Compare commits

...

71 Commits

Author SHA1 Message Date
Thomas Boop
496ec0df97 291.1 release notes (#1860)
* Revert "Added ability to run Dockerfile.SUFFIX ContainerAction (#1738)"

20b7e86e47

* 291.1 release notes
2022-04-29 10:55:20 -04:00
Ferenc Hammerl
dac4363206 Update releaseVersion 2022-04-29 13:53:22 +02:00
Ferenc Hammerl
7ba4f8587e 2.291.0 Release Notes (#1854)
* Update releaseNote.md

* Update runnerversion

* Update releaseNote.md

Co-authored-by: Thomas Boop <52323235+thboop@users.noreply.github.com>

Co-authored-by: Thomas Boop <52323235+thboop@users.noreply.github.com>
2022-04-29 12:53:36 +02:00
ruvceskistefan
88f7c56757 Issue 1528: use OS specific path separator (#1617)
* Issue 1528: use OS specific path separator

* Using Path.Combine instead of OS specific c_defaultPathSeparator
2022-04-27 22:16:03 -04:00
Nikola Jokic
20b7e86e47 Added ability to run Dockerfile.SUFFIX ContainerAction (#1738)
* Added ability to run Dockerfile.SUFFIX ContainerAction

* Extracted IsDockerFile method

* reformatted, moved from index to Last()

* extracted IsDockerfile to DockerUtil with L0

* added check for IsDockerfile to account for docker://

* updated test to clearly show path/dockerfile:tag

* fail if Data.Image is not Dockerfile or docker://[image]
2022-04-27 21:23:12 -04:00
Tingluo Huang
bd5f275830 Update runnerversion to match latest release. 2022-04-26 09:54:42 -04:00
Yang Cao
a7aadf5615 Update Actions Summary limit to 1MiB (#1839)
* Update Actions Summary limit to 1MiB

* Making limit a public const so other part of the codebase is aware of the limit too
2022-04-20 17:08:50 -04:00
Tingluo Huang
1c582abc8b Skip running L0 tests in release workflow to prevent package pollution (#1832) 2022-04-19 16:10:47 -04:00
Tingluo Huang
44d4d076fe Capture telemetry when git errors on unsafe repository. (#1823) 2022-04-13 12:48:52 -04:00
Ferenc Hammerl
b6195624ac 2.290.0 Release notes (#1820)
* 2.290.0 rel notes

* Update releaseNote.md
2022-04-12 10:34:15 -04:00
ruvceskistefan
ead3509d5a Added warning in case of invalid combination of command and flags and/or arguments (#1781)
* Added warning in case of invalid combination of command and flags and/or arguments

* Deleting unnecessary comments

* Added separate list for generic options

* Added PAT to the valid remove options

* Added command name to the error message
2022-04-11 09:23:58 -04:00
Nikola Jokic
fee24199cb Added input context to shell in composite run-step (#1767)
* Added input context to shell in composite run-step

* moved from string-shell-context to string-steps-context
2022-04-11 14:50:45 +02:00
Nikola Jokic
c8cb600ac7 Use StepHost when evaluating inputs to actions (#1762)
* composite action github.action_path set based on the StepHost

* in progress on updating github context for input template

* Fixed updating the context data for evaluation

* refactored logic so it is a little cleaner

* removed resolving the action_path in CompositeActionHandler

* removed added DeepClone

* added feature flag and modified the dict in place

* refactored step host to change context data. Added L0

* repaired spaces

* moved logic from step host to execution context, added recursive translation

* removed empty lines

* moved to extension methods

* Update src/Test/L0/Worker/StepHostL0.cs

Co-authored-by: Ferenc Hammerl <31069338+fhammerl@users.noreply.github.com>
2022-04-11 12:43:24 +00:00
Thomas Boop
f48f314a70 fix an issue where container hooks used the job default working directory (#1809) 2022-04-06 14:33:32 +02:00
Soe Tun
7b677e0618 Mark run as Cancelled/Failed upon HostContext.RunnerShutdownToken state (#1792)
- github/c2c-actions-support#883
2022-04-04 13:46:03 -04:00
Nikola Jokic
d70f9f6174 Continue on error for the composite actions (#1763)
* Added continue on error to composite action

* changed from boolean-strategy-context -> boolean-steps-context for action_yaml

* refactored composite handler to always set outcome

* retrigger checks

* fixed typo in ??= operator

* boolean-steps-context accepts the same context as string-steps-context

* setting the outcome only on continue-on-error

* moved continue on error logic to the execution context

* Added L0 table tests for continue-on-error ExecutionContext

* Added missing mocks on StepsRunnerL0 for this update

* removed empty line and added one line separating the call

* Removed empty line

Co-authored-by: Ferenc Hammerl <31069338+fhammerl@users.noreply.github.com>

Co-authored-by: Ferenc Hammerl <31069338+fhammerl@users.noreply.github.com>
2022-04-01 09:18:53 -04:00
dependabot[bot]
0343e76789 Bump minimist from 1.2.5 to 1.2.6 in /src/Misc/expressionFunc/hashFiles (#1783)
Bumps [minimist](https://github.com/substack/minimist) from 1.2.5 to 1.2.6.
- [Release notes](https://github.com/substack/minimist/releases)
- [Commits](https://github.com/substack/minimist/compare/1.2.5...1.2.6)

---
updated-dependencies:
- dependency-name: minimist
  dependency-type: indirect
...

Signed-off-by: dependabot[bot] <support@github.com>

Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2022-03-29 22:40:32 -04:00
Yashwanth Anantharaju
909b05eb66 FeedStream: handle websocket close failures (#1789)
* handle close failures

* handle in other place as well

* refactor

* bump runner version

* update release notes
2022-03-28 14:41:21 -04:00
Yashwanth Anantharaju
2e3976cf97 Feedstream websocket: set user agent (#1791)
* set user agent

* let's also add prefix
2022-03-28 14:31:23 -04:00
ruvceskistefan
052ac521b0 Issue 1739: Fixing null reference exception during configuring runner with invalid repo URL or token (#1741)
* Fixing null reference exception when configuring runner with invalid repo URL or token

* Throw exception instead of ConvertFromJson

* Storing the response code
2022-03-28 09:06:24 -04:00
Ferenc Hammerl
408d6c579c Add annotations if Node 12 action is found and FF is on (#1735)
* Add annotations if node 12 action is found

* Better placeholder

* Only warn if FF is on

* Move annotation logic

* Pass in the LTS Url

* Raise annotation right before executing the action

* Match server side FF name

* Change name back to features

* Better warning text

* Update src/Runner.Common/Constants.cs

Co-authored-by: Thomas Boop <52323235+thboop@users.noreply.github.com>

Co-authored-by: Thomas Boop <52323235+thboop@users.noreply.github.com>
2022-03-22 10:43:25 +01:00
Thomas Boop
46258428cd 2.289.1 release notes (#1771) 2022-03-18 14:22:24 -04:00
Thomas Boop
eb9a604b63 Revert "Added repository name and workflow file name to console output (#1761)" (#1770)
98aa9c1152
2022-03-18 14:09:01 -04:00
Thomas Boop
8792d8e5ee cleanup message displayed on job started/completed hooks (#1769) 2022-03-18 14:08:50 -04:00
Thomas Boop
87e86e3d72 2.289.0 release notes (#1766) 2022-03-18 16:12:50 +01:00
Thomas Boop
48b6cd9a42 Update dependencies to latest versions (#1756) 2022-03-17 23:21:35 -04:00
Yashwanth Anantharaju
d081289ed5 postlines: refactor per feedback (#1755)
* refactor per feedback

* feedback

* nit

* commentify

* feedback

* feedback
2022-03-17 21:35:20 -04:00
Ferenc Hammerl
7d5e9cd70f Runner Job Started/Completed Hooks (#1737)
* Prototype for pre job hook

* Remove debug log

* Enable hooks again

* Initialize with hostContext

* Add event_path, fix no-path bug

* Allow script post steps

* Call script handler with correct pre post stage

* Add job completed hook

* Make filecommand work and hardcode shell

* Conditionally print step details and no telemetry for hooks

* Figure out whih script to use

* Only check path for managed scripts

* Resture win dependency

* Nits

* Remove unused, add named params

* Telemetry + refactoring

* add message to job

* rename hooks remove stale comment

* cleanup

* Use .CreateService to create step

* Add L0s

* pr feedback

* update tests

* add disclaimer, clean up code

* spacing fix

* little more cleanup

* pr fix

* pr feedback

* Refactor to use JobExtension

* fix tests

* fix typo

* cleanup code

* more cleanup

* little more cleanup

* last bit of cleanup

* fix tests

* nit fix

* Update src/Runner.Worker/JobHookProvider.cs

Co-authored-by: Edward Thomson <ethomson@github.com>

* don't override runner telemtry

* pr feedback

* pr feedback

* pr feedback

Co-authored-by: Thomas Boop <thboop@github.com>
Co-authored-by: Thomas Boop <52323235+thboop@users.noreply.github.com>
Co-authored-by: Edward Thomson <ethomson@github.com>
2022-03-17 21:35:04 -04:00
ruvceskistefan
98aa9c1152 Added repository name and workflow file name to console output (#1761)
* Adding repo name and workflow file to console output

* Add guard for empty workflow file name
2022-03-17 13:25:28 -04:00
Yashwanth Anantharaju
ddc700e9eb Send postlines via websocket if we can (#1730)
* feed via websocket

* feed via websocket

* feedback

* ensure right schema is used

* fix resiliency

* some fixes

* fix sending message

* chunk data

* let's abort, which will also dispose

* close gracefully
2022-03-15 14:01:18 -04:00
Konrad Pabjan
a0458aebfe Save record order for annotation links when creating issues (#1744)
* Save record order for annotation links when creating issues

* PR feedback

* Add tests for step and line numbers
2022-03-14 11:20:11 -04:00
Tingluo Huang
b2c6d093b2 Validate packages hash before uploading to github release in CD workflow. (#1745) 2022-03-14 09:21:13 -04:00
Thomas Boop
292a2e0ab3 Fix spelling (#1747) 2022-03-11 09:41:54 -05:00
Nikola Jokic
29cee52276 Prefer user who initiated install before (#1714) 2022-03-10 14:08:19 +01:00
Antoine Grondin
ad0d0c4d0a worker: expose github.triggering_actor as an env-var (#1726)
* worker: expose `github.triggering_actor` as an env-var

* worker: sort the allow list
2022-03-02 16:49:26 -05:00
Thomas Boop
2c6064a655 Update to v2.288.1 (#1723)
We hotfixed the releases/m288 branch to update to v2.288.1. This PR brings us to parity on the main branch
2022-03-01 18:19:56 +00:00
ruvceskistefan
af6c8e6edd Issue 1698: Use safe_sleep executable in bash scripts (#1707)
* use safe_sleep executable in bash scripts

* new line at the end of safe_sleep bash script

* Replacing relative paths with absolute paths and changing location of safe_sleep

Co-authored-by: Ferenc Hammerl <31069338+fhammerl@users.noreply.github.com>
2022-03-01 13:08:52 +00:00
Nikola Jokic
c15d3f10b2 Enhancement: RunnerService.js added logic to fail on N attempts if env variable exported (#1693)
* RunnerService.js added logic to fail on N attempts

* removed code grafeculShutdown, removed unused import
2022-03-01 13:55:25 +01:00
TingluoHuang
bdf1e90503 Prepare 2.288.0 runner releases. 2022-02-25 15:08:26 -05:00
Ferenc Hammerl
100c99f263 Force JS Actions Node version to 16 if FF is on unless user opted out (#1716)
* Set GH actions Node version to 16 if FF is on unless user opted out

* Add L0s (WIP)

* Wrap tests into theory

* Only check for node12 actions

* Refactor node version picking
2022-02-25 14:59:16 -05:00
Ferenc Hammerl
e8ccafea63 Add internal to node version function and use better env var name (#1715) 2022-02-25 14:59:02 -05:00
Thomas Boop
02d2cb8fcd Lets allow up to 150 characters for services on linux/mac (#1710)
* Lets allow up to 150 characters on linux/mac, just to avoid some issues with runner naming

* Add 4 randomized digits on mac/linux

* fix pragma issue

* fix test

* Address pr feedback

* reduce complexity

* lets make it cleaner!

* fix test

* fix logic
2022-02-24 21:05:51 -05:00
Rob Herley
0cbf3351f4 Update summary max file size annotation error text (#1712)
* add doc link to summary file size err annotation

* Update src/Runner.Common/Constants.cs

Co-authored-by: Konrad Pabjan <konradpabjan@github.com>

* remove i18n from url

Co-authored-by: Konrad Pabjan <konradpabjan@github.com>
2022-02-24 21:01:49 -05:00
Ferenc Hammerl
6abef8199f Use better exit codes and comparison (#1708) 2022-02-24 21:10:52 +01:00
ruvceskistefan
ec9830836b Issue 1596: Runner throws null ref exception when new line after EOF is missing (#1687)
* Issue 1596: runner throws nullref exception when writting env var

* Adding tests for missing new line after EOF marker

* Changing newline to new line
2022-02-23 09:55:59 -05:00
Nikola Jokic
460c32a337 Repaired hashFiles call so if error was thrown, it was returned to process invoker (#1678)
* hashFiles.ts added exit status on promise action

* generated layoutbin/hashfiles/index.js
2022-02-23 09:51:09 -05:00
ruvceskistefan
934027da60 Issue 1261: inconsistency of outputs (both canceled and cancelled are used) (#1624)
* Issue 1261: inconsistency of outputs

* Changing cancelled to canceled in one error message
2022-02-23 09:48:24 -05:00
Tingluo Huang
28f0027938 Add SHA to useragent. (#1694) 2022-02-17 20:01:48 +00:00
Thomas Boop
17153c9b29 Revert "revert node12 version due to fs.copyFileSync hang https://git… (#1651)
* Revert "revert node12 version due to fs.copyFileSync hang https://github.com/actions/runner/issues/1536 (#1537)"

bef164a12f

* check hashs before tests because tests rely on right values + update hashes

* fix tests

* use hc trace
2022-02-17 09:54:13 -05:00
Tingluo Huang
a65ac083b4 Skip DeleteAgentSession when the acess token has been revoked. (#1692) 2022-02-16 16:10:18 -05:00
Tingluo Huang
882f36dcf8 Sending telemetry about actions usage. (#1688)
* Sending telemetry about actions usage.

* .

* L0 tests.

* .
2022-02-16 12:18:21 -05:00
ruvceskistefan
f2578529b0 Issue 1662: retry policy for methods GetTenantCredential and GetJITRunnerTokenAsync (#1691)
* Issue 1662: Adding retry policy for methods GetTenantCredential and GetJITRunnerTokenAsync

* Adding HttpClient creation to the retry

* Random backoff time
2022-02-16 10:56:45 -05:00
Ferenc Hammerl
bd77ccf34e Prefer node16 over node12 when running internal scripts (#1621)
* Use 16 to run RunnerService.js

* Execute hashfiles using node16

* Run downloadCert.js using node16

* Run makeWebRequest.js using node16

* Run macos-run-invoker.js using node16

* Run hashFiles.js using node16

* Update tests to use node16

* Update documentation to recommend node16

* Duplicate macos service js fix for 16

* Add PR link

* Revert ADR node change

* Merge node12/16 retainment IFs

* Try both node12 and node16

* Close if

* Revert "Update tests to use node16"

This reverts commit bbca7b9f1c.

* Fix condition

* Unfurl if condition

* Allow user to force a node version

* Format update template

* Comment env var

* Rename vars

* Fix naming

* Fix rename

* Set node ver override if job message has it

* Format executionContext

* Can only receive 'forceNode12' or nothing from FF
No specific node version from server

Co-authored-by: Ferenc Hammerl <hammerl.ferenc@gmail.com>
2022-02-14 15:06:08 +01:00
Tingluo Huang
cb19da9638 Move JobTelemetry and StepsTelemetry into GlobalContext. (#1680)
* Move JobTelemetry and StepsTelemetry into GlobalContext.

* .

* .
2022-02-11 16:18:41 -05:00
Ferenc Hammerl
d64190927f Allow mocked updates for E2E testing (#1654)
* Allow mock update messages

* Kill node process see other PR

* Better comments

* Better comparison for archiveFile

* Revert merge comment mistakes

Co-authored-by: Ferenc Hammerl <hammerl.ferenc@gmail.com>
2022-02-11 21:27:26 +01:00
Sam Cook
101b74cab6 Add ability to specify runner group when creating service (#1675) 2022-02-11 21:25:55 +01:00
Nikola Jokic
c06da82ccd updated systemd svc.sh to accept custom service file (#1612)
* updated systemd svc.sh to accept custom service file

* updated systemd and darwin svc templates to accept TEMPLATE_PATH env
2022-02-11 09:29:48 -05:00
Balaga Gayatri
374989b280 Pass jobId to the actionsDownloadInfo controller (#1639)
* Update JobServer.cs

* Update ActionManager.cs

* Update TaskHttpClientBase.cs

* Update ActionManagerL0.cs

* Update ActionManager.cs

* :nit changes

* Update ActionManager.cs

* :nit changes

* Code formatting

* Update JobServer.cs

* Update JobServer.cs

* Update TaskHttpClientBase.cs

* Update ActionManagerL0.cs

* :nit changes

* passing `jobId` as queryparameter to the controller

* :nit changes

* Update src/Sdk/DTGenerated/Generated/TaskHttpClientBase.cs

Co-authored-by: Lokesh Gopu <lokesh755@github.com>

Co-authored-by: Tingluo Huang <tingluohuang@github.com>
Co-authored-by: Lokesh Gopu <lokesh755@github.com>
2022-02-09 14:42:13 -05:00
Tingluo Huang
47fee8cd64 Fix typo in hashFiles.ts. (#1672)
* Fix typo in hashFiles.ts.

* l0

* .
2022-02-09 13:13:51 -05:00
ruvceskistefan
85dcd93d98 Problem with debugging on macOS M1 (#1625)
* Solving issue with debugging on macOS M1

* Fixing problem with debugging on macOS M1

* Adding targetArchitecture in launch.json configs

* Code refactor
2022-02-08 14:17:46 -05:00
Rob Herley
bac91075f4 Use job execution context instead of step for adding summary attachments (#1667)
* use job exec context to queue/attach summaries

* step summary tests: use job ctx and verify against server queue
2022-02-08 10:22:36 -08:00
Ferenc Hammerl
9240a1cf6c Fix windows console runner update crash (#1670)
* Kill node process to recover handle

So we can print to the console in Runner.Listener once again

* Revert testing changes

Co-authored-by: Ferenc Hammerl <hammerl.ferenc@gmail.com>
2022-02-08 15:27:04 +01:00
Tim Burgan
2946801fb6 Added examples and aligned language within docs/checks/actions.md (#1664)
* examples

* Update actions.md
2022-02-07 10:45:23 -05:00
Sven Pfleiderer
1a0d588d3a Add support for Step Summary (#1642)
* First prototype of step summary environment variable

* Fix file contention issue

* Try to simplify cleaning up file references

* use step id as md file name, queue file attachment

* separate logic into attachment summary func

* Fix indentation

* Add (experimental) feature flag support

* reorganize summary upload determination logic

* file i/o exception handling + pr feedback

* Revert changes for now to reintroduce them later

* Add skeleton SetStepSummaryCommand

* Update step summary feature flag name

* Port ShouldUploadAttachment from previous iteration

* Port QueueStepSummaryUpload from previous iteration

* Improve exception handling when uploading attachment

* Add some minor logging improvements

* Refuse to upload files larger than 128k

* Implement secrets scrubbing

* Add TODO comment to remove debugging temp files

* Add first tests

* Add test for secret masking

* Add some naming/style fixes suggested in feedback

* inline check for feature flag

* Inline method for style consistency

* Make sure that scrubbed file doesn't exist before creating it

* Rename SetStepSummaryCommand to CreateStepSummaryCommand

* Fix error handling messages

* Fix file command name when registering extension

* Remove unnecessary file deletion

Co-authored-by: Rob Herley <robherley@github.com>
2022-02-04 13:46:30 -08:00
jeremyd2019
192ebfeccf fix run.cmd script (#1633)
Restore ability to run run.cmd from directories other than the runner root, and fix it exiting the cmd that's running it.  Fixes #1632
2022-02-02 12:09:13 +01:00
Ferenc Hammerl
f2347b7a59 Use absolute path when invoking run-helper.sh or Runner.Listener (#1645) 2022-02-02 12:08:35 +01:00
Ferenc Hammerl
8f160bc084 Reopen 'Make run.sh|cmd handle update without quitting so containers using them as entrypoints don't exit on update ' (#1646)
* Only execute post for actions that have one

* Working container runner update with run.sh

* Revert "Only execute post for actions that have one"

This reverts commit 9675941fdb.

* Relaunch the listener without quitting run.cmd

* Fix typo

* Extract most os run.sh logic so we can update it

* Add bash line endings

* Extract the logic from run.cmd

* Add EoF lines

* Add unexpected ERRORLEVEL messages to cmd

* Simplify contract between run and helper

* Remove unused exit

* WIP: run a copy of the helper so it's safe to update

* Throw NonRetryableException if not configured

* Log and format

* Fix typo

* Fix typo

* Use helper template system for bash as well

* Update run.sh

* Remove unnecessary comments

* Use ping instead of timeout

* Use localhost in ping-timeout (n times, w timeout)

Co-authored-by: Ferenc Hammerl <hammerl.ferenc@gmail.com>
2022-02-02 11:16:01 +01:00
Thomas Boop
47ba1203c9 Revert "Make run.sh|cmd handle update without quitting so container… (#1635)
* Revert "Make `run.sh|cmd` handle update without quitting so containers using them as entrypoints don't exit on update (#1494)"

d8251bf912

* update runnerversion as well
2022-02-01 15:19:04 +01:00
Thomas Boop
dc8b1b685f Runner 2.287.0 Release Notes (#1631)
* Update runner to 2.287.0

* Update release notes
2022-01-27 11:28:40 -05:00
Tingluo Huang
8eacbdc79f Runner config option to disable auto-update. (#1558)
* Runner config option to disable auto-update.

* Update src/Runner.Listener/Configuration/ConfigurationManager.cs

Co-authored-by: Thomas Boop <52323235+thboop@users.noreply.github.com>

* Update src/Runner.Listener/Configuration/ConfigurationManager.cs

Co-authored-by: Thomas Boop <52323235+thboop@users.noreply.github.com>

* Update src/Runner.Listener/Configuration/ConfigurationManager.cs

Co-authored-by: Thomas Boop <52323235+thboop@users.noreply.github.com>

* Update src/Runner.Listener/Configuration/ConfigurationManager.cs

Co-authored-by: Thomas Boop <52323235+thboop@users.noreply.github.com>

* feedback.

Co-authored-by: Thomas Boop <52323235+thboop@users.noreply.github.com>
2022-01-26 13:23:24 -05:00
Pavel Iakovenko
6b4a95cdb1 Use default 8Mb chunking for the FileContainer uploads (#1626) 2022-01-24 13:57:05 -05:00
98 changed files with 6830 additions and 2488 deletions

View File

@@ -50,13 +50,6 @@ jobs:
${{ matrix.devScript }} layout Release ${{ matrix.runtime }}
working-directory: src
# Run tests
- name: L0
run: |
${{ matrix.devScript }} test
working-directory: src
if: matrix.runtime != 'linux-arm64' && matrix.runtime != 'linux-arm'
# Check runtime/externals hash
- name: Compute/Compare runtime and externals Hash
shell: bash
@@ -80,6 +73,13 @@ jobs:
DOTNET_RUNTIME_HASH: ${{hashFiles('**/_layout_trims/runtime/**/*')}}
EXTERNALS_HASH: ${{hashFiles('**/_layout_trims/externals/**/*')}}
# Run tests
- name: L0
run: |
${{ matrix.devScript }} test
working-directory: src
if: matrix.runtime != 'linux-arm64' && matrix.runtime != 'linux-arm'
# Create runner package tar.gz/zip
- name: Package Release
if: github.event_name != 'pull_request'

View File

@@ -101,11 +101,11 @@ jobs:
working-directory: src
# Run tests
- name: L0
run: |
${{ matrix.devScript }} test
working-directory: src
if: matrix.runtime != 'linux-arm64' && matrix.runtime != 'linux-arm'
#- name: L0
# run: |
# ${{ matrix.devScript }} test
# working-directory: src
# if: matrix.runtime != 'linux-arm64' && matrix.runtime != 'linux-arm'
# Create runner package tar.gz/zip
- name: Package Release
@@ -260,6 +260,17 @@ jobs:
console.log(releaseNote)
core.setOutput('version', runnerVersion);
core.setOutput('note', releaseNote);
- name: Validate Packages HASH
working-directory: _package
run: |
ls -l
echo "${{needs.build.outputs.win-x64-sha}} actions-runner-win-x64-${{ steps.releaseNote.outputs.version }}.zip" | shasum -a 256 -c
echo "${{needs.build.outputs.osx-x64-sha}} actions-runner-osx-x64-${{ steps.releaseNote.outputs.version }}.tar.gz" | shasum -a 256 -c
echo "${{needs.build.outputs.linux-x64-sha}} actions-runner-linux-x64-${{ steps.releaseNote.outputs.version }}.tar.gz" | shasum -a 256 -c
echo "${{needs.build.outputs.linux-arm-sha}} actions-runner-linux-arm-${{ steps.releaseNote.outputs.version }}.tar.gz" | shasum -a 256 -c
echo "${{needs.build.outputs.linux-arm64-sha}} actions-runner-linux-arm64-${{ steps.releaseNote.outputs.version }}.tar.gz" | shasum -a 256 -c
# Create GitHub release
- uses: actions/create-release@master
id: createRelease

6
.vscode/launch.json vendored
View File

@@ -13,6 +13,7 @@
"cwd": "${workspaceFolder}/src",
"console": "integratedTerminal",
"requireExactSource": false,
"targetArchitecture": "x86_64"
},
{
"name": "Run",
@@ -25,6 +26,7 @@
"cwd": "${workspaceFolder}/src",
"console": "integratedTerminal",
"requireExactSource": false,
"targetArchitecture": "x86_64"
},
{
"name": "Configure",
@@ -38,6 +40,7 @@
"cwd": "${workspaceFolder}/src",
"console": "integratedTerminal",
"requireExactSource": false,
"targetArchitecture": "x86_64"
},
{
"name": "Debug Worker",
@@ -45,6 +48,7 @@
"request": "attach",
"processName": "Runner.Worker",
"requireExactSource": false,
"targetArchitecture": "x86_64"
},
{
"name": "Attach Debugger",
@@ -52,6 +56,8 @@
"request": "attach",
"processId": "${command:pickProcess}",
"requireExactSource": false,
"targetArchitecture": "x86_64"
},
],
}

View File

@@ -6,13 +6,35 @@
Make sure the runner has access to actions service for GitHub.com or GitHub Enterprise Server
- For GitHub.com
- The runner needs to access https://api.github.com for downloading actions.
- The runner needs to access https://vstoken.actions.githubusercontent.com/_apis/.../ for requesting an access token.
- The runner needs to access https://pipelines.actions.githubusercontent.com/_apis/.../ for receiving workflow jobs.
- The runner needs to access `https://api.github.com` for downloading actions.
- The runner needs to access `https://vstoken.actions.githubusercontent.com/_apis/.../` for requesting an access token.
- The runner needs to access `https://pipelines.actions.githubusercontent.com/_apis/.../` for receiving workflow jobs.
These can by tested by running the following `curl` commands from your self-hosted runner machine:
```
curl -v https://api.github.com/api/v3/zen
curl -v https://vstoken.actions.githubusercontent.com/_apis/health
curl -v https://pipelines.actions.githubusercontent/_apis/health
```
- For GitHub Enterprise Server
- The runner needs to access https://myGHES.com/api/v3 for downloading actions.
- The runner needs to access https://myGHES.com/_services/vstoken/_apis/.../ for requesting an access token.
- The runner needs to access https://myGHES.com/_services/pipelines/_apis/.../ for receiving workflow jobs.
- The runner needs to access `https://[hostname]/api/v3` for downloading actions.
- The runner needs to access `https://[hostname]/_services/vstoken/_apis/.../` for requesting an access token.
- The runner needs to access `https://[hostname]/_services/pipelines/_apis/.../` for receiving workflow jobs.
These can by tested by running the following `curl` commands from your self-hosted runner machine, replacing `[hostname]` with the hostname of your appliance, for instance `github.example.com`:
```
curl -v https://[hostname]/api/v3/zen
curl -v https://[hostname]/_services/vstoken/_apis/health
curl -v https://[hostname]/_services/pipelines/_apis/health
```
A common cause of this these connectivity issues is if your to GitHub Enterprise Server appliance is using [the self-signed certificate that is enabled the first time](https://docs.github.com/en/enterprise-server/admin/configuration/configuring-network-settings/configuring-tls) your appliance is started. As self-signed certificates are not trusted by web browsers and Git clients, these clients (including the GitHub Actions runner) will report certificate warnings.
We recommend [upload a certificate signed by a trusted authority](https://docs.github.com/en/enterprise-server/admin/configuration/configuring-network-settings/configuring-tls) to GitHub Enterprise Server, or enabling the built-in ][Let's Encrypt support](https://docs.github.com/en/enterprise-server/admin/configuration/configuring-network-settings/configuring-tls).
## What is checked?
@@ -42,4 +64,4 @@ Make sure the runner has access to actions service for GitHub.com or GitHub Ente
## Still not working?
Contact GitHub customer service or log an issue at https://github.com/actions/runner if you think it's a runner issue.
Contact [GitHub Support](https://support.github.com] if you have further questuons, or log an issue at https://github.com/actions/runner if you think it's a runner issue.

View File

@@ -4,9 +4,9 @@
Make sure the built-in node.js has access to GitHub.com or GitHub Enterprise Server.
The runner carries it's own copy of node.js executable under `<runner_root>/externals/node12/`.
The runner carries its own copy of node.js executable under `<runner_root>/externals/node16/`.
All javascript base Actions will get executed by the built-in `node` at `<runner_root>/externals/node12/`.
All javascript base Actions will get executed by the built-in `node` at `<runner_root>/externals/node16/`.
> Not the `node` from `$PATH`

View File

@@ -1,22 +1,11 @@
## Features
- Bump runtime to dotnet 6 (#1471)
- Show service container logs on teardown (#1563)
## Bugs
- Add masks for multiline secrets from ::add-mask:: (#1521)
- fix Log size and retention settings not work (#1507)
- Refactor SelfUpdater adding L0 tests. (#1564)
- Fix test failure: /bin/sleep on Macos 11 (Monterey) does not accept the suffix s. (#1472)
- Fixed a bug where windows path separators were used in generated folders (#1617)
- Fixed an issue where runner's invoked via `run.sh` or `run.cmd` did not properly restart after update (#1812). This fix applies to all future updates after installing this version
## Misc
- Update dependency check for dotnet 6. (#1551)
- Produce trimmed down runner packages. (#1556)
- Deleted extra background in github-praph.png, which is displayed in README.md (#1432)
- Relaxed Actions Summary size limit to 1MiB (#1839)
## Windows x64
We recommend configuring the runner in a root folder of the Windows drive (e.g. "C:\actions-runner"). This will help avoid issues related to service identity folder permissions and long file path restrictions on Windows.

View File

@@ -1 +1 @@
<Update to ./src/runnerversion when creating release>
2.291.1

View File

@@ -13,7 +13,7 @@ set -e
flags_found=false
while getopts 's:g:n:u:l:' opt; do
while getopts 's:g:n:r:u:l:' opt; do
flags_found=true
case $opt in
@@ -26,6 +26,9 @@ while getopts 's:g:n:u:l:' opt; do
n)
runner_name=$OPTARG
;;
r)
runner_group=$OPTARG
;;
u)
svc_user=$OPTARG
;;
@@ -44,6 +47,7 @@ Usage:
-s required scope: repo (:owner/:repo) or org (:organization)
-g optional ghe_hostname: the fully qualified domain name of your GitHub Enterprise Server deployment
-n optional name of the runner, defaults to hostname
-r optional name of the runner group to add the runner to, defaults to the Default group
-u optional user svc will run as, defaults to current
-l optional list of labels (split by comma) applied on the runner"
exit 0
@@ -59,6 +63,7 @@ if ! "$flags_found"; then
runner_name=${3:-$(hostname)}
svc_user=${4:-$USER}
labels=${5}
runner_group=${6}
fi
# apply defaults
@@ -164,8 +169,8 @@ fi
echo
echo "Configuring ${runner_name} @ $runner_url"
echo "./config.sh --unattended --url $runner_url --token *** --name $runner_name --labels $labels"
sudo -E -u ${svc_user} ./config.sh --unattended --url $runner_url --token $RUNNER_TOKEN --name $runner_name --labels $labels
echo "./config.sh --unattended --url $runner_url --token *** --name $runner_name ${labels:+--labels $labels} ${runner_group:+--runnergroup \"$runner_group\"}"
sudo -E -u ${svc_user} ./config.sh --unattended --url $runner_url --token $RUNNER_TOKEN --name $runner_name ${labels:+--labels $labels} ${runner_group:+--runnergroup "$runner_group"}
#---------------------------------------
# Configuring as a service

View File

@@ -1 +1 @@
6ca4a0e1c50b7079ead05321dcf5835c1c25f23dc632add8c1c4667d416d103e
6ed30a2c1ee403a610d63e82bb230b9ba846a9c25cec9e4ea8672fb6ed4e1a51

View File

@@ -1 +1 @@
b5951dc607d782d9c7571a7224e940eb0975bb23c54ff25c7afdbf959a417081
711c30c51ec52c9b7a9a2eb399d6ab2ab5ee1dc72de11879f2f36f919f163d78

View File

@@ -1 +1 @@
af819e92011cc9cbca90e8299f9f7651f2cf6bf45b42920f9a4ca22795486147
a49479ca4b4988a06c097e8d22c51fd08a11c13f40807366236213d0e008cf6a

View File

@@ -1 +1 @@
aa0e6bf4bfaabf48c962ea3b262dca042629ab332005f73d282faec908847036
8e97df75230b843462a9b4c578ccec604ee4b4a1066120c85b04374317fa372b

View File

@@ -1 +1 @@
40328cff2b8229f9b578f32739183bd8f6aab481c21dadc052b09f1c7e8e4665
f75a671e5a188c76680739689aa75331a2c09d483dce9c80023518c48fd67a18

View File

@@ -1,6 +1,6 @@
{
"plugins": ["jest", "@typescript-eslint"],
"extends": ["plugin:github/es6"],
"plugins": ["@typescript-eslint"],
"extends": ["plugin:github/recommended"],
"parser": "@typescript-eslint/parser",
"parserOptions": {
"ecmaVersion": 9,
@@ -17,13 +17,16 @@
"@typescript-eslint/no-require-imports": "error",
"@typescript-eslint/array-type": "error",
"@typescript-eslint/await-thenable": "error",
"@typescript-eslint/ban-ts-ignore": "error",
"@typescript-eslint/naming-convention": [
"error",
{
"selector": "default",
"format": ["camelCase"]
}
],
"camelcase": "off",
"@typescript-eslint/camelcase": "error",
"@typescript-eslint/class-name-casing": "error",
"@typescript-eslint/explicit-function-return-type": ["error", {"allowExpressions": true}],
"@typescript-eslint/func-call-spacing": ["error", "never"],
"@typescript-eslint/generic-type-naming": ["error", "^[A-Z][A-Za-z]*$"],
"@typescript-eslint/no-array-constructor": "error",
"@typescript-eslint/no-empty-interface": "error",
"@typescript-eslint/no-explicit-any": "error",
@@ -33,7 +36,6 @@
"@typescript-eslint/no-misused-new": "error",
"@typescript-eslint/no-namespace": "error",
"@typescript-eslint/no-non-null-assertion": "warn",
"@typescript-eslint/no-object-literal-type-assertion": "error",
"@typescript-eslint/no-unnecessary-qualifier": "error",
"@typescript-eslint/no-unnecessary-type-assertion": "error",
"@typescript-eslint/no-useless-constructor": "error",
@@ -41,19 +43,19 @@
"@typescript-eslint/prefer-for-of": "warn",
"@typescript-eslint/prefer-function-type": "warn",
"@typescript-eslint/prefer-includes": "error",
"@typescript-eslint/prefer-interface": "error",
"@typescript-eslint/prefer-string-starts-ends-with": "error",
"@typescript-eslint/promise-function-async": "error",
"@typescript-eslint/require-array-sort-compare": "error",
"@typescript-eslint/restrict-plus-operands": "error",
"semi": "off",
"@typescript-eslint/semi": ["error", "never"],
"@typescript-eslint/type-annotation-spacing": "error",
"@typescript-eslint/unbound-method": "error"
"@typescript-eslint/unbound-method": "error",
"filenames/match-regex" : "off",
"github/no-then" : 1, // warning
"semi": "off"
},
"env": {
"node": true,
"es6": true,
"jest/globals": true
"es6": true
}
}

File diff suppressed because it is too large Load Diff

View File

@@ -25,10 +25,10 @@
},
"devDependencies": {
"@types/node": "^12.7.12",
"@typescript-eslint/parser": "^2.8.0",
"@typescript-eslint/parser": "^5.15.0",
"@zeit/ncc": "^0.20.5",
"eslint": "^6.8.0",
"eslint-plugin-github": "^2.0.0",
"eslint": "^8.11.0",
"eslint-plugin-github": "^4.3.5",
"prettier": "^1.19.1",
"typescript": "^3.6.4"
}

View File

@@ -1,9 +1,9 @@
import * as glob from '@actions/glob'
import * as crypto from 'crypto'
import * as fs from 'fs'
import * as glob from '@actions/glob'
import * as path from 'path'
import * as stream from 'stream'
import * as util from 'util'
import * as path from 'path'
async function run(): Promise<void> {
// arg0 -> node
@@ -45,7 +45,7 @@ async function run(): Promise<void> {
result.end()
if (hasMatch) {
console.log(`Find ${count} files to hash.`)
console.log(`Found ${count} files to hash.`)
console.error(`__OUTPUT__${result.digest('hex')}__OUTPUT__`)
} else {
console.error(`__OUTPUT____OUTPUT__`)
@@ -53,3 +53,11 @@ async function run(): Promise<void> {
}
run()
.then(out => {
console.log(out)
process.exit(0)
})
.catch(err => {
console.error(err)
process.exit(1)
})

View File

@@ -3,7 +3,7 @@ PACKAGERUNTIME=$1
PRECACHE=$2
NODE_URL=https://nodejs.org/dist
NODE12_VERSION="12.13.1"
NODE12_VERSION="12.22.7"
NODE16_VERSION="16.13.0"
get_abs_path() {
@@ -143,7 +143,7 @@ fi
# Download the external tools for Linux PACKAGERUNTIMEs.
if [[ "$PACKAGERUNTIME" == "linux-x64" ]]; then
acquireExternalTool "$NODE_URL/v${NODE12_VERSION}/node-v${NODE12_VERSION}-linux-x64.tar.gz" node12 fix_nested_dir
acquireExternalTool "https://vstsagenttools.blob.core.windows.net/tools/nodejs/${NODE12_VERSION}/alpine/x64/node-${NODE12_VERSION}-alpine-x64.tar.gz" node12_alpine
acquireExternalTool "https://vstsagenttools.blob.core.windows.net/tools/nodejs/${NODE12_VERSION}/alpine/x64/node-v${NODE12_VERSION}-alpine-x64.tar.gz" node12_alpine
acquireExternalTool "$NODE_URL/v${NODE16_VERSION}/node-v${NODE16_VERSION}-linux-x64.tar.gz" node16 fix_nested_dir
acquireExternalTool "https://vstsagenttools.blob.core.windows.net/tools/nodejs/${NODE16_VERSION}/alpine/x64/node-v${NODE16_VERSION}-alpine-x64.tar.gz" node16_alpine
fi

View File

@@ -3,94 +3,135 @@
// Licensed under the MIT license. See LICENSE file in the project root for full license information.
var childProcess = require("child_process");
var path = require("path")
var path = require("path");
var supported = ['linux', 'darwin']
var supported = ["linux", "darwin"];
if (supported.indexOf(process.platform) == -1) {
console.log('Unsupported platform: ' + process.platform);
console.log('Supported platforms are: ' + supported.toString());
console.log("Unsupported platform: " + process.platform);
console.log("Supported platforms are: " + supported.toString());
process.exit(1);
}
var stopping = false;
var listener = null;
var exitServiceAfterNFailures = Number(
process.env.GITHUB_ACTIONS_SERVICE_EXIT_AFTER_N_FAILURES
);
if (exitServiceAfterNFailures <= 0) {
exitServiceAfterNFailures = NaN;
}
var consecutiveFailureCount = 0;
var gracefulShutdown = function () {
console.log("Shutting down runner listener");
stopping = true;
if (listener) {
console.log("Sending SIGINT to runner listener to stop");
listener.kill("SIGINT");
console.log("Sending SIGKILL to runner listener");
setTimeout(() => listener.kill("SIGKILL"), 30000).unref();
}
};
var runService = function () {
var listenerExePath = path.join(__dirname, '../bin/Runner.Listener');
var listenerExePath = path.join(__dirname, "../bin/Runner.Listener");
var interactive = process.argv[2] === "interactive";
if (!stopping) {
try {
if (interactive) {
console.log('Starting Runner listener interactively');
listener = childProcess.spawn(listenerExePath, ['run'], { env: process.env });
console.log("Starting Runner listener interactively");
listener = childProcess.spawn(listenerExePath, ["run"], {
env: process.env,
});
} else {
console.log('Starting Runner listener with startup type: service');
listener = childProcess.spawn(listenerExePath, ['run', '--startuptype', 'service'], { env: process.env });
console.log("Starting Runner listener with startup type: service");
listener = childProcess.spawn(
listenerExePath,
["run", "--startuptype", "service"],
{ env: process.env }
);
}
console.log(`Started listener process, pid: ${listener.pid}`);
listener.stdout.on('data', (data) => {
process.stdout.write(data.toString('utf8'));
listener.stdout.on("data", (data) => {
if (data.toString("utf8").includes("Listening for Jobs")) {
consecutiveFailureCount = 0;
}
process.stdout.write(data.toString("utf8"));
});
listener.stderr.on('data', (data) => {
process.stdout.write(data.toString('utf8'));
listener.stderr.on("data", (data) => {
process.stdout.write(data.toString("utf8"));
});
listener.on("error", (err) => {
console.log(`Runner listener fail to start with error ${err.message}`);
});
listener.on('close', (code) => {
listener.on("close", (code) => {
console.log(`Runner listener exited with error code ${code}`);
if (code === 0) {
console.log('Runner listener exit with 0 return code, stop the service, no retry needed.');
console.log(
"Runner listener exit with 0 return code, stop the service, no retry needed."
);
stopping = true;
} else if (code === 1) {
console.log('Runner listener exit with terminated error, stop the service, no retry needed.');
console.log(
"Runner listener exit with terminated error, stop the service, no retry needed."
);
stopping = true;
} else if (code === 2) {
console.log('Runner listener exit with retryable error, re-launch runner in 5 seconds.');
} else if (code === 3) {
console.log('Runner listener exit because of updating, re-launch runner in 5 seconds.');
console.log(
"Runner listener exit with retryable error, re-launch runner in 5 seconds."
);
consecutiveFailureCount = 0;
} else if (code === 3 || code === 4) {
console.log(
"Runner listener exit because of updating, re-launch runner in 5 seconds."
);
consecutiveFailureCount = 0;
} else {
console.log('Runner listener exit with undefined return code, re-launch runner in 5 seconds.');
var messagePrefix = "Runner listener exit with undefined return code";
consecutiveFailureCount++;
if (
!isNaN(exitServiceAfterNFailures) &&
consecutiveFailureCount >= exitServiceAfterNFailures
) {
console.error(
`${messagePrefix}, exiting service after ${consecutiveFailureCount} consecutive failures`
);
gracefulShutdown();
return;
} else {
console.log(`${messagePrefix}, re-launch runner in 5 seconds.`);
}
}
if (!stopping) {
setTimeout(runService, 5000);
}
});
} catch (ex) {
console.log(ex);
}
}
}
};
runService();
console.log('Started running service');
console.log("Started running service");
var gracefulShutdown = function (code) {
console.log('Shutting down runner listener');
stopping = true;
if (listener) {
console.log('Sending SIGINT to runner listener to stop');
listener.kill('SIGINT');
console.log('Sending SIGKILL to runner listener');
setTimeout(() => listener.kill('SIGKILL'), 30000).unref();
}
}
process.on('SIGINT', () => {
gracefulShutdown(0);
process.on("SIGINT", () => {
gracefulShutdown();
});
process.on('SIGTERM', () => {
gracefulShutdown(0);
process.on("SIGTERM", () => {
gracefulShutdown();
});

View File

@@ -17,7 +17,13 @@ RUNNER_ROOT=`pwd`
LAUNCH_PATH="${HOME}/Library/LaunchAgents"
PLIST_PATH="${LAUNCH_PATH}/${SVC_NAME}.plist"
TEMPLATE_PATH=$GITHUB_ACTIONS_RUNNER_SERVICE_TEMPLATE
IS_CUSTOM_TEMPLATE=0
if [[ -z $TEMPLATE_PATH ]]; then
TEMPLATE_PATH=./bin/actions.runner.plist.template
else
IS_CUSTOM_TEMPLATE=1
fi
TEMP_PATH=./bin/actions.runner.plist.temp
CONFIG_PATH=.service
@@ -29,7 +35,11 @@ function failed()
}
if [ ! -f "${TEMPLATE_PATH}" ]; then
if [[ $IS_CUSTOM_TEMPLATE = 0 ]]; then
failed "Must run from runner root or install is corrupt"
else
failed "Service file at '$GITHUB_ACTIONS_RUNNER_SERVICE_TEMPLATE' using GITHUB_ACTIONS_RUNNER_SERVICE_TEMPLATE env variable is not found"
fi
fi
function install()
@@ -53,7 +63,7 @@ function install()
mkdir -p "${log_path}" || failed "failed to create ${log_path}"
echo Creating ${PLIST_PATH}
sed "s/{{User}}/${SUDO_USER:-$USER}/g; s/{{SvcName}}/$SVC_NAME/g; s@{{RunnerRoot}}@${RUNNER_ROOT}@g; s@{{UserHome}}@$HOME@g;" "${TEMPLATE_PATH}" > "${TEMP_PATH}" || failed "failed to create replacement temp file"
sed "s/{{User}}/${USER:-$SUDO_USER}/g; s/{{SvcName}}/$SVC_NAME/g; s@{{RunnerRoot}}@${RUNNER_ROOT}@g; s@{{UserHome}}@$HOME@g;" "${TEMPLATE_PATH}" > "${TEMP_PATH}" || failed "failed to create replacement temp file"
mv "${TEMP_PATH}" "${PLIST_PATH}" || failed "failed to copy plist"
# Since we started with sudo, runsvc.sh will be owned by root. Change this to current login user.

View File

@@ -43,6 +43,32 @@ module.exports =
/************************************************************************/
/******/ ({
/***/ 82:
/***/ (function(__unusedmodule, exports) {
"use strict";
// We use any as a valid input type
/* eslint-disable @typescript-eslint/no-explicit-any */
Object.defineProperty(exports, "__esModule", { value: true });
/**
* Sanitizes an input into a string so it can be passed into issueCommand safely
* @param input input to sanitize into a string
*/
function toCommandValue(input) {
if (input === null || input === undefined) {
return '';
}
else if (typeof input === 'string' || input instanceof String) {
return input;
}
return JSON.stringify(input);
}
exports.toCommandValue = toCommandValue;
//# sourceMappingURL=utils.js.map
/***/ }),
/***/ 87:
/***/ (function(module) {
@@ -978,6 +1004,42 @@ function regExpEscape (s) {
}
/***/ }),
/***/ 102:
/***/ (function(__unusedmodule, exports, __webpack_require__) {
"use strict";
// For internal use, subject to change.
var __importStar = (this && this.__importStar) || function (mod) {
if (mod && mod.__esModule) return mod;
var result = {};
if (mod != null) for (var k in mod) if (Object.hasOwnProperty.call(mod, k)) result[k] = mod[k];
result["default"] = mod;
return result;
};
Object.defineProperty(exports, "__esModule", { value: true });
// We use any as a valid input type
/* eslint-disable @typescript-eslint/no-explicit-any */
const fs = __importStar(__webpack_require__(747));
const os = __importStar(__webpack_require__(87));
const utils_1 = __webpack_require__(82);
function issueCommand(command, message) {
const filePath = process.env[`GITHUB_${command}`];
if (!filePath) {
throw new Error(`Unable to find environment variable for file command ${command}`);
}
if (!fs.existsSync(filePath)) {
throw new Error(`Missing file at path: ${filePath}`);
}
fs.appendFileSync(filePath, `${utils_1.toCommandValue(message)}${os.EOL}`, {
encoding: 'utf8'
});
}
exports.issueCommand = issueCommand;
//# sourceMappingURL=file-command.js.map
/***/ }),
/***/ 281:
@@ -1495,12 +1557,12 @@ var __importStar = (this && this.__importStar) || function (mod) {
return result;
};
Object.defineProperty(exports, "__esModule", { value: true });
const glob = __importStar(__webpack_require__(281));
const crypto = __importStar(__webpack_require__(417));
const fs = __importStar(__webpack_require__(747));
const glob = __importStar(__webpack_require__(281));
const path = __importStar(__webpack_require__(622));
const stream = __importStar(__webpack_require__(413));
const util = __importStar(__webpack_require__(669));
const path = __importStar(__webpack_require__(622));
function run() {
var e_1, _a;
return __awaiter(this, void 0, void 0, function* () {
@@ -1551,7 +1613,7 @@ function run() {
}
result.end();
if (hasMatch) {
console.log(`Find ${count} files to hash.`);
console.log(`Found ${count} files to hash.`);
console.error(`__OUTPUT__${result.digest('hex')}__OUTPUT__`);
}
else {
@@ -1559,7 +1621,15 @@ function run() {
}
});
}
run();
run()
.then(out => {
console.log(out);
process.exit(0);
})
.catch(err => {
console.error(err);
process.exit(1);
});
/***/ }),
@@ -1687,17 +1757,25 @@ module.exports = require("crypto");
"use strict";
var __importStar = (this && this.__importStar) || function (mod) {
if (mod && mod.__esModule) return mod;
var result = {};
if (mod != null) for (var k in mod) if (Object.hasOwnProperty.call(mod, k)) result[k] = mod[k];
result["default"] = mod;
return result;
};
Object.defineProperty(exports, "__esModule", { value: true });
const os = __webpack_require__(87);
const os = __importStar(__webpack_require__(87));
const utils_1 = __webpack_require__(82);
/**
* Commands
*
* Command Format:
* ##[name key=value;key=value]message
* ::name key=value,key=value::message
*
* Examples:
* ##[warning]This is the user warning message
* ##[set-secret name=mypassword]definitelyNotAPassword!
* ::warning::This is the message
* ::set-env name=MY_VAR::some value
*/
function issueCommand(command, properties, message) {
const cmd = new Command(command, properties, message);
@@ -1722,34 +1800,39 @@ class Command {
let cmdStr = CMD_STRING + this.command;
if (this.properties && Object.keys(this.properties).length > 0) {
cmdStr += ' ';
let first = true;
for (const key in this.properties) {
if (this.properties.hasOwnProperty(key)) {
const val = this.properties[key];
if (val) {
// safely append the val - avoid blowing up when attempting to
// call .replace() if message is not a string for some reason
cmdStr += `${key}=${escape(`${val || ''}`)},`;
if (first) {
first = false;
}
else {
cmdStr += ',';
}
cmdStr += `${key}=${escapeProperty(val)}`;
}
}
}
}
cmdStr += CMD_STRING;
// safely append the message - avoid blowing up when attempting to
// call .replace() if message is not a string for some reason
const message = `${this.message || ''}`;
cmdStr += escapeData(message);
cmdStr += `${CMD_STRING}${escapeData(this.message)}`;
return cmdStr;
}
}
function escapeData(s) {
return s.replace(/\r/g, '%0D').replace(/\n/g, '%0A');
return utils_1.toCommandValue(s)
.replace(/%/g, '%25')
.replace(/\r/g, '%0D')
.replace(/\n/g, '%0A');
}
function escape(s) {
return s
function escapeProperty(s) {
return utils_1.toCommandValue(s)
.replace(/%/g, '%25')
.replace(/\r/g, '%0D')
.replace(/\n/g, '%0A')
.replace(/]/g, '%5D')
.replace(/;/g, '%3B');
.replace(/:/g, '%3A')
.replace(/,/g, '%2C');
}
//# sourceMappingURL=command.js.map
@@ -1769,10 +1852,19 @@ var __awaiter = (this && this.__awaiter) || function (thisArg, _arguments, P, ge
step((generator = generator.apply(thisArg, _arguments || [])).next());
});
};
var __importStar = (this && this.__importStar) || function (mod) {
if (mod && mod.__esModule) return mod;
var result = {};
if (mod != null) for (var k in mod) if (Object.hasOwnProperty.call(mod, k)) result[k] = mod[k];
result["default"] = mod;
return result;
};
Object.defineProperty(exports, "__esModule", { value: true });
const command_1 = __webpack_require__(431);
const os = __webpack_require__(87);
const path = __webpack_require__(622);
const file_command_1 = __webpack_require__(102);
const utils_1 = __webpack_require__(82);
const os = __importStar(__webpack_require__(87));
const path = __importStar(__webpack_require__(622));
/**
* The code to exit an action
*/
@@ -1793,11 +1885,21 @@ var ExitCode;
/**
* Sets env variable for this action and future actions in the job
* @param name the name of the variable to set
* @param val the value of the variable
* @param val the value of the variable. Non-string values will be converted to a string via JSON.stringify
*/
// eslint-disable-next-line @typescript-eslint/no-explicit-any
function exportVariable(name, val) {
process.env[name] = val;
command_1.issueCommand('set-env', { name }, val);
const convertedVal = utils_1.toCommandValue(val);
process.env[name] = convertedVal;
const filePath = process.env['GITHUB_ENV'] || '';
if (filePath) {
const delimiter = '_GitHubActionsFileCommandDelimeter_';
const commandValue = `${name}<<${delimiter}${os.EOL}${convertedVal}${os.EOL}${delimiter}`;
file_command_1.issueCommand('ENV', commandValue);
}
else {
command_1.issueCommand('set-env', { name }, convertedVal);
}
}
exports.exportVariable = exportVariable;
/**
@@ -1813,7 +1915,13 @@ exports.setSecret = setSecret;
* @param inputPath
*/
function addPath(inputPath) {
const filePath = process.env['GITHUB_PATH'] || '';
if (filePath) {
file_command_1.issueCommand('PATH', inputPath);
}
else {
command_1.issueCommand('add-path', {}, inputPath);
}
process.env['PATH'] = `${inputPath}${path.delimiter}${process.env['PATH']}`;
}
exports.addPath = addPath;
@@ -1836,12 +1944,22 @@ exports.getInput = getInput;
* Sets the value of an output.
*
* @param name name of the output to set
* @param value value to store
* @param value value to store. Non-string values will be converted to a string via JSON.stringify
*/
// eslint-disable-next-line @typescript-eslint/no-explicit-any
function setOutput(name, value) {
command_1.issueCommand('set-output', { name }, value);
}
exports.setOutput = setOutput;
/**
* Enables or disables the echoing of commands into stdout for the rest of the step.
* Echoing is disabled by default if ACTIONS_STEP_DEBUG is not set.
*
*/
function setCommandEcho(enabled) {
command_1.issue('echo', enabled ? 'on' : 'off');
}
exports.setCommandEcho = setCommandEcho;
//-----------------------------------------------------------------------
// Results
//-----------------------------------------------------------------------
@@ -1858,6 +1976,13 @@ exports.setFailed = setFailed;
//-----------------------------------------------------------------------
// Logging Commands
//-----------------------------------------------------------------------
/**
* Gets whether Actions Step Debug is on or not
*/
function isDebug() {
return process.env['RUNNER_DEBUG'] === '1';
}
exports.isDebug = isDebug;
/**
* Writes debug message to user log
* @param message debug message
@@ -1868,18 +1993,18 @@ function debug(message) {
exports.debug = debug;
/**
* Adds an error issue
* @param message error issue message
* @param message error issue message. Errors will be converted to string via toString()
*/
function error(message) {
command_1.issue('error', message);
command_1.issue('error', message instanceof Error ? message.toString() : message);
}
exports.error = error;
/**
* Adds an warning issue
* @param message warning issue message
* @param message warning issue message. Errors will be converted to string via toString()
*/
function warning(message) {
command_1.issue('warning', message);
command_1.issue('warning', message instanceof Error ? message.toString() : message);
}
exports.warning = warning;
/**
@@ -1937,8 +2062,9 @@ exports.group = group;
* Saves state for current action, the state can only be retrieved by this action's post job execution.
*
* @param name name of the state to store
* @param value value to store
* @param value value to store. Non-string values will be converted to a string via JSON.stringify
*/
// eslint-disable-next-line @typescript-eslint/no-explicit-any
function saveState(name, value) {
command_1.issueCommand('save-state', { name }, value);
}

View File

@@ -10,10 +10,11 @@ if [ -f ".path" ]; then
echo ".path=${PATH}"
fi
# insert anything to setup env when running as a service
nodever=${GITHUB_ACTIONS_RUNNER_FORCED_NODE_VERSION:-node16}
# insert anything to setup env when running as a service
# run the host process which keep the listener alive
./externals/node12/bin/node ./bin/RunnerService.js &
./externals/$nodever/bin/node ./bin/RunnerService.js &
PID=$!
wait $PID
trap - TERM INT

View File

@@ -10,7 +10,13 @@ arg_2=${2}
RUNNER_ROOT=`pwd`
UNIT_PATH=/etc/systemd/system/${SVC_NAME}
TEMPLATE_PATH=$GITHUB_ACTIONS_RUNNER_SERVICE_TEMPLATE
IS_CUSTOM_TEMPLATE=0
if [[ -z $TEMPLATE_PATH ]]; then
TEMPLATE_PATH=./bin/actions.runner.service.template
else
IS_CUSTOM_TEMPLATE=1
fi
TEMP_PATH=./bin/actions.runner.service.temp
CONFIG_PATH=.service
@@ -31,7 +37,11 @@ function failed()
}
if [ ! -f "${TEMPLATE_PATH}" ]; then
if [[ $IS_CUSTOM_TEMPLATE = 0 ]]; then
failed "Must run from runner root or install is corrupt"
else
failed "Service file at '$GITHUB_ACTIONS_RUNNER_SERVICE_TEMPLATE' using GITHUB_ACTIONS_RUNNER_SERVICE_TEMPLATE env variable is not found"
fi
fi
#check if we run as root

View File

@@ -30,13 +30,13 @@ date "+[%F %T-%4N] Waiting for $runnerprocessname ($runnerpid) to complete" >> "
while [ -e /proc/$runnerpid ]
do
date "+[%F %T-%4N] Process $runnerpid still running" >> "$logfile" 2>&1
sleep 2
"$rootfolder"/safe_sleep.sh 2
done
date "+[%F %T-%4N] Process $runnerpid finished running" >> "$logfile" 2>&1
# start re-organize folders
date "+[%F %T-%4N] Sleep 1 more second to make sure process exited" >> "$logfile" 2>&1
sleep 1
"$rootfolder"/safe_sleep.sh 1
# the folder structure under runner root will be
# ./bin -> bin.2.100.0 (junction folder)
@@ -125,7 +125,7 @@ attemptedtargetedfix=0
currentplatform=$(uname | awk '{print tolower($0)}')
if [[ "$currentplatform" == 'darwin' && restartinteractiverunner -eq 0 ]]; then
# We needed a fix for https://github.com/actions/runner/issues/743
# We will recreate the ./externals/node12/bin/node of the past runner version that launched the runnerlistener service
# We will recreate the ./externals/nodeXY/bin/node of the past runner version that launched the runnerlistener service
# Otherwise mac gatekeeper kills the processes we spawn on creation as we are running a process with no backing file
# We need the pid for the nodejs loop, get that here, its the parent of the runner C# pid
@@ -135,7 +135,13 @@ if [[ "$currentplatform" == 'darwin' && restartinteractiverunner -eq 0 ]]; then
then
# inspect the open file handles to find the node process
# we can't actually inspect the process using ps because it uses relative paths and doesn't follow symlinks
path=$(lsof -a -g "$procgroup" -F n | grep node12/bin/node | grep externals | tail -1 | cut -c2-)
nodever="node16"
path=$(lsof -a -g "$procgroup" -F n | grep $nodever/bin/node | grep externals | tail -1 | cut -c2-)
if [[ $? -ne 0 || -z "$path" ]] # Fallback if RunnerService.js was started with node12
then
nodever="node12"
path=$(lsof -a -g "$procgroup" -F n | grep $nodever/bin/node | grep externals | tail -1 | cut -c2-)
fi
if [[ $? -eq 0 && -n "$path" ]]
then
# trim the last 5 characters of the path '/node'
@@ -148,7 +154,7 @@ if [[ "$currentplatform" == 'darwin' && restartinteractiverunner -eq 0 ]]; then
then
date "+[%F %T-%4N] Creating fallback node at path $path" >> "$logfile" 2>&1
mkdir -p "$trimmedpath"
cp "$rootfolder/externals/node12/bin/node" "$path"
cp "$rootfolder/externals/$nodever/bin/node" "$path"
else
date "+[%F %T-%4N] Path for fallback node exists, skipping creating $path" >> "$logfile" 2>&1
fi

View File

@@ -10,23 +10,15 @@ fi
# Run
shopt -s nocasematch
safe_sleep() {
if [ ! -x "$(command -v sleep)" ]; then
if [ ! -x "$(command -v ping)" ]; then
COUNT="0"
while [[ $COUNT != 5000 ]]; do
echo "SLEEP" > /dev/null
COUNT=$[$COUNT+1]
SOURCE="${BASH_SOURCE[0]}"
while [ -h "$SOURCE" ]; do # resolve $SOURCE until the file is no longer a symlink
DIR="$( cd -P "$( dirname "$SOURCE" )" && pwd )"
SOURCE="$(readlink "$SOURCE")"
[[ $SOURCE != /* ]] && SOURCE="$DIR/$SOURCE" # if $SOURCE was a relative symlink, we need to resolve it relative to the path where the symlink file was located
done
else
ping -c 5 127.0.0.1 > /dev/null
fi
else
sleep 5
fi
}
DIR="$( cd -P "$( dirname "$SOURCE" )" && pwd )"
"$DIR"/bin/Runner.Listener run $*
bin/Runner.Listener run $*
returnCode=$?
if [[ $returnCode == 0 ]]; then
echo "Runner listener exit with 0 return code, stop the service, no retry needed."
@@ -36,18 +28,18 @@ elif [[ $returnCode == 1 ]]; then
exit 0
elif [[ $returnCode == 2 ]]; then
echo "Runner listener exit with retryable error, re-launch runner in 5 seconds."
safe_sleep
exit 1
"$DIR"/safe_sleep.sh 5
exit 2
elif [[ $returnCode == 3 ]]; then
# Sleep 5 seconds to wait for the runner update process finish
echo "Runner listener exit because of updating, re-launch runner in 5 seconds"
safe_sleep
exit 1
"$DIR"/safe_sleep.sh 5
exit 2
elif [[ $returnCode == 4 ]]; then
# Sleep 5 seconds to wait for the ephemeral runner update process finish
echo "Runner listener exit because of updating, re-launch ephemeral runner in 5 seconds"
safe_sleep
exit 1
"$DIR"/safe_sleep.sh 5
exit 2
else
echo "Exiting with unknown error code: ${returnCode}"
exit 0

View File

@@ -19,7 +19,7 @@ rem Run.
rem ********************************************************************************
:launch_helper
copy run-helper.cmd.template run-helper.cmd /Y
copy "%~dp0run-helper.cmd.template" "%~dp0run-helper.cmd" /Y
call "%~dp0run-helper.cmd" %*
if %ERRORLEVEL% EQU 1 (
@@ -27,5 +27,5 @@ if %ERRORLEVEL% EQU 1 (
goto :launch_helper
) else (
echo "Exiting runner..."
exit 0
exit /b 0
)

View File

@@ -9,13 +9,13 @@ while [ -h "$SOURCE" ]; do # resolve $SOURCE until the file is no longer a symli
[[ $SOURCE != /* ]] && SOURCE="$DIR/$SOURCE" # if $SOURCE was a relative symlink, we need to resolve it relative to the path where the symlink file was located
done
DIR="$( cd -P "$( dirname "$SOURCE" )" && pwd )"
cp -f run-helper.sh.template run-helper.sh
cp -f "$DIR"/run-helper.sh.template "$DIR"/run-helper.sh
# run the helper process which keep the listener alive
while :;
do
"$DIR"/run-helper.sh $*
returnCode=$?
if [[ $returnCode == 1 ]]; then
if [[ $returnCode -eq 2 ]]; then
echo "Restarting runner..."
else
echo "Exiting runner..."

View File

@@ -0,0 +1,6 @@
#!/bin/bash
SECONDS=0
while [[ $SECONDS != $1 ]]; do
:
done

View File

@@ -33,6 +33,9 @@ namespace GitHub.Runner.Common
[DataMember(EmitDefaultValue = false)]
public string PoolName { get; set; }
[DataMember(EmitDefaultValue = false)]
public bool DisableUpdate { get; set; }
[DataMember(EmitDefaultValue = false)]
public bool Ephemeral { get; set; }

View File

@@ -86,7 +86,7 @@ namespace GitHub.Runner.Common
public static class CommandLine
{
//if you are adding a new arg, please make sure you update the
//validArgs array as well present in the CommandSettings.cs
//validOptions dictionary as well present in the CommandSettings.cs
public static class Args
{
public static readonly string Auth = "auth";
@@ -121,7 +121,7 @@ namespace GitHub.Runner.Common
}
//if you are adding a new flag, please make sure you update the
//validFlags array as well present in the CommandSettings.cs
//validOptions dictionary as well present in the CommandSettings.cs
public static class Flags
{
public static readonly string Check = "check";
@@ -129,6 +129,7 @@ namespace GitHub.Runner.Common
public static readonly string Ephemeral = "ephemeral";
public static readonly string Help = "help";
public static readonly string Replace = "replace";
public static readonly string DisableUpdate = "disableupdate";
public static readonly string Once = "once"; // Keep this around since customers still relies on it
public static readonly string RunAsService = "runasservice";
public static readonly string Unattended = "unattended";
@@ -148,6 +149,8 @@ namespace GitHub.Runner.Common
public static class Features
{
public static readonly string DiskSpaceWarning = "runner.diskspace.warning";
public static readonly string Node12Warning = "DistributedTask.AddWarningToNode12Action";
public static readonly string UseContainerPathForTemplate = "DistributedTask.UseContainerPathForTemplate";
}
public static readonly string InternalTelemetryIssueDataKey = "_internal_telemetry";
@@ -155,7 +158,9 @@ namespace GitHub.Runner.Common
public static readonly string LowDiskSpace = "LOW_DISK_SPACE";
public static readonly string UnsupportedCommand = "UNSUPPORTED_COMMAND";
public static readonly string UnsupportedCommandMessageDisabled = "The `{0}` command is disabled. Please upgrade to using Environment Files or opt into unsecure command execution by setting the `ACTIONS_ALLOW_UNSECURE_COMMANDS` environment variable to `true`. For more information see: https://github.blog/changelog/2020-10-01-github-actions-deprecating-set-env-and-add-path-commands/";
public static readonly string UnsupportedStopCommandTokenDisabled = "You cannot use a endToken that is an empty string, the string 'pause-logging', or another workflow command. For more information see: https://docs.github.com/en/actions/learn-github-actions/workflow-commands-for-github-actions#example-stopping-and-starting-workflow-commands or opt into insecure command execution by setting the `ACTIONS_ALLOW_UNSECURE_STOPCOMMAND_TOKENS` environment variable to `true`.";
public static readonly string UnsupportedStopCommandTokenDisabled = "You cannot use a endToken that is an empty string, the string 'pause-logging', or another workflow command. For more information see: https://docs.github.com/actions/learn-github-actions/workflow-commands-for-github-actions#example-stopping-and-starting-workflow-commands or opt into insecure command execution by setting the `ACTIONS_ALLOW_UNSECURE_STOPCOMMAND_TOKENS` environment variable to `true`.";
public static readonly string UnsupportedSummarySize = "$GITHUB_STEP_SUMMARY upload aborted, supports content up to a size of {0}k, got {1}k. For more information see: https://docs.github.com/actions/using-workflows/workflow-commands-for-github-actions#adding-a-markdown-summary";
public static readonly string Node12DetectedAfterEndOfLife = "Node.js 12 actions are deprecated. Please update the following actions to use Node.js 16: {0}";
}
public static class RunnerEvent
@@ -187,6 +192,12 @@ namespace GitHub.Runner.Common
public static readonly string Success = "success";
}
public static class Hooks
{
public static readonly string JobStartedStepName = "Set up runner";
public static readonly string JobCompletedStepName = "Complete runner";
}
public static class Path
{
public static readonly string ActionsDirectory = "_actions";
@@ -218,11 +229,15 @@ namespace GitHub.Runner.Common
public static readonly string AllowUnsupportedStopCommandTokens = "ACTIONS_ALLOW_UNSECURE_STOPCOMMAND_TOKENS";
public static readonly string RunnerDebug = "ACTIONS_RUNNER_DEBUG";
public static readonly string StepDebug = "ACTIONS_STEP_DEBUG";
public static readonly string AllowActionsUseUnsecureNodeVersion = "ACTIONS_ALLOW_USE_UNSECURE_NODE_VERSION";
}
public static class Agent
{
public static readonly string ToolsDirectory = "agent.ToolsDirectory";
// Set this env var to "node12" to downgrade the node version for internal functions (e.g hashfiles). This does NOT affect the version of node actions.
public static readonly string ForcedInternalNodeVersion = "ACTIONS_RUNNER_FORCED_INTERNAL_NODE_VERSION";
}
public static class System

View File

@@ -60,6 +60,7 @@ namespace GitHub.Runner.Common
case "GitHub.Runner.Worker.IFileCommandExtension":
Add<T>(extensions, "GitHub.Runner.Worker.AddPathFileCommand, Runner.Worker");
Add<T>(extensions, "GitHub.Runner.Worker.SetEnvFileCommand, Runner.Worker");
Add<T>(extensions, "GitHub.Runner.Worker.CreateStepSummaryCommand, Runner.Worker");
break;
case "GitHub.Runner.Listener.Check.ICheckExtension":
Add<T>(extensions, "GitHub.Runner.Listener.Check.InternetCheck, Runner.Listener");

View File

@@ -216,6 +216,8 @@ namespace GitHub.Runner.Common
_userAgents.Add(new ProductInfoHeaderValue("RunnerId", runnerSettings.AgentId.ToString(CultureInfo.InvariantCulture)));
_userAgents.Add(new ProductInfoHeaderValue("GroupId", runnerSettings.PoolId.ToString(CultureInfo.InvariantCulture)));
}
_userAgents.Add(new ProductInfoHeaderValue("CommitSHA", BuildConstants.Source.CommitHash));
}
public string GetDirectory(WellKnownDirectory directory)

View File

@@ -1,32 +1,39 @@
using GitHub.DistributedTask.WebApi;
using GitHub.DistributedTask.WebApi;
using System;
using System.Collections.Generic;
using System.IO;
using System.Linq;
using System.Net.Http;
using System.Net.Http.Headers;
using System.Net.WebSockets;
using System.Text;
using System.Threading;
using System.Threading.Tasks;
using GitHub.Runner.Sdk;
using GitHub.Services.Common;
using GitHub.Services.WebApi;
using GitHub.Services.WebApi.Utilities.Internal;
using Newtonsoft.Json;
namespace GitHub.Runner.Common
{
[ServiceLocator(Default = typeof(JobServer))]
public interface IJobServer : IRunnerService
public interface IJobServer : IRunnerService, IAsyncDisposable
{
Task ConnectAsync(VssConnection jobConnection);
void InitializeWebsocketClient(ServiceEndpoint serviceEndpoint);
// logging and console
Task<TaskLog> AppendLogContentAsync(Guid scopeIdentifier, string hubName, Guid planId, int logId, Stream uploadStream, CancellationToken cancellationToken);
Task AppendTimelineRecordFeedAsync(Guid scopeIdentifier, string hubName, Guid planId, Guid timelineId, Guid timelineRecordId, Guid stepId, IList<string> lines, CancellationToken cancellationToken);
Task AppendTimelineRecordFeedAsync(Guid scopeIdentifier, string hubName, Guid planId, Guid timelineId, Guid timelineRecordId, Guid stepId, IList<string> lines, long startLine, CancellationToken cancellationToken);
Task AppendTimelineRecordFeedAsync(Guid scopeIdentifier, string hubName, Guid planId, Guid timelineId, Guid timelineRecordId, Guid stepId, IList<string> lines, long? startLine, CancellationToken cancellationToken);
Task<TaskAttachment> CreateAttachmentAsync(Guid scopeIdentifier, string hubName, Guid planId, Guid timelineId, Guid timelineRecordId, String type, String name, Stream uploadStream, CancellationToken cancellationToken);
Task<TaskLog> CreateLogAsync(Guid scopeIdentifier, string hubName, Guid planId, TaskLog log, CancellationToken cancellationToken);
Task<Timeline> CreateTimelineAsync(Guid scopeIdentifier, string hubName, Guid planId, Guid timelineId, CancellationToken cancellationToken);
Task<List<TimelineRecord>> UpdateTimelineRecordsAsync(Guid scopeIdentifier, string hubName, Guid planId, Guid timelineId, IEnumerable<TimelineRecord> records, CancellationToken cancellationToken);
Task RaisePlanEventAsync<T>(Guid scopeIdentifier, string hubName, Guid planId, T eventData, CancellationToken cancellationToken) where T : JobEvent;
Task<Timeline> GetTimelineAsync(Guid scopeIdentifier, string hubName, Guid planId, Guid timelineId, CancellationToken cancellationToken);
Task<ActionDownloadInfoCollection> ResolveActionDownloadInfoAsync(Guid scopeIdentifier, string hubName, Guid planId, ActionReferenceList actions, CancellationToken cancellationToken);
Task<ActionDownloadInfoCollection> ResolveActionDownloadInfoAsync(Guid scopeIdentifier, string hubName, Guid planId, Guid jobId, ActionReferenceList actions, CancellationToken cancellationToken);
}
public sealed class JobServer : RunnerService, IJobServer
@@ -34,6 +41,20 @@ namespace GitHub.Runner.Common
private bool _hasConnection;
private VssConnection _connection;
private TaskHttpClient _taskClient;
private ClientWebSocket _websocketClient;
private ServiceEndpoint _serviceEndpoint;
private int totalBatchedLinesAttemptedByWebsocket = 0;
private int failedAttemptsToPostBatchedLinesByWebsocket = 0;
private static readonly TimeSpan _minDelayForWebsocketReconnect = TimeSpan.FromMilliseconds(100);
private static readonly TimeSpan _maxDelayForWebsocketReconnect = TimeSpan.FromMilliseconds(500);
private static readonly int _minWebsocketFailurePercentageAllowed = 50;
private static readonly int _minWebsocketBatchedLinesCountToConsider = 5;
private Task _websocketConnectTask;
public async Task ConnectAsync(VssConnection jobConnection)
{
@@ -117,6 +138,21 @@ namespace GitHub.Runner.Common
}
}
public void InitializeWebsocketClient(ServiceEndpoint serviceEndpoint)
{
this._serviceEndpoint = serviceEndpoint;
InitializeWebsocketClient(TimeSpan.Zero);
}
public ValueTask DisposeAsync()
{
CloseWebSocket(WebSocketCloseStatus.NormalClosure, CancellationToken.None);
GC.SuppressFinalize(this);
return ValueTask.CompletedTask;
}
private void CheckConnection()
{
if (!_hasConnection)
@@ -125,6 +161,53 @@ namespace GitHub.Runner.Common
}
}
private void InitializeWebsocketClient(TimeSpan delay)
{
if (_serviceEndpoint.Authorization != null &&
_serviceEndpoint.Authorization.Parameters.TryGetValue(EndpointAuthorizationParameters.AccessToken, out var accessToken) &&
!string.IsNullOrEmpty(accessToken))
{
if (_serviceEndpoint.Data.TryGetValue("FeedStreamUrl", out var feedStreamUrl) && !string.IsNullOrEmpty(feedStreamUrl))
{
// let's ensure we use the right scheme
feedStreamUrl = feedStreamUrl.Replace("https://", "wss://").Replace("http://", "ws://");
Trace.Info($"Creating websocket client ..." + feedStreamUrl);
this._websocketClient = new ClientWebSocket();
this._websocketClient.Options.SetRequestHeader("Authorization", $"Bearer {accessToken}");
var userAgentValues = new List<ProductInfoHeaderValue>();
userAgentValues.AddRange(UserAgentUtility.GetDefaultRestUserAgent());
userAgentValues.AddRange(HostContext.UserAgents);
this._websocketClient.Options.SetRequestHeader("User-Agent", string.Join(" ", userAgentValues.Select(x=>x.ToString())));
this._websocketConnectTask = ConnectWebSocketClient(feedStreamUrl, delay);
}
else
{
Trace.Info($"No FeedStreamUrl found, so we will use Rest API calls for sending feed data");
}
}
else
{
Trace.Info($"No access token from the service endpoint");
}
}
private async Task ConnectWebSocketClient(string feedStreamUrl, TimeSpan delay)
{
try
{
Trace.Info($"Attempting to start websocket client with delay {delay}.");
await Task.Delay(delay);
await this._websocketClient.ConnectAsync(new Uri(feedStreamUrl), default(CancellationToken));
Trace.Info($"Successfully started websocket client.");
}
catch(Exception ex)
{
Trace.Info("Exception caught during websocket client connect, fallback of HTTP would be used now instead of websocket.");
Trace.Error(ex);
}
}
//-----------------------------------------------------------------
// Feedback: WebConsole, TimelineRecords and Logs
//-----------------------------------------------------------------
@@ -135,16 +218,86 @@ namespace GitHub.Runner.Common
return _taskClient.AppendLogContentAsync(scopeIdentifier, hubName, planId, logId, uploadStream, cancellationToken: cancellationToken);
}
public Task AppendTimelineRecordFeedAsync(Guid scopeIdentifier, string hubName, Guid planId, Guid timelineId, Guid timelineRecordId, Guid stepId, IList<string> lines, CancellationToken cancellationToken)
public async Task AppendTimelineRecordFeedAsync(Guid scopeIdentifier, string hubName, Guid planId, Guid timelineId, Guid timelineRecordId, Guid stepId, IList<string> lines, long? startLine, CancellationToken cancellationToken)
{
CheckConnection();
return _taskClient.AppendTimelineRecordFeedAsync(scopeIdentifier, hubName, planId, timelineId, timelineRecordId, stepId, lines, cancellationToken: cancellationToken);
var pushedLinesViaWebsocket = false;
if (_websocketConnectTask != null)
{
await _websocketConnectTask;
}
public Task AppendTimelineRecordFeedAsync(Guid scopeIdentifier, string hubName, Guid planId, Guid timelineId, Guid timelineRecordId, Guid stepId, IList<string> lines, long startLine, CancellationToken cancellationToken)
// "_websocketClient != null" implies either: We have a successful connection OR we have to attempt sending again and then reconnect
// ...in other words, if websocket client is null, we will skip sending to websocket and just use rest api calls to send data
if (_websocketClient != null)
{
CheckConnection();
return _taskClient.AppendTimelineRecordFeedAsync(scopeIdentifier, hubName, planId, timelineId, timelineRecordId, stepId, lines, startLine, cancellationToken: cancellationToken);
var linesWrapper = startLine.HasValue? new TimelineRecordFeedLinesWrapper(stepId, lines, startLine.Value): new TimelineRecordFeedLinesWrapper(stepId, lines);
var jsonData = StringUtil.ConvertToJson(linesWrapper);
try
{
totalBatchedLinesAttemptedByWebsocket++;
var jsonDataBytes = Encoding.UTF8.GetBytes(jsonData);
// break the message into chunks of 1024 bytes
for (var i = 0; i < jsonDataBytes.Length; i += 1 * 1024)
{
var lastChunk = i + (1 * 1024) >= jsonDataBytes.Length;
var chunk = new ArraySegment<byte>(jsonDataBytes, i, Math.Min(1 * 1024, jsonDataBytes.Length - i));
await _websocketClient.SendAsync(chunk, WebSocketMessageType.Text, endOfMessage:lastChunk, cancellationToken);
}
pushedLinesViaWebsocket = true;
}
catch (Exception ex)
{
failedAttemptsToPostBatchedLinesByWebsocket++;
Trace.Info($"Caught exception during append web console line to websocket, let's fallback to sending via non-websocket call (total calls: {totalBatchedLinesAttemptedByWebsocket}, failed calls: {failedAttemptsToPostBatchedLinesByWebsocket}, websocket state: {this._websocketClient?.State}).");
Trace.Error(ex);
if (totalBatchedLinesAttemptedByWebsocket > _minWebsocketBatchedLinesCountToConsider)
{
// let's consider failure percentage
if (failedAttemptsToPostBatchedLinesByWebsocket * 100 / totalBatchedLinesAttemptedByWebsocket > _minWebsocketFailurePercentageAllowed)
{
Trace.Info($"Exhausted websocket allowed retries, we will not attempt websocket connection for this job to post lines again.");
CloseWebSocket(WebSocketCloseStatus.InternalServerError, cancellationToken);
// By setting it to null, we will ensure that we never try websocket path again for this job
_websocketClient = null;
}
}
if (_websocketClient != null)
{
var delay = BackoffTimerHelper.GetRandomBackoff(_minDelayForWebsocketReconnect, _maxDelayForWebsocketReconnect);
Trace.Info($"Websocket is not open, let's attempt to connect back again with random backoff {delay} ms (total calls: {totalBatchedLinesAttemptedByWebsocket}, failed calls: {failedAttemptsToPostBatchedLinesByWebsocket}).");
InitializeWebsocketClient(delay);
}
}
}
if (!pushedLinesViaWebsocket)
{
if (startLine.HasValue)
{
await _taskClient.AppendTimelineRecordFeedAsync(scopeIdentifier, hubName, planId, timelineId, timelineRecordId, stepId, lines, startLine.Value, cancellationToken: cancellationToken);
}
else
{
await _taskClient.AppendTimelineRecordFeedAsync(scopeIdentifier, hubName, planId, timelineId, timelineRecordId, stepId, lines, cancellationToken: cancellationToken);
}
}
}
private void CloseWebSocket(WebSocketCloseStatus closeStatus, CancellationToken cancellationToken)
{
try
{
_websocketClient?.CloseOutputAsync(closeStatus, "Closing websocket", cancellationToken);
}
catch (Exception websocketEx)
{
// In some cases this might be okay since the websocket might be open yet, so just close and don't trace exceptions
Trace.Info($"Failed to close websocket gracefully {websocketEx.GetType().Name}");
}
}
public Task<TaskAttachment> CreateAttachmentAsync(Guid scopeIdentifier, string hubName, Guid planId, Guid timelineId, Guid timelineRecordId, string type, string name, Stream uploadStream, CancellationToken cancellationToken)
@@ -186,10 +339,10 @@ namespace GitHub.Runner.Common
//-----------------------------------------------------------------
// Action download info
//-----------------------------------------------------------------
public Task<ActionDownloadInfoCollection> ResolveActionDownloadInfoAsync(Guid scopeIdentifier, string hubName, Guid planId, ActionReferenceList actions, CancellationToken cancellationToken)
public Task<ActionDownloadInfoCollection> ResolveActionDownloadInfoAsync(Guid scopeIdentifier, string hubName, Guid planId, Guid jobId, ActionReferenceList actions, CancellationToken cancellationToken)
{
CheckConnection();
return _taskClient.ResolveActionDownloadInfoAsync(scopeIdentifier, hubName, planId, actions, cancellationToken: cancellationToken);
return _taskClient.ResolveActionDownloadInfoAsync(scopeIdentifier, hubName, planId, jobId, actions, cancellationToken: cancellationToken);
}
}
}

View File

@@ -89,6 +89,10 @@ namespace GitHub.Runner.Common
{
Trace.Entering();
var serviceEndPoint = jobRequest.Resources.Endpoints.Single(x => string.Equals(x.Name, WellKnownServiceEndpointNames.SystemVssConnection, StringComparison.OrdinalIgnoreCase));
_jobServer.InitializeWebsocketClient(serviceEndPoint);
if (_queueInProcess)
{
Trace.Info("No-opt, all queue process tasks are running.");
@@ -156,6 +160,9 @@ namespace GitHub.Runner.Common
await ProcessTimelinesUpdateQueueAsync(runOnce: true);
Trace.Info("Timeline update queue drained.");
Trace.Info($"Disposing job server ...");
await _jobServer.DisposeAsync();
Trace.Info("All queue process tasks have been stopped, and all queues are drained.");
}
@@ -292,15 +299,7 @@ namespace GitHub.Runner.Common
{
try
{
// we will not requeue failed batch, since the web console lines are time sensitive.
if (batch[0].LineNumber.HasValue)
{
await _jobServer.AppendTimelineRecordFeedAsync(_scopeIdentifier, _hubName, _planId, _jobTimelineId, _jobTimelineRecordId, stepRecordId, batch.Select(logLine => logLine.Line).ToList(), batch[0].LineNumber.Value, default(CancellationToken));
}
else
{
await _jobServer.AppendTimelineRecordFeedAsync(_scopeIdentifier, _hubName, _planId, _jobTimelineId, _jobTimelineRecordId, stepRecordId, batch.Select(logLine => logLine.Line).ToList(), default(CancellationToken));
}
await _jobServer.AppendTimelineRecordFeedAsync(_scopeIdentifier, _hubName, _planId, _jobTimelineId, _jobTimelineRecordId, stepRecordId, batch.Select(logLine => logLine.Line).ToList(), batch[0].LineNumber, default(CancellationToken));
if (_firstConsoleOutputs)
{

View File

@@ -0,0 +1,16 @@
using System;
using System.Collections.ObjectModel;
namespace GitHub.Runner.Common.Util
{
public static class NodeUtil
{
private const string _defaultNodeVersion = "node16";
public static readonly ReadOnlyCollection<string> BuiltInNodeVersions = new(new[] {"node12", "node16"});
public static string GetInternalNodeVersion()
{
var forcedNodeVersion = Environment.GetEnvironmentVariable(Constants.Variables.Agent.ForcedInternalNodeVersion);
return !string.IsNullOrEmpty(forcedNodeVersion) && BuiltInNodeVersions.Contains(forcedNodeVersion) ? forcedNodeVersion : _defaultNodeVersion;
}
}
}

View File

@@ -10,6 +10,7 @@ using System.Net.NetworkInformation;
using System.Threading;
using System.Threading.Tasks;
using GitHub.Runner.Common;
using GitHub.Runner.Common.Util;
using GitHub.Runner.Sdk;
using GitHub.Services.Common;
@@ -314,12 +315,12 @@ namespace GitHub.Runner.Listener.Check
});
var downloadCertScript = Path.Combine(hostContext.GetDirectory(WellKnownDirectory.Bin), "checkScripts", "downloadCert");
var node12 = Path.Combine(hostContext.GetDirectory(WellKnownDirectory.Externals), "node12", "bin", $"node{IOUtil.ExeExtension}");
result.Logs.Add($"{DateTime.UtcNow.ToString("O")} Run '{node12} \"{downloadCertScript}\"' ");
var node = Path.Combine(hostContext.GetDirectory(WellKnownDirectory.Externals), NodeUtil.GetInternalNodeVersion(), "bin", $"node{IOUtil.ExeExtension}");
result.Logs.Add($"{DateTime.UtcNow.ToString("O")} Run '{node} \"{downloadCertScript}\"' ");
result.Logs.Add($"{DateTime.UtcNow.ToString("O")} {StringUtil.ConvertToJson(env)}");
await processInvoker.ExecuteAsync(
hostContext.GetDirectory(WellKnownDirectory.Root),
node12,
node,
$"\"{downloadCertScript}\"",
env,
true,

View File

@@ -6,6 +6,7 @@ using System.Net;
using System.Threading;
using System.Threading.Tasks;
using GitHub.Runner.Common;
using GitHub.Runner.Common.Util;
using GitHub.Runner.Sdk;
namespace GitHub.Runner.Listener.Check
@@ -144,12 +145,12 @@ namespace GitHub.Runner.Listener.Check
});
var makeWebRequestScript = Path.Combine(HostContext.GetDirectory(WellKnownDirectory.Bin), "checkScripts", "makeWebRequest.js");
var node12 = Path.Combine(HostContext.GetDirectory(WellKnownDirectory.Externals), "node12", "bin", $"node{IOUtil.ExeExtension}");
result.Logs.Add($"{DateTime.UtcNow.ToString("O")} Run '{node12} \"{makeWebRequestScript}\"' ");
var node = Path.Combine(HostContext.GetDirectory(WellKnownDirectory.Externals), NodeUtil.GetInternalNodeVersion(), "bin", $"node{IOUtil.ExeExtension}");
result.Logs.Add($"{DateTime.UtcNow.ToString("O")} Run '{node} \"{makeWebRequestScript}\"' ");
result.Logs.Add($"{DateTime.UtcNow.ToString("O")} {StringUtil.ConvertToJson(env)}");
await processInvoker.ExecuteAsync(
HostContext.GetDirectory(WellKnownDirectory.Root),
node12,
node,
$"\"{makeWebRequestScript}\"",
env,
true,

View File

@@ -17,42 +17,57 @@ namespace GitHub.Runner.Listener
private readonly IPromptManager _promptManager;
private readonly Tracing _trace;
private readonly string[] validCommands =
// Valid flags for all commands
private readonly string[] genericOptions =
{
Constants.Runner.CommandLine.Commands.Configure,
Constants.Runner.CommandLine.Commands.Remove,
Constants.Runner.CommandLine.Commands.Run,
Constants.Runner.CommandLine.Commands.Warmup,
Constants.Runner.CommandLine.Flags.Help,
Constants.Runner.CommandLine.Flags.Version,
Constants.Runner.CommandLine.Flags.Commit,
Constants.Runner.CommandLine.Flags.Check
};
private readonly string[] validFlags =
// Valid flags and args for specific command - key: command, value: array of valid flags and args
private readonly Dictionary<string, string[]> validOptions = new Dictionary<string, string[]>
{
Constants.Runner.CommandLine.Flags.Check,
Constants.Runner.CommandLine.Flags.Commit,
// Valid configure flags and args
[Constants.Runner.CommandLine.Commands.Configure] =
new string[]
{
Constants.Runner.CommandLine.Flags.DisableUpdate,
Constants.Runner.CommandLine.Flags.Ephemeral,
Constants.Runner.CommandLine.Flags.Help,
Constants.Runner.CommandLine.Flags.Once,
Constants.Runner.CommandLine.Flags.Replace,
Constants.Runner.CommandLine.Flags.RunAsService,
Constants.Runner.CommandLine.Flags.Unattended,
Constants.Runner.CommandLine.Flags.Version
};
private readonly string[] validArgs =
{
Constants.Runner.CommandLine.Args.Auth,
Constants.Runner.CommandLine.Args.Labels,
Constants.Runner.CommandLine.Args.MonitorSocketAddress,
Constants.Runner.CommandLine.Args.Name,
Constants.Runner.CommandLine.Args.PAT,
Constants.Runner.CommandLine.Args.RunnerGroup,
Constants.Runner.CommandLine.Args.StartupType,
Constants.Runner.CommandLine.Args.Token,
Constants.Runner.CommandLine.Args.Url,
Constants.Runner.CommandLine.Args.UserName,
Constants.Runner.CommandLine.Args.WindowsLogonAccount,
Constants.Runner.CommandLine.Args.WindowsLogonPassword,
Constants.Runner.CommandLine.Args.Work
},
// Valid remove flags and args
[Constants.Runner.CommandLine.Commands.Remove] =
new string[]
{
Constants.Runner.CommandLine.Args.Token,
Constants.Runner.CommandLine.Args.PAT
},
// Valid run flags and args
[Constants.Runner.CommandLine.Commands.Run] =
new string[]
{
Constants.Runner.CommandLine.Flags.Once,
Constants.Runner.CommandLine.Args.StartupType
},
// valid warmup flags and args
[Constants.Runner.CommandLine.Commands.Warmup] =
new string[] { }
};
// Commands.
@@ -68,6 +83,7 @@ namespace GitHub.Runner.Listener
public bool Unattended => TestFlag(Constants.Runner.CommandLine.Flags.Unattended);
public bool Version => TestFlag(Constants.Runner.CommandLine.Flags.Version);
public bool Ephemeral => TestFlag(Constants.Runner.CommandLine.Flags.Ephemeral);
public bool DisableUpdate => TestFlag(Constants.Runner.CommandLine.Flags.DisableUpdate);
// Keep this around since customers still relies on it
public bool RunOnce => TestFlag(Constants.Runner.CommandLine.Flags.Once);
@@ -124,17 +140,48 @@ namespace GitHub.Runner.Listener
List<string> unknowns = new List<string>();
// detect unknown commands
unknowns.AddRange(_parser.Commands.Where(x => !validCommands.Contains(x, StringComparer.OrdinalIgnoreCase)));
unknowns.AddRange(_parser.Commands.Where(x => !validOptions.Keys.Contains(x, StringComparer.OrdinalIgnoreCase)));
// detect unknown flags
unknowns.AddRange(_parser.Flags.Where(x => !validFlags.Contains(x, StringComparer.OrdinalIgnoreCase)));
// detect unknown args
unknowns.AddRange(_parser.Args.Keys.Where(x => !validArgs.Contains(x, StringComparer.OrdinalIgnoreCase)));
if (unknowns.Count == 0)
{
// detect unknown flags and args for valid commands
foreach (var command in _parser.Commands)
{
if (validOptions.TryGetValue(command, out string[] options))
{
unknowns.AddRange(_parser.Flags.Where(x => !options.Contains(x, StringComparer.OrdinalIgnoreCase) && !genericOptions.Contains(x, StringComparer.OrdinalIgnoreCase)));
unknowns.AddRange(_parser.Args.Keys.Where(x => !options.Contains(x, StringComparer.OrdinalIgnoreCase)));
}
}
}
return unknowns;
}
public string GetCommandName()
{
string command = string.Empty;
if (Configure)
{
command = Constants.Runner.CommandLine.Commands.Configure;
}
else if (Remove)
{
command = Constants.Runner.CommandLine.Commands.Remove;
}
else if (Run)
{
command = Constants.Runner.CommandLine.Commands.Run;
}
else if (Warmup)
{
command = Constants.Runner.CommandLine.Commands.Warmup;
}
return command;
}
//
// Interactive flags.
//

View File

@@ -196,6 +196,7 @@ namespace GitHub.Runner.Listener.Configuration
TaskAgent agent;
while (true)
{
runnerSettings.DisableUpdate = command.DisableUpdate;
runnerSettings.Ephemeral = command.Ephemeral;
runnerSettings.AgentName = command.GetRunnerName();
@@ -213,11 +214,22 @@ namespace GitHub.Runner.Listener.Configuration
if (command.GetReplace())
{
// Update existing agent with new PublicKey, agent version.
agent = UpdateExistingAgent(agent, publicKey, userLabels, runnerSettings.Ephemeral);
agent = UpdateExistingAgent(agent, publicKey, userLabels, runnerSettings.Ephemeral, command.DisableUpdate);
try
{
agent = await _runnerServer.ReplaceAgentAsync(runnerSettings.PoolId, agent);
if (command.DisableUpdate &&
command.DisableUpdate != agent.DisableUpdate)
{
throw new NotSupportedException("The GitHub server does not support configuring a self-hosted runner with 'DisableUpdate' flag.");
}
if (command.Ephemeral &&
command.Ephemeral != agent.Ephemeral)
{
throw new NotSupportedException("The GitHub server does not support configuring a self-hosted runner with 'Ephemeral' flag.");
}
_term.WriteSuccessMessage("Successfully replaced the runner");
break;
}
@@ -236,11 +248,22 @@ namespace GitHub.Runner.Listener.Configuration
else
{
// Create a new agent.
agent = CreateNewAgent(runnerSettings.AgentName, publicKey, userLabels, runnerSettings.Ephemeral);
agent = CreateNewAgent(runnerSettings.AgentName, publicKey, userLabels, runnerSettings.Ephemeral, command.DisableUpdate);
try
{
agent = await _runnerServer.AddAgentAsync(runnerSettings.PoolId, agent);
if (command.DisableUpdate &&
command.DisableUpdate != agent.DisableUpdate)
{
throw new NotSupportedException("The GitHub server does not support configuring a self-hosted runner with 'DisableUpdate' flag.");
}
if (command.Ephemeral &&
command.Ephemeral != agent.Ephemeral)
{
throw new NotSupportedException("The GitHub server does not support configuring a self-hosted runner with 'Ephemeral' flag.");
}
_term.WriteSuccessMessage("Runner successfully added");
break;
}
@@ -466,7 +489,7 @@ namespace GitHub.Runner.Listener.Configuration
}
private TaskAgent UpdateExistingAgent(TaskAgent agent, RSAParameters publicKey, ISet<string> userLabels, bool ephemeral)
private TaskAgent UpdateExistingAgent(TaskAgent agent, RSAParameters publicKey, ISet<string> userLabels, bool ephemeral, bool disableUpdate)
{
ArgUtil.NotNull(agent, nameof(agent));
agent.Authorization = new TaskAgentAuthorization
@@ -478,6 +501,7 @@ namespace GitHub.Runner.Listener.Configuration
agent.Version = BuildConstants.RunnerPackage.Version;
agent.OSDescription = RuntimeInformation.OSDescription;
agent.Ephemeral = ephemeral;
agent.DisableUpdate = disableUpdate;
agent.MaxParallelism = 1;
agent.Labels.Clear();
@@ -494,7 +518,7 @@ namespace GitHub.Runner.Listener.Configuration
return agent;
}
private TaskAgent CreateNewAgent(string agentName, RSAParameters publicKey, ISet<string> userLabels, bool ephemeral)
private TaskAgent CreateNewAgent(string agentName, RSAParameters publicKey, ISet<string> userLabels, bool ephemeral, bool disableUpdate)
{
TaskAgent agent = new TaskAgent(agentName)
{
@@ -506,6 +530,7 @@ namespace GitHub.Runner.Listener.Configuration
Version = BuildConstants.RunnerPackage.Version,
OSDescription = RuntimeInformation.OSDescription,
Ephemeral = ephemeral,
DisableUpdate = disableUpdate
};
agent.Labels.Add(new AgentLabel("self-hosted", LabelType.System));
@@ -588,6 +613,9 @@ namespace GitHub.Runner.Listener.Configuration
throw new ArgumentException($"'{githubUrl}' should point to an org or repository.");
}
int retryCount = 0;
while(retryCount < 3)
{
using (var httpClientHandler = HostContext.CreateHttpClientHandler())
using (var httpClient = new HttpClient(httpClientHandler))
{
@@ -597,7 +625,11 @@ namespace GitHub.Runner.Listener.Configuration
httpClient.DefaultRequestHeaders.UserAgent.AddRange(HostContext.UserAgents);
httpClient.DefaultRequestHeaders.Accept.ParseAdd("application/vnd.github.v3+json");
var responseStatus = System.Net.HttpStatusCode.OK;
try
{
var response = await httpClient.PostAsync(githubApiUrl, new StringContent(string.Empty));
responseStatus = response.StatusCode;
if (response.IsSuccessStatusCode)
{
@@ -611,10 +643,21 @@ namespace GitHub.Runner.Listener.Configuration
var errorResponse = await response.Content.ReadAsStringAsync();
_term.WriteError(errorResponse);
response.EnsureSuccessStatusCode();
}
}
catch(Exception ex) when (retryCount < 2 && responseStatus != System.Net.HttpStatusCode.NotFound)
{
retryCount++;
Trace.Error($"Failed to get JIT runner token -- Atempt: {retryCount}");
Trace.Error(ex);
}
}
var backOff = BackoffTimerHelper.GetRandomBackoff(TimeSpan.FromSeconds(1), TimeSpan.FromSeconds(5));
Trace.Info($"Retrying in {backOff.Seconds} seconds");
await Task.Delay(backOff);
}
return null;
}
}
}
private async Task<GitHubAuthResult> GetTenantCredential(string githubUrl, string githubToken, string runnerEvent)
{
@@ -629,6 +672,9 @@ namespace GitHub.Runner.Listener.Configuration
githubApiUrl = $"{gitHubUrlBuilder.Scheme}://{gitHubUrlBuilder.Host}/api/v3/actions/runner-registration";
}
int retryCount = 0;
while (retryCount < 3)
{
using (var httpClientHandler = HostContext.CreateHttpClientHandler())
using (var httpClient = new HttpClient(httpClientHandler))
{
@@ -641,7 +687,11 @@ namespace GitHub.Runner.Listener.Configuration
{"runner_event", runnerEvent}
};
var responseStatus = System.Net.HttpStatusCode.OK;
try
{
var response = await httpClient.PostAsync(githubApiUrl, new StringContent(StringUtil.ConvertToJson(bodyObject), null, "application/json"));
responseStatus = response.StatusCode;
if(response.IsSuccessStatusCode)
{
@@ -655,9 +705,20 @@ namespace GitHub.Runner.Listener.Configuration
var errorResponse = await response.Content.ReadAsStringAsync();
_term.WriteError(errorResponse);
response.EnsureSuccessStatusCode();
}
}
catch(Exception ex) when (retryCount < 2 && responseStatus != System.Net.HttpStatusCode.NotFound)
{
retryCount++;
Trace.Error($"Failed to get tenant credentials -- Atempt: {retryCount}");
Trace.Error(ex);
}
}
var backOff = BackoffTimerHelper.GetRandomBackoff(TimeSpan.FromSeconds(1), TimeSpan.FromSeconds(5));
Trace.Info($"Retrying in {backOff.Seconds} seconds");
await Task.Delay(backOff);
}
return null;
}
}
}
}
}

View File

@@ -48,13 +48,12 @@ namespace GitHub.Runner.Listener.Configuration
string repoOrOrgName = regex.Replace(settings.RepoOrOrgName, "-");
serviceName = StringUtil.Format(serviceNamePattern, repoOrOrgName, settings.AgentName);
if (serviceName.Length > 80)
if (serviceName.Length > MaxServiceNameLength)
{
Trace.Verbose($"Calculated service name is too long (> 80 chars). Trying again by calculating a shorter name.");
int exceededCharLength = serviceName.Length - 80;
string repoOrOrgNameSubstring = StringUtil.SubstringPrefix(repoOrOrgName, 45);
Trace.Verbose($"Calculated service name is too long (> {MaxServiceNameLength} chars). Trying again by calculating a shorter name.");
// Add 5 to add -xxxx random number on the end
int exceededCharLength = serviceName.Length - MaxServiceNameLength + 5;
string repoOrOrgNameSubstring = StringUtil.SubstringPrefix(repoOrOrgName, MaxRepoOrgCharacters);
exceededCharLength -= repoOrOrgName.Length - repoOrOrgNameSubstring.Length;
@@ -66,6 +65,10 @@ namespace GitHub.Runner.Listener.Configuration
runnerNameSubstring = StringUtil.SubstringPrefix(settings.AgentName, settings.AgentName.Length - exceededCharLength);
}
// Lets add a suffix with a random number to reduce the chance of collisions between runner names once we truncate
var random = new Random();
var num = random.Next(1000, 9999).ToString();
runnerNameSubstring +=$"-{num}";
serviceName = StringUtil.Format(serviceNamePattern, repoOrOrgNameSubstring, runnerNameSubstring);
}
@@ -73,5 +76,12 @@ namespace GitHub.Runner.Listener.Configuration
Trace.Info($"Service name '{serviceName}' display name '{serviceDisplayName}' will be used for service configuration.");
}
#if (OS_LINUX || OS_OSX)
const int MaxServiceNameLength = 150;
const int MaxRepoOrgCharacters = 70;
#elif OS_WINDOWS
const int MaxServiceNameLength = 80;
const int MaxRepoOrgCharacters = 45;
#endif
}
}

View File

@@ -285,7 +285,7 @@ namespace GitHub.Runner.Listener
{
// at this point, the job execution might encounter some dead lock and even not able to be cancelled.
// no need to localize the exception string should never happen.
throw new InvalidOperationException($"Job dispatch process for {jobDispatch.JobId} has encountered unexpected error, the dispatch task is not able to be canceled within 45 seconds.");
throw new InvalidOperationException($"Job dispatch process for {jobDispatch.JobId} has encountered unexpected error, the dispatch task is not able to be cancelled within 45 seconds.");
}
}
else
@@ -363,7 +363,7 @@ namespace GitHub.Runner.Listener
Trace.Info($"Start renew job request {requestId} for job {message.JobId}.");
Task renewJobRequest = RenewJobRequestAsync(_poolId, requestId, lockToken, orchestrationId, firstJobRequestRenewed, lockRenewalTokenSource.Token);
// wait till first renew succeed or job request is canceled
// wait till first renew succeed or job request is cancelled
// not even start worker if the first renew fail
await Task.WhenAny(firstJobRequestRenewed.Task, renewJobRequest, Task.Delay(-1, jobRequestCancellationToken));
@@ -704,7 +704,7 @@ namespace GitHub.Runner.Listener
{
// OperationCanceledException may caused by http timeout or _lockRenewalTokenSource.Cance();
// Stop renew only on cancellation token fired.
Trace.Info($"job renew has been canceled, stop renew job request {requestId}.");
Trace.Info($"job renew has been cancelled, stop renew job request {requestId}.");
return;
}
catch (Exception ex)
@@ -762,7 +762,7 @@ namespace GitHub.Runner.Listener
}
catch (OperationCanceledException) when (token.IsCancellationRequested)
{
Trace.Info($"job renew has been canceled, stop renew job request {requestId}.");
Trace.Info($"job renew has been cancelled, stop renew job request {requestId}.");
}
}
else

View File

@@ -1,18 +1,18 @@
using GitHub.DistributedTask.WebApi;
using GitHub.Runner.Listener.Configuration;
using GitHub.Services.Common;
using System;
using System.Collections.Generic;
using System.Diagnostics;
using System.IO;
using System.Runtime.InteropServices;
using System.Security.Cryptography;
using System.Text;
using System.Threading;
using System.Threading.Tasks;
using System.Security.Cryptography;
using System.IO;
using System.Text;
using GitHub.Services.OAuth;
using System.Diagnostics;
using System.Runtime.InteropServices;
using GitHub.DistributedTask.WebApi;
using GitHub.Runner.Common;
using GitHub.Runner.Listener.Configuration;
using GitHub.Runner.Sdk;
using GitHub.Services.Common;
using GitHub.Services.OAuth;
namespace GitHub.Runner.Listener
{
@@ -33,6 +33,7 @@ namespace GitHub.Runner.Listener
private IRunnerServer _runnerServer;
private TaskAgentSession _session;
private TimeSpan _getNextMessageRetryInterval;
private bool _accessTokenRevoked = false;
private readonly TimeSpan _sessionCreationRetryInterval = TimeSpan.FromSeconds(30);
private readonly TimeSpan _sessionConflictRetryLimit = TimeSpan.FromMinutes(4);
private readonly TimeSpan _clockSkewRetryLimit = TimeSpan.FromMinutes(30);
@@ -111,6 +112,7 @@ namespace GitHub.Runner.Listener
catch (TaskAgentAccessTokenExpiredException)
{
Trace.Info("Runner OAuth token has been revoked. Session creation failed.");
_accessTokenRevoked = true;
throw;
}
catch (Exception ex)
@@ -153,12 +155,19 @@ namespace GitHub.Runner.Listener
public async Task DeleteSessionAsync()
{
if (_session != null && _session.SessionId != Guid.Empty)
{
if (!_accessTokenRevoked)
{
using (var ts = new CancellationTokenSource(TimeSpan.FromSeconds(30)))
{
await _runnerServer.DeleteAgentSessionAsync(_settings.PoolId, _session.SessionId, ts.Token);
}
}
else
{
Trace.Warning("Runner OAuth token has been revoked. Skip deleting session.");
}
}
}
public async Task<TaskAgentMessage> GetNextMessageAsync(CancellationToken token)
@@ -205,6 +214,7 @@ namespace GitHub.Runner.Listener
catch (TaskAgentAccessTokenExpiredException)
{
Trace.Info("Runner OAuth token has been revoked. Unable to pull message.");
_accessTokenRevoked = true;
throw;
}
catch (Exception ex)

View File

@@ -95,7 +95,15 @@ namespace GitHub.Runner.Listener
var unknownCommandlines = command.Validate();
if (unknownCommandlines.Count > 0)
{
terminal.WriteError($"Unrecognized command-line input arguments: '{string.Join(", ", unknownCommandlines)}'. For usage refer to: .\\config.cmd --help or ./config.sh --help");
string commandName = command.GetCommandName();
if (string.IsNullOrEmpty(commandName))
{
terminal.WriteError($"This command does not recognize the command-line input arguments: '{string.Join(", ", unknownCommandlines)}'. For usage refer to: .\\config.cmd --help or ./config.sh --help");
}
else
{
terminal.WriteError($"Unrecognized command-line input arguments for command {commandName}: '{string.Join(", ", unknownCommandlines)}'. For usage refer to: .\\config.cmd --help or ./config.sh --help");
}
}
// Defer to the Runner class to execute the command.

View File

@@ -12,6 +12,7 @@ using GitHub.Runner.Common;
using GitHub.Runner.Sdk;
using System.Linq;
using GitHub.Runner.Listener.Check;
using System.Collections.Generic;
namespace GitHub.Runner.Listener
{
@@ -407,6 +408,27 @@ namespace GitHub.Runner.Listener
{
autoUpdateInProgress = true;
var runnerUpdateMessage = JsonUtility.FromString<AgentRefreshMessage>(message.Body);
#if DEBUG
// Can mock the update for testing
if (StringUtil.ConvertToBoolean(Environment.GetEnvironmentVariable("GITHUB_ACTIONS_RUNNER_IS_MOCK_UPDATE")))
{
// The mock_update_messages.json file should be an object with keys being the current version and values being the targeted mock version object
// Example: { "2.283.2": {"targetVersion":"2.284.1"}, "2.284.1": {"targetVersion":"2.285.0"}}
var mockUpdatesPath = Path.Combine(HostContext.GetDirectory(WellKnownDirectory.Root), "mock_update_messages.json");
if (File.Exists(mockUpdatesPath))
{
var mockUpdateMessages = JsonUtility.FromString<Dictionary<string, AgentRefreshMessage>>(File.ReadAllText(mockUpdatesPath));
if (mockUpdateMessages.ContainsKey(BuildConstants.RunnerPackage.Version))
{
var mockTargetVersion = mockUpdateMessages[BuildConstants.RunnerPackage.Version].TargetVersion;
_term.WriteLine($"Mocking update, using version {mockTargetVersion} instead of {runnerUpdateMessage.TargetVersion}");
Trace.Info($"Mocking update, using version {mockTargetVersion} instead of {runnerUpdateMessage.TargetVersion}");
runnerUpdateMessage = new AgentRefreshMessage(runnerUpdateMessage.AgentId, mockTargetVersion, runnerUpdateMessage.Timeout);
}
}
}
#endif
var selfUpdater = HostContext.GetService<ISelfUpdater>();
selfUpdateTask = selfUpdater.SelfUpdate(runnerUpdateMessage, jobDispatcher, false, HostContext.RunnerShutdownToken);
Trace.Info("Refresh message received, kick-off selfupdate background process.");
@@ -540,6 +562,7 @@ Config Options:
--work string Relative runner work directory (default {Constants.Path.WorkDirectory})
--replace Replace any existing runner with the same name (default false)
--pat GitHub personal access token used for checking network connectivity when executing `.{separator}run.{ext} --check`
--disableupdate Disable self-hosted runner automatic update to the latest released version`
--ephemeral Configure the runner to only take one job and then let the service un-configure the runner after the job finishes (default false)");
#if OS_WINDOWS

View File

@@ -12,6 +12,7 @@ using System.Threading;
using System.Threading.Tasks;
using GitHub.DistributedTask.WebApi;
using GitHub.Runner.Common;
using GitHub.Runner.Common.Util;
using GitHub.Runner.Sdk;
using GitHub.Services.Common;
using GitHub.Services.WebApi;
@@ -104,7 +105,7 @@ namespace GitHub.Runner.Listener
}
}
await DownloadLatestRunner(token);
await DownloadLatestRunner(token, updateMessage.TargetVersion);
Trace.Info($"Download latest runner and unzip into runner root.");
// wait till all running job finish
@@ -206,7 +207,7 @@ namespace GitHub.Runner.Listener
/// </summary>
/// <param name="token"></param>
/// <returns></returns>
private async Task DownloadLatestRunner(CancellationToken token)
private async Task DownloadLatestRunner(CancellationToken token, string targetVersion)
{
string latestRunnerDirectory = Path.Combine(HostContext.GetDirectory(WellKnownDirectory.Work), Constants.Path.UpdateDirectory);
IOUtil.DeleteDirectory(latestRunnerDirectory, token);
@@ -266,14 +267,57 @@ namespace GitHub.Runner.Listener
try
{
#if DEBUG
// Much of the update process (targetVersion, archive) is server-side, this is a way to control it from here for testing specific update scenarios
// Add files like 'runner2.281.2.tar.gz' or 'runner2.283.0.zip' (depending on your platform) to your runner root folder
// Note that runners still need to be older than the server's runner version in order to receive an 'AgentRefreshMessage' and trigger this update
// Wrapped in #if DEBUG as this should not be in the RELEASE build
if (StringUtil.ConvertToBoolean(Environment.GetEnvironmentVariable("GITHUB_ACTIONS_RUNNER_IS_MOCK_UPDATE")))
{
var waitForDebugger = StringUtil.ConvertToBoolean(Environment.GetEnvironmentVariable("GITHUB_ACTIONS_RUNNER_IS_MOCK_UPDATE_WAIT_FOR_DEBUGGER"));
if (waitForDebugger)
{
int waitInSeconds = 20;
while (!Debugger.IsAttached && waitInSeconds-- > 0)
{
await Task.Delay(1000);
}
Debugger.Break();
}
if (_targetPackage.Platform.StartsWith("win"))
{
archiveFile = Path.Combine(HostContext.GetDirectory(WellKnownDirectory.Root), $"runner{targetVersion}.zip");
}
else
{
archiveFile = Path.Combine(HostContext.GetDirectory(WellKnownDirectory.Root), $"runner{targetVersion}.tar.gz");
}
if (File.Exists(archiveFile))
{
_updateTrace.Enqueue($"Mocking update with file: '{archiveFile}' and targetVersion: '{targetVersion}', nothing is downloaded");
_terminal.WriteLine($"Mocking update with file: '{archiveFile}' and targetVersion: '{targetVersion}', nothing is downloaded");
}
else
{
archiveFile = null;
_terminal.WriteLine($"Mock runner archive not found at {archiveFile} for target version {targetVersion}, proceeding with download instead");
_updateTrace.Enqueue($"Mock runner archive not found at {archiveFile} for target version {targetVersion}, proceeding with download instead");
}
}
#endif
// archiveFile is not null only if we mocked it above
if (string.IsNullOrEmpty(archiveFile))
{
archiveFile = await DownLoadRunner(latestRunnerDirectory, packageDownloadUrl, packageHashValue, token);
if (string.IsNullOrEmpty(archiveFile))
{
throw new TaskCanceledException($"Runner package '{packageDownloadUrl}' failed after {Constants.RunnerDownloadRetryMaxAttempts} download attempts");
}
await ValidateRunnerHash(archiveFile, packageHashValue);
}
await ExtractRunnerPackage(archiveFile, latestRunnerDirectory, token);
}
@@ -460,7 +504,7 @@ namespace GitHub.Runner.Listener
}
catch (OperationCanceledException) when (token.IsCancellationRequested)
{
Trace.Info($"Runner download has been canceled.");
Trace.Info($"Runner download has been cancelled.");
throw;
}
catch (Exception ex)
@@ -761,7 +805,7 @@ namespace GitHub.Runner.Listener
IOUtil.CopyDirectory(_externalsCloneDirectory, Path.Combine(downloadDirectory, Constants.Path.ExternalsDirectory), token);
// try run node.js to see if current node.js works fine after copy over to new location.
var nodeVersions = new[] { "node12", "node16" };
var nodeVersions = NodeUtil.BuiltInNodeVersions;
foreach (var nodeVersion in nodeVersions)
{
var newNodeBinary = Path.Combine(downloadDirectory, Constants.Path.ExternalsDirectory, nodeVersion, "bin", $"node{IOUtil.ExeExtension}");
@@ -1026,7 +1070,7 @@ namespace GitHub.Runner.Listener
var stopWatch = Stopwatch.StartNew();
string binDir = HostContext.GetDirectory(WellKnownDirectory.Bin);
string node = Path.Combine(HostContext.GetDirectory(WellKnownDirectory.Externals), "node12", "bin", $"node{IOUtil.ExeExtension}");
string node = Path.Combine(HostContext.GetDirectory(WellKnownDirectory.Externals), NodeUtil.GetInternalNodeVersion(), "bin", $"node{IOUtil.ExeExtension}");
string hashFilesScript = Path.Combine(binDir, "hashFiles");
var hashResult = string.Empty;
@@ -1060,6 +1104,8 @@ namespace GitHub.Runner.Listener
arguments: $"\"{hashFilesScript.Replace("\"", "\\\"")}\"",
environment: env,
requireExitCodeZero: false,
outputEncoding: null,
killProcessOnCancel: true,
cancellationToken: token);
if (exitCode != 0)

View File

@@ -469,7 +469,7 @@ namespace GitHub.Runner.Plugins.Artifact
try
{
uploadTimer.Restart();
using (HttpResponseMessage response = await _fileContainerHttpClient.UploadFileAsync(_containerId, itemPath, fs, _projectId, cancellationToken: token, chunkSize: 4 * 1024 * 1024))
using (HttpResponseMessage response = await _fileContainerHttpClient.UploadFileAsync(_containerId, itemPath, fs, _projectId, cancellationToken: token))
{
if (response == null || response.StatusCode != HttpStatusCode.Created)
{

View File

@@ -166,7 +166,7 @@ namespace GitHub.Runner.Plugins.Repository.v1_0
}
else
{
// delete the index.lock file left by previous canceled build or any operation cause git.exe crash last time.
// delete the index.lock file left by previous cancelled build or any operation cause git.exe crash last time.
string lockFile = Path.Combine(targetPath, ".git\\index.lock");
if (File.Exists(lockFile))
{
@@ -181,7 +181,7 @@ namespace GitHub.Runner.Plugins.Repository.v1_0
}
}
// delete the shallow.lock file left by previous canceled build or any operation cause git.exe crash last time.
// delete the shallow.lock file left by previous cancelled build or any operation cause git.exe crash last time.
string shallowLockFile = Path.Combine(targetPath, ".git\\shallow.lock");
if (File.Exists(shallowLockFile))
{

View File

@@ -150,7 +150,7 @@ namespace GitHub.Runner.Plugins.Repository.v1_1
}
else
{
// delete the index.lock file left by previous canceled build or any operation cause git.exe crash last time.
// delete the index.lock file left by previous cancelled build or any operation cause git.exe crash last time.
string lockFile = Path.Combine(targetPath, ".git\\index.lock");
if (File.Exists(lockFile))
{
@@ -165,7 +165,7 @@ namespace GitHub.Runner.Plugins.Repository.v1_1
}
}
// delete the shallow.lock file left by previous canceled build or any operation cause git.exe crash last time.
// delete the shallow.lock file left by previous cancelled build or any operation cause git.exe crash last time.
string shallowLockFile = Path.Combine(targetPath, ".git\\shallow.lock");
if (File.Exists(shallowLockFile))
{

View File

@@ -108,7 +108,7 @@ namespace GitHub.Runner.Sdk
}
// Create a new token source for the parallel query. The parallel query should be
// canceled after the first error is encountered. Otherwise the number of exceptions
// cancelled after the first error is encountered. Otherwise the number of exceptions
// could get out of control for a large directory with access denied on every file.
using (var tokenSource = CancellationTokenSource.CreateLinkedTokenSource(cancellationToken))
{

View File

@@ -178,7 +178,7 @@ namespace GitHub.Runner.Worker
Message = $"Invoked ::stopCommand:: with token: [{stopToken}]",
Type = JobTelemetryType.ActionCommand
};
context.JobTelemetry.Add(telemetry);
context.Global.JobTelemetry.Add(telemetry);
}
if (isTokenInvalid && !allowUnsecureStopCommandTokens)

View File

@@ -651,10 +651,10 @@ namespace GitHub.Runner.Worker
{
try
{
actionDownloadInfos = await jobServer.ResolveActionDownloadInfoAsync(executionContext.Global.Plan.ScopeIdentifier, executionContext.Global.Plan.PlanType, executionContext.Global.Plan.PlanId, new WebApi.ActionReferenceList { Actions = actionReferences }, executionContext.CancellationToken);
actionDownloadInfos = await jobServer.ResolveActionDownloadInfoAsync(executionContext.Global.Plan.ScopeIdentifier, executionContext.Global.Plan.PlanType, executionContext.Global.Plan.PlanId, executionContext.Root.Id, new WebApi.ActionReferenceList { Actions = actionReferences }, executionContext.CancellationToken);
break;
}
catch (Exception ex) when (!executionContext.CancellationToken.IsCancellationRequested) // Do not retry if the run is canceled.
catch (Exception ex) when (!executionContext.CancellationToken.IsCancellationRequested) // Do not retry if the run is cancelled.
{
// UnresolvableActionDownloadInfoException is a 422 client error, don't retry
// Some possible cases are:

View File

@@ -1,18 +1,17 @@
using System;
using System.IO;
using System.Text;
using System.Collections.Generic;
using System.Linq;
using System.Threading.Tasks;
using GitHub.DistributedTask.ObjectTemplating;
using GitHub.DistributedTask.ObjectTemplating.Tokens;
using GitHub.DistributedTask.Pipelines;
using GitHub.DistributedTask.Pipelines.ContextData;
using GitHub.DistributedTask.Pipelines.ObjectTemplating;
using GitHub.Runner.Common;
using GitHub.Runner.Common.Util;
using GitHub.Runner.Sdk;
using GitHub.Runner.Worker.Handlers;
using Pipelines = GitHub.DistributedTask.Pipelines;
using GitHub.Runner.Common;
using GitHub.Runner.Sdk;
using System.Collections.Generic;
namespace GitHub.Runner.Worker
{
@@ -141,21 +140,7 @@ namespace GitHub.Runner.Worker
IStepHost stepHost = HostContext.CreateService<IDefaultStepHost>();
// Makes directory for event_path data
var tempDirectory = HostContext.GetDirectory(WellKnownDirectory.Temp);
var workflowDirectory = Path.Combine(tempDirectory, "_github_workflow");
Directory.CreateDirectory(workflowDirectory);
var gitHubEvent = ExecutionContext.GetGitHubContext("event");
// adds the GitHub event path/file if the event exists
if (gitHubEvent != null)
{
var workflowFile = Path.Combine(workflowDirectory, "event.json");
Trace.Info($"Write event payload to {workflowFile}");
File.WriteAllText(workflowFile, gitHubEvent, new UTF8Encoding(false));
ExecutionContext.SetGitHubContext("event_path", workflowFile);
}
ExecutionContext.WriteWebhookPayload();
// Set GITHUB_ACTION_REPOSITORY if this Action is from a repository
if (Action.Reference is Pipelines.RepositoryPathReference repoPathReferenceAction &&
@@ -186,8 +171,16 @@ namespace GitHub.Runner.Worker
// Load the inputs.
ExecutionContext.Debug("Loading inputs");
Dictionary<string, string> inputs;
if (ExecutionContext.Global.Variables.GetBoolean(Constants.Runner.Features.UseContainerPathForTemplate) ?? false)
{
inputs = EvaluateStepInputs(stepHost);
}
else
{
var templateEvaluator = ExecutionContext.ToPipelineTemplateEvaluator();
var inputs = templateEvaluator.EvaluateStepInputs(Action.Inputs, ExecutionContext.ExpressionValues, ExecutionContext.ExpressionFunctions);
inputs = templateEvaluator.EvaluateStepInputs(Action.Inputs, ExecutionContext.ExpressionValues, ExecutionContext.ExpressionFunctions);
}
var userInputs = new HashSet<string>(StringComparer.OrdinalIgnoreCase);
foreach (KeyValuePair<string, string> input in inputs)
@@ -274,8 +267,8 @@ namespace GitHub.Runner.Worker
actionDirectory: definition.Directory,
localActionContainerSetupSteps: localActionContainerSetupSteps);
// Print out action details
handler.PrintActionDetails(Stage);
// Print out action details and log telemetry
handler.PrepareExecution(Stage);
// Run the task.
try
@@ -314,6 +307,15 @@ namespace GitHub.Runner.Worker
return didFullyEvaluate;
}
private Dictionary<String, String> EvaluateStepInputs(IStepHost stepHost)
{
DictionaryContextData expressionValues = ExecutionContext.GetExpressionValues(stepHost);
var templateEvaluator = ExecutionContext.ToPipelineTemplateEvaluator();
var inputs = templateEvaluator.EvaluateStepInputs(Action.Inputs, expressionValues, ExecutionContext.ExpressionFunctions);
return inputs;
}
private string GenerateDisplayName(ActionStep action, DictionaryContextData contextData, IExecutionContext context, out bool didFullyEvaluate)
{
ArgUtil.NotNull(context, nameof(context));

View File

@@ -1,25 +1,21 @@
using System;
using System.Collections;
using System.Collections.Generic;
using System.Collections.Specialized;
using System.Globalization;
using System.IO;
using System.Linq;
using System.Text;
using System.Text.RegularExpressions;
using System.Threading;
using System.Threading.Tasks;
using System.Web;
using GitHub.DistributedTask.Expressions2;
using GitHub.DistributedTask.Pipelines;
using GitHub.DistributedTask.ObjectTemplating.Tokens;
using GitHub.DistributedTask.Pipelines.ContextData;
using GitHub.DistributedTask.Pipelines.ObjectTemplating;
using GitHub.DistributedTask.WebApi;
using GitHub.Runner.Common.Util;
using GitHub.Runner.Common;
using GitHub.Runner.Common.Util;
using GitHub.Runner.Sdk;
using GitHub.Runner.Worker.Container;
using GitHub.Services.WebApi;
using GitHub.Runner.Worker.Handlers;
using Newtonsoft.Json;
using ObjectTemplating = GitHub.DistributedTask.ObjectTemplating;
using Pipelines = GitHub.DistributedTask.Pipelines;
@@ -52,8 +48,7 @@ namespace GitHub.Runner.Worker
Dictionary<string, string> IntraActionState { get; }
Dictionary<string, VariableValue> JobOutputs { get; }
ActionsEnvironmentReference ActionsEnvironment { get; }
List<ActionsStepTelemetry> ActionsStepsTelemetry { get; }
List<JobTelemetry> JobTelemetry { get; }
ActionsStepTelemetry StepTelemetry { get; }
DictionaryContextData ExpressionValues { get; }
IList<IFunctionInfo> ExpressionFunctions { get; }
JobContext JobContext { get; }
@@ -109,12 +104,21 @@ namespace GitHub.Runner.Worker
// others
void ForceTaskComplete();
void RegisterPostJobStep(IStep step);
void PublishStepTelemetry();
void ApplyContinueOnError(TemplateToken continueOnError);
void UpdateGlobalStepsContext();
void WriteWebhookPayload();
}
public sealed class ExecutionContext : RunnerService, IExecutionContext
{
private const int _maxIssueCount = 10;
private const int _throttlingDelayReportThreshold = 10 * 1000; // Don't report throttling with less than 10 seconds delay
private const int _maxIssueMessageLength = 4096; // Don't send issue with huge message since we can't forward them from actions to check annotation.
private const int _maxIssueCountInTelemetry = 3; // Only send the first 3 issues to telemetry
private const int _maxIssueMessageLengthInTelemetry = 256; // Only send the first 256 characters of issue message to telemetry
private readonly TimelineRecord _record = new TimelineRecord();
private readonly Dictionary<Guid, TimelineRecord> _detailRecords = new Dictionary<Guid, TimelineRecord>();
@@ -139,6 +143,7 @@ namespace GitHub.Runner.Worker
// only job level ExecutionContext will track throttling delay.
private long _totalThrottlingDelayInMilliseconds = 0;
private bool _stepTelemetryPublished = false;
public Guid Id => _record.Id;
public Guid EmbeddedId { get; private set; }
@@ -152,8 +157,7 @@ namespace GitHub.Runner.Worker
public Dictionary<string, VariableValue> JobOutputs { get; private set; }
public ActionsEnvironmentReference ActionsEnvironment { get; private set; }
public List<ActionsStepTelemetry> ActionsStepsTelemetry { get; private set; }
public List<JobTelemetry> JobTelemetry { get; private set; }
public ActionsStepTelemetry StepTelemetry { get; } = new ActionsStepTelemetry();
public DictionaryContextData ExpressionValues { get; } = new DictionaryContextData();
public IList<IFunctionInfo> ExpressionFunctions { get; } = new List<IFunctionInfo>();
@@ -294,7 +298,20 @@ namespace GitHub.Runner.Worker
Root.PostJobSteps.Push(step);
}
public IExecutionContext CreateChild(Guid recordId, string displayName, string refName, string scopeName, string contextName, ActionRunStage stage, Dictionary<string, string> intraActionState = null, int? recordOrder = null, IPagingLogger logger = null, bool isEmbedded = false, CancellationTokenSource cancellationTokenSource = null, Guid embeddedId = default(Guid), string siblingScopeName = null)
public IExecutionContext CreateChild(
Guid recordId,
string displayName,
string refName,
string scopeName,
string contextName,
ActionRunStage stage,
Dictionary<string, string> intraActionState = null,
int? recordOrder = null,
IPagingLogger logger = null,
bool isEmbedded = false,
CancellationTokenSource cancellationTokenSource = null,
Guid embeddedId = default(Guid),
string siblingScopeName = null)
{
Trace.Entering();
@@ -306,7 +323,6 @@ namespace GitHub.Runner.Worker
child.Stage = stage;
child.EmbeddedId = embeddedId;
child.SiblingScopeName = siblingScopeName;
child.JobTelemetry = JobTelemetry;
if (intraActionState == null)
{
child.IntraActionState = new Dictionary<string, string>(StringComparer.OrdinalIgnoreCase);
@@ -346,6 +362,9 @@ namespace GitHub.Runner.Worker
}
child.IsEmbedded = isEmbedded;
child.StepTelemetry.StepId = recordId;
child.StepTelemetry.Stage = stage.ToString();
child.StepTelemetry.IsEmbedded = isEmbedded;
return child;
}
@@ -354,7 +373,13 @@ namespace GitHub.Runner.Worker
/// An embedded execution context shares the same record ID, record name, logger,
/// and a linked cancellation token.
/// </summary>
public IExecutionContext CreateEmbeddedChild(string scopeName, string contextName, Guid embeddedId, ActionRunStage stage, Dictionary<string, string> intraActionState = null, string siblingScopeName = null)
public IExecutionContext CreateEmbeddedChild(
string scopeName,
string contextName,
Guid embeddedId,
ActionRunStage stage,
Dictionary<string, string> intraActionState = null,
string siblingScopeName = null)
{
return Root.CreateChild(_record.Id, _record.Name, _record.Id.ToString("N"), scopeName, contextName, stage, logger: _logger, isEmbedded: true, cancellationTokenSource: CancellationTokenSource.CreateLinkedTokenSource(_cancellationTokenSource.Token), intraActionState: intraActionState, embeddedId: embeddedId, siblingScopeName: siblingScopeName);
}
@@ -404,6 +429,8 @@ namespace GitHub.Runner.Worker
}
}
PublishStepTelemetry();
if (Root != this)
{
// only dispose TokenSource for step level ExecutionContext
@@ -412,14 +439,19 @@ namespace GitHub.Runner.Worker
_logger.End();
UpdateGlobalStepsContext();
return Result.Value;
}
public void UpdateGlobalStepsContext()
{
// Skip if generated context name. Generated context names start with "__". After 3.2 the server will never send an empty context name.
if (!string.IsNullOrEmpty(ContextName) && !ContextName.StartsWith("__", StringComparison.Ordinal))
{
Global.StepsContext.SetOutcome(ScopeName, ContextName, (Outcome ?? Result ?? TaskResult.Succeeded).ToActionResult());
Global.StepsContext.SetConclusion(ScopeName, ContextName, (Result ?? TaskResult.Succeeded).ToActionResult());
}
return Result.Value;
}
public void SetRunnerContext(string name, string value)
@@ -519,11 +551,20 @@ namespace GitHub.Runner.Worker
}
issue.Message = HostContext.SecretMasker.MaskSecrets(issue.Message);
if (issue.Message.Length > _maxIssueMessageLength)
{
issue.Message = issue.Message[.._maxIssueMessageLength];
}
// Tracking the line number (logFileLineNumber) and step number (stepNumber) for each issue that gets created
// Actions UI from the run summary page use both values to easily link to an exact locations in logs where annotations originate from
if (_record.Order != null)
{
issue.Data["stepNumber"] = _record.Order.ToString();
}
if (issue.Type == IssueType.Error)
{
// tracking line number for each issue in log file
// log UI use this to navigate from issue to log
if (!string.IsNullOrEmpty(logMessage))
{
long logLineNumber = Write(WellKnownTags.Error, logMessage);
@@ -539,8 +580,6 @@ namespace GitHub.Runner.Worker
}
else if (issue.Type == IssueType.Warning)
{
// tracking line number for each issue in log file
// log UI use this to navigate from issue to log
if (!string.IsNullOrEmpty(logMessage))
{
long logLineNumber = Write(WellKnownTags.Warning, logMessage);
@@ -556,9 +595,6 @@ namespace GitHub.Runner.Worker
}
else if (issue.Type == IssueType.Notice)
{
// tracking line number for each issue in log file
// log UI use this to navigate from issue to log
if (!string.IsNullOrEmpty(logMessage))
{
long logLineNumber = Write(WellKnownTags.Notice, logMessage);
@@ -648,22 +684,29 @@ namespace GitHub.Runner.Worker
// Variables
Global.Variables = new Variables(HostContext, message.Variables);
if (Global.Variables.GetBoolean("DistributedTask.ForceInternalNodeVersionOnRunnerTo12") ?? false)
{
Environment.SetEnvironmentVariable(Constants.Variables.Agent.ForcedInternalNodeVersion, "node12");
}
// Environment variables shared across all actions
Global.EnvironmentVariables = new Dictionary<string, string>(VarUtil.EnvironmentVariableKeyComparer);
// Job defaults shared across all actions
Global.JobDefaults = new Dictionary<string, IDictionary<string, string>>(StringComparer.OrdinalIgnoreCase);
// Job Telemetry
Global.JobTelemetry = new List<JobTelemetry>();
// ActionsStepTelemetry for entire job
Global.StepsTelemetry = new List<ActionsStepTelemetry>();
// Job Outputs
JobOutputs = new Dictionary<string, VariableValue>(StringComparer.OrdinalIgnoreCase);
// Actions environment
ActionsEnvironment = message.ActionsEnvironment;
// ActionsStepTelemetry
ActionsStepsTelemetry = new List<ActionsStepTelemetry>();
JobTelemetry = new List<JobTelemetry>();
// Service container info
Global.ServiceContainers = new List<ContainerInfo>();
@@ -895,6 +938,83 @@ namespace GitHub.Runner.Worker
return Root._matchers ?? Array.Empty<IssueMatcherConfig>();
}
public void PublishStepTelemetry()
{
if (!_stepTelemetryPublished)
{
// Add to the global steps telemetry only if we have something to log.
if (!string.IsNullOrEmpty(StepTelemetry?.Type))
{
if (!IsEmbedded)
{
StepTelemetry.Result = _record.Result;
}
if (!IsEmbedded &&
_record.FinishTime != null &&
_record.StartTime != null)
{
StepTelemetry.ExecutionTimeInSeconds = (int)Math.Ceiling((_record.FinishTime - _record.StartTime)?.TotalSeconds ?? 0);
}
if (!IsEmbedded &&
_record.Issues.Count > 0)
{
foreach (var issue in _record.Issues)
{
if ((issue.Type == IssueType.Error || issue.Type == IssueType.Warning) &&
!string.IsNullOrEmpty(issue.Message))
{
string issueTelemetry;
if (issue.Message.Length > _maxIssueMessageLengthInTelemetry)
{
issueTelemetry = $"{issue.Message[.._maxIssueMessageLengthInTelemetry]}";
}
else
{
issueTelemetry = issue.Message;
}
StepTelemetry.ErrorMessages.Add(issueTelemetry);
// Only send over the first 3 issues to avoid sending too much data.
if (StepTelemetry.ErrorMessages.Count >= _maxIssueCountInTelemetry)
{
break;
}
}
}
}
Trace.Info($"Publish step telemetry for current step {StringUtil.ConvertToJson(StepTelemetry)}.");
Global.StepsTelemetry.Add(StepTelemetry);
_stepTelemetryPublished = true;
}
}
else
{
Trace.Info($"Step telemetry has already been published.");
}
}
public void WriteWebhookPayload()
{
// Makes directory for event_path data
var tempDirectory = HostContext.GetDirectory(WellKnownDirectory.Temp);
var workflowDirectory = Path.Combine(tempDirectory, "_github_workflow");
Directory.CreateDirectory(workflowDirectory);
var gitHubEvent = GetGitHubContext("event");
// adds the GitHub event path/file if the event exists
if (gitHubEvent != null)
{
var workflowFile = Path.Combine(workflowDirectory, "event.json");
Trace.Info($"Write event payload to {workflowFile}");
File.WriteAllText(workflowFile, gitHubEvent, new UTF8Encoding(false));
SetGitHubContext("event_path", workflowFile);
}
}
private void InitializeTimelineRecord(Guid timelineId, Guid timelineRecordId, Guid? parentTimelineRecordId, string recordType, string displayName, string refName, int? order)
{
_mainTimelineId = timelineId;
@@ -949,6 +1069,36 @@ namespace GitHub.Runner.Worker
var newGuid = Guid.NewGuid();
return CreateChild(newGuid, displayName, newGuid.ToString("N"), null, null, ActionRunStage.Post, intraActionState, _childTimelineRecordOrder - Root.PostJobSteps.Count, siblingScopeName: siblingScopeName);
}
public void ApplyContinueOnError(TemplateToken continueOnErrorToken)
{
if (Result != TaskResult.Failed)
{
return;
}
var continueOnError = false;
try
{
var templateEvaluator = this.ToPipelineTemplateEvaluator();
continueOnError = templateEvaluator.EvaluateStepContinueOnError(continueOnErrorToken, ExpressionValues, ExpressionFunctions);
}
catch (Exception ex)
{
Trace.Info("The step failed and an error occurred when attempting to determine whether to continue on error.");
Trace.Error(ex);
this.Error("The step failed and an error occurred when attempting to determine whether to continue on error.");
this.Error(ex);
}
if (continueOnError)
{
Outcome = Result;
Result = TaskResult.Succeeded;
Trace.Info($"Updated step result (continue on error)");
}
UpdateGlobalStepsContext();
}
}
// The Error/Warning/etc methods are created as extension methods to simplify unit testing.
@@ -970,7 +1120,6 @@ namespace GitHub.Runner.Worker
context.Error(ex.Message);
context.Debug(ex.ToString());
}
// Do not add a format string overload. See comment on ExecutionContext.Write().
public static void Error(this IExecutionContext context, string message)
{
@@ -1044,6 +1193,66 @@ namespace GitHub.Runner.Worker
{
return new TemplateTraceWriter(context);
}
public static DictionaryContextData GetExpressionValues(this IExecutionContext context, IStepHost stepHost)
{
if (stepHost is ContainerStepHost)
{
var expressionValues = context.ExpressionValues.Clone() as DictionaryContextData;
context.UpdatePathsInExpressionValues("github", expressionValues, stepHost);
context.UpdatePathsInExpressionValues("runner", expressionValues, stepHost);
return expressionValues;
}
else
{
return context.ExpressionValues.Clone() as DictionaryContextData;
}
}
private static void UpdatePathsInExpressionValues(this IExecutionContext context, string contextName, DictionaryContextData expressionValues, IStepHost stepHost)
{
var dict = expressionValues[contextName].AssertDictionary($"expected context {contextName} to be a dictionary");
context.ResolvePathsInExpressionValuesDictionary(dict, stepHost);
expressionValues[contextName] = dict;
}
private static void ResolvePathsInExpressionValuesDictionary(this IExecutionContext context, DictionaryContextData dict, IStepHost stepHost)
{
foreach (var key in dict.Keys.ToList())
{
if (dict[key] is StringContextData)
{
var value = dict[key].ToString();
if (!string.IsNullOrEmpty(value))
{
dict[key] = new StringContextData(stepHost.ResolvePathForStepHost(value));
}
}
else if (dict[key] is DictionaryContextData)
{
var innerDict = dict[key].AssertDictionary("expected dictionary");
context.ResolvePathsInExpressionValuesDictionary(innerDict, stepHost);
var updatedDict = new DictionaryContextData();
foreach (var k in innerDict.Keys.ToList())
{
updatedDict[k] = innerDict[k];
}
dict[key] = updatedDict;
}
else if (dict[key] is CaseSensitiveDictionaryContextData)
{
var innerDict = dict[key].AssertDictionary("expected dictionary");
context.ResolvePathsInExpressionValuesDictionary(innerDict, stepHost);
var updatedDict = new CaseSensitiveDictionaryContextData();
foreach (var k in innerDict.Keys.ToList())
{
updatedDict[k] = innerDict[k];
}
dict[key] = updatedDict;
}
}
}
}
internal sealed class TemplateTraceWriter : ObjectTemplating.ITraceWriter

View File

@@ -7,6 +7,7 @@ using GitHub.Runner.Sdk;
using System.Reflection;
using System.Threading;
using System.Collections.Generic;
using GitHub.Runner.Common.Util;
namespace GitHub.Runner.Worker.Expressions
{
@@ -62,7 +63,7 @@ namespace GitHub.Runner.Worker.Expressions
string binDir = Path.GetDirectoryName(Assembly.GetEntryAssembly().Location);
string runnerRoot = new DirectoryInfo(binDir).Parent.FullName;
string node = Path.Combine(runnerRoot, "externals", "node12", "bin", $"node{IOUtil.ExeExtension}");
string node = Path.Combine(runnerRoot, "externals", NodeUtil.GetInternalNodeVersion(), "bin", $"node{IOUtil.ExeExtension}");
string hashFilesScript = Path.Combine(binDir, "hashFiles");
var hashResult = string.Empty;
var p = new ProcessInvoker(new HashFilesTrace(context.Trace));

View File

@@ -181,6 +181,10 @@ namespace GitHub.Runner.Worker
{
throw new Exception($"Invalid environment variable value. Matching delimiter not found '{delimiter}'");
}
if (newline == null)
{
throw new Exception($"Invalid environment variable value. EOF marker missing new line.");
}
endIndex = index - newline.Length;
tempLine = ReadLine(text, ref index, out newline);
}
@@ -259,4 +263,72 @@ namespace GitHub.Runner.Worker
return text.Substring(originalIndex, lfIndex - originalIndex);
}
}
public sealed class CreateStepSummaryCommand : RunnerService, IFileCommandExtension
{
public const int AttachmentSizeLimit = 1024 * 1024;
public string ContextName => "step_summary";
public string FilePrefix => "step_summary_";
public Type ExtensionType => typeof(IFileCommandExtension);
public void ProcessCommand(IExecutionContext context, string filePath, ContainerInfo container)
{
if (!context.Global.Variables.GetBoolean("DistributedTask.UploadStepSummary") ?? true)
{
Trace.Info("Step Summary is disabled; skipping attachment upload");
return;
}
if (String.IsNullOrEmpty(filePath) || !File.Exists(filePath))
{
Trace.Info($"Step Summary file ({filePath}) does not exist; skipping attachment upload");
return;
}
try
{
var fileSize = new FileInfo(filePath).Length;
if (fileSize == 0)
{
Trace.Info($"Step Summary file ({filePath}) is empty; skipping attachment upload");
return;
}
if (fileSize > AttachmentSizeLimit)
{
context.Error(String.Format(Constants.Runner.UnsupportedSummarySize, AttachmentSizeLimit / 1024, fileSize / 1024));
Trace.Info($"Step Summary file ({filePath}) is too large ({fileSize} bytes); skipping attachment upload");
return;
}
Trace.Verbose($"Step Summary file exists: {filePath} and has a file size of {fileSize} bytes");
var scrubbedFilePath = filePath + "-scrubbed";
using (var streamReader = new StreamReader(filePath))
using (var streamWriter = new StreamWriter(scrubbedFilePath))
{
string line;
while ((line = streamReader.ReadLine()) != null)
{
var maskedLine = HostContext.SecretMasker.MaskSecrets(line);
streamWriter.WriteLine(maskedLine);
}
}
var attachmentName = context.Id.ToString();
Trace.Info($"Queueing file ({filePath}) for attachment upload ({attachmentName})");
// Attachments must be added to the parent context (job), not the current context (step)
context.Root.QueueAttachFile(ChecksAttachmentType.StepSummary, attachmentName, scrubbedFilePath);
}
catch (Exception e)
{
Trace.Error($"Error while processing file ({filePath}): {e}");
context.Error($"Failed to create step summary using 'GITHUB_STEP_SUMMARY': {e.Message}");
}
}
}
}

View File

@@ -8,10 +8,10 @@ namespace GitHub.Runner.Worker
{
private readonly HashSet<string> _contextEnvAllowlist = new HashSet<string>(StringComparer.OrdinalIgnoreCase)
{
"action",
"action_path",
"action_ref",
"action_repository",
"action",
"actor",
"api_url",
"base_ref",
@@ -22,18 +22,20 @@ namespace GitHub.Runner.Worker
"head_ref",
"job",
"path",
"ref",
"ref_name",
"ref_protected",
"ref_type",
"repository",
"ref",
"repository_owner",
"repository",
"retention_days",
"run_attempt",
"run_id",
"run_number",
"server_url",
"sha",
"step_summary",
"triggering_actor",
"workflow",
"workspace",
};

View File

@@ -14,6 +14,8 @@ namespace GitHub.Runner.Worker
public PlanFeatures Features { get; set; }
public IList<String> FileTable { get; set; }
public IDictionary<String, IDictionary<String, String>> JobDefaults { get; set; }
public List<ActionsStepTelemetry> StepsTelemetry { get; set; }
public List<JobTelemetry> JobTelemetry { get; set; }
public TaskOrchestrationPlanReference Plan { get; set; }
public List<string> PrependPath { get; set; }
public List<ContainerInfo> ServiceContainers { get; set; }

View File

@@ -1,8 +1,6 @@
using System;
using System.Collections.Generic;
using System.IO;
using System.Linq;
using System.Text;
using System.Threading;
using System.Threading.Tasks;
using GitHub.DistributedTask.Expressions2;
@@ -13,7 +11,6 @@ using GitHub.DistributedTask.WebApi;
using GitHub.Runner.Common;
using GitHub.Runner.Common.Util;
using GitHub.Runner.Sdk;
using GitHub.Runner.Worker;
using GitHub.Runner.Worker.Expressions;
using Pipelines = GitHub.DistributedTask.Pipelines;
@@ -67,7 +64,7 @@ namespace GitHub.Runner.Worker.Handlers
steps = Data.Steps;
}
// Add Telemetry to JobContext to send with JobCompleteMessage
// Set extra telemetry base on the current context.
if (stage == ActionRunStage.Main)
{
var hasRunsStep = false;
@@ -83,19 +80,15 @@ namespace GitHub.Runner.Worker.Handlers
hasUsesStep = true;
}
}
var pathReference = Action as Pipelines.RepositoryPathReference;
var telemetry = new ActionsStepTelemetry {
Ref = GetActionRef(),
HasPreStep = Data.HasPre,
HasPostStep = Data.HasPost,
IsEmbedded = ExecutionContext.IsEmbedded,
Type = "composite",
HasRunsStep = hasRunsStep,
HasUsesStep = hasUsesStep,
StepCount = steps.Count
};
ExecutionContext.Root.ActionsStepsTelemetry.Add(telemetry);
ExecutionContext.StepTelemetry.HasPreStep = Data.HasPre;
ExecutionContext.StepTelemetry.HasPostStep = Data.HasPost;
ExecutionContext.StepTelemetry.HasRunsStep = hasRunsStep;
ExecutionContext.StepTelemetry.HasUsesStep = hasUsesStep;
ExecutionContext.StepTelemetry.StepCount = steps.Count;
}
ExecutionContext.StepTelemetry.Type = "composite";
try
{
@@ -303,7 +296,7 @@ namespace GitHub.Runner.Worker.Handlers
// Register job cancellation call back only if job cancellation token not been fire before each step run
if (!ExecutionContext.Root.CancellationToken.IsCancellationRequested)
{
// Test the condition again. The job was canceled after the condition was originally evaluated.
// Test the condition again. The job was cancelled after the condition was originally evaluated.
jobCancelRegister = ExecutionContext.Root.CancellationToken.Register(() =>
{
// Mark job as cancelled
@@ -403,7 +396,7 @@ namespace GitHub.Runner.Worker.Handlers
jobCancelRegister = null;
}
}
// Check failed or canceled
// Check failed or cancelled
if (step.ExecutionContext.Result == TaskResult.Failed || step.ExecutionContext.Result == TaskResult.Canceled)
{
Trace.Info($"Update job result with current composite step result '{step.ExecutionContext.Result}'.");
@@ -411,7 +404,7 @@ namespace GitHub.Runner.Worker.Handlers
}
// Update context
SetStepsContext(step);
step.ExecutionContext.UpdateGlobalStepsContext();
}
}
@@ -456,23 +449,17 @@ namespace GitHub.Runner.Worker.Handlers
SetStepConclusion(step, Common.Util.TaskResultUtil.MergeTaskResults(step.ExecutionContext.Result, step.ExecutionContext.CommandResult.Value));
}
step.ExecutionContext.ApplyContinueOnError(step.ContinueOnError);
Trace.Info($"Step result: {step.ExecutionContext.Result}");
step.ExecutionContext.Debug($"Finished: {step.DisplayName}");
step.ExecutionContext.PublishStepTelemetry();
}
private void SetStepConclusion(IStep step, TaskResult result)
{
step.ExecutionContext.Result = result;
SetStepsContext(step);
}
private void SetStepsContext(IStep step)
{
if (!string.IsNullOrEmpty(step.ExecutionContext.ContextName) && !step.ExecutionContext.ContextName.StartsWith("__", StringComparison.Ordinal))
{
// TODO: when we support continue on error, we may need to do logic here to change conclusion based on the continue on error result
step.ExecutionContext.Global.StepsContext.SetOutcome(step.ExecutionContext.ScopeName, step.ExecutionContext.ContextName, (step.ExecutionContext.Result ?? TaskResult.Succeeded).ToActionResult());
step.ExecutionContext.Global.StepsContext.SetConclusion(step.ExecutionContext.ScopeName, step.ExecutionContext.ContextName, (step.ExecutionContext.Result ?? TaskResult.Succeeded).ToActionResult());
}
step.ExecutionContext.UpdateGlobalStepsContext();
}
}
}

View File

@@ -1,14 +1,14 @@
using System.Collections.Generic;
using System;
using System.Collections.Generic;
using System.IO;
using System.Linq;
using System.Threading.Tasks;
using System;
using GitHub.Runner.Worker.Container;
using Pipelines = GitHub.DistributedTask.Pipelines;
using GitHub.DistributedTask.Pipelines.ContextData;
using GitHub.DistributedTask.WebApi;
using GitHub.Runner.Common;
using GitHub.Runner.Sdk;
using GitHub.DistributedTask.WebApi;
using GitHub.DistributedTask.Pipelines.ContextData;
using System.Linq;
using GitHub.Runner.Worker.Container;
using Pipelines = GitHub.DistributedTask.Pipelines;
namespace GitHub.Runner.Worker.Handlers
{
@@ -70,18 +70,13 @@ namespace GitHub.Runner.Worker.Handlers
}
string type = Action.Type == Pipelines.ActionSourceType.Repository ? "Dockerfile" : "DockerHub";
// Add Telemetry to JobContext to send with JobCompleteMessage
// Set extra telemetry base on the current context.
if (stage == ActionRunStage.Main)
{
var telemetry = new ActionsStepTelemetry {
Ref = GetActionRef(),
HasPreStep = Data.HasPre,
HasPostStep = Data.HasPost,
IsEmbedded = ExecutionContext.IsEmbedded,
Type = type
};
ExecutionContext.Root.ActionsStepsTelemetry.Add(telemetry);
ExecutionContext.StepTelemetry.HasPreStep = Data.HasPre;
ExecutionContext.StepTelemetry.HasPostStep = Data.HasPost;
}
ExecutionContext.StepTelemetry.Type = type;
// run container
var container = new ContainerInfo(HostContext)

View File

@@ -1,13 +1,13 @@
using GitHub.DistributedTask.WebApi;
using GitHub.Runner.Common.Util;
using System;
using System.Collections.Generic;
using System.Threading.Tasks;
using System.Linq;
using System.IO;
using Pipelines = GitHub.DistributedTask.Pipelines;
using System.Linq;
using System.Threading.Tasks;
using GitHub.DistributedTask.WebApi;
using GitHub.Runner.Common;
using GitHub.Runner.Common.Util;
using GitHub.Runner.Sdk;
using Pipelines = GitHub.DistributedTask.Pipelines;
namespace GitHub.Runner.Worker.Handlers
{
@@ -22,7 +22,7 @@ namespace GitHub.Runner.Worker.Handlers
string ActionDirectory { get; set; }
List<JobExtensionRunner> LocalActionContainerSetupSteps { get; set; }
Task RunAsync(ActionRunStage stage);
void PrintActionDetails(ActionRunStage stage);
void PrepareExecution(ActionRunStage stage);
}
public abstract class Handler : RunnerService
@@ -36,6 +36,7 @@ namespace GitHub.Runner.Worker.Handlers
protected IActionCommandManager ActionCommandManager { get; private set; }
public Pipelines.ActionStepDefinitionReference Action { get; set; }
public bool IsActionStep => Action != null;
public Dictionary<string, string> Environment { get; set; }
public Variables RuntimeVariables { get; set; }
public IExecutionContext ExecutionContext { get; set; }
@@ -44,29 +45,50 @@ namespace GitHub.Runner.Worker.Handlers
public string ActionDirectory { get; set; }
public List<JobExtensionRunner> LocalActionContainerSetupSteps { get; set; }
public virtual string GetActionRef()
public void PrepareExecution(ActionRunStage stage)
{
if (Action.Type == Pipelines.ActionSourceType.ContainerRegistry)
// Print out action details
PrintActionDetails(stage);
// Get telemetry for the action
PopulateActionTelemetry(stage);
}
protected void PopulateActionTelemetry(ActionRunStage stage)
{
if (!IsActionStep)
{
ExecutionContext.StepTelemetry.Type = "runner";
ExecutionContext.StepTelemetry.Action = $"{stage} Job Hook";
}
else if (Action.Type == Pipelines.ActionSourceType.ContainerRegistry)
{
ExecutionContext.StepTelemetry.Type = "docker";
var registryAction = Action as Pipelines.ContainerRegistryReference;
return registryAction.Image;
ExecutionContext.StepTelemetry.Action = registryAction.Image;
}
else if (Action.Type == Pipelines.ActionSourceType.Script)
{
ExecutionContext.StepTelemetry.Type = "run";
}
else if (Action.Type == Pipelines.ActionSourceType.Repository)
{
ExecutionContext.StepTelemetry.Type = "repository";
var repoAction = Action as Pipelines.RepositoryPathReference;
if (string.Equals(repoAction.RepositoryType, Pipelines.PipelineConstants.SelfAlias, StringComparison.OrdinalIgnoreCase))
{
return repoAction.Path;
ExecutionContext.StepTelemetry.Action = repoAction.Path;
}
else
{
ExecutionContext.StepTelemetry.Ref = repoAction.Ref;
if (string.IsNullOrEmpty(repoAction.Path))
{
return $"{repoAction.Name}@{repoAction.Ref}";
ExecutionContext.StepTelemetry.Action = repoAction.Name;
}
else
{
return $"{repoAction.Name}/{repoAction.Path}@{repoAction.Ref}";
ExecutionContext.StepTelemetry.Action = $"{repoAction.Name}/{repoAction.Path}";
}
}
}
@@ -75,9 +97,9 @@ namespace GitHub.Runner.Worker.Handlers
// this should never happen
Trace.Error($"Can't generate ref for {Action.Type.ToString()}");
}
return "";
}
public virtual void PrintActionDetails(ActionRunStage stage)
protected virtual void PrintActionDetails(ActionRunStage stage)
{
if (stage == ActionRunStage.Post)

View File

@@ -55,7 +55,23 @@ namespace GitHub.Runner.Worker.Handlers
else if (data.ExecutionType == ActionExecutionType.NodeJS)
{
handler = HostContext.CreateService<INodeScriptActionHandler>();
(handler as INodeScriptActionHandler).Data = data as NodeJSActionExecutionData;
var nodeData = data as NodeJSActionExecutionData;
// With node12 EoL in 04/2022, we want to be able to uniformly upgrade all JS actions to node16 from the server
if (string.Equals(nodeData.NodeVersion, "node12", StringComparison.InvariantCultureIgnoreCase) &&
(executionContext.Global.Variables.GetBoolean("DistributedTask.ForceGithubJavascriptActionsToNode16") ?? false))
{
// The user can opt out of this behaviour by setting this variable to true, either setting 'env' in their workflow or as an environment variable on their machine
executionContext.Global.EnvironmentVariables.TryGetValue(Constants.Variables.Actions.AllowActionsUseUnsecureNodeVersion, out var workflowOptOut);
var isWorkflowOptOutSet = !string.IsNullOrEmpty(workflowOptOut);
var isLocalOptOut = StringUtil.ConvertToBoolean(Environment.GetEnvironmentVariable(Constants.Variables.Actions.AllowActionsUseUnsecureNodeVersion));
bool isOptOut = isWorkflowOptOutSet ? StringUtil.ConvertToBoolean(workflowOptOut) : isLocalOptOut;
if (!isOptOut)
{
nodeData.NodeVersion = "node16";
}
}
(handler as INodeScriptActionHandler).Data = nodeData;
}
else if (data.ExecutionType == ActionExecutionType.Script)
{

View File

@@ -1,12 +1,13 @@
using System.IO;
using System;
using System.IO;
using System.Linq;
using System.Text;
using System.Threading.Tasks;
using GitHub.DistributedTask.Pipelines;
using GitHub.DistributedTask.Pipelines.ContextData;
using GitHub.DistributedTask.WebApi;
using GitHub.Runner.Common;
using GitHub.Runner.Sdk;
using GitHub.DistributedTask.WebApi;
using Pipelines = GitHub.DistributedTask.Pipelines;
using System;
using System.Linq;
namespace GitHub.Runner.Worker.Handlers
{
@@ -74,19 +75,13 @@ namespace GitHub.Runner.Worker.Handlers
target = Data.Post;
}
// Add Telemetry to JobContext to send with JobCompleteMessage
// Set extra telemetry base on the current context.
if (stage == ActionRunStage.Main)
{
var telemetry = new ActionsStepTelemetry
{
Ref = GetActionRef(),
HasPreStep = Data.HasPre,
HasPostStep = Data.HasPost,
IsEmbedded = ExecutionContext.IsEmbedded,
Type = Data.NodeVersion
};
ExecutionContext.Root.ActionsStepsTelemetry.Add(telemetry);
ExecutionContext.StepTelemetry.HasPreStep = Data.HasPre;
ExecutionContext.StepTelemetry.HasPostStep = Data.HasPost;
}
ExecutionContext.StepTelemetry.Type = Data.NodeVersion;
ArgUtil.NotNullOrEmpty(target, nameof(target));
target = Path.Combine(ActionDirectory, target);
@@ -119,6 +114,17 @@ namespace GitHub.Runner.Worker.Handlers
// Remove environment variable that may cause conflicts with the node within the runner.
Environment.Remove("NODE_ICU_DATA"); // https://github.com/actions/runner/issues/795
if (Data.NodeVersion == "node12" && (ExecutionContext.Global.Variables.GetBoolean(Constants.Runner.Features.Node12Warning) ?? false))
{
if (!ExecutionContext.JobContext.ContainsKey("Node12ActionsWarnings"))
{
ExecutionContext.JobContext["Node12ActionsWarnings"] = new ArrayContextData();
}
var repoAction = Action as RepositoryPathReference;
var actionDisplayName = new StringContextData(repoAction.Name ?? repoAction.Path); // local actions don't have a 'Name'
ExecutionContext.JobContext["Node12ActionsWarnings"].AssertArray("Node12ActionsWarnings").Add(actionDisplayName);
}
using (var stdoutManager = new OutputManager(ExecutionContext, ActionCommandManager))
using (var stderrManager = new OutputManager(ExecutionContext, ActionCommandManager))
{

View File

@@ -151,6 +151,11 @@ namespace GitHub.Runner.Worker.Handlers
}
}
if (line.Contains("fatal: unsafe repository", StringComparison.OrdinalIgnoreCase))
{
_executionContext.StepTelemetry.ErrorMessages.Add(line);
}
// Regular output
_executionContext.Output(line);
}

View File

@@ -1,7 +1,7 @@
using System.Threading.Tasks;
using System;
using GitHub.Runner.Sdk;
using System;
using System.Threading.Tasks;
using GitHub.Runner.Common;
using GitHub.Runner.Sdk;
using Pipelines = GitHub.DistributedTask.Pipelines;
namespace GitHub.Runner.Worker.Handlers
@@ -35,6 +35,8 @@ namespace GitHub.Runner.Worker.Handlers
}
ArgUtil.NotNullOrEmpty(plugin, nameof(plugin));
// Set extra telemetry base on the current context.
ExecutionContext.StepTelemetry.Type = plugin;
// Update the env dictionary.
AddPrependPathToEnvironment();

View File

@@ -1,12 +1,13 @@
using System;
using System.IO;
using System.Linq;
using System.Text;
using System.Threading.Tasks;
using System.Linq;
using GitHub.DistributedTask.Pipelines.ContextData;
using GitHub.Runner.Common;
using GitHub.Runner.Sdk;
using GitHub.DistributedTask.WebApi;
using GitHub.Runner.Common;
using GitHub.Runner.Common.Util;
using GitHub.Runner.Sdk;
using Pipelines = GitHub.DistributedTask.Pipelines;
namespace GitHub.Runner.Worker.Handlers
@@ -21,18 +22,24 @@ namespace GitHub.Runner.Worker.Handlers
{
public ScriptActionExecutionData Data { get; set; }
public override void PrintActionDetails(ActionRunStage stage)
protected override void PrintActionDetails(ActionRunStage stage)
{
if (stage == ActionRunStage.Post)
// if we're executing a Job Extension, we won't have an 'Action'
if (!IsActionStep)
{
throw new NotSupportedException("Script action should not have 'Post' job action.");
if (Inputs.TryGetValue("path", out var path))
{
ExecutionContext.Output($"##[group]Run '{path}'");
}
else
{
throw new InvalidOperationException("Inputs 'path' must be set for job extensions");
}
}
else if (Action.Type == Pipelines.ActionSourceType.Script)
{
Inputs.TryGetValue("script", out string contents);
contents = contents ?? string.Empty;
if (Action.Type == Pipelines.ActionSourceType.Script)
{
var firstLine = contents.TrimStart(' ', '\t', '\r', '\n');
var firstNewLine = firstLine.IndexOfAny(new[] { '\r', '\n' });
if (firstNewLine >= 0)
@@ -41,18 +48,17 @@ namespace GitHub.Runner.Worker.Handlers
}
ExecutionContext.Output($"##[group]Run {firstLine}");
}
else
{
throw new InvalidOperationException($"Invalid action type {Action.Type} for {nameof(ScriptHandler)}");
}
var multiLines = contents.Replace("\r\n", "\n").TrimEnd('\n').Split('\n');
foreach (var line in multiLines)
{
// Bright Cyan color
ExecutionContext.Output($"\x1b[36;1m{line}\x1b[0m");
}
}
else
{
throw new InvalidOperationException($"Invalid action type {Action?.Type} for {nameof(ScriptHandler)}");
}
string argFormat;
string shellCommand;
@@ -131,11 +137,6 @@ namespace GitHub.Runner.Worker.Handlers
public async Task RunAsync(ActionRunStage stage)
{
if (stage == ActionRunStage.Post)
{
throw new NotSupportedException("Script action should not have 'Post' job action.");
}
// Validate args
Trace.Entering();
ArgUtil.NotNull(ExecutionContext, nameof(ExecutionContext));
@@ -144,17 +145,6 @@ namespace GitHub.Runner.Worker.Handlers
var githubContext = ExecutionContext.ExpressionValues["github"] as GitHubContext;
ArgUtil.NotNull(githubContext, nameof(githubContext));
// Add Telemetry to JobContext to send with JobCompleteMessage
if (stage == ActionRunStage.Main)
{
var telemetry = new ActionsStepTelemetry
{
IsEmbedded = ExecutionContext.IsEmbedded,
Type = "run",
};
ExecutionContext.Root.ActionsStepsTelemetry.Add(telemetry);
}
var tempDirectory = HostContext.GetDirectory(WellKnownDirectory.Temp);
Inputs.TryGetValue("script", out var contents);
@@ -163,7 +153,8 @@ namespace GitHub.Runner.Worker.Handlers
string workingDirectory = null;
if (!Inputs.TryGetValue("workingDirectory", out workingDirectory))
{
if (string.IsNullOrEmpty(ExecutionContext.ScopeName) && ExecutionContext.Global.JobDefaults.TryGetValue("run", out var runDefaults))
// Don't use job level working directories for hooks
if (IsActionStep && string.IsNullOrEmpty(ExecutionContext.ScopeName) && ExecutionContext.Global.JobDefaults.TryGetValue("run", out var runDefaults))
{
if (runDefaults.TryGetValue("working-directory", out workingDirectory))
{
@@ -222,15 +213,35 @@ namespace GitHub.Runner.Worker.Handlers
}
}
// Don't override runner telemetry here
if (!string.IsNullOrEmpty(shellCommand) && IsActionStep)
{
ExecutionContext.StepTelemetry.Action = shellCommand;
}
// No arg format was given, shell must be a built-in
if (string.IsNullOrEmpty(argFormat) || !argFormat.Contains("{0}"))
{
throw new ArgumentException("Invalid shell option. Shell must be a valid built-in (bash, sh, cmd, powershell, pwsh) or a format string containing '{0}'");
}
string scriptFilePath, resolvedScriptPath;
if (IsActionStep)
{
// We do not not the full path until we know what shell is being used, so that we can determine the file extension
var scriptFilePath = Path.Combine(tempDirectory, $"{Guid.NewGuid()}{ScriptHandlerHelpers.GetScriptFileExtension(shellCommand)}");
var resolvedScriptPath = $"{StepHost.ResolvePathForStepHost(scriptFilePath).Replace("\"", "\\\"")}";
scriptFilePath = Path.Combine(tempDirectory, $"{Guid.NewGuid()}{ScriptHandlerHelpers.GetScriptFileExtension(shellCommand)}");
resolvedScriptPath = $"{StepHost.ResolvePathForStepHost(scriptFilePath).Replace("\"", "\\\"")}";
}
else
{
// JobExtensionRunners run a script file, we load that from the inputs here
if (!Inputs.ContainsKey("path"))
{
throw new ArgumentException("Expected 'path' input to be set");
}
scriptFilePath = Inputs["path"];
ArgUtil.NotNullOrEmpty(scriptFilePath, "path");
resolvedScriptPath = Inputs["path"].Replace("\"", "\\\"");
}
// Format arg string with script path
var arguments = string.Format(argFormat, resolvedScriptPath);
@@ -247,8 +258,11 @@ namespace GitHub.Runner.Worker.Handlers
// Don't add a BOM. It causes the script to fail on some operating systems (e.g. on Ubuntu 14).
var encoding = new UTF8Encoding(false);
#endif
if (IsActionStep)
{
// Script is written to local path (ie host) but executed relative to the StepHost, which may be a container
File.WriteAllText(scriptFilePath, contents, encoding);
}
// Prepend PATH
AddPrependPathToEnvironment();
@@ -271,10 +285,10 @@ namespace GitHub.Runner.Worker.Handlers
if (Environment.ContainsKey("DYLD_INSERT_LIBRARIES")) // We don't check `isContainerStepHost` because we don't support container on macOS
{
// launch `node macOSRunInvoker.js shell args` instead of `shell args` to avoid macOS SIP remove `DYLD_INSERT_LIBRARIES` when launch process
string node12 = Path.Combine(HostContext.GetDirectory(WellKnownDirectory.Externals), "node12", "bin", $"node{IOUtil.ExeExtension}");
string node = Path.Combine(HostContext.GetDirectory(WellKnownDirectory.Externals), NodeUtil.GetInternalNodeVersion(), "bin", $"node{IOUtil.ExeExtension}");
string macOSRunInvoker = Path.Combine(HostContext.GetDirectory(WellKnownDirectory.Bin), "macos-run-invoker.js");
arguments = $"\"{macOSRunInvoker.Replace("\"", "\\\"")}\" \"{fileName.Replace("\"", "\\\"")}\" {arguments}";
fileName = node12;
fileName = node;
}
#endif
var systemConnection = ExecutionContext.Global.Endpoints.Single(x => string.Equals(x.Name, WellKnownServiceEndpointNames.SystemVssConnection, StringComparison.OrdinalIgnoreCase));

View File

@@ -1,6 +1,8 @@
using System;
using System.Collections.Generic;
using System.IO;
using GitHub.Runner.Sdk;
namespace GitHub.Runner.Worker.Handlers
{
@@ -79,5 +81,22 @@ namespace GitHub.Runner.Worker.Handlers
throw new ArgumentException($"Failed to parse COMMAND [..ARGS] from {shellOption}");
}
}
internal static string GetDefaultShellForScript(string path, Common.Tracing trace, string prependPath)
{
var format = "{0} {1}";
switch (Path.GetExtension(path))
{
case ".sh":
// use 'sh' args but prefer bash
var pathToShell = WhichUtil.Which("bash", false, trace, prependPath) ?? WhichUtil.Which("sh", true, trace, prependPath);
return string.Format(format, pathToShell, _defaultArguments["sh"]);
case ".ps1":
var pathToPowershell = WhichUtil.Which("pwsh", false, trace, prependPath) ?? WhichUtil.Which("powershell", true, trace, prependPath);
return string.Format(format, pathToPowershell, _defaultArguments["powershell"]);
default:
throw new ArgumentException($"{path} is not a valid path to a script. Make sure it ends in '.sh' or '.ps1'.");
}
}
}
}

View File

@@ -1,18 +1,13 @@
using System;
using System.Collections.Generic;
using System.IO;
using System.Text;
using System.Threading;
using System.Threading.Channels;
using System.Threading.Tasks;
using GitHub.DistributedTask.WebApi;
using GitHub.Runner.Common.Util;
using GitHub.Runner.Worker.Container;
using GitHub.Services.WebApi;
using Newtonsoft.Json;
using GitHub.Runner.Common;
using GitHub.Runner.Sdk;
using System.Linq;
using GitHub.DistributedTask.Pipelines.ContextData;
namespace GitHub.Runner.Worker.Handlers
{

View File

@@ -15,6 +15,7 @@ using GitHub.DistributedTask.WebApi;
using GitHub.Runner.Common;
using GitHub.Runner.Common.Util;
using GitHub.Runner.Sdk;
using GitHub.Runner.Worker;
using Pipelines = GitHub.DistributedTask.Pipelines;
namespace GitHub.Runner.Worker
@@ -56,6 +57,8 @@ namespace GitHub.Runner.Worker
// Create a new timeline record for 'Set up job'
IExecutionContext context = jobContext.CreateChild(Guid.NewGuid(), "Set up job", $"{nameof(JobExtension)}_Init", null, null, ActionRunStage.Pre);
context.StepTelemetry.Type = "runner";
context.StepTelemetry.Action = "setup_job";
List<IStep> preJobSteps = new List<IStep>();
List<IStep> jobSteps = new List<IStep>();
@@ -246,6 +249,19 @@ namespace GitHub.Runner.Worker
Trace.Info("Downloading actions");
var actionManager = HostContext.GetService<IActionManager>();
var prepareResult = await actionManager.PrepareActionsAsync(context, message.Steps);
// add hook to preJobSteps
var startedHookPath = Environment.GetEnvironmentVariable("ACTIONS_RUNNER_HOOK_JOB_STARTED");
if (!string.IsNullOrEmpty(startedHookPath))
{
var hookProvider = HostContext.GetService<IJobHookProvider>();
var jobHookData = new JobHookData(ActionRunStage.Pre, startedHookPath);
preJobSteps.Add(new JobExtensionRunner(runAsync: hookProvider.RunHook,
condition: $"{PipelineTemplateConstants.Always}()",
displayName: Constants.Hooks.JobStartedStepName,
data: (object)jobHookData));
}
preJobSteps.AddRange(prepareResult.ContainerSetupSteps);
// Add start-container steps, record and stop-container steps
@@ -313,6 +329,8 @@ namespace GitHub.Runner.Worker
ArgUtil.NotNull(extensionStep, extensionStep.DisplayName);
Guid stepId = Guid.NewGuid();
extensionStep.ExecutionContext = jobContext.CreateChild(stepId, extensionStep.DisplayName, stepId.ToString("N"), null, stepId.ToString("N"), ActionRunStage.Pre);
extensionStep.ExecutionContext.StepTelemetry.Type = "runner";
extensionStep.ExecutionContext.StepTelemetry.Action = extensionStep.DisplayName.ToLowerInvariant().Replace(' ', '_');
}
else if (step is IActionRunner actionStep)
{
@@ -333,6 +351,18 @@ namespace GitHub.Runner.Worker
}
}
// Register Job Completed hook if the variable is set
var completedHookPath = Environment.GetEnvironmentVariable("ACTIONS_RUNNER_HOOK_JOB_COMPLETED");
if (!string.IsNullOrEmpty(completedHookPath))
{
var hookProvider = HostContext.GetService<IJobHookProvider>();
var jobHookData = new JobHookData(ActionRunStage.Post, completedHookPath);
jobContext.RegisterPostJobStep(new JobExtensionRunner(runAsync: hookProvider.RunHook,
condition: $"{PipelineTemplateConstants.Always}()",
displayName: Constants.Hooks.JobCompletedStepName,
data: (object)jobHookData));
}
List<IStep> steps = new List<IStep>();
steps.AddRange(preJobSteps);
steps.AddRange(jobSteps);
@@ -401,6 +431,8 @@ namespace GitHub.Runner.Worker
// create a new timeline record node for 'Finalize job'
IExecutionContext context = jobContext.CreateChild(Guid.NewGuid(), "Complete job", $"{nameof(JobExtension)}_Final", null, null, ActionRunStage.Post);
context.StepTelemetry.Type = "runner";
context.StepTelemetry.Action = "complete_job";
using (var register = jobContext.CancellationToken.Register(() => { context.CancelToken(); }))
{
try

View File

@@ -0,0 +1,95 @@
using System;
using System.Collections.Generic;
using System.IO;
using System.Threading.Tasks;
using System.Linq;
using GitHub.Runner.Common;
using GitHub.Runner.Common.Util;
using GitHub.Runner.Sdk;
using GitHub.Runner.Worker.Handlers;
namespace GitHub.Runner.Worker
{
[ServiceLocator(Default = typeof(JobHookProvider))]
public interface IJobHookProvider : IRunnerService
{
Task RunHook(IExecutionContext executionContext, object data);
}
public class JobHookData
{
public string Path {get; private set;}
public ActionRunStage Stage {get; private set;}
public JobHookData(ActionRunStage stage, string path)
{
Path = path;
Stage = stage;
}
}
public class JobHookProvider : RunnerService, IJobHookProvider
{
public override void Initialize(IHostContext hostContext)
{
base.Initialize(hostContext);
}
public async Task RunHook(IExecutionContext executionContext, object data)
{
// Get Inputs
var hookData = data as JobHookData;
ArgUtil.NotNull(hookData, nameof(JobHookData));
var displayName = hookData.Stage == ActionRunStage.Pre ? "job started hook" : "job completed hook";
// Log to users so that they know how this step was injected
executionContext.Output($"A {displayName} has been configured by the self-hosted runner administrator");
// Validate script file.
if (!File.Exists(hookData.Path))
{
throw new FileNotFoundException("File doesn't exist");
}
executionContext.WriteWebhookPayload();
// Create the handler data.
var scriptDirectory = Path.GetDirectoryName(hookData.Path);
var stepHost = HostContext.CreateService<IDefaultStepHost>();
var prependPath = string.Join(Path.PathSeparator.ToString(), executionContext.Global.PrependPath.Reverse<string>());
Dictionary<string, string> inputs = new()
{
["path"] = hookData.Path,
["shell"] = ScriptHandlerHelpers.GetDefaultShellForScript(hookData.Path, Trace, prependPath)
};
// Create the handler
var handlerFactory = HostContext.GetService<IHandlerFactory>();
var handler = handlerFactory.Create(
executionContext,
action: null,
stepHost,
new ScriptActionExecutionData(),
inputs,
environment: new Dictionary<string, string>(VarUtil.EnvironmentVariableKeyComparer),
executionContext.Global.Variables,
actionDirectory: scriptDirectory,
localActionContainerSetupSteps: null);
handler.PrepareExecution(hookData.Stage);
// Setup file commands
var fileCommandManager = HostContext.CreateService<IFileCommandManager>();
fileCommandManager.InitializeFiles(executionContext, null);
// Run the step and process the file commands
try
{
await handler.RunAsync(hookData.Stage);
}
finally
{
fileCommandManager.ProcessFiles(executionContext, executionContext.Global.Container);
}
}
}
}

View File

@@ -1,18 +1,19 @@
using GitHub.DistributedTask.WebApi;
using Pipelines = GitHub.DistributedTask.Pipelines;
using GitHub.Runner.Common.Util;
using GitHub.Services.Common;
using GitHub.Services.WebApi;
using System;
using System;
using System.Collections.Generic;
using System.IO;
using System.Linq;
using System.Net.Http;
using System.Text;
using System.Threading;
using System.Threading.Tasks;
using System.Net.Http;
using GitHub.DistributedTask.Pipelines.ContextData;
using GitHub.DistributedTask.WebApi;
using GitHub.Runner.Common;
using GitHub.Runner.Common.Util;
using GitHub.Runner.Sdk;
using GitHub.Services.Common;
using GitHub.Services.WebApi;
using Pipelines = GitHub.DistributedTask.Pipelines;
namespace GitHub.Runner.Worker
{
@@ -25,6 +26,7 @@ namespace GitHub.Runner.Worker
public sealed class JobRunner : RunnerService, IJobRunner
{
private IJobServerQueue _jobServerQueue;
private RunnerSettings _runnerSettings;
private ITempDirectoryManager _tempDirectoryManager;
public async Task<TaskResult> RunAsync(Pipelines.AgentJobRequestMessage message, CancellationToken jobRequestCancellationToken)
@@ -108,8 +110,8 @@ namespace GitHub.Runner.Worker
jobContext.SetRunnerContext("os", VarUtil.OS);
jobContext.SetRunnerContext("arch", VarUtil.OSArchitecture);
var runnerSettings = HostContext.GetService<IConfigurationStore>().GetSettings();
jobContext.SetRunnerContext("name", runnerSettings.AgentName);
_runnerSettings = HostContext.GetService<IConfigurationStore>().GetSettings();
jobContext.SetRunnerContext("name", _runnerSettings.AgentName);
string toolsDirectory = HostContext.GetDirectory(WellKnownDirectory.Tools);
Directory.CreateDirectory(toolsDirectory);
@@ -130,9 +132,9 @@ namespace GitHub.Runner.Worker
}
catch (OperationCanceledException ex) when (jobContext.CancellationToken.IsCancellationRequested)
{
// set the job to canceled
// set the job to cancelled
// don't log error issue to job ExecutionContext, since server owns the job level issue
Trace.Error($"Job is canceled during initialize.");
Trace.Error($"Job is cancelled during initialize.");
Trace.Error($"Caught exception: {ex}");
return await CompleteJobAsync(jobServer, jobContext, message, TaskResult.Canceled);
}
@@ -209,6 +211,59 @@ namespace GitHub.Runner.Worker
jobContext.Debug($"Finishing: {message.JobDisplayName}");
TaskResult result = jobContext.Complete(taskResult);
if (_runnerSettings.DisableUpdate == true)
{
try
{
var currentVersion = new PackageVersion(BuildConstants.RunnerPackage.Version);
ServiceEndpoint systemConnection = message.Resources.Endpoints.Single(x => string.Equals(x.Name, WellKnownServiceEndpointNames.SystemVssConnection, StringComparison.OrdinalIgnoreCase));
VssCredentials serverCredential = VssUtil.GetVssCredential(systemConnection);
var runnerServer = HostContext.GetService<IRunnerServer>();
await runnerServer.ConnectAsync(systemConnection.Url, serverCredential);
var serverPackages = await runnerServer.GetPackagesAsync("agent", BuildConstants.RunnerPackage.PackageName, 5, includeToken: false, cancellationToken: CancellationToken.None);
if (serverPackages.Count > 0)
{
serverPackages = serverPackages.OrderByDescending(x => x.Version).ToList();
Trace.Info($"Newer packages {StringUtil.ConvertToJson(serverPackages.Select(x => x.Version.ToString()))}");
var warnOnFailedJob = false; // any minor/patch version behind.
var warnOnOldRunnerVersion = false; // >= 2 minor version behind
if (serverPackages.Any(x => x.Version.CompareTo(currentVersion) > 0))
{
Trace.Info($"Current runner version {currentVersion} is behind the latest runner version {serverPackages[0].Version}.");
warnOnFailedJob = true;
}
if (serverPackages.Where(x => x.Version.Major == currentVersion.Major && x.Version.Minor > currentVersion.Minor).Count() > 1)
{
Trace.Info($"Current runner version {currentVersion} is way behind the latest runner version {serverPackages[0].Version}.");
warnOnOldRunnerVersion = true;
}
if (result == TaskResult.Failed && warnOnFailedJob)
{
jobContext.Warning($"This job failure may be caused by using an out of date self-hosted runner. You are currently using runner version {currentVersion}. Please update to the latest version {serverPackages[0].Version}");
}
else if (warnOnOldRunnerVersion)
{
jobContext.Warning($"This self-hosted runner is currently using runner version {currentVersion}. This version is out of date. Please update to the latest version {serverPackages[0].Version}");
}
}
}
catch (Exception ex)
{
// Ignore any error since suggest runner update is best effort.
Trace.Error($"Caught exception during runner version check: {ex}");
}
}
if (jobContext.JobContext.ContainsKey("Node12ActionsWarnings"))
{
var actions = string.Join(", ", jobContext.JobContext["Node12ActionsWarnings"].AssertArray("Node12ActionsWarnings").Select(action => action.ToString()));
jobContext.Warning(string.Format(Constants.Runner.Node12DetectedAfterEndOfLife, actions));
}
try
{
await ShutdownQueue(throwOnFailure: true);
@@ -231,14 +286,13 @@ namespace GitHub.Runner.Worker
}
// Load any upgrade telemetry
LoadFromTelemetryFile(jobContext.JobTelemetry);
LoadFromTelemetryFile(jobContext.Global.JobTelemetry);
// Make sure we don't submit secrets as telemetry
MaskTelemetrySecrets(jobContext.JobTelemetry);
Trace.Info("Raising job completed event.");
var jobCompletedEvent = new JobCompletedEvent(message.RequestId, message.JobId, result, jobContext.JobOutputs, jobContext.ActionsEnvironment, jobContext.ActionsStepsTelemetry, jobContext.JobTelemetry);
MaskTelemetrySecrets(jobContext.Global.JobTelemetry);
Trace.Info($"Raising job completed event");
var jobCompletedEvent = new JobCompletedEvent(message.RequestId, message.JobId, result, jobContext.JobOutputs, jobContext.ActionsEnvironment, jobContext.Global.StepsTelemetry, jobContext.Global.JobTelemetry);
var completeJobRetryLimit = 5;
var exceptions = new List<Exception>();

View File

@@ -1,5 +1,6 @@
using System;
using System.Collections.Generic;
using System.Linq;
using System.Text;
using System.Threading;
using System.Threading.Tasks;
@@ -130,11 +131,13 @@ namespace GitHub.Runner.Worker
// Register job cancellation call back only if job cancellation token not been fire before each step run
if (!jobContext.CancellationToken.IsCancellationRequested)
{
// Test the condition again. The job was canceled after the condition was originally evaluated.
// Test the condition again. The job was cancelled after the condition was originally evaluated.
jobCancelRegister = jobContext.CancellationToken.Register(() =>
{
// Mark job as cancelled
jobContext.Result = TaskResult.Canceled;
// Mark job as Cancelled or Failed depending on HostContext shutdown token's cancellation
jobContext.Result = HostContext.RunnerShutdownToken.IsCancellationRequested
? TaskResult.Failed
: TaskResult.Canceled;
jobContext.JobContext.Status = jobContext.Result?.ToActionResult();
step.ExecutionContext.Debug($"Re-evaluate condition on job cancellation for step: '{step.DisplayName}'.");
@@ -172,8 +175,10 @@ namespace GitHub.Runner.Worker
{
if (jobContext.Result != TaskResult.Canceled)
{
// Mark job as cancelled
jobContext.Result = TaskResult.Canceled;
// Mark job as Cancelled or Failed depending on HostContext shutdown token's cancellation
jobContext.Result = HostContext.RunnerShutdownToken.IsCancellationRequested
? TaskResult.Failed
: TaskResult.Canceled;
jobContext.JobContext.Status = jobContext.Result?.ToActionResult();
}
}
@@ -319,29 +324,8 @@ namespace GitHub.Runner.Worker
step.ExecutionContext.Result = TaskResultUtil.MergeTaskResults(step.ExecutionContext.Result, step.ExecutionContext.CommandResult.Value);
}
// Fixup the step result if ContinueOnError
if (step.ExecutionContext.Result == TaskResult.Failed)
{
var continueOnError = false;
try
{
continueOnError = templateEvaluator.EvaluateStepContinueOnError(step.ContinueOnError, step.ExecutionContext.ExpressionValues, step.ExecutionContext.ExpressionFunctions);
}
catch (Exception ex)
{
Trace.Info("The step failed and an error occurred when attempting to determine whether to continue on error.");
Trace.Error(ex);
step.ExecutionContext.Error("The step failed and an error occurred when attempting to determine whether to continue on error.");
step.ExecutionContext.Error(ex);
}
step.ExecutionContext.ApplyContinueOnError(step.ContinueOnError);
if (continueOnError)
{
step.ExecutionContext.Outcome = step.ExecutionContext.Result;
step.ExecutionContext.Result = TaskResult.Succeeded;
Trace.Info($"Updated step result (continue on error)");
}
}
Trace.Info($"Step result: {step.ExecutionContext.Result}");
// Complete the step context

View File

@@ -129,9 +129,10 @@
"required": true
},
"env": "step-env",
"continue-on-error": "boolean-steps-context",
"working-directory": "string-steps-context",
"shell": {
"type": "non-empty-string",
"type": "string-steps-context",
"required": true
}
}
@@ -147,6 +148,7 @@
"type": "non-empty-string",
"required": true
},
"continue-on-error": "boolean-steps-context",
"with": "step-with",
"env": "step-env"
}
@@ -201,6 +203,20 @@
],
"string": {}
},
"boolean-steps-context": {
"context": [
"github",
"inputs",
"strategy",
"matrix",
"steps",
"job",
"runner",
"env",
"hashFiles(1,255)"
],
"boolean": {}
},
"step-env": {
"context": [
"github",

View File

@@ -23,7 +23,6 @@ namespace GitHub.Services.Common.ClientStorage
private readonly string m_filePath;
private readonly VssFileStorageReader m_reader;
private readonly IVssClientStorageWriter m_writer;
private const char c_defaultPathSeparator = '\\';
private const bool c_defaultIgnoreCaseInPaths = false;
@@ -192,7 +191,7 @@ namespace GitHub.Services.Common.ClientStorage
// Windows Impersonation is being used.
// Check to see if we can find the user's local application data directory.
string subDir = "GitHub\\ActionsService";
string subDir = Path.Combine("GitHub", "ActionsService");
string path = Environment.GetEnvironmentVariable("localappdata");
SafeGetFolderPath(Environment.SpecialFolder.LocalApplicationData);
if (string.IsNullOrEmpty(path))

View File

@@ -1,4 +1,4 @@
/*
/*
* ---------------------------------------------------------
* Copyright(C) Microsoft Corporation. All rights reserved.
* ---------------------------------------------------------
@@ -324,6 +324,7 @@ namespace GitHub.DistributedTask.WebApi
/// <param name="scopeIdentifier">The project GUID to scope the request</param>
/// <param name="hubName">The name of the server hub: "build" for the Build server or "rm" for the Release Management server</param>
/// <param name="planId"></param>
/// <param name="jobId"></param>
/// <param name="actionReferenceList"></param>
/// <param name="userState"></param>
/// <param name="cancellationToken">The cancellation token to cancel operation.</param>
@@ -331,6 +332,7 @@ namespace GitHub.DistributedTask.WebApi
Guid scopeIdentifier,
string hubName,
Guid planId,
Guid jobId,
ActionReferenceList actionReferenceList,
object userState = null,
CancellationToken cancellationToken = default)
@@ -340,11 +342,14 @@ namespace GitHub.DistributedTask.WebApi
object routeValues = new { scopeIdentifier = scopeIdentifier, hubName = hubName, planId = planId };
HttpContent content = new ObjectContent<ActionReferenceList>(actionReferenceList, new VssJsonMediaTypeFormatter(true));
List<KeyValuePair<string, string>> queryParams = new List<KeyValuePair<string, string>>();
queryParams.Add("jobId", jobId);
return SendAsync<ActionDownloadInfoCollection>(
httpMethod,
locationId,
routeValues: routeValues,
version: new ApiResourceVersion(6.0, 1),
queryParameters: queryParams,
userState: userState,
cancellationToken: cancellationToken,
content: content);

View File

@@ -1,4 +1,6 @@
using System.Runtime.Serialization;
using System;
using System.Collections.Generic;
using System.Runtime.Serialization;
namespace GitHub.DistributedTask.WebApi
{
@@ -8,6 +10,13 @@ namespace GitHub.DistributedTask.WebApi
[DataContract]
public class ActionsStepTelemetry
{
public ActionsStepTelemetry()
{
this.ErrorMessages = new List<string>();
}
[DataMember(EmitDefaultValue = false)]
public string Action { get; set; }
[DataMember(EmitDefaultValue = false)]
public string Ref { get; set; }
@@ -15,6 +24,12 @@ namespace GitHub.DistributedTask.WebApi
[DataMember(EmitDefaultValue = false)]
public string Type { get; set; }
[DataMember(EmitDefaultValue = false)]
public string Stage { get; set; }
[DataMember(EmitDefaultValue = false)]
public Guid StepId { get; set; }
[DataMember(EmitDefaultValue = false)]
public bool? HasRunsStep { get; set; }
@@ -32,5 +47,14 @@ namespace GitHub.DistributedTask.WebApi
[DataMember(EmitDefaultValue = false)]
public int? StepCount { get; set; }
[DataMember(EmitDefaultValue = false)]
public TaskResult? Result { get; set; }
[DataMember(EmitDefaultValue = false)]
public List<string> ErrorMessages { get; set; }
[DataMember(EmitDefaultValue = false)]
public int? ExecutionTimeInSeconds { get; set; }
}
}

View File

@@ -25,6 +25,7 @@ namespace GitHub.DistributedTask.WebApi
this.ProvisioningState = referenceToBeCloned.ProvisioningState;
this.AccessPoint = referenceToBeCloned.AccessPoint;
this.Ephemeral = referenceToBeCloned.Ephemeral;
this.DisableUpdate = referenceToBeCloned.DisableUpdate;
if (referenceToBeCloned.m_links != null)
{
@@ -92,6 +93,16 @@ namespace GitHub.DistributedTask.WebApi
set;
}
/// <summary>
/// Whether or not this agent should auto-update to latest version.
/// </summary>
[DataMember(EmitDefaultValue = false)]
public bool? DisableUpdate
{
get;
set;
}
/// <summary>
/// Whether or not the agent is online.
/// </summary>

View File

@@ -102,4 +102,10 @@ namespace GitHub.DistributedTask.WebApi
public static readonly String FileAttachment = "DistributedTask.Core.FileAttachment";
public static readonly String DiagnosticLog = "DistributedTask.Core.DiagnosticLog";
}
[GenerateAllConstants]
public class ChecksAttachmentType
{
public static readonly String StepSummary = "Checks.Step.Summary";
}
}

View File

@@ -708,33 +708,85 @@ namespace GitHub.Runner.Common.Tests
}
}
[Fact]
[Theory]
[InlineData("configure", "once")]
[InlineData("remove", "disableupdate")]
[InlineData("remove", "ephemeral")]
[InlineData("remove", "once")]
[InlineData("remove", "replace")]
[InlineData("remove", "runasservice")]
[InlineData("remove", "unattended")]
[InlineData("run", "disableupdate")]
[InlineData("run", "ephemeral")]
[InlineData("run", "replace")]
[InlineData("run", "runasservice")]
[InlineData("run", "unattended")]
[InlineData("warmup", "disableupdate")]
[InlineData("warmup", "ephemeral")]
[InlineData("warmup", "once")]
[InlineData("warmup", "replace")]
[InlineData("warmup", "runasservice")]
[InlineData("warmup", "unattended")]
[Trait("Level", "L0")]
[Trait("Category", nameof(CommandSettings))]
public void ValidateFlags()
public void ValidateInvalidFlagCommandCombination(string validCommand, string flag)
{
using (TestHostContext hc = CreateTestContext())
{
// Arrange.
var command = new CommandSettings(hc, args: new string[] { "--badflag" });
var command = new CommandSettings(hc, args: new string[] { validCommand, $"--{flag}" });
// Assert.
Assert.Contains("badflag", command.Validate());
Assert.Contains(flag, command.Validate());
}
}
[Fact]
[Theory]
[InlineData("remove", "auth", "bar arg value")]
[InlineData("remove", "labels", "bar arg value")]
[InlineData("remove", "monitorsocketaddress", "bar arg value")]
[InlineData("remove", "name", "bar arg value")]
[InlineData("remove", "runnergroup", "bar arg value")]
[InlineData("remove", "url", "bar arg value")]
[InlineData("remove", "username", "bar arg value")]
[InlineData("remove", "windowslogonaccount", "bar arg value")]
[InlineData("remove", "windowslogonpassword", "bar arg value")]
[InlineData("remove", "work", "bar arg value")]
[InlineData("run", "auth", "bad arg value")]
[InlineData("run", "labels", "bad arg value")]
[InlineData("run", "monitorsocketaddress", "bad arg value")]
[InlineData("run", "name", "bad arg value")]
[InlineData("run", "pat", "bad arg value")]
[InlineData("run", "runnergroup", "bad arg value")]
[InlineData("run", "token", "bad arg value")]
[InlineData("run", "url", "bad arg value")]
[InlineData("run", "username", "bad arg value")]
[InlineData("run", "windowslogonaccount", "bad arg value")]
[InlineData("run", "windowslogonpassword", "bad arg value")]
[InlineData("run", "work", "bad arg value")]
[InlineData("warmup", "auth", "bad arg value")]
[InlineData("warmup", "labels", "bad arg value")]
[InlineData("warmup", "monitorsocketaddress", "bad arg value")]
[InlineData("warmup", "name", "bad arg value")]
[InlineData("warmup", "pat", "bad arg value")]
[InlineData("warmup", "runnergroup", "bad arg value")]
[InlineData("warmup", "token", "bad arg value")]
[InlineData("warmup", "url", "bad arg value")]
[InlineData("warmup", "username", "bad arg value")]
[InlineData("warmup", "windowslogonaccount", "bad arg value")]
[InlineData("warmup", "windowslogonpassword", "bad arg value")]
[InlineData("warmup", "work", "bad arg value")]
[Trait("Level", "L0")]
[Trait("Category", nameof(CommandSettings))]
public void ValidateArgs()
public void ValidateInvalidArgCommandCombination(string validCommand, string arg, string argValue)
{
using (TestHostContext hc = CreateTestContext())
{
// Arrange.
var command = new CommandSettings(hc, args: new string[] { "--badargname", "bad arg value" });
var command = new CommandSettings(hc, args: new string[] { validCommand, $"--{arg}", argValue });
// Assert.
Assert.Contains("badargname", command.Validate());
Assert.Contains(arg, command.Validate());
}
}
@@ -758,6 +810,73 @@ namespace GitHub.Runner.Common.Tests
}
}
[Theory]
[InlineData("configure", "help")]
[InlineData("configure", "version")]
[InlineData("configure", "commit")]
[InlineData("configure", "check")]
[InlineData("configure", "disableupdate")]
[InlineData("configure", "ephemeral")]
[InlineData("configure", "replace")]
[InlineData("configure", "runasservice")]
[InlineData("configure", "unattended")]
[InlineData("remove", "help")]
[InlineData("remove", "version")]
[InlineData("remove", "commit")]
[InlineData("remove", "check")]
[InlineData("run", "help")]
[InlineData("run", "version")]
[InlineData("run", "commit")]
[InlineData("run", "check")]
[InlineData("run", "once")]
[InlineData("warmup", "help")]
[InlineData("warmup", "version")]
[InlineData("warmup", "commit")]
[InlineData("warmup", "check")]
[Trait("Level", "L0")]
[Trait("Category", nameof(CommandSettings))]
public void ValidateGoodFlagCommandCombination(string validCommand, string flag)
{
using (TestHostContext hc = CreateTestContext())
{
// Arrange.
var command = new CommandSettings(hc, args: new string[] { validCommand, $"--{flag}" });
// Assert.
Assert.True(command.Validate().Count == 0);
}
}
[Theory]
[InlineData("configure", "auth", "good arg value")]
[InlineData("configure", "labels", "good arg value")]
[InlineData("configure", "monitorsocketaddress", "good arg value")]
[InlineData("configure", "name", "good arg value")]
[InlineData("configure", "pat", "good arg value")]
[InlineData("configure", "runnergroup", "good arg value")]
[InlineData("configure", "token", "good arg value")]
[InlineData("configure", "url", "good arg value")]
[InlineData("configure", "username", "good arg value")]
[InlineData("configure", "windowslogonaccount", "good arg value")]
[InlineData("configure", "windowslogonpassword", "good arg value")]
[InlineData("configure", "work", "good arg value")]
[InlineData("remove", "token", "good arg value")]
[InlineData("remove", "pat", "good arg value")]
[InlineData("run", "startuptype", "good arg value")]
[Trait("Level", "L0")]
[Trait("Category", nameof(CommandSettings))]
public void ValidateGoodArgCommandCombination(string validCommand, string arg, string argValue)
{
using (TestHostContext hc = CreateTestContext())
{
// Arrange.
var command = new CommandSettings(hc, args: new string[] { validCommand, $"--{arg}", argValue });
// Assert.
Assert.True(command.Validate().Count == 0);
}
}
private TestHostContext CreateTestContext([CallerMemberName] string testName = "")
{
TestHostContext hc = new TestHostContext(this, testName);

View File

@@ -66,7 +66,7 @@ namespace GitHub.Runner.Common.Tests.Listener.Configuration
_serviceControlManager = new Mock<ILinuxServiceControlManager>();
#endif
var expectedAgent = new TaskAgent(_expectedAgentName) { Id = 1 };
var expectedAgent = new TaskAgent(_expectedAgentName) { Id = 1, Ephemeral = true, DisableUpdate = true };
expectedAgent.Authorization = new TaskAgentAuthorization
{
ClientId = Guid.NewGuid(),
@@ -163,6 +163,8 @@ namespace GitHub.Runner.Common.Tests.Listener.Configuration
"--token", _expectedToken,
"--labels", userLabels,
"--ephemeral",
"--disableupdate",
"--unattended",
});
trace.Info("Constructed.");
_store.Setup(x => x.IsConfigured()).Returns(false);

View File

@@ -1,18 +1,18 @@
using GitHub.DistributedTask.WebApi;
using GitHub.Services.Common;
using GitHub.Services.WebApi;
using GitHub.Runner.Listener;
using GitHub.Runner.Listener.Configuration;
using Moq;
using System;
using System.Runtime.CompilerServices;
using System.Threading.Tasks;
using Xunit;
using System.Threading;
using System.Reflection;
using System;
using System.Collections.Generic;
using System.IO;
using System.Reflection;
using System.Runtime.CompilerServices;
using System.Security.Cryptography;
using System.Threading;
using System.Threading.Tasks;
using GitHub.DistributedTask.WebApi;
using GitHub.Runner.Listener;
using GitHub.Runner.Listener.Configuration;
using GitHub.Services.Common;
using GitHub.Services.WebApi;
using Moq;
using Xunit;
namespace GitHub.Runner.Common.Tests.Listener
{
@@ -256,5 +256,67 @@ namespace GitHub.Runner.Common.Tests.Listener
tokenSource.Token), Times.Once());
}
}
[Fact]
[Trait("Level", "L0")]
[Trait("Category", "Runner")]
public async void SkipDeleteSession_WhenGetNextMessageGetTaskAgentAccessTokenExpiredException()
{
using (TestHostContext tc = CreateTestContext())
using (var tokenSource = new CancellationTokenSource())
{
Tracing trace = tc.GetTrace();
// Arrange.
var expectedSession = new TaskAgentSession();
PropertyInfo sessionIdProperty = expectedSession.GetType().GetProperty("SessionId", BindingFlags.Instance | BindingFlags.NonPublic | BindingFlags.Public);
Assert.NotNull(sessionIdProperty);
sessionIdProperty.SetValue(expectedSession, Guid.NewGuid());
_runnerServer
.Setup(x => x.CreateAgentSessionAsync(
_settings.PoolId,
It.Is<TaskAgentSession>(y => y != null),
tokenSource.Token))
.Returns(Task.FromResult(expectedSession));
_credMgr.Setup(x => x.LoadCredentials()).Returns(new VssCredentials());
_store.Setup(x => x.GetCredentials()).Returns(new CredentialData() { Scheme = Constants.Configuration.OAuthAccessToken });
_store.Setup(x => x.GetMigratedCredentials()).Returns(default(CredentialData));
// Act.
MessageListener listener = new MessageListener();
listener.Initialize(tc);
bool result = await listener.CreateSessionAsync(tokenSource.Token);
Assert.True(result);
_runnerServer
.Setup(x => x.GetAgentMessageAsync(
_settings.PoolId, expectedSession.SessionId, It.IsAny<long?>(), tokenSource.Token))
.Throws(new TaskAgentAccessTokenExpiredException("test"));
try
{
await listener.GetNextMessageAsync(tokenSource.Token);
}
catch (TaskAgentAccessTokenExpiredException)
{
//expected
}
finally
{
await listener.DeleteSessionAsync();
}
//Assert
_runnerServer
.Verify(x => x.GetAgentMessageAsync(
_settings.PoolId, expectedSession.SessionId, It.IsAny<long?>(), tokenSource.Token), Times.Once);
_runnerServer
.Verify(x => x.DeleteAgentSessionAsync(
_settings.PoolId, expectedSession.SessionId, It.IsAny<CancellationToken>()), Times.Never);
}
}
}
}

View File

@@ -774,8 +774,15 @@ namespace GitHub.Runner.Common.Tests.Listener
var traceFile = Path.GetTempFileName();
File.Copy(hc.TraceFileName, traceFile, true);
if (File.ReadAllText(traceFile).Contains("Use trimmed (runtime+externals) package"))
{
Assert.Contains("Something wrong with the trimmed runner package, failback to use the full package for runner updates", File.ReadAllText(traceFile));
}
else
{
hc.GetTrace().Warning("Skipping the 'TestSelfUpdateAsync_FallbackToFullPackage' test, as the `externals` or `runtime` hashes have been updated");
}
}
}
finally
{

View File

@@ -2,13 +2,13 @@
using System.Collections.Generic;
using System.Diagnostics;
using System.IO;
using System.Threading;
using System.Threading.Tasks;
using Xunit;
using GitHub.Runner.Common.Util;
using System.Threading.Channels;
using GitHub.Runner.Sdk;
using System.Linq;
using System.Threading;
using System.Threading.Channels;
using System.Threading.Tasks;
using GitHub.Runner.Common.Util;
using GitHub.Runner.Sdk;
using Xunit;
namespace GitHub.Runner.Common.Tests
{
@@ -150,5 +150,128 @@ namespace GitHub.Runner.Common.Tests
}
}
}
[Fact]
[Trait("Level", "L0")]
[Trait("Category", "Common")]
public async Task RunnerLayoutParts_CheckDotnetRuntimeHash()
{
using (TestHostContext hc = new TestHostContext(this))
{
Tracing trace = hc.GetTrace();
var dotnetRuntimeHashFile = Path.Combine(TestUtil.GetSrcPath(), $"Misc/contentHash/dotnetRuntime/{BuildConstants.RunnerPackage.PackageName}");
trace.Info($"Current hash: {File.ReadAllText(dotnetRuntimeHashFile)}");
string layoutTrimsRuntimeAssets = Path.Combine(TestUtil.GetSrcPath(), @"../_layout_trims/runtime");
string binDir = Path.Combine(TestUtil.GetSrcPath(), @"../_layout/bin");
#if OS_WINDOWS
string node = Path.Combine(TestUtil.GetSrcPath(), @"..\_layout\externals\node12\bin\node");
#else
string node = Path.Combine(TestUtil.GetSrcPath(), @"../_layout/externals/node12/bin/node");
#endif
string hashFilesScript = Path.Combine(binDir, "hashFiles");
var hashResult = string.Empty;
var p1 = new ProcessInvokerWrapper();
p1.Initialize(hc);
p1.ErrorDataReceived += (_, data) =>
{
if (!string.IsNullOrEmpty(data.Data) && data.Data.StartsWith("__OUTPUT__") && data.Data.EndsWith("__OUTPUT__"))
{
hashResult = data.Data.Substring(10, data.Data.Length - 20);
trace.Info($"Hash result: '{hashResult}'");
}
else
{
trace.Info(data.Data);
}
};
p1.OutputDataReceived += (_, data) =>
{
trace.Info(data.Data);
};
var env = new Dictionary<string, string>
{
["patterns"] = "**"
};
int exitCode = await p1.ExecuteAsync(workingDirectory: layoutTrimsRuntimeAssets,
fileName: node,
arguments: $"\"{hashFilesScript.Replace("\"", "\\\"")}\"",
environment: env,
requireExitCodeZero: true,
outputEncoding: null,
killProcessOnCancel: true,
cancellationToken: CancellationToken.None);
Assert.True(string.Equals(hashResult, File.ReadAllText(dotnetRuntimeHashFile).Trim()), $"Hash mismatch for dotnet runtime. You might need to update `Misc/contentHash/dotnetRuntime/{BuildConstants.RunnerPackage.PackageName}` or check if `hashFiles.ts` ever changed recently.");
}
}
[Fact]
[Trait("Level", "L0")]
[Trait("Category", "Common")]
public async Task RunnerLayoutParts_CheckExternalsHash()
{
using (TestHostContext hc = new TestHostContext(this))
{
Tracing trace = hc.GetTrace();
var externalsHashFile = Path.Combine(TestUtil.GetSrcPath(), $"Misc/contentHash/externals/{BuildConstants.RunnerPackage.PackageName}");
trace.Info($"Current hash: {File.ReadAllText(externalsHashFile)}");
string layoutTrimsExternalsAssets = Path.Combine(TestUtil.GetSrcPath(), @"../_layout_trims/externals");
string binDir = Path.Combine(TestUtil.GetSrcPath(), @"../_layout/bin");
#if OS_WINDOWS
string node = Path.Combine(TestUtil.GetSrcPath(), @"..\_layout\externals\node12\bin\node");
#else
string node = Path.Combine(TestUtil.GetSrcPath(), @"../_layout/externals/node12/bin/node");
#endif
string hashFilesScript = Path.Combine(binDir, "hashFiles");
var hashResult = string.Empty;
var p1 = new ProcessInvokerWrapper();
p1.Initialize(hc);
p1.ErrorDataReceived += (_, data) =>
{
if (!string.IsNullOrEmpty(data.Data) && data.Data.StartsWith("__OUTPUT__") && data.Data.EndsWith("__OUTPUT__"))
{
hashResult = data.Data.Substring(10, data.Data.Length - 20);
trace.Info($"Hash result: '{hashResult}'");
}
else
{
trace.Info(data.Data);
}
};
p1.OutputDataReceived += (_, data) =>
{
trace.Info(data.Data);
};
var env = new Dictionary<string, string>
{
["patterns"] = "**"
};
int exitCode = await p1.ExecuteAsync(workingDirectory: layoutTrimsExternalsAssets,
fileName: node,
arguments: $"\"{hashFilesScript.Replace("\"", "\\\"")}\"",
environment: env,
requireExitCodeZero: true,
outputEncoding: null,
killProcessOnCancel: true,
cancellationToken: CancellationToken.None);
Assert.True(string.Equals(hashResult, File.ReadAllText(externalsHashFile).Trim()), $"Hash mismatch for externals. You might need to update `Misc/contentHash/externals/{BuildConstants.RunnerPackage.PackageName}` or check if `hashFiles.ts` ever changed recently.");
}
}
}
}

View File

@@ -1,5 +1,6 @@
using System;
using System.Runtime.CompilerServices;
using System.Text.RegularExpressions;
using GitHub.Runner.Common;
using GitHub.Runner.Listener.Configuration;
using Xunit;
@@ -15,7 +16,7 @@ namespace GitHub.Runner.Common.Tests
{
RunnerSettings settings = new RunnerSettings();
settings.AgentName = "thisiskindofalongrunnername1";
settings.AgentName = "thisiskindofalongrunnerabcde";
settings.ServerUrl = "https://example.githubusercontent.com/12345678901234567890123456789012345678901234567890";
settings.GitHubUrl = "https://github.com/myorganizationexample/myrepoexample";
@@ -43,7 +44,7 @@ namespace GitHub.Runner.Common.Tests
Assert.Equal("actions", serviceNameParts[0]);
Assert.Equal("runner", serviceNameParts[1]);
Assert.Equal("myorganizationexample-myrepoexample", serviceNameParts[2]); // '/' has been replaced with '-'
Assert.Equal("thisiskindofalongrunnername1", serviceNameParts[3]);
Assert.Equal("thisiskindofalongrunnerabcde", serviceNameParts[3]);
}
}
@@ -54,7 +55,7 @@ namespace GitHub.Runner.Common.Tests
{
RunnerSettings settings = new RunnerSettings();
settings.AgentName = "thisiskindofalongrunnername12";
settings.AgentName = "thisiskindofalongrunnernabcde";
settings.ServerUrl = "https://example.githubusercontent.com/12345678901234567890123456789012345678901234567890";
settings.GitHubUrl = "https://github.com/myorganizationexample/myrepoexample";
@@ -82,10 +83,11 @@ namespace GitHub.Runner.Common.Tests
Assert.Equal("actions", serviceNameParts[0]);
Assert.Equal("runner", serviceNameParts[1]);
Assert.Equal("myorganizationexample-myrepoexample", serviceNameParts[2]); // '/' has been replaced with '-'
Assert.Equal("thisiskindofalongrunnername12", serviceNameParts[3]);
Assert.Equal("thisiskindofalongrunnernabcde", serviceNameParts[3]); // should not have random numbers unless we exceed 80
}
}
#if OS_WINDOWS
[Fact]
[Trait("Level", "L0")]
[Trait("Category", "Service")]
@@ -121,7 +123,7 @@ namespace GitHub.Runner.Common.Tests
Assert.Equal("actions", serviceNameParts[0]); // Never shortened
Assert.Equal("runner", serviceNameParts[1]); // Never shortened
Assert.Equal("myreallylongorganizationexample-myreallylongr", serviceNameParts[2]); // First 45 chars, '/' has been replaced with '-'
Assert.Equal("thisisareallyreally", serviceNameParts[3]); // Remainder of unused chars
Assert.Matches(@"^(thisisareallyr-[0-9]{4})$", serviceNameParts[3]); // Remainder of unused chars, 4 random numbers added at the end
}
}
@@ -164,6 +166,46 @@ namespace GitHub.Runner.Common.Tests
Assert.Equal("name", serviceNameParts[3]);
}
}
#else
[Fact]
[Trait("Level", "L0")]
[Trait("Category", "Service")]
public void CalculateServiceNameLimitsServiceNameTo150Chars()
{
RunnerSettings settings = new RunnerSettings();
settings.AgentName = "thisisareallyreallylongbutstillvalidagentnameiamusingforthisexampletotestverylongnamelimits";
settings.ServerUrl = "https://example.githubusercontent.com/12345678901234567890123456789012345678901234567890";
settings.GitHubUrl = "https://github.com/myreallylongorganizationexampleonlinux/myreallylongrepoexampleonlinux1234";
string serviceNamePattern = "actions.runner.{0}.{1}";
string serviceDisplayNamePattern = "GitHub Actions Runner ({0}.{1})";
using (TestHostContext hc = CreateTestContext())
{
ServiceControlManager scm = new ServiceControlManager();
scm.Initialize(hc);
scm.CalculateServiceName(
settings,
serviceNamePattern,
serviceDisplayNamePattern,
out string serviceName,
out string serviceDisplayName);
// Verify name has been shortened to 150
Assert.Equal(150, serviceName.Length);
var serviceNameParts = serviceName.Split('.');
// Verify that each component has been shortened to a sensible length
Assert.Equal("actions", serviceNameParts[0]); // Never shortened
Assert.Equal("runner", serviceNameParts[1]); // Never shortened
Assert.Equal("myreallylongorganizationexampleonlinux-myreallylongrepoexampleonlinux1", serviceNameParts[2]); // First 70 chars, '/' has been replaced with '-'
Assert.Matches(@"^(thisisareallyreallylongbutstillvalidagentnameiamusingforthi-[0-9]{4})$", serviceNameParts[3]);
}
}
#endif
private TestHostContext CreateTestContext([CallerMemberName] string testName = "")
{

View File

@@ -121,6 +121,7 @@ namespace GitHub.Runner.Common.Tests.Worker
using (TestHostContext hc = CreateTestContext())
{
_ec.Object.Global.EnvironmentVariables = new Dictionary<string, string>();
_ec.Object.Global.JobTelemetry = new List<JobTelemetry>();
var expressionValues = new DictionaryContextData
{
["env"] =
@@ -131,7 +132,6 @@ namespace GitHub.Runner.Common.Tests.Worker
#endif
};
_ec.Setup(x => x.ExpressionValues).Returns(expressionValues);
_ec.Setup(x => x.JobTelemetry).Returns(new List<JobTelemetry>());
Assert.True(_commandManager.TryProcessCommand(_ec.Object, $"::stop-commands::{invalidToken}", null));
}
@@ -148,8 +148,8 @@ namespace GitHub.Runner.Common.Tests.Worker
using (TestHostContext hc = CreateTestContext())
{
_ec.Object.Global.EnvironmentVariables = new Dictionary<string, string>();
_ec.Object.Global.JobTelemetry = new List<JobTelemetry>();
_ec.Setup(x => x.ExpressionValues).Returns(GetExpressionValues());
_ec.Setup(x => x.JobTelemetry).Returns(new List<JobTelemetry>());
Assert.Throws<Exception>(() => _commandManager.TryProcessCommand(_ec.Object, $"::stop-commands::{invalidToken}", null));
}
}

View File

@@ -1,4 +1,4 @@
using System;
using System;
using System.Collections.Generic;
using System.IO;
using System.IO.Compression;
@@ -2135,6 +2135,7 @@ runs:
_ec = new Mock<IExecutionContext>();
_ec.Setup(x => x.Global).Returns(new GlobalContext());
_ec.Setup(x => x.CancellationToken).Returns(_ecTokenSource.Token);
_ec.Setup(x => x.Root).Returns(new GitHub.Runner.Worker.ExecutionContext());
var variables = new Dictionary<string, VariableValue>();
if (enableComposite)
{
@@ -2156,8 +2157,8 @@ runs:
_dockerManager.Setup(x => x.DockerBuild(_ec.Object, It.IsAny<string>(), It.IsAny<string>(), It.IsAny<string>(), It.IsAny<string>())).Returns(Task.FromResult(0));
_jobServer = new Mock<IJobServer>();
_jobServer.Setup(x => x.ResolveActionDownloadInfoAsync(It.IsAny<Guid>(), It.IsAny<string>(), It.IsAny<Guid>(), It.IsAny<ActionReferenceList>(), It.IsAny<CancellationToken>()))
.Returns((Guid scopeIdentifier, string hubName, Guid planId, ActionReferenceList actions, CancellationToken cancellationToken) =>
_jobServer.Setup(x => x.ResolveActionDownloadInfoAsync(It.IsAny<Guid>(), It.IsAny<string>(), It.IsAny<Guid>(), It.IsAny<Guid>(), It.IsAny<ActionReferenceList>(), It.IsAny<CancellationToken>()))
.Returns((Guid scopeIdentifier, string hubName, Guid planId, Guid jobId, ActionReferenceList actions, CancellationToken cancellationToken) =>
{
var result = new ActionDownloadInfoCollection { Actions = new Dictionary<string, ActionDownloadInfo>() };
foreach (var action in actions.Actions)

View File

@@ -118,7 +118,7 @@ namespace GitHub.Runner.Common.Tests.Worker
await _actionRunner.RunAsync();
//Assert
_ec.Verify(x => x.SetGitHubContext("event_path", Path.Combine(_hc.GetDirectory(WellKnownDirectory.Temp), "_github_workflow", "event.json")), Times.Once);
_ec.Verify(x => x.WriteWebhookPayload(), Times.Once);
}
[Fact]

View File

@@ -0,0 +1,289 @@
using System;
using System.Collections.Generic;
using System.IO;
using System.Text;
using System.Runtime.CompilerServices;
using GitHub.Runner.Common.Util;
using GitHub.Runner.Sdk;
using GitHub.Runner.Worker;
using Moq;
using Xunit;
using DTWebApi = GitHub.DistributedTask.WebApi;
using GitHub.DistributedTask.WebApi;
using Pipelines = GitHub.DistributedTask.Pipelines;
namespace GitHub.Runner.Common.Tests.Worker
{
public sealed class CreateStepSummaryCommandL0
{
private Mock<IExecutionContext> _executionContext;
private Mock<IJobServerQueue> _jobServerQueue;
private ExecutionContext _jobExecutionContext;
private List<Tuple<DTWebApi.Issue, string>> _issues;
private Variables _variables;
private string _rootDirectory;
private CreateStepSummaryCommand _createStepCommand;
private ITraceWriter _trace;
[Fact]
[Trait("Level", "L0")]
[Trait("Category", "Worker")]
public void CreateStepSummaryCommand_FeatureDisabled()
{
using (var hostContext = Setup(featureFlagState: "false"))
{
var stepSummaryFile = Path.Combine(_rootDirectory, "feature-off");
_createStepCommand.ProcessCommand(_executionContext.Object, stepSummaryFile, null);
_jobExecutionContext.Complete();
_jobServerQueue.Verify(x => x.QueueFileUpload(It.IsAny<Guid>(), It.IsAny<Guid>(), It.IsAny<string>(), It.IsAny<string>(), It.IsAny<string>(), It.IsAny<bool>()), Times.Never());
Assert.Equal(0, _issues.Count);
}
}
[Fact]
[Trait("Level", "L0")]
[Trait("Category", "Worker")]
public void CreateStepSummaryCommand_FileNull()
{
using (var hostContext = Setup())
{
_createStepCommand.ProcessCommand(_executionContext.Object, null, null);
_jobExecutionContext.Complete();
_jobServerQueue.Verify(x => x.QueueFileUpload(It.IsAny<Guid>(), It.IsAny<Guid>(), It.IsAny<string>(), It.IsAny<string>(), It.IsAny<string>(), It.IsAny<bool>()), Times.Never());
Assert.Equal(0, _issues.Count);
}
}
[Fact]
[Trait("Level", "L0")]
[Trait("Category", "Worker")]
public void CreateStepSummaryCommand_DirectoryNotFound()
{
using (var hostContext = Setup())
{
var stepSummaryFile = Path.Combine(_rootDirectory, "directory-not-found", "env");
_createStepCommand.ProcessCommand(_executionContext.Object, stepSummaryFile, null);
_jobExecutionContext.Complete();
_jobServerQueue.Verify(x => x.QueueFileUpload(It.IsAny<Guid>(), It.IsAny<Guid>(), It.IsAny<string>(), It.IsAny<string>(), It.IsAny<string>(), It.IsAny<bool>()), Times.Never());
Assert.Equal(0, _issues.Count);
}
}
[Fact]
[Trait("Level", "L0")]
[Trait("Category", "Worker")]
public void CreateStepSummaryCommand_FileNotFound()
{
using (var hostContext = Setup())
{
var stepSummaryFile = Path.Combine(_rootDirectory, "file-not-found");
_createStepCommand.ProcessCommand(_executionContext.Object, stepSummaryFile, null);
_jobExecutionContext.Complete();
_jobServerQueue.Verify(x => x.QueueFileUpload(It.IsAny<Guid>(), It.IsAny<Guid>(), It.IsAny<string>(), It.IsAny<string>(), It.IsAny<string>(), It.IsAny<bool>()), Times.Never());
Assert.Equal(0, _issues.Count);
}
}
[Fact]
[Trait("Level", "L0")]
[Trait("Category", "Worker")]
public void CreateStepSummaryCommand_EmptyFile()
{
using (var hostContext = Setup())
{
var stepSummaryFile = Path.Combine(_rootDirectory, "empty-file");
File.Create(stepSummaryFile).Dispose();
_createStepCommand.ProcessCommand(_executionContext.Object, stepSummaryFile, null);
_jobExecutionContext.Complete();
_jobServerQueue.Verify(x => x.QueueFileUpload(It.IsAny<Guid>(), It.IsAny<Guid>(), It.IsAny<string>(), It.IsAny<string>(), It.IsAny<string>(), It.IsAny<bool>()), Times.Never());
Assert.Equal(0, _issues.Count);
}
}
[Fact]
[Trait("Level", "L0")]
[Trait("Category", "Worker")]
public void CreateStepSummaryCommand_LargeFile()
{
using (var hostContext = Setup())
{
var stepSummaryFile = Path.Combine(_rootDirectory, "empty-file");
File.WriteAllBytes(stepSummaryFile, new byte[CreateStepSummaryCommand.AttachmentSizeLimit + 1]);
_createStepCommand.ProcessCommand(_executionContext.Object, stepSummaryFile, null);
_jobExecutionContext.Complete();
_jobServerQueue.Verify(x => x.QueueFileUpload(It.IsAny<Guid>(), It.IsAny<Guid>(), It.IsAny<string>(), It.IsAny<string>(), It.IsAny<string>(), It.IsAny<bool>()), Times.Never());
Assert.Equal(1, _issues.Count);
}
}
[Fact]
[Trait("Level", "L0")]
[Trait("Category", "Worker")]
public void CreateStepSummaryCommand_Simple()
{
using (var hostContext = Setup())
{
var stepSummaryFile = Path.Combine(_rootDirectory, "simple");
var content = new List<string>
{
"# This is some markdown content",
"",
"## This is more markdown content",
};
WriteContent(stepSummaryFile, content);
_createStepCommand.ProcessCommand(_executionContext.Object, stepSummaryFile, null);
_jobExecutionContext.Complete();
_jobServerQueue.Verify(x => x.QueueFileUpload(It.IsAny<Guid>(), It.IsAny<Guid>(), ChecksAttachmentType.StepSummary, _executionContext.Object.Id.ToString(), stepSummaryFile + "-scrubbed", It.IsAny<bool>()), Times.Once());
Assert.Equal(0, _issues.Count);
}
}
[Fact]
[Trait("Level", "L0")]
[Trait("Category", "Worker")]
public void CreateStepSummaryCommand_ScrubSecrets()
{
using (var hostContext = Setup())
{
// configure secretmasker to actually mask secrets
hostContext.SecretMasker.AddRegex("Password=.*");
hostContext.SecretMasker.AddRegex("ghs_.*");
var stepSummaryFile = Path.Combine(_rootDirectory, "simple");
var scrubbedFile = stepSummaryFile + "-scrubbed";
var content = new List<string>
{
"# Password=ThisIsMySecretPassword!",
"",
"# GITHUB_TOKEN ghs_verysecuretoken",
};
WriteContent(stepSummaryFile, content);
_createStepCommand.ProcessCommand(_executionContext.Object, stepSummaryFile, null);
var scrubbedFileContents = File.ReadAllText(scrubbedFile);
Assert.DoesNotContain("ThisIsMySecretPassword!", scrubbedFileContents);
Assert.DoesNotContain("ghs_verysecuretoken", scrubbedFileContents);
_jobExecutionContext.Complete();
_jobServerQueue.Verify(x => x.QueueFileUpload(It.IsAny<Guid>(), It.IsAny<Guid>(), ChecksAttachmentType.StepSummary, _executionContext.Object.Id.ToString(), scrubbedFile, It.IsAny<bool>()), Times.Once());
Assert.Equal(0, _issues.Count);
}
}
private void WriteContent(
string path,
List<string> content,
string newline = null)
{
if (string.IsNullOrEmpty(newline))
{
newline = Environment.NewLine;
}
var encoding = new UTF8Encoding(true); // Emit BOM
var contentStr = string.Join(newline, content);
File.WriteAllText(path, contentStr, encoding);
}
private TestHostContext Setup([CallerMemberName] string name = "", string featureFlagState = "true")
{
var hostContext = new TestHostContext(this, name);
_issues = new List<Tuple<DTWebApi.Issue, string>>();
// Setup a job request
TaskOrchestrationPlanReference plan = new TaskOrchestrationPlanReference();
TimelineReference timeline = new TimelineReference();
Guid jobId = Guid.NewGuid();
string jobName = "Summary Job";
var jobRequest = new Pipelines.AgentJobRequestMessage(plan, timeline, jobId, jobName, jobName, null, null, null, new Dictionary<string, VariableValue>(), new List<MaskHint>(), new Pipelines.JobResources(), new Pipelines.ContextData.DictionaryContextData(), new Pipelines.WorkspaceOptions(), new List<Pipelines.ActionStep>(), null, null, null, null);
jobRequest.Resources.Repositories.Add(new Pipelines.RepositoryResource()
{
Alias = Pipelines.PipelineConstants.SelfAlias,
Id = "github",
Version = "sha1"
});
jobRequest.ContextData["github"] = new Pipelines.ContextData.DictionaryContextData();
jobRequest.Variables["ACTIONS_STEP_DEBUG"] = "true";
// Server queue for job
_jobServerQueue = new Mock<IJobServerQueue>();
_jobServerQueue.Setup(x => x.QueueTimelineRecordUpdate(It.IsAny<Guid>(), It.IsAny<TimelineRecord>()));
hostContext.SetSingleton(_jobServerQueue.Object);
// Configuration store (required singleton)
var configurationStore = new Mock<IConfigurationStore>();
configurationStore.Setup(x => x.GetSettings()).Returns(new RunnerSettings());
hostContext.SetSingleton(configurationStore.Object);
// Paging Logger (required singleton)
var pagingLogger = new Mock<IPagingLogger>();
hostContext.EnqueueInstance(pagingLogger.Object);
// Trace
_trace = hostContext.GetTrace();
// Variables to test for secret scrubbing & FF options
_variables = new Variables(hostContext, new Dictionary<string, VariableValue>
{
{ "MySecretName", new VariableValue("My secret value", true) },
{ "DistributedTask.UploadStepSummary", featureFlagState },
});
// Directory for test data
var workDirectory = hostContext.GetDirectory(WellKnownDirectory.Work);
ArgUtil.NotNullOrEmpty(workDirectory, nameof(workDirectory));
Directory.CreateDirectory(workDirectory);
_rootDirectory = Path.Combine(workDirectory, nameof(CreateStepSummaryCommandL0));
Directory.CreateDirectory(_rootDirectory);
// Job execution context
_jobExecutionContext = new ExecutionContext();
_jobExecutionContext.Initialize(hostContext);
_jobExecutionContext.InitializeJob(jobRequest, System.Threading.CancellationToken.None);
// Step execution context
_executionContext = new Mock<IExecutionContext>();
_executionContext.Setup(x => x.Global)
.Returns(new GlobalContext
{
EnvironmentVariables = new Dictionary<string, string>(VarUtil.EnvironmentVariableKeyComparer),
WriteDebug = true,
Variables = _variables,
});
_executionContext.Setup(x => x.AddIssue(It.IsAny<DTWebApi.Issue>(), It.IsAny<string>()))
.Callback((DTWebApi.Issue issue, string logMessage) =>
{
_issues.Add(new Tuple<DTWebApi.Issue, string>(issue, logMessage));
var message = !string.IsNullOrEmpty(logMessage) ? logMessage : issue.Message;
_trace.Info($"Issue '{issue.Type}': {message}");
});
_executionContext.Setup(x => x.Write(It.IsAny<string>(), It.IsAny<string>()))
.Callback((string tag, string message) =>
{
_trace.Info($"{tag}{message}");
});
_executionContext.SetupGet(x => x.Root).Returns(_jobExecutionContext);
//CreateStepSummaryCommand
_createStepCommand = new CreateStepSummaryCommand();
_createStepCommand.Initialize(hostContext);
return hostContext;
}
}
}

View File

@@ -1,13 +1,16 @@
using GitHub.DistributedTask.Pipelines.ContextData;
using GitHub.DistributedTask.WebApi;
using GitHub.Runner.Worker;
using Moq;
using System;
using System;
using System.Collections.Generic;
using System.Linq;
using System.Runtime.CompilerServices;
using System.Threading;
using GitHub.DistributedTask.Pipelines.ContextData;
using GitHub.DistributedTask.WebApi;
using GitHub.Runner.Worker;
using GitHub.Runner.Worker.Container;
using GitHub.Runner.Worker.Handlers;
using Moq;
using Xunit;
using GitHub.DistributedTask.ObjectTemplating.Tokens;
using Pipelines = GitHub.DistributedTask.Pipelines;
namespace GitHub.Runner.Common.Tests.Worker
@@ -90,6 +93,166 @@ namespace GitHub.Runner.Common.Tests.Worker
}
}
[Fact]
[Trait("Level", "L0")]
[Trait("Category", "Worker")]
public void ApplyContinueOnError_CheckResultAndOutcome()
{
using (TestHostContext hc = CreateTestContext())
{
// Arrange: Create a job request message.
TaskOrchestrationPlanReference plan = new TaskOrchestrationPlanReference();
TimelineReference timeline = new TimelineReference();
Guid jobId = Guid.NewGuid();
string jobName = "some job name";
var jobRequest = new Pipelines.AgentJobRequestMessage(plan, timeline, jobId, jobName, jobName, null, null, null, new Dictionary<string, VariableValue>(), new List<MaskHint>(), new Pipelines.JobResources(), new Pipelines.ContextData.DictionaryContextData(), new Pipelines.WorkspaceOptions(), new List<Pipelines.ActionStep>(), null, null, null, null);
jobRequest.Resources.Repositories.Add(new Pipelines.RepositoryResource()
{
Alias = Pipelines.PipelineConstants.SelfAlias,
Id = "github",
Version = "sha1"
});
jobRequest.ContextData["github"] = new Pipelines.ContextData.DictionaryContextData();
jobRequest.Variables["ACTIONS_STEP_DEBUG"] = "true";
// Arrange: Setup the paging logger.
var pagingLogger = new Mock<IPagingLogger>();
var jobServerQueue = new Mock<IJobServerQueue>();
jobServerQueue.Setup(x => x.QueueTimelineRecordUpdate(It.IsAny<Guid>(), It.IsAny<TimelineRecord>()));
jobServerQueue.Setup(x => x.QueueWebConsoleLine(It.IsAny<Guid>(), It.IsAny<string>(), It.IsAny<long>())).Callback((Guid id, string msg, long? lineNumber) => { hc.GetTrace().Info(msg); });
hc.EnqueueInstance(pagingLogger.Object);
hc.SetSingleton(jobServerQueue.Object);
var ec = new Runner.Worker.ExecutionContext();
ec.Initialize(hc);
// Act.
ec.InitializeJob(jobRequest, CancellationToken.None);
foreach (var tc in new List<(TemplateToken token, TaskResult result, TaskResult? expectedResult, TaskResult? expectedOutcome)> {
(token: new BooleanToken(null, null, null, true), result: TaskResult.Failed, expectedResult: TaskResult.Succeeded, expectedOutcome: TaskResult.Failed),
(token: new BooleanToken(null, null, null, true), result: TaskResult.Succeeded, expectedResult: TaskResult.Succeeded, expectedOutcome: null),
(token: new BooleanToken(null, null, null, true), result: TaskResult.Canceled, expectedResult: TaskResult.Canceled, expectedOutcome: null),
(token: new BooleanToken(null, null, null, false), result: TaskResult.Failed, expectedResult: TaskResult.Failed, expectedOutcome: null),
(token: new BooleanToken(null, null, null, false), result: TaskResult.Succeeded, expectedResult: TaskResult.Succeeded, expectedOutcome: null),
(token: new BooleanToken(null, null, null, false), result: TaskResult.Canceled, expectedResult: TaskResult.Canceled, expectedOutcome: null),
})
{
ec.Result = tc.result;
ec.Outcome = null;
ec.ApplyContinueOnError(tc.token);
Assert.Equal(ec.Result, tc.expectedResult);
Assert.Equal(ec.Outcome, tc.expectedOutcome);
}
}
}
[Fact]
[Trait("Level", "L0")]
[Trait("Category", "Worker")]
public void AddIssue_TrimMessageSize()
{
using (TestHostContext hc = CreateTestContext())
{
// Arrange: Create a job request message.
TaskOrchestrationPlanReference plan = new TaskOrchestrationPlanReference();
TimelineReference timeline = new TimelineReference();
Guid jobId = Guid.NewGuid();
string jobName = "some job name";
var jobRequest = new Pipelines.AgentJobRequestMessage(plan, timeline, jobId, jobName, jobName, null, null, null, new Dictionary<string, VariableValue>(), new List<MaskHint>(), new Pipelines.JobResources(), new Pipelines.ContextData.DictionaryContextData(), new Pipelines.WorkspaceOptions(), new List<Pipelines.ActionStep>(), null, null, null, null);
jobRequest.Resources.Repositories.Add(new Pipelines.RepositoryResource()
{
Alias = Pipelines.PipelineConstants.SelfAlias,
Id = "github",
Version = "sha1"
});
jobRequest.ContextData["github"] = new Pipelines.ContextData.DictionaryContextData();
// Arrange: Setup the paging logger.
var pagingLogger = new Mock<IPagingLogger>();
var jobServerQueue = new Mock<IJobServerQueue>();
jobServerQueue.Setup(x => x.QueueTimelineRecordUpdate(It.IsAny<Guid>(), It.IsAny<TimelineRecord>()));
hc.EnqueueInstance(pagingLogger.Object);
hc.SetSingleton(jobServerQueue.Object);
var ec = new Runner.Worker.ExecutionContext();
ec.Initialize(hc);
// Act.
ec.InitializeJob(jobRequest, CancellationToken.None);
var bigMessage = "";
for (var i = 0; i < 5000; i++)
{
bigMessage += "a";
}
ec.AddIssue(new Issue() { Type = IssueType.Error, Message = bigMessage });
ec.AddIssue(new Issue() { Type = IssueType.Warning, Message = bigMessage });
ec.AddIssue(new Issue() { Type = IssueType.Notice, Message = bigMessage });
ec.Complete();
// Assert.
jobServerQueue.Verify(x => x.QueueTimelineRecordUpdate(It.IsAny<Guid>(), It.Is<TimelineRecord>(t => t.Issues.Where(i => i.Type == IssueType.Error).Single().Message.Length <= 4096)), Times.AtLeastOnce);
jobServerQueue.Verify(x => x.QueueTimelineRecordUpdate(It.IsAny<Guid>(), It.Is<TimelineRecord>(t => t.Issues.Where(i => i.Type == IssueType.Warning).Single().Message.Length <= 4096)), Times.AtLeastOnce);
jobServerQueue.Verify(x => x.QueueTimelineRecordUpdate(It.IsAny<Guid>(), It.Is<TimelineRecord>(t => t.Issues.Where(i => i.Type == IssueType.Notice).Single().Message.Length <= 4096)), Times.AtLeastOnce);
}
}
[Fact]
[Trait("Level", "L0")]
[Trait("Category", "Worker")]
public void AddIssue_AddStepAndLineNumberInformation()
{
using (TestHostContext hc = CreateTestContext())
{
TaskOrchestrationPlanReference plan = new TaskOrchestrationPlanReference();
TimelineReference timeline = new TimelineReference();
Guid jobId = Guid.NewGuid();
string jobName = "some job name";
var jobRequest = new Pipelines.AgentJobRequestMessage(plan, timeline, jobId, jobName, jobName, null, null, null, new Dictionary<string, VariableValue>(), new List<MaskHint>(), new Pipelines.JobResources(), new Pipelines.ContextData.DictionaryContextData(), new Pipelines.WorkspaceOptions(), new List<Pipelines.ActionStep>(), null, null, null, null);
jobRequest.Resources.Repositories.Add(new Pipelines.RepositoryResource()
{
Alias = Pipelines.PipelineConstants.SelfAlias,
Id = "github",
Version = "sha1"
});
jobRequest.ContextData["github"] = new Pipelines.ContextData.DictionaryContextData();
// Arrange: Setup the paging logger.
var pagingLogger = new Mock<IPagingLogger>();
var pagingLogger2 = new Mock<IPagingLogger>();
var jobServerQueue = new Mock<IJobServerQueue>();
jobServerQueue.Setup(x => x.QueueTimelineRecordUpdate(It.IsAny<Guid>(), It.IsAny<TimelineRecord>()));
hc.EnqueueInstance(pagingLogger.Object);
hc.EnqueueInstance(pagingLogger2.Object);
hc.SetSingleton(jobServerQueue.Object);
var ec = new Runner.Worker.ExecutionContext();
ec.Initialize(hc);
ec.InitializeJob(jobRequest, CancellationToken.None);
ec.Start();
var embeddedStep = ec.CreateChild(Guid.NewGuid(), "action_1_pre", "action_1_pre", null, null, ActionRunStage.Main, isEmbedded: true);
embeddedStep.Start();
embeddedStep.AddIssue(new Issue() { Type = IssueType.Error, Message = "error annotation that should have step and line number information" });
embeddedStep.AddIssue(new Issue() { Type = IssueType.Warning, Message = "warning annotation that should have step and line number information" });
embeddedStep.AddIssue(new Issue() { Type = IssueType.Notice, Message = "notice annotation that should have step and line number information" });
jobServerQueue.Verify(x => x.QueueTimelineRecordUpdate(It.IsAny<Guid>(), It.Is<TimelineRecord>(t => t.Issues.Where(i => i.Data.ContainsKey("stepNumber") && i.Data.ContainsKey("logFileLineNumber") && i.Type == IssueType.Error).Count() == 1)), Times.AtLeastOnce);
jobServerQueue.Verify(x => x.QueueTimelineRecordUpdate(It.IsAny<Guid>(), It.Is<TimelineRecord>(t => t.Issues.Where(i => i.Data.ContainsKey("stepNumber") && i.Data.ContainsKey("logFileLineNumber") && i.Type == IssueType.Warning).Count() == 1)), Times.AtLeastOnce);
jobServerQueue.Verify(x => x.QueueTimelineRecordUpdate(It.IsAny<Guid>(), It.Is<TimelineRecord>(t => t.Issues.Where(i => i.Data.ContainsKey("stepNumber") && i.Data.ContainsKey("logFileLineNumber") && i.Type == IssueType.Notice).Count() == 1)), Times.AtLeastOnce);
}
}
[Fact]
[Trait("Level", "L0")]
[Trait("Category", "Worker")]
@@ -378,6 +541,177 @@ namespace GitHub.Runner.Common.Tests.Worker
}
}
[Fact]
[Trait("Level", "L0")]
[Trait("Category", "Worker")]
public void PublishStepTelemetry_RegularStep_NoOpt()
{
using (TestHostContext hc = CreateTestContext())
{
// Arrange: Create a job request message.
TaskOrchestrationPlanReference plan = new TaskOrchestrationPlanReference();
TimelineReference timeline = new TimelineReference();
Guid jobId = Guid.NewGuid();
string jobName = "some job name";
var jobRequest = new Pipelines.AgentJobRequestMessage(plan, timeline, jobId, jobName, jobName, null, null, null, new Dictionary<string, VariableValue>(), new List<MaskHint>(), new Pipelines.JobResources(), new Pipelines.ContextData.DictionaryContextData(), new Pipelines.WorkspaceOptions(), new List<Pipelines.ActionStep>(), null, null, null, null);
jobRequest.Resources.Repositories.Add(new Pipelines.RepositoryResource()
{
Alias = Pipelines.PipelineConstants.SelfAlias,
Id = "github",
Version = "sha1"
});
jobRequest.ContextData["github"] = new Pipelines.ContextData.DictionaryContextData();
// Arrange: Setup the paging logger.
var pagingLogger = new Mock<IPagingLogger>();
var jobServerQueue = new Mock<IJobServerQueue>();
jobServerQueue.Setup(x => x.QueueTimelineRecordUpdate(It.IsAny<Guid>(), It.IsAny<TimelineRecord>()));
hc.EnqueueInstance(pagingLogger.Object);
hc.SetSingleton(jobServerQueue.Object);
var ec = new Runner.Worker.ExecutionContext();
ec.Initialize(hc);
// Act.
ec.InitializeJob(jobRequest, CancellationToken.None);
ec.Start();
ec.Complete();
// Assert.
Assert.Equal(0, ec.Global.StepsTelemetry.Count);
}
}
[Fact]
[Trait("Level", "L0")]
[Trait("Category", "Worker")]
public void PublishStepTelemetry_RegularStep()
{
using (TestHostContext hc = CreateTestContext())
{
// Arrange: Create a job request message.
TaskOrchestrationPlanReference plan = new TaskOrchestrationPlanReference();
TimelineReference timeline = new TimelineReference();
Guid jobId = Guid.NewGuid();
string jobName = "some job name";
var jobRequest = new Pipelines.AgentJobRequestMessage(plan, timeline, jobId, jobName, jobName, null, null, null, new Dictionary<string, VariableValue>(), new List<MaskHint>(), new Pipelines.JobResources(), new Pipelines.ContextData.DictionaryContextData(), new Pipelines.WorkspaceOptions(), new List<Pipelines.ActionStep>(), null, null, null, null);
jobRequest.Resources.Repositories.Add(new Pipelines.RepositoryResource()
{
Alias = Pipelines.PipelineConstants.SelfAlias,
Id = "github",
Version = "sha1"
});
jobRequest.ContextData["github"] = new Pipelines.ContextData.DictionaryContextData();
// Arrange: Setup the paging logger.
var pagingLogger = new Mock<IPagingLogger>();
var jobServerQueue = new Mock<IJobServerQueue>();
jobServerQueue.Setup(x => x.QueueTimelineRecordUpdate(It.IsAny<Guid>(), It.IsAny<TimelineRecord>()));
hc.EnqueueInstance(pagingLogger.Object);
hc.SetSingleton(jobServerQueue.Object);
var ec = new Runner.Worker.ExecutionContext();
ec.Initialize(hc);
// Act.
ec.InitializeJob(jobRequest, CancellationToken.None);
ec.Start();
ec.StepTelemetry.Type = "node16";
ec.StepTelemetry.Action = "actions/checkout";
ec.StepTelemetry.Ref = "v2";
ec.StepTelemetry.IsEmbedded = false;
ec.StepTelemetry.StepId = Guid.NewGuid();
ec.StepTelemetry.Stage = "main";
ec.AddIssue(new Issue() { Type = IssueType.Error, Message = "error" });
ec.AddIssue(new Issue() { Type = IssueType.Warning, Message = "warning" });
ec.AddIssue(new Issue() { Type = IssueType.Notice, Message = "notice" });
ec.AddIssue(new Issue() { Type = IssueType.Error, Message = "error" });
ec.AddIssue(new Issue() { Type = IssueType.Warning, Message = "warning" });
ec.AddIssue(new Issue() { Type = IssueType.Notice, Message = "notice" });
ec.Complete();
// Assert.
Assert.Equal(1, ec.Global.StepsTelemetry.Count);
Assert.Equal("node16", ec.Global.StepsTelemetry.Single().Type);
Assert.Equal("actions/checkout", ec.Global.StepsTelemetry.Single().Action);
Assert.Equal("v2", ec.Global.StepsTelemetry.Single().Ref);
Assert.Equal(TaskResult.Succeeded, ec.Global.StepsTelemetry.Single().Result);
Assert.NotNull(ec.Global.StepsTelemetry.Single().ExecutionTimeInSeconds);
Assert.Equal(3, ec.Global.StepsTelemetry.Single().ErrorMessages.Count);
Assert.DoesNotContain(ec.Global.StepsTelemetry.Single().ErrorMessages, x => x.Contains("notice"));
}
}
[Fact]
[Trait("Level", "L0")]
[Trait("Category", "Worker")]
public void PublishStepTelemetry_EmbeddedStep()
{
using (TestHostContext hc = CreateTestContext())
{
// Arrange: Create a job request message.
TaskOrchestrationPlanReference plan = new TaskOrchestrationPlanReference();
TimelineReference timeline = new TimelineReference();
Guid jobId = Guid.NewGuid();
string jobName = "some job name";
var jobRequest = new Pipelines.AgentJobRequestMessage(plan, timeline, jobId, jobName, jobName, null, null, null, new Dictionary<string, VariableValue>(), new List<MaskHint>(), new Pipelines.JobResources(), new Pipelines.ContextData.DictionaryContextData(), new Pipelines.WorkspaceOptions(), new List<Pipelines.ActionStep>(), null, null, null, null);
jobRequest.Resources.Repositories.Add(new Pipelines.RepositoryResource()
{
Alias = Pipelines.PipelineConstants.SelfAlias,
Id = "github",
Version = "sha1"
});
jobRequest.ContextData["github"] = new Pipelines.ContextData.DictionaryContextData();
// Arrange: Setup the paging logger.
var pagingLogger = new Mock<IPagingLogger>();
var pagingLogger2 = new Mock<IPagingLogger>();
var jobServerQueue = new Mock<IJobServerQueue>();
jobServerQueue.Setup(x => x.QueueTimelineRecordUpdate(It.IsAny<Guid>(), It.IsAny<TimelineRecord>()));
hc.EnqueueInstance(pagingLogger.Object);
hc.EnqueueInstance(pagingLogger2.Object);
hc.SetSingleton(jobServerQueue.Object);
var ec = new Runner.Worker.ExecutionContext();
ec.Initialize(hc);
// Act.
ec.InitializeJob(jobRequest, CancellationToken.None);
ec.Start();
var embeddedStep = ec.CreateChild(Guid.NewGuid(), "action_1_pre", "action_1_pre", null, null, ActionRunStage.Main, isEmbedded: true);
embeddedStep.Start();
embeddedStep.StepTelemetry.Type = "node16";
embeddedStep.StepTelemetry.Action = "actions/checkout";
embeddedStep.StepTelemetry.Ref = "v2";
embeddedStep.AddIssue(new Issue() { Type = IssueType.Error, Message = "error" });
embeddedStep.AddIssue(new Issue() { Type = IssueType.Warning, Message = "warning" });
embeddedStep.AddIssue(new Issue() { Type = IssueType.Notice, Message = "notice" });
embeddedStep.PublishStepTelemetry();
// Assert.
Assert.Equal(1, ec.Global.StepsTelemetry.Count);
Assert.Equal("node16", ec.Global.StepsTelemetry.Single().Type);
Assert.Equal("actions/checkout", ec.Global.StepsTelemetry.Single().Action);
Assert.Equal("v2", ec.Global.StepsTelemetry.Single().Ref);
Assert.Equal(ActionRunStage.Main.ToString(), ec.Global.StepsTelemetry.Single().Stage);
Assert.True(ec.Global.StepsTelemetry.Single().IsEmbedded);
Assert.Null(ec.Global.StepsTelemetry.Single().Result);
Assert.Null(ec.Global.StepsTelemetry.Single().ExecutionTimeInSeconds);
Assert.Equal(0, ec.Global.StepsTelemetry.Single().ErrorMessages.Count);
}
}
private TestHostContext CreateTestContext([CallerMemberName] String testName = "")
{
var hc = new TestHostContext(this, testName);
@@ -392,5 +726,149 @@ namespace GitHub.Runner.Common.Tests.Worker
return hc;
}
[Fact]
[Trait("Level", "L0")]
[Trait("Category", "Worker")]
public void GetExpressionValues_ContainerStepHost()
{
using (TestHostContext hc = CreateTestContext())
{
const string source = "/home/username/Projects/work/runner/_layout";
var containerInfo = new ContainerInfo();
containerInfo.ContainerId = "test";
containerInfo.AddPathTranslateMapping($"{source}/_work", "/__w");
containerInfo.AddPathTranslateMapping($"{source}/_temp", "/__t");
containerInfo.AddPathTranslateMapping($"{source}/externals", "/__e");
containerInfo.AddPathTranslateMapping($"{source}/_work/_temp/_github_home", "/github/home");
containerInfo.AddPathTranslateMapping($"{source}/_work/_temp/_github_workflow", "/github/workflow");
foreach (var v in new List<string>() {
$"{source}/_work",
$"{source}/externals",
$"{source}/_work/_temp",
$"{source}/_work/_actions",
$"{source}/_work/_tool",
})
{
containerInfo.MountVolumes.Add(new MountVolume(v, containerInfo.TranslateToContainerPath(v)));
};
var stepHost = new ContainerStepHost();
stepHost.Container = containerInfo;
var ec = new Runner.Worker.ExecutionContext();
ec.Initialize(hc);
var inputGithubContext = new GitHubContext();
var inputeRunnerContext = new RunnerContext();
// string context data
inputGithubContext["action_path"] = new StringContextData("/home/username/Projects/work/runner/_layout/_work/_actions/owner/composite/main");
inputGithubContext["action"] = new StringContextData("__owner_composite");
inputGithubContext["api_url"] = new StringContextData("https://api.github.com/custom/path");
inputGithubContext["env"] = new StringContextData("/home/username/Projects/work/runner/_layout/_work/_temp/_runner_file_commands/set_env_265698aa-7f38-40f5-9316-5c01a3153672");
inputGithubContext["path"] = new StringContextData("/home/username/Projects/work/runner/_layout/_work/_temp/_runner_file_commands/add_path_265698aa-7f38-40f5-9316-5c01a3153672");
inputGithubContext["event_path"] = new StringContextData("/home/username/Projects/work/runner/_layout/_work/_temp/_github_workflow/event.json");
inputGithubContext["repository"] = new StringContextData("owner/repo-name");
inputGithubContext["run_id"] = new StringContextData("2033211332");
inputGithubContext["workflow"] = new StringContextData("Name of Workflow");
inputGithubContext["workspace"] = new StringContextData("/home/username/Projects/work/runner/_layout/_work/step-order/step-order");
inputeRunnerContext["temp"] = new StringContextData("/home/username/Projects/work/runner/_layout/_work/_temp");
inputeRunnerContext["tool_cache"] = new StringContextData("/home/username/Projects/work/runner/_layout/_work/_tool");
// dictionary context data
var githubEvent = new DictionaryContextData();
githubEvent["inputs"] = null;
githubEvent["ref"] = new StringContextData("refs/heads/main");
githubEvent["repository"] = new DictionaryContextData();
githubEvent["sender"] = new DictionaryContextData();
githubEvent["workflow"] = new StringContextData(".github/workflows/composite_step_host_translate.yaml");
inputGithubContext["event"] = githubEvent;
ec.ExpressionValues["github"] = inputGithubContext;
ec.ExpressionValues["runner"] = inputeRunnerContext;
var ecExpect = new Runner.Worker.ExecutionContext();
ecExpect.Initialize(hc);
var expectedGithubEvent = new DictionaryContextData();
expectedGithubEvent["inputs"] = null;
expectedGithubEvent["ref"] = new StringContextData("refs/heads/main");
expectedGithubEvent["repository"] = new DictionaryContextData();
expectedGithubEvent["sender"] = new DictionaryContextData();
expectedGithubEvent["workflow"] = new StringContextData(".github/workflows/composite_step_host_translate.yaml");
var expectedGithubContext = new GitHubContext();
var expectedRunnerContext = new RunnerContext();
expectedGithubContext["action_path"] = new StringContextData("/__w/_actions/owner/composite/main");
expectedGithubContext["action"] = new StringContextData("__owner_composite");
expectedGithubContext["api_url"] = new StringContextData("https://api.github.com/custom/path");
expectedGithubContext["env"] = new StringContextData("/__w/_temp/_runner_file_commands/set_env_265698aa-7f38-40f5-9316-5c01a3153672");
expectedGithubContext["path"] = new StringContextData("/__w/_temp/_runner_file_commands/add_path_265698aa-7f38-40f5-9316-5c01a3153672");
expectedGithubContext["event_path"] = new StringContextData("/github/workflow/event.json");
expectedGithubContext["repository"] = new StringContextData("owner/repo-name");
expectedGithubContext["run_id"] = new StringContextData("2033211332");
expectedGithubContext["workflow"] = new StringContextData("Name of Workflow");
expectedGithubContext["workspace"] = new StringContextData("/__w/step-order/step-order");
expectedGithubContext["event"] = expectedGithubEvent;
expectedRunnerContext["temp"] = new StringContextData("/__w/_temp");
expectedRunnerContext["tool_cache"] = new StringContextData("/__w/_tool");
ecExpect.ExpressionValues["github"] = expectedGithubContext;
ecExpect.ExpressionValues["runner"] = expectedRunnerContext;
var translatedExpressionValues = ec.GetExpressionValues(stepHost);
foreach (var contextName in new string[] { "github", "runner" })
{
var dict = translatedExpressionValues[contextName].AssertDictionary($"expected context github to be a dictionary");
var expectedExpressionValues = ecExpect.ExpressionValues[contextName].AssertDictionary("expect dict");
foreach (var key in dict.Keys.ToList())
{
if (dict[key] is StringContextData)
{
var expect = dict[key].AssertString("expect string");
var outcome = expectedExpressionValues[key].AssertString("expect string");
Assert.Equal(expect.Value, outcome.Value);
}
else if (dict[key] is DictionaryContextData || dict[key] is CaseSensitiveDictionaryContextData)
{
var expectDict = dict[key].AssertDictionary("expect dict");
var actualDict = expectedExpressionValues[key].AssertDictionary("expect dict");
Assert.True(ExpressionValuesAssertEqual(expectDict, actualDict));
}
}
}
}
}
private bool ExpressionValuesAssertEqual(DictionaryContextData expect, DictionaryContextData actual)
{
foreach (var key in expect.Keys.ToList())
{
if (expect[key] is StringContextData)
{
var expectValue = expect[key].AssertString("expect string");
var actualValue = actual[key].AssertString("expect string");
if (expectValue.Equals(actualValue))
{
return false;
}
}
else if (expect[key] is DictionaryContextData || expect[key] is CaseSensitiveDictionaryContextData)
{
var expectDict = expect[key].AssertDictionary("expect dict");
var actualDict = actual[key].AssertDictionary("expect dict");
if (!ExpressionValuesAssertEqual(expectDict, actualDict))
{
return false;
}
}
}
return true;
}
}
}

View File

@@ -0,0 +1,100 @@
using System;
using System.Collections.Generic;
using System.Runtime.CompilerServices;
using Moq;
using Xunit;
using GitHub.Runner.Worker;
using GitHub.Runner.Worker.Handlers;
using GitHub.DistributedTask.Pipelines;
using GitHub.DistributedTask.WebApi;
namespace GitHub.Runner.Common.Tests.Worker
{
public sealed class HandlerFactoryL0
{
private Mock<IExecutionContext> _ec;
private TestHostContext CreateTestContext([CallerMemberName] string testName = "")
{
var hostContext = new TestHostContext(this, testName);
_ec = new Mock<IExecutionContext>();
_ec.SetupAllProperties();
_ec.Object.Initialize(hostContext);
var handler = new Mock<INodeScriptActionHandler>();
handler.SetupAllProperties();
hostContext.EnqueueInstance(handler.Object);
//hostContext.EnqueueInstance(new ActionCommandManager() as IActionCommandManager);
return hostContext;
}
[Theory]
[Trait("Level", "L0")]
[Trait("Category", "Worker")]
[InlineData("node12", "", "", "", "node12")]
[InlineData("node12", "true", "", "", "node16")]
[InlineData("node12", "true", "", "true", "node12")]
[InlineData("node12", "true", "true", "", "node12")]
[InlineData("node12", "true", "true", "true", "node12")]
[InlineData("node12", "true", "false", "true", "node16")] // workflow overrides env
[InlineData("node16", "", "", "", "node16")]
[InlineData("node16", "true", "", "", "node16")]
[InlineData("node16", "true", "", "true", "node16")]
[InlineData("node16", "true", "true", "", "node16")]
[InlineData("node16", "true", "true", "true", "node16")]
[InlineData("node16", "true", "false", "true", "node16")]
public void IsNodeVersionUpgraded(string inputVersion, string serverFeatureFlag, string workflowOptOut, string machineOptOut, string expectedVersion)
{
using (TestHostContext hc = CreateTestContext())
{
// Arrange.
var hf = new HandlerFactory();
hf.Initialize(hc);
// Server Feature Flag
var variables = new Dictionary<string, VariableValue>();
if (!string.IsNullOrEmpty(serverFeatureFlag))
{
variables["DistributedTask.ForceGithubJavascriptActionsToNode16"] = serverFeatureFlag;
}
Variables serverVariables = new Variables(hc, variables);
// Workflow opt-out
var workflowVariables = new Dictionary<string, string>();
if (!string.IsNullOrEmpty(workflowOptOut))
{
workflowVariables[Constants.Variables.Actions.AllowActionsUseUnsecureNodeVersion] = workflowOptOut;
}
// Machine opt-out
if (!string.IsNullOrEmpty(machineOptOut))
{
Environment.SetEnvironmentVariable(Constants.Variables.Actions.AllowActionsUseUnsecureNodeVersion, machineOptOut);
}
_ec.Setup(x => x.Global).Returns(new GlobalContext()
{
Variables = serverVariables,
EnvironmentVariables = workflowVariables
});
// Act.
var data = new NodeJSActionExecutionData();
data.NodeVersion = inputVersion;
var handler = hf.Create(
_ec.Object,
new ScriptReference(),
new Mock<IStepHost>().Object,
data,
new Dictionary<string, string>(),
new Dictionary<string, string>(),
new Variables(hc, new Dictionary<string, VariableValue>()), "", new List<JobExtensionRunner>()
) as INodeScriptActionHandler;
// Assert.
Assert.Equal(expectedVersion, handler.Data.NodeVersion);
Environment.SetEnvironmentVariable(Constants.Variables.Actions.AllowActionsUseUnsecureNodeVersion, null);
}
}
}
}

View File

@@ -0,0 +1,88 @@
using System;
using System.Runtime.CompilerServices;
using GitHub.DistributedTask.Pipelines;
using GitHub.DistributedTask.WebApi;
using GitHub.Runner.Sdk;
using GitHub.Runner.Worker;
using GitHub.Runner.Worker.Handlers;
using Moq;
using Xunit;
namespace GitHub.Runner.Common.Tests.Worker
{
public sealed class HandlerL0
{
private Mock<IExecutionContext> _ec;
private ActionsStepTelemetry _stepTelemetry;
private TestHostContext CreateTestContext([CallerMemberName] String testName = "")
{
var hc = new TestHostContext(this, testName);
_stepTelemetry = new ActionsStepTelemetry();
_ec = new Mock<IExecutionContext>();
_ec.SetupAllProperties();
_ec.Setup(x => x.StepTelemetry).Returns(_stepTelemetry);
var trace = hc.GetTrace();
_ec.Setup(x => x.Write(It.IsAny<string>(), It.IsAny<string>())).Callback((string tag, string message) => { trace.Info($"[{tag}]{message}"); });
hc.EnqueueInstance<IActionCommandManager>(new Mock<IActionCommandManager>().Object);
return hc;
}
[Fact]
[Trait("Level", "L0")]
[Trait("Category", "Worker")]
public void PrepareExecution_PopulateTelemetry_RepoActions()
{
using (TestHostContext hc = CreateTestContext())
{
// Arrange.
var nodeHandler = new NodeScriptActionHandler();
nodeHandler.Initialize(hc);
nodeHandler.ExecutionContext = _ec.Object;
nodeHandler.Action = new RepositoryPathReference()
{
Name = "actions/checkout",
Ref = "v2"
};
// Act.
nodeHandler.PrepareExecution(ActionRunStage.Main);
hc.GetTrace().Info($"Telemetry: {StringUtil.ConvertToJson(_stepTelemetry)}");
// Assert.
Assert.Equal("repository", _stepTelemetry.Type);
Assert.Equal("actions/checkout", _stepTelemetry.Action);
Assert.Equal("v2", _stepTelemetry.Ref);
}
}
[Fact]
[Trait("Level", "L0")]
[Trait("Category", "Worker")]
public void PrepareExecution_PopulateTelemetry_DockerActions()
{
using (TestHostContext hc = CreateTestContext())
{
// Arrange.
var nodeHandler = new NodeScriptActionHandler();
nodeHandler.Initialize(hc);
nodeHandler.ExecutionContext = _ec.Object;
nodeHandler.Action = new ContainerRegistryReference()
{
Image = "ubuntu:20.04"
};
// Act.
nodeHandler.PrepareExecution(ActionRunStage.Main);
hc.GetTrace().Info($"Telemetry: {StringUtil.ConvertToJson(_stepTelemetry)}");
// Assert.
Assert.Equal("docker", _stepTelemetry.Type);
Assert.Equal("ubuntu:20.04", _stepTelemetry.Action);
}
}
}
}

View File

@@ -25,6 +25,7 @@ namespace GitHub.Runner.Common.Tests.Worker
private Mock<IPagingLogger> _logger;
private Mock<IContainerOperationProvider> _containerProvider;
private Mock<IDiagnosticLogManager> _diagnosticLogManager;
private Mock<IJobHookProvider> _jobHookProvider;
private CancellationTokenSource _tokenSource;
private TestHostContext CreateTestContext([CallerMemberName] String testName = "")
@@ -40,6 +41,7 @@ namespace GitHub.Runner.Common.Tests.Worker
_directoryManager = new Mock<IPipelineDirectoryManager>();
_directoryManager.Setup(x => x.PrepareDirectory(It.IsAny<IExecutionContext>(), It.IsAny<Pipelines.WorkspaceOptions>()))
.Returns(new TrackingConfig() { PipelineDirectory = "runner", WorkspaceDirectory = "runner/runner" });
_jobHookProvider = new Mock<IJobHookProvider>();
IActionRunner step1 = new ActionRunner();
IActionRunner step2 = new ActionRunner();
@@ -111,7 +113,9 @@ namespace GitHub.Runner.Common.Tests.Worker
hc.SetSingleton(_containerProvider.Object);
hc.SetSingleton(_directoryManager.Object);
hc.SetSingleton(_diagnosticLogManager.Object);
hc.SetSingleton(_jobHookProvider.Object);
hc.EnqueueInstance<IPagingLogger>(_logger.Object); // JobExecutionContext
hc.EnqueueInstance<IPagingLogger>(_logger.Object); // job start hook
hc.EnqueueInstance<IPagingLogger>(_logger.Object); // Initial Job
hc.EnqueueInstance<IPagingLogger>(_logger.Object); // step1
hc.EnqueueInstance<IPagingLogger>(_logger.Object); // step2
@@ -120,6 +124,7 @@ namespace GitHub.Runner.Common.Tests.Worker
hc.EnqueueInstance<IPagingLogger>(_logger.Object); // step5
hc.EnqueueInstance<IPagingLogger>(_logger.Object); // prepare1
hc.EnqueueInstance<IPagingLogger>(_logger.Object); // prepare2
hc.EnqueueInstance<IPagingLogger>(_logger.Object); // job complete hook
hc.EnqueueInstance<IActionRunner>(step1);
hc.EnqueueInstance<IActionRunner>(step2);
@@ -348,5 +353,62 @@ namespace GitHub.Runner.Common.Tests.Worker
Assert.Equal(TaskResult.Succeeded, _jobEc.Result);
}
}
[Fact]
[Trait("Level", "L0")]
[Trait("Category", "Worker")]
public async Task EnsurePreAndPostHookStepsIfEnvExists()
{
Environment.SetEnvironmentVariable("ACTIONS_RUNNER_HOOK_JOB_STARTED", "/foo/bar");
Environment.SetEnvironmentVariable("ACTIONS_RUNNER_HOOK_JOB_COMPLETED", "/bar/foo");
using (TestHostContext hc = CreateTestContext())
{
var jobExtension = new JobExtension();
jobExtension.Initialize(hc);
_actionManager.Setup(x => x.PrepareActionsAsync(It.IsAny<IExecutionContext>(), It.IsAny<IEnumerable<Pipelines.JobStep>>(), It.IsAny<Guid>()))
.Returns(Task.FromResult(new PrepareResult(new List<JobExtensionRunner>(), new Dictionary<Guid, IActionRunner>())));
List<IStep> result = await jobExtension.InitializeJob(_jobEc, _message);
var trace = hc.GetTrace();
var hookStart = result.First() as JobExtensionRunner;
jobExtension.FinalizeJob(_jobEc, _message, DateTime.UtcNow);
Assert.Equal(Constants.Hooks.JobStartedStepName, hookStart.DisplayName);
Assert.Equal(Constants.Hooks.JobCompletedStepName, (_jobEc.PostJobSteps.Last() as JobExtensionRunner).DisplayName);
}
Environment.SetEnvironmentVariable("ACTIONS_RUNNER_HOOK_JOB_STARTED", null);
Environment.SetEnvironmentVariable("ACTIONS_RUNNER_HOOK_JOB_COMPLETED", null);
}
[Fact]
[Trait("Level", "L0")]
[Trait("Category", "Worker")]
public void EnsureNoPreAndPostHookSteps()
{
using (TestHostContext hc = CreateTestContext())
{
var jobExtension = new JobExtension();
jobExtension.Initialize(hc);
_message.ActionsEnvironment = null;
_jobEc = new Runner.Worker.ExecutionContext {Result = TaskResult.Succeeded};
_jobEc.Initialize(hc);
_jobEc.InitializeJob(_message, _tokenSource.Token);
var x = _jobEc.JobSteps;
jobExtension.FinalizeJob(_jobEc, _message, DateTime.UtcNow);
Assert.Equal(TaskResult.Succeeded, _jobEc.Result);
Assert.Equal(0, _jobEc.PostJobSteps.Count);
}
}
}
}

View File

@@ -3,9 +3,9 @@ using System.Collections.Generic;
using System.Globalization;
using System.IO;
using System.Linq;
using System.Runtime.CompilerServices;
using System.Threading;
using System.Threading.Tasks;
using System.Runtime.CompilerServices;
using GitHub.Runner.Sdk;
using GitHub.Runner.Worker;
using GitHub.Runner.Worker.Container;
@@ -937,6 +937,19 @@ namespace GitHub.Runner.Common.Tests.Worker
}
}
[Fact]
[Trait("Level", "L0")]
[Trait("Category", "Worker")]
public void CaptureTelemetryForGitUnsafeRepository()
{
using (Setup())
using (_outputManager)
{
Process("fatal: unsafe repository ('/github/workspace' is owned by someone else)");
Assert.Contains("fatal: unsafe repository ('/github/workspace' is owned by someone else)", _executionContext.Object.StepTelemetry.ErrorMessages);
}
}
private TestHostContext Setup(
[CallerMemberName] string name = "",
IssueMatchersConfig matchers = null,
@@ -962,6 +975,8 @@ namespace GitHub.Runner.Common.Tests.Worker
Variables = _variables,
WriteDebug = true,
});
_executionContext.Setup(x => x.StepTelemetry)
.Returns(new DTWebApi.ActionsStepTelemetry());
_executionContext.Setup(x => x.GetMatchers())
.Returns(matchers?.Matchers ?? new List<IssueMatcherConfig>());
_executionContext.Setup(x => x.Add(It.IsAny<OnMatcherChanged>()))

View File

@@ -302,6 +302,50 @@ namespace GitHub.Runner.Common.Tests.Worker
}
}
[Fact]
[Trait("Level", "L0")]
[Trait("Category", "Worker")]
public void SetEnvFileCommand_Heredoc_MissingNewLine()
{
using (var hostContext = Setup())
{
var envFile = Path.Combine(_rootDirectory, "heredoc");
var content = new List<string>
{
"MY_ENV<<EOF",
"line one",
"line two",
"line three",
"EOF",
};
WriteContent(envFile, content, " ");
var ex = Assert.Throws<Exception>(() => _setEnvFileCommand.ProcessCommand(_executionContext.Object, envFile, null));
Assert.Contains("Matching delimiter not found", ex.Message);
}
}
[Fact]
[Trait("Level", "L0")]
[Trait("Category", "Worker")]
public void SetEnvFileCommand_Heredoc_MissingNewLineMultipleLinesEnv()
{
using (var hostContext = Setup())
{
var envFile = Path.Combine(_rootDirectory, "heredoc");
var content = new List<string>
{
"MY_ENV<<EOF",
@"line one
line two
line three",
"EOF",
};
WriteContent(envFile, content, " ");
var ex = Assert.Throws<Exception>(() => _setEnvFileCommand.ProcessCommand(_executionContext.Object, envFile, null));
Assert.Contains("EOF marker missing new line", ex.Message);
}
}
#if OS_WINDOWS
[Fact]
[Trait("Level", "L0")]

View File

@@ -7,6 +7,9 @@ using Xunit;
using GitHub.Runner.Worker;
using GitHub.Runner.Worker.Handlers;
using GitHub.Runner.Worker.Container;
using GitHub.DistributedTask.Pipelines.ContextData;
using System.Linq;
using GitHub.DistributedTask.Pipelines;
namespace GitHub.Runner.Common.Tests.Worker
{

View File

@@ -622,6 +622,40 @@ namespace GitHub.Runner.Common.Tests.Worker
_stepContext.SetOutcome("", stepContext.Object.ContextName, (stepContext.Object.Outcome ?? stepContext.Object.Result ?? TaskResult.Succeeded).ToActionResult());
_stepContext.SetConclusion("", stepContext.Object.ContextName, (stepContext.Object.Result ?? TaskResult.Succeeded).ToActionResult());
});
stepContext.Setup(x => x.UpdateGlobalStepsContext()).Callback(() =>
{
if (!string.IsNullOrEmpty(stepContext.Object.ContextName) && !stepContext.Object.ContextName.StartsWith("__", StringComparison.Ordinal))
{
stepContext.Object.Global.StepsContext.SetOutcome(stepContext.Object.ScopeName, stepContext.Object.ContextName, (stepContext.Object.Outcome ?? stepContext.Object.Result ?? TaskResult.Succeeded).ToActionResult());
stepContext.Object.Global.StepsContext.SetConclusion(stepContext.Object.ScopeName, stepContext.Object.ContextName, (stepContext.Object.Result ?? TaskResult.Succeeded).ToActionResult());
}
});
stepContext.Setup(x => x.ApplyContinueOnError(It.IsAny<TemplateToken>())).Callback((TemplateToken token) =>
{
if (stepContext.Object.Result != TaskResult.Failed)
{
return;
}
var continueOnError = false;
try
{
var templateEvaluator = stepContext.Object.ToPipelineTemplateEvaluator();
continueOnError = templateEvaluator.EvaluateStepContinueOnError(token, stepContext.Object.ExpressionValues, stepContext.Object.ExpressionFunctions);
}
catch (Exception ex)
{
stepContext.Object.Error("The step failed and an error occurred when attempting to determine whether to continue on error.");
stepContext.Object.Error(ex);
}
if (continueOnError)
{
stepContext.Object.Outcome = stepContext.Object.Result;
stepContext.Object.Result = TaskResult.Succeeded;
}
stepContext.Object.UpdateGlobalStepsContext();
});
var trace = hc.GetTrace();
stepContext.Setup(x => x.Write(It.IsAny<string>(), It.IsAny<string>())).Callback((string tag, string message) => { trace.Info($"[{tag}]{message}"); });
stepContext.Object.Result = result;

View File

@@ -0,0 +1 @@
{ "2.283.2": {"targetVersion":"2.284.1"}, "2.284.1": {"targetVersion":"2.285.0"}}

View File

@@ -128,6 +128,7 @@ function layout ()
chmod +x "${LAYOUT_DIR}/bin/Runner.Worker"
chmod +x "${LAYOUT_DIR}/bin/Runner.PluginHost"
chmod +x "${LAYOUT_DIR}/bin/installdependencies.sh"
chmod +x "${LAYOUT_DIR}/safe_sleep.sh"
fi
heading "Setup externals folder for $RUNTIME_ID runner's layout"

View File

@@ -1 +1 @@
2.286.0
2.291.1