Compare commits

...

80 Commits

Author SHA1 Message Date
David Kale
de955418e4 Merge branch 'main' into releases/m275
Update release version to 2.275.1
2020-12-14 16:37:14 -05:00
David Kale
7ff6ff6afa Prepare 2.275.1 2020-12-14 16:36:31 -05:00
Tingluo Huang
56529a1c2f fix compat issue in timeline record state. (#861) 2020-12-14 15:43:00 -05:00
David Kale
0fe3c90573 Release 2.275.0 2020-12-14 11:14:30 -05:00
David Kale
510fadf71a Prepare m275 (#860) 2020-12-14 11:02:44 -05:00
klassiker
007ac8138b Add proxy support for container actions (#840)
* Add proxy support for container actions in Runner.Worker/StepsRunner

* Move proxy modifications to ContainerActionHandler
2020-12-11 13:08:45 -05:00
Yang Cao
1e12b8909a Count actions resolve failures as infra failures (#851)
During job run we may fail to resolve actions download info, and this
stack is fully controlled by GitHub actions so it should be counted as
infrastructure failure instead of user failure.
2020-12-11 11:07:43 -05:00
Tingluo Huang
9ceb3d481a unset GTIHUB_ACTION_REPOSITORY and GITHUB_ACTION_REF for non-repo based actions. (#804) 2020-12-11 11:04:07 -05:00
Bruno FERNANDO
3bce2eb09c feat(scripts): add labels in the script that register runner (#844) 2020-12-11 11:03:04 -05:00
David Kale
80bf68db81 Crypto cleanup and enable usage of FIPS compliant crypto when required (#806)
* Use FIPS compliant crypto when required

* Comment cleanup

* Store OAuth signing scheme in credentialData instead of runner setting

Add encryption scheme for job message encyption key to session

Further cleanup of unused crypto code

* Update windows rsa key manager to use crossplat dotnet RSA api

* Undo unneeded ConfigurationManager change
2020-12-04 11:35:16 -05:00
Thomas Boop
a2e32170fd Disable set-env and add-pathcommands (#779)
* Disable Old Runner Commands set-env and add-path

* update dotnet install scripts

* update runner version and release notes
2020-11-16 08:20:43 -05:00
Thomas Boop
35dda19491 Add deprecation date and release 2.274.1 version (#796) 2020-11-09 09:01:47 -05:00
Julio Barba
36bdf50bc6 Prepare the release of 2.274.0 runner 2020-11-05 10:25:24 -05:00
Chris Gavin
95e2158dc6 Add an environment variable to indicate which repository the currently running Action came from. (#585)
* add `workflow_dispatch`

* Add an environment variable to indicate which repository the currently running Action came from.

* Expose the Action ref as well.

* Move setting `github.action_repository` and `github.action_ref` to `ActionRunner.cs`.

* Don't set `action_repository` and `action_ref` for local Actions.

Co-authored-by: Tingluo Huang <tingluohuang@github.com>
2020-11-03 14:39:17 -05:00
Jason Laqua
3ebaeb9f19 Fixes #759 doesn't change proxy environment variables (#760)
* Fixes #759 doesn't change proxy environment variables

* Update RunnerWebProxy.cs

* Update RunnerWebProxyL0.cs

Co-authored-by: Tingluo Huang <tingluohuang@github.com>
2020-11-03 10:47:30 -05:00
shinriyo
9d678cb270 DRY and add sudo (#687)
remove 3 "redundant" text and put one text for DRY.
and developers always forget `sudo` and annoying `Need to run with sudo privilege` message.
so, add first.
2020-11-02 21:38:35 -05:00
Tingluo Huang
27788491ea raise error for set-env, block set node_options. (#784)
* raise error for set-env, block set node_options.

* feedback.
2020-11-02 14:09:29 -05:00
Yashwanth Anantharaju
5ba7affea4 fix in correct check (#778) 2020-10-30 14:34:00 -04:00
Robin Neatherway
ce92d7a6b5 Change ping .. > nul to sleep (#647)
* Change `ping .. > nul` to `sleep`

The filename `nul` is a Windows-ism that causes the update script to
create such a file in the current working directory. The `ping`
utility is also an dependency not installed by
`installdependencies.sh`, so it seemed easier to change it to the
standard `sleep` command.

* Update dotnet-install script as requested by test

* Update dotnet-install.ps1

Co-authored-by: Tingluo Huang <tingluohuang@github.com>
2020-10-29 10:15:30 -04:00
Temtaime
d23ca0ba7a Add .editorconfig (#768)
* Add .editorconfig

* Create .editorconfig
2020-10-27 10:50:50 -04:00
dependabot[bot]
9d1c81f018 Bump @actions/core in /src/Misc/expressionFunc/hashFiles (#729)
Bumps [@actions/core](https://github.com/actions/toolkit/tree/HEAD/packages/core) from 1.2.0 to 1.2.6.
- [Release notes](https://github.com/actions/toolkit/releases)
- [Changelog](https://github.com/actions/toolkit/blob/main/packages/core/RELEASES.md)
- [Commits](https://github.com/actions/toolkit/commits/HEAD/packages/core)

Signed-off-by: dependabot[bot] <support@github.com>

Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2020-10-26 23:29:08 -04:00
Josh Soref
7a8abe726a Improve apt handling (#708)
* Unify apt/apt-get logic

The previous logic was buggy in that it tried to use `apt` in the `apt-get` branch after deciding that `apt` was unavailable...

* Prefer apt-get over apt

apt does not have a stable cli and using it from scripts yields annoying messages

* Improve English for missing apt-get & apt case

* Fix apt-get/apt fallback behavior for $ patterns

If there's a `$` in the apt install pattern, it will not fail if it selects a thing and decides it isn't interested in installing it.

* Fix spelling of libssl
2020-10-26 23:27:09 -04:00
Łukasz Łaniewski-Wołłk
a9135e61a0 Correcting bug in check of libicu presence (#695) 2020-10-26 23:14:17 -04:00
Justin Weissig
feafd3e1d7 fixed grammar issues (#672)
Nothing major here just minor wording.
2020-10-26 23:11:30 -04:00
Justin Weissig
dc3b2d3a36 fixed wording (#671)
Fixed a few minor grammar issues
2020-10-26 23:10:59 -04:00
Justin Weissig
a371309079 minor spelling & grammar tweaks (#670)
Fixed a few minor spelling & grammar issues.
2020-10-26 23:10:26 -04:00
Justin Weissig
5dd6bde4ca fixed minor spelling mistake (#669)
Changed enhancment to enhancement.
2020-10-26 23:09:38 -04:00
Tingluo Huang
c196103e58 update dotnet install script. 2020-10-26 23:07:57 -04:00
Fabian Mastenbroek
d55070da3e Update to .NET Core SDK 3.1.302 (#681)
This change updates the .NET Core SDK used by the Actions Runner to
version 3.1.302 to address the issues that are caused by the following issue:
    https://github.com/dotnet/runtime/issues/13475
See #574 for more information.

Fixes #574
2020-10-26 22:51:29 -04:00
Yashwanth Anantharaju
8279ae9a70 Support environment URL parsing (#762)
* environment URL parsing
2020-10-21 12:14:21 -04:00
Hayden Faulds
2e3b03623f log runner group name (#696)
* log runner group name

* linting
2020-10-16 14:56:06 +01:00
Thomas Boop
c18c8746db Release notes for 2.273.5 (#734) 2020-10-02 11:49:49 -04:00
Thomas Boop
6332a52d76 Notify on unsecure commands (#731)
* notify on unsecure commands
2020-10-02 11:34:37 -04:00
Yang Cao
8bb588bb69 Expose retention days in env for toolkit/artifacts package (#714) 2020-09-17 15:11:12 -04:00
David Kale
4510f69c73 Prepare 273.4 release 2020-09-17 18:19:42 +00:00
David Kale
c7b8552edf Prepare 2.273.3 release 2020-09-16 15:06:07 +00:00
Julio Barba
0face6e3af Preparing the release of 2.273.2 runner 2020-09-14 13:06:41 -04:00
eric sciple
306be41266 fix bug w checkout v1 updating GITHUB_WORKSPACE (#704) 2020-09-14 12:00:00 -04:00
David Kale
4e85b8f3b7 Allow registry credentials for job/service containers (#694)
* Log in with container credentials if given

* Stub in registry aware auth for later

* Fix hang if password is empty

* Remove default param to fix build

* PR Feedback. Add some tests and fix parse
2020-09-11 12:28:58 -04:00
Julio Barba
444332ca88 Prepare the release of 2.273.1 runner 2020-09-08 13:01:36 -04:00
Thomas Boop
e6eb9e381d Cleanup FileCommands (#693) 2020-09-04 15:35:36 -04:00
eric sciple
3a76a2e291 read env file (#683) 2020-08-29 23:18:35 -04:00
Thomas Boop
9976cb92a0 Add Runner File Commands (#684)
* Add File Runner Commands
2020-08-28 15:32:25 -04:00
Thomas Brumley
d900654c42 Add in Log line numbers for streaming logs (#663)
* Add in Log line

Co-authored-by: yaananth (Yash) <yaananth@github.com>
2020-08-25 12:02:29 -04:00
Julio Barba
65e3ec86b4 Set executable bit 2020-08-18 16:09:04 -04:00
Julio Barba
a7f205593a Update dotnet scripts 2020-08-18 16:03:06 -04:00
Julio Barba
55f60a4ffc Prepare the release of 2.273.0 runner 2020-08-17 15:41:15 -04:00
Ethan Chiu
ca13b25240 Fix Outputs Example (#658) 2020-08-14 11:55:27 -04:00
Timo Schilling
b0c2734380 fix endgroup maker (#640) 2020-08-14 11:53:30 -04:00
Ethan Chiu
9e7b56f698 Fix Null Ref Issues Composite Actions (#657) 2020-08-12 17:12:54 -04:00
Ethan Chiu
8c29e33e88 Fix DisplayName Changing in middle of composite action run (#645) 2020-08-10 16:26:23 -04:00
Ethan Chiu
976217d6ec Free up memory from step level outputs in composite action (#641) 2020-08-10 14:31:30 -04:00
Ethan Chiu
562eafab3a Adding Documentation to ADR for Support for Script Execution + Explicit Definition (#616) 2020-08-10 11:30:34 -04:00
Joe Bourne
9015b95a72 Updating virtual environment terminology (#651)
* Dropping pool terminology

* Update README.md
2020-08-07 15:52:56 -04:00
eric sciple
7d4bbf46de fix feature flag check; omit context for generated context names (#638) 2020-08-04 11:12:40 -04:00
Christopher Johnson
7b608e3e92 Adding help text for the new runnergroup feature (#626)
Co-authored-by: Christopher Johnson <thchrisjohnson@github.com>
2020-07-30 12:03:40 -04:00
Ethan Chiu
f028b4e2b0 Revert JobSteps to Queue Data Structure (#625)
* Revert JobSteps to Queue data structure

* Revert tests
2020-07-29 16:19:04 -04:00
TingluoHuang
38f816c2ae prepare release 2.272.0 runner. 2020-07-29 15:31:45 -04:00
efyx
bc1fe2cfe0 Fix poor performance of process spawned from svc daemon (#614) 2020-07-29 15:20:28 -04:00
Ethan Chiu
89a13db2c3 Remove TESTING_COMPOSITE_ACTIONS_ALPHA Env Variable (#624) 2020-07-29 15:12:15 -04:00
Ethan Chiu
d59092d973 GITHUB_ACTION_PATH + GITHUB_ACTION so that we can run scripts for Composite Run Steps (#615)
* Add environment variable for GITHUB_ACTION_PATH

* ah

* Remove debugging messages

* Set github action path at step level instead of global scope to avoid necessary removal

* Remove set context for github action

* Set github action path before and after composite action

* Copy GitHub Context, use this copied context for each composit step, and then set the action_path for each one (to avoid stamping over parent pointer GitHubContext
2020-07-29 14:28:14 -04:00
Ethan Chiu
855b90c3d4 Explicitly define what is allowed for a composite action (#605)
* Explicitly define what is allowed for an action

* Add step-env

* Remove secrets + defaults

* new line

* Add safety check to prevent from checking defaults in ScriptHandler for composite action

* Revert "Add safety check to prevent from checking defaults in ScriptHandler for composite action"

This reverts commit aeae15de7b.

* Need to explictly use ActionStep type since we need the .Inputs attribute which is only found in the ActionStep not IStep

* Fix ActionManifestManager

* Remove todos

* Revert "Revert "Add safety check to prevent from checking defaults in ScriptHandler for composite action""

This reverts commit a22fcbc036.

* revert

* Remove needs in env

* Make shell required + add inputs

* Remove passing context to all composite steps attribuyte
2020-07-28 10:15:46 -04:00
Christopher Johnson
48ac96307c Add ability to register a runner to the non-default self-hosted runner group (#613)
Co-authored-by: Christopher Johnson <thchrisjohnson@github.com>
2020-07-23 17:46:48 -04:00
Ethan Chiu
2e50dffb37 Clean Up Composite UI (#610)
* Remove redundant code (display name is already evaluated in ActionRunner beforehand for each step)

* remove

* Remove nesting information for composite steps.

* put messages in debug logs if composite. if not, put these messages as outputs

* Fix group issue

* Fix end group issue
2020-07-23 09:45:00 -04:00
Tingluo Huang
e7b0844772 Add manual trigger 2020-07-23 00:34:36 -04:00
Ethan Chiu
d5a5550649 Fix Timeout-minutes for Whole Composite Action Step (#599)
* Exploring child Linked Cancellation Tokens

* Preliminary Timeout-minutes fix

* Final Solution for resolving cancellation token's timeout vs. cancellation

* Clean up + Fix error handling

* Use linked tokens instead

* Clean up

* one liner

* Remove JobExecutionContext => Replace with public Root accessor

* Move CreateLinkedTokenSource in the CreateCompositeStep Function
2020-07-22 18:01:50 -04:00
Ethan Chiu
3d0147d322 Improve Debugging Messages for Empty Tokens (#609)
* Improve Debugging Messages for Empty Tokens

* fix tests
2020-07-22 17:34:40 -04:00
Ethan Chiu
bd1f245aac Clarify details for defaults, shell, and working-dir (#607) 2020-07-22 16:11:55 -04:00
Steven Maude
005f1c15b1 Fix "propogate" typo in ADR 0549 (#600) 2020-07-22 16:10:57 -04:00
David Kale
da3cb5506f Fold logs for intermediate docker commands (#608) 2020-07-22 14:55:49 -04:00
dependabot[bot]
32d439070b Bump lodash in /src/Misc/expressionFunc/hashFiles (#603)
Bumps [lodash](https://github.com/lodash/lodash) from 4.17.15 to 4.17.19.
- [Release notes](https://github.com/lodash/lodash/releases)
- [Commits](https://github.com/lodash/lodash/compare/4.17.15...4.17.19)

Signed-off-by: dependabot[bot] <support@github.com>

Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2020-07-20 10:21:01 -04:00
jeffrey
ec9f8f1682 dbl quotes around variable so CD works if path contains spaces (#602) 2020-07-20 10:19:37 -04:00
eric sciple
0921af735a move shared ExecutionContext properties under .Global (#594) 2020-07-19 19:05:47 -04:00
eric sciple
1cc3c08cf2 Prepare to switch GITHUB_ACTION to use ContextName instead of refname (#593)
This PR changes GITHUB_ACTION to use the step ContextName, instead of refname. The behavior is behind a feature flag. Refname is an otherwise deprecated property.

Primary motivation: For composite actions, we need a distinct GITHUB_ACTION for each nested step. This PR adds code to generate a default context name for nested steps.

For nested steps, GITHUB_ACTION will be set to "{ScopeName}.{ContextName}" to ensure no collisions.

A corresponding change will be made on the server so context name is never empty. Generated context names will start with "__".

A follow-up PR is required to avoid tracking "step" context values (outputs/conclusion/result) for generated context names. Waiting on telemetry from the server to confirm it's safe to assume leading "__" is a generate context name.
2020-07-19 17:19:13 -04:00
Ethan Chiu
f9dca15c63 Composite Run Steps Refactoring (#591)
* Add basic framework for baby steps runner

* Basic logic for adding steps / invoking composite action steps

* Composite Steps Runner MVP

* Fix null object reference error

* intialize composiute

* Comment out code that is handled by stepsrunner

* Add composite clean up step

* Remove previous 'workarounds' from StepsRunner. Clean Up PR

* Remove todo

* Remove todo

* Fix using unitialized object yikes

* Remove time delay

* Format handler

* Move output handler into action handler

* Add try to evaluate display name

* Remove while loop yikes

* Abstract away the windows encoding check during step running

* Github context set to {ScopeName}.{ContextName} or {ContextName} if ScopeName is null

* Remove setting result to sucess since result defaults to sucess

* Fix windows error

* Fix windows

* revert:

* Windows fix

* Fix Windows Error in Abstraction

* Remove Composite Steps Runner => consolidate into Composite Steps Runner

* Remove unn. attribute in ExecutionContext

* Change protection levels, plus change function name to more clear meaning

* Remove location param

* location pt.2 fix

* Remove outputs step

* Remove temp directory

* new line

* Add arguitl not null

* better comment

* Change encoding name

* Check count > 0 for composite steps, import System.Threading

* Change function header encodingutil

* Add TODO

* Add await

* Handle Failed Step

* Move over SetAllCompositeOutputs to the handler

* Remove timeout-minutes setting in steps-level

* Use only ExecutionContext

* Move using to the top

* Remove redundant check

* Change function name

* Remove testing code

* Consolidate error code

* Consolidate code

* Change HandleOutput => ProcessCompositeActionOutputs

* Remove set the timeout comment

* Add Cancelling functionality + Remove unn. parameter
2020-07-17 16:31:48 -04:00
eric sciple
0877d9a533 Update StringUtil.cs 2020-07-16 10:30:42 -04:00
eric sciple
d5e40c6a60 Update 0549-composite-run-steps.md 2020-07-15 20:00:45 -04:00
eric sciple
391bc35bb9 Update 0549-composite-run-steps.md 2020-07-15 19:59:13 -04:00
eric sciple
e4267b8434 Update 0549-composite-run-steps.md 2020-07-15 19:57:22 -04:00
TingluoHuang
2709cbc0ea rename master to main. 2020-07-14 13:37:20 -04:00
101 changed files with 2978 additions and 1820 deletions

View File

@@ -1,9 +1,10 @@
name: Runner CI name: Runner CI
on: on:
workflow_dispatch:
push: push:
branches: branches:
- master - main
- releases/* - releases/*
paths-ignore: paths-ignore:
- '**.md' - '**.md'

View File

@@ -1,13 +1,14 @@
name: Runner CD name: Runner CD
on: on:
workflow_dispatch:
push: push:
paths: paths:
- releaseVersion - releaseVersion
jobs: jobs:
check: check:
if: startsWith(github.ref, 'refs/heads/releases/') || github.ref == 'refs/heads/master' if: startsWith(github.ref, 'refs/heads/releases/') || github.ref == 'refs/heads/main'
runs-on: ubuntu-latest runs-on: ubuntu-latest
steps: steps:
- uses: actions/checkout@v2 - uses: actions/checkout@v2

View File

@@ -6,7 +6,7 @@
[![Actions Status](https://github.com/actions/runner/workflows/Runner%20CI/badge.svg)](https://github.com/actions/runner/actions) [![Actions Status](https://github.com/actions/runner/workflows/Runner%20CI/badge.svg)](https://github.com/actions/runner/actions)
The runner is the application that runs a job from a GitHub Actions workflow. The runner can run on the [hosted machine pools](https://github.com/actions/virtual-environments) or run on [self-hosted environments](https://help.github.com/en/actions/automating-your-workflow-with-github-actions/about-self-hosted-runners). The runner is the application that runs a job from a GitHub Actions workflow. It is used by GitHub Actions in the [hosted virtual environments](https://github.com/actions/virtual-environments), or you can [self-host the runner](https://help.github.com/en/actions/automating-your-workflow-with-github-actions/about-self-hosted-runners) in your own environment.
## Get Started ## Get Started

View File

@@ -22,7 +22,7 @@ These are described in detail below:
- http://proxy.com - http://proxy.com
- http://127.0.0.1:8080 - http://127.0.0.1:8080
- http://user:password@proxy.com - http://user:password@proxy.com
- `no_proxy` a comma seperated list of hosts that should not use the proxy. An optional port may be specified - `no_proxy` a comma separated list of hosts that should not use the proxy. An optional port may be specified
- `google.com` - `google.com`
- `yahoo.com:443` - `yahoo.com:443`
- `google.com,bing.com` - `google.com,bing.com`
@@ -31,9 +31,9 @@ We won't use `http_proxy` for https traffic when `https_proxy` is not set, this
Otherwise action authors and workflow users need to adjust to differences between the runner proxy convention, and tools used by their actions and scripts. Otherwise action authors and workflow users need to adjust to differences between the runner proxy convention, and tools used by their actions and scripts.
Example: Example:
Customer set `http_proxy=http://127.0.0.1:8888` and configure the runner against `https://github.com/owner/repo`, with the `https_proxy` -> `http_proxy` fallback, the runner will connect to server without any problem. However, if user runs `git push` to `https://github.com/owner/repo`, `git` won't use the proxy since it require `https_proxy` to be set for any https traffic. Customer set `http_proxy=http://127.0.0.1:8888` and configure the runner against `https://github.com/owner/repo`, with the `https_proxy` -> `http_proxy` fallback, the runner will connect to the server without any problem. However, if a user runs `git push` to `https://github.com/owner/repo`, `git` won't use the proxy since it requires `https_proxy` to be set for any https traffic.
> `golang`, `node.js` and other dev tools from the linux community use `http_proxy` for both http and https traffic base on my research. > `golang`, `node.js` and other dev tools from the linux community use `http_proxy` for both http and https traffic based on my research.
A majority of our users are using Linux where these variables are commonly required to be set by various programs. By reading these values, we simplify the process for self hosted runners to set up proxy, and expose it in a way users are already familiar with. A majority of our users are using Linux where these variables are commonly required to be set by various programs. By reading these values, we simplify the process for self hosted runners to set up proxy, and expose it in a way users are already familiar with.
@@ -43,7 +43,7 @@ We will support the lowercase and uppercase variants, with lowercase taking prio
### No Proxy Format ### No Proxy Format
While exact implementations are different per application on handle `no_proxy` env, most applications accept a comma separated list of hosts. Some accept wildcard characters (*). We are going to do exact case-insentive matches, and not support wildcards at this time. While exact implementations are different per application on handle `no_proxy` env, most applications accept a comma separated list of hosts. Some accept wildcard characters (*). We are going to do exact case-insensitive matches, and not support wildcards at this time.
For example: For example:
- example.com will match example.com, foo.example.com, foo.bar.example.com - example.com will match example.com, foo.example.com, foo.bar.example.com
- foo.example.com will match bar.foo.example.com and foo.example.com - foo.example.com will match bar.foo.example.com and foo.example.com
@@ -57,5 +57,5 @@ We will not support IP addresses for `no_proxy`, only hostnames.
3. The runner will read from the environmental variables during config and runtime and use the provided proxy if it exists 3. The runner will read from the environmental variables during config and runtime and use the provided proxy if it exists
4. Users may need to pass these environmental variables into other applications if they do not natively take these variables 4. Users may need to pass these environmental variables into other applications if they do not natively take these variables
5. Action authors may need to update their workflows to react to the these environment variables 5. Action authors may need to update their workflows to react to the these environment variables
6. We will document the way of setting environmental variables for runners using the environmental variables and how the runner uses them 6. We will document the way of setting environmental variables for runners using the environment variables and how the runner uses them
7. Like all other secrets, users will be able to relatively easily figure out proxy password if they can modify a workflow file running on a self hosted machine 7. Like all other secrets, users will be able to relatively easily figure out proxy password if they can modify a workflow file running on a self hosted machine

View File

@@ -34,7 +34,7 @@ A way out for rare cases where scoping is a problem.
`##[remove-matcher]owner` `##[remove-matcher]owner`
For the this to be usable, the `owner` needs to be discoverable. Therefore, debug print the owner on registration. For this to be usable, the `owner` needs to be discoverable. Therefore, debug print the owner on registration.
### Single line matcher ### Single line matcher
@@ -184,7 +184,7 @@ Solving this problem means:
- Use the `github.workspace` (where the repo is cloned on disk) - Use the `github.workspace` (where the repo is cloned on disk)
- Match against a repository to determine the relative path within the repo - Match against a repository to determine the relative path within the repo
This is a place where we diverge from VSCode. VSCode task configuration are specific to the local workspace (workspace root is known or can be specified). We're solving a more generic problem, so we need more information - specifically the `fromPath` property - in order to accurately root the path. This is a place where we diverge from VSCode. VSCode task configurations are specific to the local workspace (workspace root is known or can be specified). We're solving a more generic problem, so we need more information - specifically the `fromPath` property - in order to accurately root the path.
In order to avoid creating inaccurate hyperlinks on the error issues, the agent will verify the file exists and is in the main repository. Otherwise omit the file property from the error issue and debug trace what happened. In order to avoid creating inaccurate hyperlinks on the error issues, the agent will verify the file exists and is in the main repository. Otherwise omit the file property from the error issue and debug trace what happened.
@@ -203,7 +203,7 @@ Problem matchers are unable to interpret severity strings other than `warning` a
However some tools indicate error/warning in different ways. For example `flake8` uses codes like `E100`, `W200`, and `F300` (error, warning, fatal, respectively). However some tools indicate error/warning in different ways. For example `flake8` uses codes like `E100`, `W200`, and `F300` (error, warning, fatal, respectively).
Therefore, allow a property `severity`, sibling to `owner`, which identifies the default severity for the problem matcher. This allows two problem matchers are registered - one for warnings and one for errors. Therefore, allow a property `severity`, sibling to `owner`, which identifies the default severity for the problem matcher. This allows two problem matchers to be registered - one for warnings and one for errors.
For example, given the following `flake8` output: For example, given the following `flake8` output:

View File

@@ -84,7 +84,7 @@ powershell/pwsh
- Users can always opt out by not using the builtins, and providing a shell option like: `pwsh -File {0}`, or `powershell -Command "& '{0}'"`, depending on need - Users can always opt out by not using the builtins, and providing a shell option like: `pwsh -File {0}`, or `powershell -Command "& '{0}'"`, depending on need
cmd cmd
- There doesnt seem to be a way to fully opt in to fail-fast behavior other than writing your script to check each error code and respond accordingly, so we cant actually provide that behavior by default, it will be completely up to the user to write this behavior into their script - There doesn't seem to be a way to fully opt in to fail-fast behavior other than writing your script to check each error code and respond accordingly, so we can't actually provide that behavior by default, it will be completely up to the user to write this behavior into their script
- cmd.exe will exit (return the error code to the runner) with the errorlevel of the last program it executed. This is internally consistent with the previous default behavior (sh, pwsh) and is the cmd.exe default, so we keep that behavior - cmd.exe will exit (return the error code to the runner) with the errorlevel of the last program it executed. This is internally consistent with the previous default behavior (sh, pwsh) and is the cmd.exe default, so we keep that behavior
## Consequences ## Consequences

View File

@@ -1,10 +1,8 @@
# ADR 054x: Composite Run Steps # ADR 0549: Composite Run Steps
**Date**: 2020-06-17 **Date**: 2020-06-17
**Status**: Proposed **Status**: Accepted
**Relevant PR**: https://github.com/actions/runner/pull/549
## Context ## Context
@@ -12,18 +10,39 @@ Customers want to be able to compose actions from actions (ex: https://github.co
An important step towards meeting this goal is to build in functionality for actions where users can simply execute any number of steps. An important step towards meeting this goal is to build in functionality for actions where users can simply execute any number of steps.
## Guiding Principles ### Guiding Principles
We don't want the workflow author to need to know how the internal workings of the action work. Users shouldn't know the internal workings of the composite action (for example, `default.shell` and `default.workingDir` should not be inherited from the workflow file to the action file). When deciding how to design certain parts of composite run steps, we want to think one logical step from the consumer. We don't want the workflow author to need to know how the internal workings of the action work. Users shouldn't know the internal workings of the composite action (for example, `default.shell` and `default.workingDir` should not be inherited from the workflow file to the action file). When deciding how to design certain parts of composite run steps, we want to think one logical step from the consumer.
A composite action is treated as **one** individual job step (aka encapsulation). A composite action is treated as **one** individual job step (this is known as encapsulation).
## Decision ## Decision
**In this ADR, we only support running multiple run steps in an Action.** In doing so, we build in support for mapping and flowing the inputs, outputs, and env variables (ex: All nested steps should have access to its parents' input variables and nested steps can overwrite the input variables). **In this ADR, we only support running multiple run steps in an Action.** In doing so, we build in support for mapping and flowing the inputs, outputs, and env variables (ex: All nested steps should have access to its parents' input variables and nested steps can overwrite the input variables).
## Steps ### Composite Run Steps Features
This feature supports at the top action level:
- name
- description
- inputs
- runs
- outputs
This feature supports at the run step level:
- name
- id
- run
- env
- shell
- working-directory
This feature **does not support** at the run step level:
- timeout-minutes
- secrets
- conditionals (needs, if, etc.)
- continue-on-error
### Steps
Example `workflow.yml` Example `workflow.yml`
@@ -51,7 +70,9 @@ runs:
using: "composite" using: "composite"
steps: steps:
- run: pip install -r requirements.txt - run: pip install -r requirements.txt
shell: bash
- run: npm install - run: npm install
shell: bash
``` ```
Example Output Example Output
@@ -65,7 +86,70 @@ echo hello world 4
We add a token called "composite" which allows our Runner code to process composite actions. By invoking "using: composite", our Runner code then processes the "steps" attribute, converts this template code to a list of steps, and finally runs each run step sequentially. If any step fails and there are no `if` conditions defined, the whole composite action job fails. We add a token called "composite" which allows our Runner code to process composite actions. By invoking "using: composite", our Runner code then processes the "steps" attribute, converts this template code to a list of steps, and finally runs each run step sequentially. If any step fails and there are no `if` conditions defined, the whole composite action job fails.
## Inputs ### Defaults
We will not support "defaults" in a composite action.
### Shell and Working-directory
For each run step in a composite action, the action author can set the `shell` and `working-directory` attributes for that step. The shell attribute is **required** for each run step because the action author does not know what the workflow author is using for the operating system so we need to explicitly prevent unknown behavior by making sure that each run step has an explicit shell **set by the action author.** On the other hand, `working-directory` is optional. Moreover, the composite action author can map in values from the `inputs` for it's `shell` and `working-directory` attributes at the step level for an action.
For example,
`action.yml`
```yaml
inputs:
shell_1:
description: 'Your name'
default: 'pwsh'
steps:
- run: echo 1
shell: ${{ inputs.shell_1 }}
```
Note, the workflow file and action file are treated as separate entities. **So, the workflow `defaults` will never change the `shell` and `working-directory` value in the run steps in a composite action.** Note, `defaults` in a workflow only apply to run steps not "uses" steps (steps that use an action).
### Running Local Scripts
Example 'workflow.yml':
```yaml
jobs:
build:
runs-on: self-hosted
steps:
- uses: user/composite@v1
```
Example `user/composite/action.yml`:
```yaml
runs:
using: "composite"
steps:
- run: chmod +x ${{ github.action_path }}/test/script2.sh
shell: bash
- run: chmod +x $GITHUB_ACTION_PATH/script.sh
shell: bash
- run: ${{ github.action_path }}/test/script2.sh
shell: bash
- run: $GITHUB_ACTION_PATH/script.sh
shell: bash
```
Where `user/composite` has the file structure:
```
.
+-- action.yml
+-- script.sh
+-- test
| +-- script2.sh
```
Users will be able to run scripts located in their action folder by first prepending the relative path and script name with `$GITHUB_ACTION_PATH` or `github.action_path` which contains the path in which the composite action is downloaded to and where those "files" live. Note, you'll have to use `chmod` before running each script if you do not git check in your script files into your github repo with the executable bit turned on.
### Inputs
Example `workflow.yml`: Example `workflow.yml`:
@@ -88,6 +172,7 @@ runs:
using: "composite" using: "composite"
steps: steps:
- run: echo hello ${{ inputs.your_name }} - run: echo hello ${{ inputs.your_name }}
shell: bash
``` ```
Example Output: Example Output:
@@ -98,7 +183,7 @@ hello Octocat
Each input variable in the composite action is only viewable in its own scope. Each input variable in the composite action is only viewable in its own scope.
## Outputs ### Outputs
Example `workflow.yml`: Example `workflow.yml`:
@@ -108,6 +193,7 @@ steps:
- id: foo - id: foo
uses: user/composite@v1 uses: user/composite@v1
- run: echo random-number ${{ steps.foo.outputs.random-number }} - run: echo random-number ${{ steps.foo.outputs.random-number }}
shell: bash
``` ```
Example `user/composite/action.yml`: Example `user/composite/action.yml`:
@@ -121,7 +207,8 @@ runs:
using: "composite" using: "composite"
steps: steps:
- id: random-number-generator - id: random-number-generator
run: echo "::set-output name=random-number::$(echo $RANDOM)" run: echo "::set-output name=random-id::$(echo $RANDOM)"
shell: bash
``` ```
Example Output: Example Output:
@@ -135,22 +222,26 @@ Each of the output variables from the composite action is viewable from the work
Moreover, the output ids are only accessible within the scope where it was defined. Note that in the example above, in our `workflow.yml` file, it should not have access to output id (i.e. `random-id`). The reason why we are doing this is because we don't want to require the workflow author to know the internal workings of the composite action. Moreover, the output ids are only accessible within the scope where it was defined. Note that in the example above, in our `workflow.yml` file, it should not have access to output id (i.e. `random-id`). The reason why we are doing this is because we don't want to require the workflow author to know the internal workings of the composite action.
## Context ### Context
Similar to the workflow file, the composite action has access to the [same context objects](https://help.github.com/en/actions/reference/context-and-expression-syntax-for-github-actions#contexts) (ex: `github`, `env`, `strategy`). Similar to the workflow file, the composite action has access to the [same context objects](https://help.github.com/en/actions/reference/context-and-expression-syntax-for-github-actions#contexts) (ex: `github`, `env`, `strategy`).
## Environment ### Environment
In the Composite Action, you'll only be able to use `::set-env::` to set environment variables just like you could with other actions. In the Composite Action, you'll only be able to use `::set-env::` to set environment variables just like you could with other actions.
## Secrets ### Secrets
**Note** : This feature will be focused on in a future ADR. **We will not support "Secrets" in a composite action for now. This functionality will be focused on in a future ADR.**
We'll pass the secrets from the composite action's parents (ex: the workflow file) to the composite action. Secrets can be created in the composite action with the secrets context. In the actions yaml, we'll automatically mask the secret. We'll pass the secrets from the composite action's parents (ex: the workflow file) to the composite action. Secrets can be created in the composite action with the secrets context. In the actions yaml, we'll automatically mask the secret.
## If Condition ### If Condition
** If and needs conditions will not be supported in the composite run steps feature. It will be supported later on in a new feature. **
Old reasoning:
Example `workflow.yml`: Example `workflow.yml`:
@@ -168,24 +259,30 @@ runs:
using: "composite" using: "composite"
steps: steps:
- run: echo "just succeeding" - run: echo "just succeeding"
shell: bash
- run: echo "I will run, as my current scope is succeeding" - run: echo "I will run, as my current scope is succeeding"
shell: bash
if: success() if: success()
- run: exit 1 - run: exit 1
shell: bash
- run: echo "I will not run, as my current scope is now failing" - run: echo "I will not run, as my current scope is now failing"
shell: bash
``` ```
**We will not support "if Condition" in a composite action for now. This functionality will be focused on in a future ADR.**
See the paragraph below for a rudimentary approach (thank you to @cybojenix for the idea, example, and explanation for this approach): See the paragraph below for a rudimentary approach (thank you to @cybojenix for the idea, example, and explanation for this approach):
The `if` statement in the parent (in the example above, this is the `workflow.yml`) shows whether or not we should run the composite action. So, our composite action will run since the `if` condition for running the composite action is `always()`. The `if` statement in the parent (in the example above, this is the `workflow.yml`) shows whether or not we should run the composite action. So, our composite action will run since the `if` condition for running the composite action is `always()`.
**Note that the if condition on the parent does not propogate to the rest of its children though.** **Note that the if condition on the parent does not propagate to the rest of its children though.**
In the child action (in this example, this is the `action.yml`), it starts with a clean slate (in other words, no imposing if conditions). Similar to the logic in the paragraph above, `echo "I will run, as my current scope is succeeding"` will run since the `if` condition checks if the previous steps **within this composite action** has not failed. `run: echo "I will not run, as my current scope is now failing"` will not run since the previous step resulted in an error and by default, the if expression is set to `success()` if the if condition is not set for a step. In the child action (in this example, this is the `action.yml`), it starts with a clean slate (in other words, no imposing if conditions). Similar to the logic in the paragraph above, `echo "I will run, as my current scope is succeeding"` will run since the `if` condition checks if the previous steps **within this composite action** has not failed. `run: echo "I will not run, as my current scope is now failing"` will not run since the previous step resulted in an error and by default, the if expression is set to `success()` if the if condition is not set for a step.
What if a step has `cancelled()`? We do the opposite of our approach above if `cancelled()` is used for any of our composite run steps. We will cancel any step that has this condition if the workflow is cancelled at all. What if a step has `cancelled()`? We do the opposite of our approach above if `cancelled()` is used for any of our composite run steps. We will cancel any step that has this condition if the workflow is cancelled at all.
## Timeout-minutes ### Timeout-minutes
Example `workflow.yml`: Example `workflow.yml`:
@@ -205,13 +302,18 @@ runs:
- id: foo1 - id: foo1
run: echo test 1 run: echo test 1
timeout-minutes: 10 timeout-minutes: 10
shell: bash
- id: foo2 - id: foo2
run: echo test 2 run: echo test 2
shell: bash
- id: foo3 - id: foo3
run: echo test 3 run: echo test 3
timeout-minutes: 10 timeout-minutes: 10
shell: bash
``` ```
**We will not support "timeout-minutes" in a composite action for now. This functionality will be focused on in a future ADR.**
A composite action in its entirety is a job. You can set both timeout-minutes for the whole composite action or its steps as long as the the sum of the `timeout-minutes` for each composite action step that has the attribute `timeout-minutes` is less than or equals to `timeout-minutes` for the composite action. There is no default timeout-minutes for each composite action step. A composite action in its entirety is a job. You can set both timeout-minutes for the whole composite action or its steps as long as the the sum of the `timeout-minutes` for each composite action step that has the attribute `timeout-minutes` is less than or equals to `timeout-minutes` for the composite action. There is no default timeout-minutes for each composite action step.
If the time taken for any of the steps in combination or individually exceed the whole composite action `timeout-minutes` attribute, the whole job will fail (1). If an individual step exceeds its own `timeout-minutes` attribute but the total time that has been used including this step is below the overall composite action `timeout-minutes`, the individual step will fail but the rest of the steps will run based on their own `timeout-minutes` attribute (they will still abide by condition (1) though). If the time taken for any of the steps in combination or individually exceed the whole composite action `timeout-minutes` attribute, the whole job will fail (1). If an individual step exceeds its own `timeout-minutes` attribute but the total time that has been used including this step is below the overall composite action `timeout-minutes`, the individual step will fail but the rest of the steps will run based on their own `timeout-minutes` attribute (they will still abide by condition (1) though).
@@ -223,7 +325,7 @@ The rationale behind this is that users can configure their steps with the `if`
[Usage limits still apply](https://help.github.com/en/actions/reference/workflow-syntax-for-github-actions?query=if%28%29#usage-limits) [Usage limits still apply](https://help.github.com/en/actions/reference/workflow-syntax-for-github-actions?query=if%28%29#usage-limits)
## Continue-on-error ### Continue-on-error
Example `workflow.yml`: Example `workflow.yml`:
@@ -245,18 +347,18 @@ runs:
steps: steps:
- run: exit 1 - run: exit 1
continue-on-error: true continue-on-error: true
shell: bash
- run: echo "Hello World 2" <----- This step will run - run: echo "Hello World 2" <----- This step will run
shell: bash
``` ```
**We will not support "continue-on-error" in a composite action for now. This functionality will be focused on in a future ADR.**
If any of the steps fail in the composite action and the `continue-on-error` is set to `false` for the whole composite action step in the workflow file, then the steps below it will run. On the flip side, if `continue-on-error` is set to `true` for the whole composite action step in the workflow file, the next job step will run. If any of the steps fail in the composite action and the `continue-on-error` is set to `false` for the whole composite action step in the workflow file, then the steps below it will run. On the flip side, if `continue-on-error` is set to `true` for the whole composite action step in the workflow file, the next job step will run.
For the composite action steps, it follows the same logic as above. In this example, `"Hello World 2"` will be outputted because the previous step has `continue-on-error` set to `true` although that previous step errored. For the composite action steps, it follows the same logic as above. In this example, `"Hello World 2"` will be outputted because the previous step has `continue-on-error` set to `true` although that previous step errored.
## Defaults ### Visualizing Composite Action in the GitHub Actions UI
The composite action author will be required to set the `shell` and `workingDir` of the composite action. Moreover, the composite action author will be able to explicitly set the shell for each composite run step. The workflow author will not have the ability to change these attributes.
## Visualizing Composite Action in the GitHub Actions UI
We want all the composite action's steps to be condensed into the original composite action node. We want all the composite action's steps to be condensed into the original composite action node.
Here is a visual represenation of the [first example](#Steps) Here is a visual represenation of the [first example](#Steps)
@@ -271,5 +373,6 @@ Here is a visual represenation of the [first example](#Steps)
``` ```
## Conclusion ## Consequences
This ADR lays the framework for eventually supporting nested Composite Actions within Composite Actions. This ADR allows for users to run multiple run steps within a GitHub Composite Action with the support of inputs, outputs, environment, and context for use in any steps as well as the if, timeout-minutes, and the continue-on-error attributes for each Composite Action step. This ADR lays the framework for eventually supporting nested Composite Actions within Composite Actions. This ADR allows for users to run multiple run steps within a GitHub Composite Action with the support of inputs, outputs, environment, and context for use in any steps as well as the if, timeout-minutes, and the continue-on-error attributes for each Composite Action step.

View File

@@ -14,7 +14,7 @@ Issues in this repository should be for the runner application. Note that the V
We ask that before significant effort is put into code changes, that we have agreement on taking the change before time is invested in code changes. We ask that before significant effort is put into code changes, that we have agreement on taking the change before time is invested in code changes.
1. Create a feature request. Once agreed we will take the enhancment 1. Create a feature request. Once agreed we will take the enhancement
2. Create an ADR to agree on the details of the change. 2. Create an ADR to agree on the details of the change.
An ADR is an Architectural Decision Record. This allows consensus on the direction forward and also serves as a record of the change and motivation. [Read more here](adrs/README.md) An ADR is an Architectural Decision Record. This allows consensus on the direction forward and also serves as a record of the change and motivation. [Read more here](adrs/README.md)

View File

@@ -1,18 +1,14 @@
## Features ## Features
- Resolve action download info from server (#508, #515, #550) - Add labels in the script that register runner (#844)
- Print runner and machine name to log. (#539) - Add proxy support for container actions (#840)
## Bugs ## Bugs
- Reduce input validation warnings (#506) - Unset GTIHUB_ACTION_REPOSITORY and GITHUB_ACTION_REF for non-repo based actions #804
- Fix null ref exception in SecretMasker caused by `hashfiles` timeout. (#516) - fix compat issue in timeline record state. #861
- Add libicu66 to `./installDependencies.sh` for Ubuntu 20.04 (#535)
- Fix DataContract with Token service (#532)
- Skip search $PATH on command with fully qualified path (#526)
- Restore SELinux context on service file when SELinux is enabled (#525)
## Misc ## Misc
- Remove SPS/Token migration code. Remove GHES url manipulate code. (#513) - Crypto cleanup and enable usage of FIPS compliant crypto when required (#806)
- Add sub-step for developer flow for clarity (#523) - Count actions resolve failures as infra failures (#851)
- Update Links and Language to Git + VSCode (#522)
- Update runner configuration exception message (#540)
## Windows x64 ## Windows x64
We recommend configuring the runner in a root folder of the Windows drive (e.g. "C:\actions-runner"). This will help avoid issues related to service identity folder permissions and long file path restrictions on Windows. We recommend configuring the runner in a root folder of the Windows drive (e.g. "C:\actions-runner"). This will help avoid issues related to service identity folder permissions and long file path restrictions on Windows.

View File

@@ -1 +1 @@
<Update to ./src/runnerversion when creating release> 2.275.1

View File

@@ -12,12 +12,13 @@ set -e
# #
# Usage: # Usage:
# export RUNNER_CFG_PAT=<yourPAT> # export RUNNER_CFG_PAT=<yourPAT>
# ./create-latest-svc scope [ghe_domain] [name] [user] # ./create-latest-svc scope [ghe_domain] [name] [user] [labels]
# #
# scope required repo (:owner/:repo) or org (:organization) # scope required repo (:owner/:repo) or org (:organization)
# ghe_domain optional the fully qualified domain name of your GitHub Enterprise Server deployment # ghe_domain optional the fully qualified domain name of your GitHub Enterprise Server deployment
# name optional defaults to hostname # name optional defaults to hostname
# user optional user svc will run as. defaults to current # user optional user svc will run as. defaults to current
# labels optional list of labels (split by comma) applied on the runner
# #
# Notes: # Notes:
# PATS over envvars are more secure # PATS over envvars are more secure
@@ -30,6 +31,7 @@ runner_scope=${1}
ghe_hostname=${2} ghe_hostname=${2}
runner_name=${3:-$(hostname)} runner_name=${3:-$(hostname)}
svc_user=${4:-$USER} svc_user=${4:-$USER}
labels=${5}
echo "Configuring runner @ ${runner_scope}" echo "Configuring runner @ ${runner_scope}"
sudo echo sudo echo
@@ -130,8 +132,8 @@ fi
echo echo
echo "Configuring ${runner_name} @ $runner_url" echo "Configuring ${runner_name} @ $runner_url"
echo "./config.sh --unattended --url $runner_url --token *** --name $runner_name" echo "./config.sh --unattended --url $runner_url --token *** --name $runner_name --labels $labels"
sudo -E -u ${svc_user} ./config.sh --unattended --url $runner_url --token $RUNNER_TOKEN --name $runner_name sudo -E -u ${svc_user} ./config.sh --unattended --url $runner_url --token $RUNNER_TOKEN --name $runner_name --labels $labels
#--------------------------------------- #---------------------------------------
# Configuring as a service # Configuring as a service

10
src/.editorconfig Normal file
View File

@@ -0,0 +1,10 @@
[*.cs]
charset = utf-8
insert_final_newline = true
csharp_new_line_before_else = true
csharp_new_line_before_catch = true
csharp_new_line_before_finally = true
csharp_new_line_before_open_brace = all
csharp_space_after_keywords_in_control_flow_statements = true

View File

@@ -1,4 +1,4 @@

Microsoft Visual Studio Solution File, Format Version 12.00 Microsoft Visual Studio Solution File, Format Version 12.00
# Visual Studio Version 16 # Visual Studio Version 16
VisualStudioVersion = 16.0.29411.138 VisualStudioVersion = 16.0.29411.138
@@ -21,6 +21,11 @@ Project("{9A19103F-16F7-4668-BE54-9A1E7A4F7556}") = "Sdk", "Sdk\Sdk.csproj", "{D
EndProject EndProject
Project("{9A19103F-16F7-4668-BE54-9A1E7A4F7556}") = "Test", "Test\Test.csproj", "{C932061F-F6A1-4F1E-B854-A6C6B30DC3EF}" Project("{9A19103F-16F7-4668-BE54-9A1E7A4F7556}") = "Test", "Test\Test.csproj", "{C932061F-F6A1-4F1E-B854-A6C6B30DC3EF}"
EndProject EndProject
Project("{2150E333-8FDC-42A3-9474-1A3956D46DE8}") = "Solution Items", "Solution Items", "{EFB254FC-7927-445E-BA64-6676ADB309E9}"
ProjectSection(SolutionItems) = preProject
.editorconfig = .editorconfig
EndProjectSection
EndProject
Global Global
GlobalSection(SolutionConfigurationPlatforms) = preSolution GlobalSection(SolutionConfigurationPlatforms) = preSolution
Debug|Any CPU = Debug|Any CPU Debug|Any CPU = Debug|Any CPU

View File

@@ -69,6 +69,8 @@
.PARAMETER ProxyUseDefaultCredentials .PARAMETER ProxyUseDefaultCredentials
Default: false Default: false
Use default credentials, when using proxy address. Use default credentials, when using proxy address.
.PARAMETER ProxyBypassList
If set with ProxyAddress, will provide the list of comma separated urls that will bypass the proxy
.PARAMETER SkipNonVersionedFiles .PARAMETER SkipNonVersionedFiles
Default: false Default: false
Skips installing non-versioned files if they already exist, such as dotnet.exe. Skips installing non-versioned files if they already exist, such as dotnet.exe.
@@ -96,6 +98,7 @@ param(
[string]$FeedCredential, [string]$FeedCredential,
[string]$ProxyAddress, [string]$ProxyAddress,
[switch]$ProxyUseDefaultCredentials, [switch]$ProxyUseDefaultCredentials,
[string[]]$ProxyBypassList=@(),
[switch]$SkipNonVersionedFiles, [switch]$SkipNonVersionedFiles,
[switch]$NoCdn [switch]$NoCdn
) )
@@ -119,12 +122,28 @@ $VersionRegEx="/\d+\.\d+[^/]+/"
$OverrideNonVersionedFiles = !$SkipNonVersionedFiles $OverrideNonVersionedFiles = !$SkipNonVersionedFiles
function Say($str) { function Say($str) {
try
{
Write-Host "dotnet-install: $str" Write-Host "dotnet-install: $str"
} }
catch
{
# Some platforms cannot utilize Write-Host (Azure Functions, for instance). Fall back to Write-Output
Write-Output "dotnet-install: $str"
}
}
function Say-Verbose($str) { function Say-Verbose($str) {
try
{
Write-Verbose "dotnet-install: $str" Write-Verbose "dotnet-install: $str"
} }
catch
{
# Some platforms cannot utilize Write-Verbose (Azure Functions, for instance). Fall back to Write-Output
Write-Output "dotnet-install: $str"
}
}
function Say-Invocation($Invocation) { function Say-Invocation($Invocation) {
$command = $Invocation.MyCommand; $command = $Invocation.MyCommand;
@@ -176,7 +195,7 @@ function Get-CLIArchitecture-From-Architecture([string]$Architecture) {
{ $_ -eq "x86" } { return "x86" } { $_ -eq "x86" } { return "x86" }
{ $_ -eq "arm" } { return "arm" } { $_ -eq "arm" } { return "arm" }
{ $_ -eq "arm64" } { return "arm64" } { $_ -eq "arm64" } { return "arm64" }
default { throw "Architecture not supported. If you think this is a bug, report it at https://github.com/dotnet/sdk/issues" } default { throw "Architecture '$Architecture' not supported. If you think this is a bug, report it at https://github.com/dotnet/install-scripts/issues" }
} }
} }
@@ -237,7 +256,11 @@ function GetHTTPResponse([Uri] $Uri)
if($ProxyAddress) { if($ProxyAddress) {
$HttpClientHandler = New-Object System.Net.Http.HttpClientHandler $HttpClientHandler = New-Object System.Net.Http.HttpClientHandler
$HttpClientHandler.Proxy = New-Object System.Net.WebProxy -Property @{Address=$ProxyAddress;UseDefaultCredentials=$ProxyUseDefaultCredentials} $HttpClientHandler.Proxy = New-Object System.Net.WebProxy -Property @{
Address=$ProxyAddress;
UseDefaultCredentials=$ProxyUseDefaultCredentials;
BypassList = $ProxyBypassList;
}
$HttpClient = New-Object System.Net.Http.HttpClient -ArgumentList $HttpClientHandler $HttpClient = New-Object System.Net.Http.HttpClient -ArgumentList $HttpClientHandler
} }
else { else {
@@ -372,17 +395,20 @@ function Get-Specific-Version-From-Version([string]$AzureFeed, [string]$Channel,
function Get-Download-Link([string]$AzureFeed, [string]$SpecificVersion, [string]$CLIArchitecture) { function Get-Download-Link([string]$AzureFeed, [string]$SpecificVersion, [string]$CLIArchitecture) {
Say-Invocation $MyInvocation Say-Invocation $MyInvocation
# If anything fails in this lookup it will default to $SpecificVersion
$SpecificProductVersion = Get-Product-Version -AzureFeed $AzureFeed -SpecificVersion $SpecificVersion
if ($Runtime -eq "dotnet") { if ($Runtime -eq "dotnet") {
$PayloadURL = "$AzureFeed/Runtime/$SpecificVersion/dotnet-runtime-$SpecificVersion-win-$CLIArchitecture.zip" $PayloadURL = "$AzureFeed/Runtime/$SpecificVersion/dotnet-runtime-$SpecificProductVersion-win-$CLIArchitecture.zip"
} }
elseif ($Runtime -eq "aspnetcore") { elseif ($Runtime -eq "aspnetcore") {
$PayloadURL = "$AzureFeed/aspnetcore/Runtime/$SpecificVersion/aspnetcore-runtime-$SpecificVersion-win-$CLIArchitecture.zip" $PayloadURL = "$AzureFeed/aspnetcore/Runtime/$SpecificVersion/aspnetcore-runtime-$SpecificProductVersion-win-$CLIArchitecture.zip"
} }
elseif ($Runtime -eq "windowsdesktop") { elseif ($Runtime -eq "windowsdesktop") {
$PayloadURL = "$AzureFeed/Runtime/$SpecificVersion/windowsdesktop-runtime-$SpecificVersion-win-$CLIArchitecture.zip" $PayloadURL = "$AzureFeed/Runtime/$SpecificVersion/windowsdesktop-runtime-$SpecificProductVersion-win-$CLIArchitecture.zip"
} }
elseif (-not $Runtime) { elseif (-not $Runtime) {
$PayloadURL = "$AzureFeed/Sdk/$SpecificVersion/dotnet-sdk-$SpecificVersion-win-$CLIArchitecture.zip" $PayloadURL = "$AzureFeed/Sdk/$SpecificVersion/dotnet-sdk-$SpecificProductVersion-win-$CLIArchitecture.zip"
} }
else { else {
throw "Invalid value for `$Runtime" throw "Invalid value for `$Runtime"
@@ -390,7 +416,7 @@ function Get-Download-Link([string]$AzureFeed, [string]$SpecificVersion, [string
Say-Verbose "Constructed primary named payload URL: $PayloadURL" Say-Verbose "Constructed primary named payload URL: $PayloadURL"
return $PayloadURL return $PayloadURL, $SpecificProductVersion
} }
function Get-LegacyDownload-Link([string]$AzureFeed, [string]$SpecificVersion, [string]$CLIArchitecture) { function Get-LegacyDownload-Link([string]$AzureFeed, [string]$SpecificVersion, [string]$CLIArchitecture) {
@@ -411,6 +437,51 @@ function Get-LegacyDownload-Link([string]$AzureFeed, [string]$SpecificVersion, [
return $PayloadURL return $PayloadURL
} }
function Get-Product-Version([string]$AzureFeed, [string]$SpecificVersion) {
Say-Invocation $MyInvocation
if ($Runtime -eq "dotnet") {
$ProductVersionTxtURL = "$AzureFeed/Runtime/$SpecificVersion/productVersion.txt"
}
elseif ($Runtime -eq "aspnetcore") {
$ProductVersionTxtURL = "$AzureFeed/aspnetcore/Runtime/$SpecificVersion/productVersion.txt"
}
elseif ($Runtime -eq "windowsdesktop") {
$ProductVersionTxtURL = "$AzureFeed/Runtime/$SpecificVersion/productVersion.txt"
}
elseif (-not $Runtime) {
$ProductVersionTxtURL = "$AzureFeed/Sdk/$SpecificVersion/productVersion.txt"
}
else {
throw "Invalid value '$Runtime' specified for `$Runtime"
}
Say-Verbose "Checking for existence of $ProductVersionTxtURL"
try {
$productVersionResponse = GetHTTPResponse($productVersionTxtUrl)
if ($productVersionResponse.StatusCode -eq 200) {
$productVersion = $productVersionResponse.Content.ReadAsStringAsync().Result.Trim()
if ($productVersion -ne $SpecificVersion)
{
Say "Using alternate version $productVersion found in $ProductVersionTxtURL"
}
return $productVersion
}
else {
Say-Verbose "Got StatusCode $($productVersionResponse.StatusCode) trying to get productVersion.txt at $productVersionTxtUrl, so using default value of $SpecificVersion"
$productVersion = $SpecificVersion
}
} catch {
Say-Verbose "Could not read productVersion.txt at $productVersionTxtUrl, so using default value of $SpecificVersion (Exception: '$($_.Exception.Message)' )"
$productVersion = $SpecificVersion
}
return $productVersion
}
function Get-User-Share-Path() { function Get-User-Share-Path() {
Say-Invocation $MyInvocation Say-Invocation $MyInvocation
@@ -564,9 +635,14 @@ function Prepend-Sdk-InstallRoot-To-Path([string]$InstallRoot, [string]$BinFolde
} }
} }
Say "Note that the intended use of this script is for Continuous Integration (CI) scenarios, where:"
Say "- The SDK needs to be installed without user interaction and without admin rights."
Say "- The SDK installation doesn't need to persist across multiple CI runs."
Say "To set up a development environment or to run apps, use installers rather than this script. Visit https://dotnet.microsoft.com/download to get the installer.`r`n"
$CLIArchitecture = Get-CLIArchitecture-From-Architecture $Architecture $CLIArchitecture = Get-CLIArchitecture-From-Architecture $Architecture
$SpecificVersion = Get-Specific-Version-From-Version -AzureFeed $AzureFeed -Channel $Channel -Version $Version -JSonFile $JSonFile $SpecificVersion = Get-Specific-Version-From-Version -AzureFeed $AzureFeed -Channel $Channel -Version $Version -JSonFile $JSonFile
$DownloadLink = Get-Download-Link -AzureFeed $AzureFeed -SpecificVersion $SpecificVersion -CLIArchitecture $CLIArchitecture $DownloadLink, $EffectiveVersion = Get-Download-Link -AzureFeed $AzureFeed -SpecificVersion $SpecificVersion -CLIArchitecture $CLIArchitecture
$LegacyDownloadLink = Get-LegacyDownload-Link -AzureFeed $AzureFeed -SpecificVersion $SpecificVersion -CLIArchitecture $CLIArchitecture $LegacyDownloadLink = Get-LegacyDownload-Link -AzureFeed $AzureFeed -SpecificVersion $SpecificVersion -CLIArchitecture $CLIArchitecture
$InstallRoot = Resolve-Installation-Path $InstallDir $InstallRoot = Resolve-Installation-Path $InstallDir
@@ -592,6 +668,11 @@ if ($DryRun) {
} }
} }
Say "Repeatable invocation: $RepeatableCommand" Say "Repeatable invocation: $RepeatableCommand"
if ($SpecificVersion -ne $EffectiveVersion)
{
Say "NOTE: Due to finding a version manifest with this runtime, it would actually install with version '$EffectiveVersion'"
}
exit 0 exit 0
} }
@@ -615,6 +696,12 @@ else {
throw "Invalid value for `$Runtime" throw "Invalid value for `$Runtime"
} }
if ($SpecificVersion -ne $EffectiveVersion)
{
Say "Performing installation checks for effective version: $EffectiveVersion"
$SpecificVersion = $EffectiveVersion
}
# Check if the SDK version is already installed. # Check if the SDK version is already installed.
$isAssetInstalled = Is-Dotnet-Package-Installed -InstallRoot $InstallRoot -RelativePathToPackage $dotnetPackageRelativePath -SpecificVersion $SpecificVersion $isAssetInstalled = Is-Dotnet-Package-Installed -InstallRoot $InstallRoot -RelativePathToPackage $dotnetPackageRelativePath -SpecificVersion $SpecificVersion
if ($isAssetInstalled) { if ($isAssetInstalled) {
@@ -691,14 +778,15 @@ Remove-Item $ZipPath
Prepend-Sdk-InstallRoot-To-Path -InstallRoot $InstallRoot -BinFolderRelativePath $BinFolderRelativePath Prepend-Sdk-InstallRoot-To-Path -InstallRoot $InstallRoot -BinFolderRelativePath $BinFolderRelativePath
Say "Note that the script does not resolve dependencies during installation."
Say "To check the list of dependencies, go to https://docs.microsoft.com/dotnet/core/install/windows#dependencies"
Say "Installation finished" Say "Installation finished"
exit 0 exit 0
# SIG # Begin signature block # SIG # Begin signature block
# MIIjhwYJKoZIhvcNAQcCoIIjeDCCI3QCAQExDzANBglghkgBZQMEAgEFADB5Bgor # MIIjlgYJKoZIhvcNAQcCoIIjhzCCI4MCAQExDzANBglghkgBZQMEAgEFADB5Bgor
# BgEEAYI3AgEEoGswaTA0BgorBgEEAYI3AgEeMCYCAwEAAAQQH8w7YFlLCE63JNLG # BgEEAYI3AgEEoGswaTA0BgorBgEEAYI3AgEeMCYCAwEAAAQQH8w7YFlLCE63JNLG
# KX7zUQIBAAIBAAIBAAIBAAIBADAxMA0GCWCGSAFlAwQCAQUABCAiKYSY4KtkeThH # KX7zUQIBAAIBAAIBAAIBAAIBADAxMA0GCWCGSAFlAwQCAQUABCA+isugNMwZSGLd
# d5M1aXqv1K0/pff07QwfUbYZ/qX5LqCCDYUwggYDMIID66ADAgECAhMzAAABiK9S # kfBd0C2Ud//U2Nbj31s1jg3Yf9gh4KCCDYUwggYDMIID66ADAgECAhMzAAABiK9S
# 1rmSbej5AAAAAAGIMA0GCSqGSIb3DQEBCwUAMH4xCzAJBgNVBAYTAlVTMRMwEQYD # 1rmSbej5AAAAAAGIMA0GCSqGSIb3DQEBCwUAMH4xCzAJBgNVBAYTAlVTMRMwEQYD
# VQQIEwpXYXNoaW5ndG9uMRAwDgYDVQQHEwdSZWRtb25kMR4wHAYDVQQKExVNaWNy # VQQIEwpXYXNoaW5ndG9uMRAwDgYDVQQHEwdSZWRtb25kMR4wHAYDVQQKExVNaWNy
# b3NvZnQgQ29ycG9yYXRpb24xKDAmBgNVBAMTH01pY3Jvc29mdCBDb2RlIFNpZ25p # b3NvZnQgQ29ycG9yYXRpb24xKDAmBgNVBAMTH01pY3Jvc29mdCBDb2RlIFNpZ25p
@@ -770,119 +858,119 @@ exit 0
# BL7fQccOKO7eZS/sl/ahXJbYANahRr1Z85elCUtIEJmAH9AAKcWxm6U/RXceNcbS # BL7fQccOKO7eZS/sl/ahXJbYANahRr1Z85elCUtIEJmAH9AAKcWxm6U/RXceNcbS
# oqKfenoi+kiVH6v7RyOA9Z74v2u3S5fi63V4GuzqN5l5GEv/1rMjaHXmr/r8i+sL # oqKfenoi+kiVH6v7RyOA9Z74v2u3S5fi63V4GuzqN5l5GEv/1rMjaHXmr/r8i+sL
# gOppO6/8MO0ETI7f33VtY5E90Z1WTk+/gFcioXgRMiF670EKsT/7qMykXcGhiJtX # gOppO6/8MO0ETI7f33VtY5E90Z1WTk+/gFcioXgRMiF670EKsT/7qMykXcGhiJtX
# cVZOSEXAQsmbdlsKgEhr/Xmfwb1tbWrJUnMTDXpQzTGCFVgwghVUAgEBMIGVMH4x # cVZOSEXAQsmbdlsKgEhr/Xmfwb1tbWrJUnMTDXpQzTGCFWcwghVjAgEBMIGVMH4x
# CzAJBgNVBAYTAlVTMRMwEQYDVQQIEwpXYXNoaW5ndG9uMRAwDgYDVQQHEwdSZWRt # CzAJBgNVBAYTAlVTMRMwEQYDVQQIEwpXYXNoaW5ndG9uMRAwDgYDVQQHEwdSZWRt
# b25kMR4wHAYDVQQKExVNaWNyb3NvZnQgQ29ycG9yYXRpb24xKDAmBgNVBAMTH01p # b25kMR4wHAYDVQQKExVNaWNyb3NvZnQgQ29ycG9yYXRpb24xKDAmBgNVBAMTH01p
# Y3Jvc29mdCBDb2RlIFNpZ25pbmcgUENBIDIwMTECEzMAAAGIr1LWuZJt6PkAAAAA # Y3Jvc29mdCBDb2RlIFNpZ25pbmcgUENBIDIwMTECEzMAAAGIr1LWuZJt6PkAAAAA
# AYgwDQYJYIZIAWUDBAIBBQCgga4wGQYJKoZIhvcNAQkDMQwGCisGAQQBgjcCAQQw # AYgwDQYJYIZIAWUDBAIBBQCgga4wGQYJKoZIhvcNAQkDMQwGCisGAQQBgjcCAQQw
# HAYKKwYBBAGCNwIBCzEOMAwGCisGAQQBgjcCARUwLwYJKoZIhvcNAQkEMSIEIFxZ # HAYKKwYBBAGCNwIBCzEOMAwGCisGAQQBgjcCARUwLwYJKoZIhvcNAQkEMSIEIK4I
# Yezh3liQqiGQuXNa+zYfoSIbLqOpdEn2ZKskBkisMEIGCisGAQQBgjcCAQwxNDAy # CDH7/r/eeMqTtDETJ67ogfneVRo0/P6ogV2vy4tXMEIGCisGAQQBgjcCAQwxNDAy
# oBSAEgBNAGkAYwByAG8AcwBvAGYAdKEagBhodHRwOi8vd3d3Lm1pY3Jvc29mdC5j # oBSAEgBNAGkAYwByAG8AcwBvAGYAdKEagBhodHRwOi8vd3d3Lm1pY3Jvc29mdC5j
# b20wDQYJKoZIhvcNAQEBBQAEggEAjLUrwCXJCPHZulZuKAQSX+MfnIRFAhlN7ru2 # b20wDQYJKoZIhvcNAQEBBQAEggEAOnmVmILEjI6ZiuuSOvvTvijidkBez61Vz97A
# 6H8rudvhkWgqMISkLb9gFDPR5FhR4sqdYgKW4P0ERao9ypCGi1FWDLqygC2XBbHj # jV3AOsfmUvLpVaTVa1Mt2iPDuq1QLqRPaT7BD8PAUwr91pYllVgEd8NqivCIaCZg
# NEQHBxHJs5SMsMAXNSIcYHqVAvhF3nXoseaNBkhOTrkQ1FS/fW7AfDGRbsiiESzv # QyIRiTmHQxbozWsLcjxMvX2VxSmNKDw7IOHzUbXtmiEGhygyZpdh/uiCj7ziSxp3
# lebf92shZylBFKOsKQLAL0mF/B7xrxHJIj5dgQoD1phATRNHOEQj3jgmkidFWowV # lQBR8mUE1NL9dxaxKWLhGeORqAepw6nId9oO+mHRh4JRK7uqZOFAES7/21M9vPZi
# 4r8MzbxRhAEORbnJexlUoDQJQH3YwxuUyXkTvrYMTKSbGJLlwRaZQbrcBU0k4gCH # XYilJLgIoyMkvqYSdoouzn6+m74kgzkNkyK9GYz2mmO2BCMnai9Njze2d0+kY+37
# y8Sci+p9Rq+aOTzLCoNrZyh9E7OdwVDm1FJAtY30bV50T2WSFKGCEuIwghLeBgor # kt10BmJDw3FHaZ+/fH/TMTgo0ZcAOicP9ccdIh/CzzpU52o+Q6GCEvEwghLtBgor
# BgEEAYI3AwMBMYISzjCCEsoGCSqGSIb3DQEHAqCCErswghK3AgEDMQ8wDQYJYIZI # BgEEAYI3AwMBMYIS3TCCEtkGCSqGSIb3DQEHAqCCEsowghLGAgEDMQ8wDQYJYIZI
# AWUDBAIBBQAwggFRBgsqhkiG9w0BCRABBKCCAUAEggE8MIIBOAIBAQYKKwYBBAGE # AWUDBAIBBQAwggFVBgsqhkiG9w0BCRABBKCCAUQEggFAMIIBPAIBAQYKKwYBBAGE
# WQoDATAxMA0GCWCGSAFlAwQCAQUABCD7JNcBBSfhlKPL1tN3CEKRKJuT/dZ8RO9K # WQoDATAxMA0GCWCGSAFlAwQCAQUABCBSbhMJwNER+BICn3iLUnPrP8dptyUphcFC
# orYLXJeLTwIGXvN89YD7GBMyMDIwMDcwMTE0MTYyMC40MDVaMASAAgH0oIHQpIHN # A/NsIgnPLwIGX4hEzP6WGBMyMDIwMTEwOTE0NDY1Mi4yMzNaMASAAgH0oIHUpIHR
# MIHKMQswCQYDVQQGEwJVUzELMAkGA1UECBMCV0ExEDAOBgNVBAcTB1JlZG1vbmQx # MIHOMQswCQYDVQQGEwJVUzETMBEGA1UECBMKV2FzaGluZ3RvbjEQMA4GA1UEBxMH
# HjAcBgNVBAoTFU1pY3Jvc29mdCBDb3Jwb3JhdGlvbjEtMCsGA1UECxMkTWljcm9z # UmVkbW9uZDEeMBwGA1UEChMVTWljcm9zb2Z0IENvcnBvcmF0aW9uMSkwJwYDVQQL
# b2Z0IElyZWxhbmQgT3BlcmF0aW9ucyBMaW1pdGVkMSYwJAYDVQQLEx1UaGFsZXMg # EyBNaWNyb3NvZnQgT3BlcmF0aW9ucyBQdWVydG8gUmljbzEmMCQGA1UECxMdVGhh
# VFNTIEVTTjoxNzlFLTRCQjAtODI0NjElMCMGA1UEAxMcTWljcm9zb2Z0IFRpbWUt # bGVzIFRTUyBFU046MEE1Ni1FMzI5LTRENEQxJTAjBgNVBAMTHE1pY3Jvc29mdCBU
# U3RhbXAgU2VydmljZaCCDjkwggTxMIID2aADAgECAhMzAAABDKp4btzMQkzBAAAA # aW1lLVN0YW1wIFNlcnZpY2Wggg5EMIIE9TCCA92gAwIBAgITMwAAAScvbqPvkagZ
# AAEMMA0GCSqGSIb3DQEBCwUAMHwxCzAJBgNVBAYTAlVTMRMwEQYDVQQIEwpXYXNo # qAAAAAABJzANBgkqhkiG9w0BAQsFADB8MQswCQYDVQQGEwJVUzETMBEGA1UECBMK
# aW5ndG9uMRAwDgYDVQQHEwdSZWRtb25kMR4wHAYDVQQKExVNaWNyb3NvZnQgQ29y # V2FzaGluZ3RvbjEQMA4GA1UEBxMHUmVkbW9uZDEeMBwGA1UEChMVTWljcm9zb2Z0
# cG9yYXRpb24xJjAkBgNVBAMTHU1pY3Jvc29mdCBUaW1lLVN0YW1wIFBDQSAyMDEw # IENvcnBvcmF0aW9uMSYwJAYDVQQDEx1NaWNyb3NvZnQgVGltZS1TdGFtcCBQQ0Eg
# MB4XDTE5MTAyMzIzMTkxNloXDTIxMDEyMTIzMTkxNlowgcoxCzAJBgNVBAYTAlVT # MjAxMDAeFw0xOTEyMTkwMTE0NTlaFw0yMTAzMTcwMTE0NTlaMIHOMQswCQYDVQQG
# MQswCQYDVQQIEwJXQTEQMA4GA1UEBxMHUmVkbW9uZDEeMBwGA1UEChMVTWljcm9z # EwJVUzETMBEGA1UECBMKV2FzaGluZ3RvbjEQMA4GA1UEBxMHUmVkbW9uZDEeMBwG
# b2Z0IENvcnBvcmF0aW9uMS0wKwYDVQQLEyRNaWNyb3NvZnQgSXJlbGFuZCBPcGVy # A1UEChMVTWljcm9zb2Z0IENvcnBvcmF0aW9uMSkwJwYDVQQLEyBNaWNyb3NvZnQg
# YXRpb25zIExpbWl0ZWQxJjAkBgNVBAsTHVRoYWxlcyBUU1MgRVNOOjE3OUUtNEJC # T3BlcmF0aW9ucyBQdWVydG8gUmljbzEmMCQGA1UECxMdVGhhbGVzIFRTUyBFU046
# MC04MjQ2MSUwIwYDVQQDExxNaWNyb3NvZnQgVGltZS1TdGFtcCBTZXJ2aWNlMIIB # MEE1Ni1FMzI5LTRENEQxJTAjBgNVBAMTHE1pY3Jvc29mdCBUaW1lLVN0YW1wIFNl
# IjANBgkqhkiG9w0BAQEFAAOCAQ8AMIIBCgKCAQEAq5011+XqVJmQKtiw39igeEMv # cnZpY2UwggEiMA0GCSqGSIb3DQEBAQUAA4IBDwAwggEKAoIBAQD4Ad5xEZ5On0uN
# CLcZ1forbmxsDkpnCN1SrThKI+n2Pr3zqTzJVgdJFCoKm1ks1gtRJ7HaL6tDkrOw # L71ng9xwoDPRKeMUyEIj5yVxPRPh5GVbU7D3pqDsoXzQMhfeRP61L1zlU1HCRS+1
# 8XJmfJaxyQAluCQ+e40NI+A4w+u59Gy89AVY5lJNrmCva6gozfg1kxw6abV5WWr+ # 29eo0yj1zjbAlmPAwosUgyIonesWt9E4hFlXCGUcIg5XMdvQ+Ouzk2r+awNRuk8A
# PjEpNCshO4hxv3UqgMcCKnT2YVSZzF1Gy7APub1fY0P1vNEuOFKrNCEEvWIKRrqs # BGOa0I4VBy6zqCYHyX2pGauiB43frJSNP6pcrO0CBmpBZNjgepof5Z/50vBuJDUS
# eyBB73G8KD2yw6jfz0VKxNSRAdhJV/ghOyrDt5a+L6C3m1rpr8sqiof3iohv3ANI # ug6OIMQ7ZwUhSzX4bEmZUUjAycBb62dhQpGqHsXe6ypVDTgAEnGONdSBKkHiNT8H
# gNqw6ex+4+G+B7JMbIHbGpPdebedL6ePbuBCnbgJoDn340k0aw6ij21GvvUnkQID # 0Zt2lm0vCLwHyTwtgIdi67T/LCp+X2mlPHqXsY3u72X3GYn/3G8YFCkrSc6m3b0w
# AQABo4IBGzCCARcwHQYDVR0OBBYEFAlCOq9DDIa0A0oqgKtM5vjuZeK+MB8GA1Ud # TXPd5/2fAgMBAAGjggEbMIIBFzAdBgNVHQ4EFgQU5fSWVYBfOTEkW2JTiV24WNNt
# IwQYMBaAFNVjOlyKMZDzQ3t8RhvFM2hahW1VMFYGA1UdHwRPME0wS6BJoEeGRWh0 # lfIwHwYDVR0jBBgwFoAU1WM6XIoxkPNDe3xGG8UzaFqFbVUwVgYDVR0fBE8wTTBL
# dHA6Ly9jcmwubWljcm9zb2Z0LmNvbS9wa2kvY3JsL3Byb2R1Y3RzL01pY1RpbVN0 # oEmgR4ZFaHR0cDovL2NybC5taWNyb3NvZnQuY29tL3BraS9jcmwvcHJvZHVjdHMv
# YVBDQV8yMDEwLTA3LTAxLmNybDBaBggrBgEFBQcBAQROMEwwSgYIKwYBBQUHMAKG # TWljVGltU3RhUENBXzIwMTAtMDctMDEuY3JsMFoGCCsGAQUFBwEBBE4wTDBKBggr
# Pmh0dHA6Ly93d3cubWljcm9zb2Z0LmNvbS9wa2kvY2VydHMvTWljVGltU3RhUENB # BgEFBQcwAoY+aHR0cDovL3d3dy5taWNyb3NvZnQuY29tL3BraS9jZXJ0cy9NaWNU
# XzIwMTAtMDctMDEuY3J0MAwGA1UdEwEB/wQCMAAwEwYDVR0lBAwwCgYIKwYBBQUH # aW1TdGFQQ0FfMjAxMC0wNy0wMS5jcnQwDAYDVR0TAQH/BAIwADATBgNVHSUEDDAK
# AwgwDQYJKoZIhvcNAQELBQADggEBAET3xBg/IZ9zdOfwbDGK7cK3qKYt/qUOlbRB # BggrBgEFBQcDCDANBgkqhkiG9w0BAQsFAAOCAQEACsqNfNFVxwalZ42cEMuzZc12
# zgeNjb32K86nGeRGkBee10dVOEGWUw6KtBeWh1LQ70b64/tLtiLcsf9JzaAyDYb1 # 6Nvluanx8UewDVeUQZEZHRmppMFHAzS/g6RzmxTyR2tKE3mChNGW5dTL730vEbRh
# sRmMi5fjRZ753TquaT8V7NJ7RfEuYfvZlubfQD0MVbU4tzsdZdYuxE37V2J9pN89 # nYRmBgiX/gT3f4AQrOPnZGXY7zszcrlbgzxpakOX+x0u4rkP3Ashh3B2CdJ11XsB
# j7GoFNtAnSnCn1MRxENAILgt9XzeQzTEDhFYW0N2DNphTkRPXGjpDmwi6WtkJ5fv # di5PiZa1spB6U5S8D15gqTUfoIniLT4v1DBdkWExsKI1vsiFcDcjGJ4xRlMRF+fw
# 0iTyB4dwEC+/ed0lGbFLcytJoMwfTNMdH6gcnHlMzsniornGFZa5PPiV78XoZ9Fe # 7SY0WZoOzwRzKxDTdg4DusAXpaeKbch9iithLFk/vIxQrqCr/niW8tEA+eSzeX/E
# upKo8ZKNGhLLLB5GTtqfHex5no3ioVSq+NthvhX0I/V+iXJsopowggZxMIIEWaAD # q1D0ZyvOn4e2lTnwoJUKH6OQAWSBogyK4OCbFeJOqdKAUiBTgHKkQIYh/tbKQjCC
# AgECAgphCYEqAAAAAAACMA0GCSqGSIb3DQEBCwUAMIGIMQswCQYDVQQGEwJVUzET # BnEwggRZoAMCAQICCmEJgSoAAAAAAAIwDQYJKoZIhvcNAQELBQAwgYgxCzAJBgNV
# MBEGA1UECBMKV2FzaGluZ3RvbjEQMA4GA1UEBxMHUmVkbW9uZDEeMBwGA1UEChMV # BAYTAlVTMRMwEQYDVQQIEwpXYXNoaW5ndG9uMRAwDgYDVQQHEwdSZWRtb25kMR4w
# TWljcm9zb2Z0IENvcnBvcmF0aW9uMTIwMAYDVQQDEylNaWNyb3NvZnQgUm9vdCBD # HAYDVQQKExVNaWNyb3NvZnQgQ29ycG9yYXRpb24xMjAwBgNVBAMTKU1pY3Jvc29m
# ZXJ0aWZpY2F0ZSBBdXRob3JpdHkgMjAxMDAeFw0xMDA3MDEyMTM2NTVaFw0yNTA3 # dCBSb290IENlcnRpZmljYXRlIEF1dGhvcml0eSAyMDEwMB4XDTEwMDcwMTIxMzY1
# MDEyMTQ2NTVaMHwxCzAJBgNVBAYTAlVTMRMwEQYDVQQIEwpXYXNoaW5ndG9uMRAw # NVoXDTI1MDcwMTIxNDY1NVowfDELMAkGA1UEBhMCVVMxEzARBgNVBAgTCldhc2hp
# DgYDVQQHEwdSZWRtb25kMR4wHAYDVQQKExVNaWNyb3NvZnQgQ29ycG9yYXRpb24x # bmd0b24xEDAOBgNVBAcTB1JlZG1vbmQxHjAcBgNVBAoTFU1pY3Jvc29mdCBDb3Jw
# JjAkBgNVBAMTHU1pY3Jvc29mdCBUaW1lLVN0YW1wIFBDQSAyMDEwMIIBIjANBgkq # b3JhdGlvbjEmMCQGA1UEAxMdTWljcm9zb2Z0IFRpbWUtU3RhbXAgUENBIDIwMTAw
# hkiG9w0BAQEFAAOCAQ8AMIIBCgKCAQEAqR0NvHcRijog7PwTl/X6f2mUa3RUENWl # ggEiMA0GCSqGSIb3DQEBAQUAA4IBDwAwggEKAoIBAQCpHQ28dxGKOiDs/BOX9fp/
# CgCChfvtfGhLLF/Fw+Vhwna3PmYrW/AVUycEMR9BGxqVHc4JE458YTBZsTBED/Fg # aZRrdFQQ1aUKAIKF++18aEssX8XD5WHCdrc+Zitb8BVTJwQxH0EbGpUdzgkTjnxh
# iIRUQwzXTbg4CLNC3ZOs1nMwVyaCo0UN0Or1R4HNvyRgMlhgRvJYR4YyhB50YWeR # MFmxMEQP8WCIhFRDDNdNuDgIs0Ldk6zWczBXJoKjRQ3Q6vVHgc2/JGAyWGBG8lhH
# X4FUsc+TTJLBxKZd0WETbijGGvmGgLvfYfxGwScdJGcSchohiq9LZIlQYrFd/Xcf # hjKEHnRhZ5FfgVSxz5NMksHEpl3RYRNuKMYa+YaAu99h/EbBJx0kZxJyGiGKr0tk
# PfBXday9ikJNQFHRD5wGPmd/9WbAA5ZEfu/QS/1u5ZrKsajyeioKMfDaTgaRtogI # iVBisV39dx898Fd1rL2KQk1AUdEPnAY+Z3/1ZsADlkR+79BL/W7lmsqxqPJ6Kgox
# Neh4HLDpmc085y9Euqf03GS9pAHBIAmTeM38vMDJRF1eFpwBBU8iTQIDAQABo4IB # 8NpOBpG2iAg16HgcsOmZzTznL0S6p/TcZL2kAcEgCZN4zfy8wMlEXV4WnAEFTyJN
# 5jCCAeIwEAYJKwYBBAGCNxUBBAMCAQAwHQYDVR0OBBYEFNVjOlyKMZDzQ3t8RhvF # AgMBAAGjggHmMIIB4jAQBgkrBgEEAYI3FQEEAwIBADAdBgNVHQ4EFgQU1WM6XIox
# M2hahW1VMBkGCSsGAQQBgjcUAgQMHgoAUwB1AGIAQwBBMAsGA1UdDwQEAwIBhjAP # kPNDe3xGG8UzaFqFbVUwGQYJKwYBBAGCNxQCBAweCgBTAHUAYgBDAEEwCwYDVR0P
# BgNVHRMBAf8EBTADAQH/MB8GA1UdIwQYMBaAFNX2VsuP6KJcYmjRPZSQW9fOmhjE # BAQDAgGGMA8GA1UdEwEB/wQFMAMBAf8wHwYDVR0jBBgwFoAU1fZWy4/oolxiaNE9
# MFYGA1UdHwRPME0wS6BJoEeGRWh0dHA6Ly9jcmwubWljcm9zb2Z0LmNvbS9wa2kv # lJBb186aGMQwVgYDVR0fBE8wTTBLoEmgR4ZFaHR0cDovL2NybC5taWNyb3NvZnQu
# Y3JsL3Byb2R1Y3RzL01pY1Jvb0NlckF1dF8yMDEwLTA2LTIzLmNybDBaBggrBgEF # Y29tL3BraS9jcmwvcHJvZHVjdHMvTWljUm9vQ2VyQXV0XzIwMTAtMDYtMjMuY3Js
# BQcBAQROMEwwSgYIKwYBBQUHMAKGPmh0dHA6Ly93d3cubWljcm9zb2Z0LmNvbS9w # MFoGCCsGAQUFBwEBBE4wTDBKBggrBgEFBQcwAoY+aHR0cDovL3d3dy5taWNyb3Nv
# a2kvY2VydHMvTWljUm9vQ2VyQXV0XzIwMTAtMDYtMjMuY3J0MIGgBgNVHSABAf8E # ZnQuY29tL3BraS9jZXJ0cy9NaWNSb29DZXJBdXRfMjAxMC0wNi0yMy5jcnQwgaAG
# gZUwgZIwgY8GCSsGAQQBgjcuAzCBgTA9BggrBgEFBQcCARYxaHR0cDovL3d3dy5t # A1UdIAEB/wSBlTCBkjCBjwYJKwYBBAGCNy4DMIGBMD0GCCsGAQUFBwIBFjFodHRw
# aWNyb3NvZnQuY29tL1BLSS9kb2NzL0NQUy9kZWZhdWx0Lmh0bTBABggrBgEFBQcC # Oi8vd3d3Lm1pY3Jvc29mdC5jb20vUEtJL2RvY3MvQ1BTL2RlZmF1bHQuaHRtMEAG
# AjA0HjIgHQBMAGUAZwBhAGwAXwBQAG8AbABpAGMAeQBfAFMAdABhAHQAZQBtAGUA # CCsGAQUFBwICMDQeMiAdAEwAZQBnAGEAbABfAFAAbwBsAGkAYwB5AF8AUwB0AGEA
# bgB0AC4gHTANBgkqhkiG9w0BAQsFAAOCAgEAB+aIUQ3ixuCYP4FxAz2do6Ehb7Pr # dABlAG0AZQBuAHQALiAdMA0GCSqGSIb3DQEBCwUAA4ICAQAH5ohRDeLG4Jg/gXED
# psz1Mb7PBeKp/vpXbRkws8LFZslq3/Xn8Hi9x6ieJeP5vO1rVFcIK1GCRBL7uVOM # PZ2joSFvs+umzPUxvs8F4qn++ldtGTCzwsVmyWrf9efweL3HqJ4l4/m87WtUVwgr
# zPRgEop2zEBAQZvcXBf/XPleFzWYJFZLdO9CEMivv3/Gf/I3fVo/HPKZeUqRUgCv # UYJEEvu5U4zM9GASinbMQEBBm9xcF/9c+V4XNZgkVkt070IQyK+/f8Z/8jd9Wj8c
# OA8X9S95gWXZqbVr5MfO9sp6AG9LMEQkIjzP7QOllo9ZKby2/QThcJ8ySif9Va8v # 8pl5SpFSAK84Dxf1L3mBZdmptWvkx872ynoAb0swRCQiPM/tA6WWj1kpvLb9BOFw
# /rbljjO7Yl+a21dA6fHOmWaQjP9qYn/dxUoLkSbiOewZSnFjnXshbcOco6I8+n99 # nzJKJ/1Vry/+tuWOM7tiX5rbV0Dp8c6ZZpCM/2pif93FSguRJuI57BlKcWOdeyFt
# lmqQeKZt0uGc+R38ONiU9MalCpaGpL2eGq4EQoO4tYCbIjggtSXlZOz39L9+Y1kl # w5yjojz6f32WapB4pm3S4Zz5Hfw42JT0xqUKloakvZ4argRCg7i1gJsiOCC1JeVk
# D3ouOVd2onGqBooPiRa6YacRy5rYDkeagMXQzafQ732D8OE7cQnfXXSYIghh2rBQ # 7Pf0v35jWSUPei45V3aicaoGig+JFrphpxHLmtgOR5qAxdDNp9DvfYPw4TtxCd9d
# Hm+98eEA3+cxB6STOvdlR3jo+KhIq/fecn5ha293qYHLpwmsObvsxsvYgrRyzR30 # dJgiCGHasFAeb73x4QDf5zEHpJM692VHeOj4qEir995yfmFrb3epgcunCaw5u+zG
# uIUBHoD7G4kqVDmyW9rIDVWZeodzOwjmmC3qjeAzLhIp9cAvVCch98isTtoouLGp # y9iCtHLNHfS4hQEegPsbiSpUObJb2sgNVZl6h3M7COaYLeqN4DMuEin1wC9UJyH3
# 25ayp0Kiyc8ZQU3ghvkqmqMRZjDTu3QyS99je/WZii8bxyGvWbWu3EQ8l1Bx16HS # yKxO2ii4sanblrKnQqLJzxlBTeCG+SqaoxFmMNO7dDJL32N79ZmKLxvHIa9Zta7c
# xVXjad5XwdHeMMD9zOZN+w2/XU/pnR4ZOC+8z1gFLu8NoFA12u8JJxzVs341Hgi6 # RDyXUHHXodLFVeNp3lfB0d4wwP3M5k37Db9dT+mdHhk4L7zPWAUu7w2gUDXa7wkn
# 2jbb01+P3nSISRKhggLLMIICNAIBATCB+KGB0KSBzTCByjELMAkGA1UEBhMCVVMx # HNWzfjUeCLraNtvTX4/edIhJEqGCAtIwggI7AgEBMIH8oYHUpIHRMIHOMQswCQYD
# CzAJBgNVBAgTAldBMRAwDgYDVQQHEwdSZWRtb25kMR4wHAYDVQQKExVNaWNyb3Nv # VQQGEwJVUzETMBEGA1UECBMKV2FzaGluZ3RvbjEQMA4GA1UEBxMHUmVkbW9uZDEe
# ZnQgQ29ycG9yYXRpb24xLTArBgNVBAsTJE1pY3Jvc29mdCBJcmVsYW5kIE9wZXJh # MBwGA1UEChMVTWljcm9zb2Z0IENvcnBvcmF0aW9uMSkwJwYDVQQLEyBNaWNyb3Nv
# dGlvbnMgTGltaXRlZDEmMCQGA1UECxMdVGhhbGVzIFRTUyBFU046MTc5RS00QkIw # ZnQgT3BlcmF0aW9ucyBQdWVydG8gUmljbzEmMCQGA1UECxMdVGhhbGVzIFRTUyBF
# LTgyNDYxJTAjBgNVBAMTHE1pY3Jvc29mdCBUaW1lLVN0YW1wIFNlcnZpY2WiIwoB # U046MEE1Ni1FMzI5LTRENEQxJTAjBgNVBAMTHE1pY3Jvc29mdCBUaW1lLVN0YW1w
# ATAHBgUrDgMCGgMVAMsg9FQ9pgPLXI2Ld5z7xDS0QAZ9oIGDMIGApH4wfDELMAkG # IFNlcnZpY2WiIwoBATAHBgUrDgMCGgMVALOVuE5sgxzETO4s+poBqI6r1x8zoIGD
# A1UEBhMCVVMxEzARBgNVBAgTCldhc2hpbmd0b24xEDAOBgNVBAcTB1JlZG1vbmQx # MIGApH4wfDELMAkGA1UEBhMCVVMxEzARBgNVBAgTCldhc2hpbmd0b24xEDAOBgNV
# HjAcBgNVBAoTFU1pY3Jvc29mdCBDb3Jwb3JhdGlvbjEmMCQGA1UEAxMdTWljcm9z # BAcTB1JlZG1vbmQxHjAcBgNVBAoTFU1pY3Jvc29mdCBDb3Jwb3JhdGlvbjEmMCQG
# b2Z0IFRpbWUtU3RhbXAgUENBIDIwMTAwDQYJKoZIhvcNAQEFBQACBQDipo0MMCIY # A1UEAxMdTWljcm9zb2Z0IFRpbWUtU3RhbXAgUENBIDIwMTAwDQYJKoZIhvcNAQEF
# DzIwMjAwNzAxMTIxODIwWhgPMjAyMDA3MDIxMjE4MjBaMHQwOgYKKwYBBAGEWQoE # BQACBQDjU7byMCIYDzIwMjAxMTA5MTYzOTE0WhgPMjAyMDExMTAxNjM5MTRaMHcw
# ATEsMCowCgIFAOKmjQwCAQAwBwIBAAICE70wBwIBAAICEeIwCgIFAOKn3owCAQAw # PQYKKwYBBAGEWQoEATEvMC0wCgIFAONTtvICAQAwCgIBAAICIt0CAf8wBwIBAAIC
# NgYKKwYBBAGEWQoEAjEoMCYwDAYKKwYBBAGEWQoDAqAKMAgCAQACAwehIKEKMAgC # EcQwCgIFAONVCHICAQAwNgYKKwYBBAGEWQoEAjEoMCYwDAYKKwYBBAGEWQoDAqAK
# AQACAwGGoDANBgkqhkiG9w0BAQUFAAOBgQCOPjlHOH8nYtgt2XnpKXenxPUR03ED # MAgCAQACAwehIKEKMAgCAQACAwGGoDANBgkqhkiG9w0BAQUFAAOBgQAQhyIIAC/A
# xPBm8XR5Z1vIq53RU9jG6yYcYNTdK+q38SGZtu0W/SgagTfKCQhjhRakuv7rGSs2 # P+VJdbhL9IQgm8WTa1DmPPE+BQSuRbBy2MmzC1KostixdEkr2OaNSjcYuZBNIJgv
# dlhx9LGCoc/q1vqmZpRSjkqWVcc/NzmldUWIWnLlV6rmLGoDmfCH5BcsiU6Eo6wU # vE8CWhVDD+sbBpVcOdoSfoBwHXKfvqSTiWvovoexkF0X5aon7yr3PkJ/kEqoLyUM
# iUVwnnXoqsCaBzGCAw0wggMJAgEBMIGTMHwxCzAJBgNVBAYTAlVTMRMwEQYDVQQI # xRvdWKJdHOL1sT0/aWHn048c6aGin/zc8DGCAw0wggMJAgEBMIGTMHwxCzAJBgNV
# EwpXYXNoaW5ndG9uMRAwDgYDVQQHEwdSZWRtb25kMR4wHAYDVQQKExVNaWNyb3Nv # BAYTAlVTMRMwEQYDVQQIEwpXYXNoaW5ndG9uMRAwDgYDVQQHEwdSZWRtb25kMR4w
# ZnQgQ29ycG9yYXRpb24xJjAkBgNVBAMTHU1pY3Jvc29mdCBUaW1lLVN0YW1wIFBD # HAYDVQQKExVNaWNyb3NvZnQgQ29ycG9yYXRpb24xJjAkBgNVBAMTHU1pY3Jvc29m
# QSAyMDEwAhMzAAABDKp4btzMQkzBAAAAAAEMMA0GCWCGSAFlAwQCAQUAoIIBSjAa # dCBUaW1lLVN0YW1wIFBDQSAyMDEwAhMzAAABJy9uo++RqBmoAAAAAAEnMA0GCWCG
# BgkqhkiG9w0BCQMxDQYLKoZIhvcNAQkQAQQwLwYJKoZIhvcNAQkEMSIEIDpwhjyu # SAFlAwQCAQUAoIIBSjAaBgkqhkiG9w0BCQMxDQYLKoZIhvcNAQkQAQQwLwYJKoZI
# zgu3Kmxpnpz86ZlthBqEzG5vaEMOkYRyuFCaMIH6BgsqhkiG9w0BCRACLzGB6jCB # hvcNAQkEMSIEIJZkrbvF4R8oqYYpN6ZPGOj+QEZTQriEi/Yw9gW6zMqRMIH6Bgsq
# 5zCB5DCBvQQgg5AWKX7M1+m2//+V7qmRvt1K/ww5Muu8XzGJBqygVCkwgZgwgYCk # hkiG9w0BCRACLzGB6jCB5zCB5DCBvQQgG5LoSxKGHWoW/wVMlbMztlQ4upAdzEmq
# fjB8MQswCQYDVQQGEwJVUzETMBEGA1UECBMKV2FzaGluZ3RvbjEQMA4GA1UEBxMH # H//vLu0jPiIwgZgwgYCkfjB8MQswCQYDVQQGEwJVUzETMBEGA1UECBMKV2FzaGlu
# UmVkbW9uZDEeMBwGA1UEChMVTWljcm9zb2Z0IENvcnBvcmF0aW9uMSYwJAYDVQQD # Z3RvbjEQMA4GA1UEBxMHUmVkbW9uZDEeMBwGA1UEChMVTWljcm9zb2Z0IENvcnBv
# Ex1NaWNyb3NvZnQgVGltZS1TdGFtcCBQQ0EgMjAxMAITMwAAAQyqeG7czEJMwQAA # cmF0aW9uMSYwJAYDVQQDEx1NaWNyb3NvZnQgVGltZS1TdGFtcCBQQ0EgMjAxMAIT
# AAABDDAiBCD11urvv5vgo4gFVQ2NMVrzgxT87Yuiq16YdswYbaYeITANBgkqhkiG # MwAAAScvbqPvkagZqAAAAAABJzAiBCDwhEViCRvqKwQV3MxociF2iGYrDP4p1BK+
# 9w0BAQsFAASCAQAi3q8hwcT2ft4b2EleaiyZxOImV/cKusmth1dtCh5/Jb0GbOld # s4tStO4vSDANBgkqhkiG9w0BAQsFAASCAQAkgmDo8lVmar0ZIqTG1it3skG8PZC9
# f5cSalrjf42MNPodWAtgmWozkYrQF6HxnsOiYiamfRA8E3E7xyRMy7AFfAhjcwMi # iqEEC1vxcz8OSfsjl2QSkQ5T2+3xWpxWA4uy2+Byv0bi8EsfQEnnn4vtdthS6/kb
# xaW4Iye6E1Ec6LtULANxfDtG/KIdCWdZxKqOezL3nzFNQWmm1mXPV+UnKpnJkA3E # vB/LLQiqoMhJ0rasf3/y/4KnQZEtztpg1+cCaNwFUgI6o+E8YEFt1frhLwFs/0WH
# DsQOUWk8J6ojDurhrP536WI+3arg8PcnppHBLd/xNKYdlsTb+6qndgzKXkDDt1CV # 5pyBFx9ECEs0M22SLIpW13gexv9fgk6ZboIfSreAI28DLveeJpkgwggxHRpuVOVD
# 4zCyuZ7bO8eyZAmNoSZz22k7vus9UjBz/CDhXylo20N43nr29rWPItUgH4uvOGQn # 4D7QQJAvJ0VU6p+yJlbvQXR9iltwb1REhlsJ5mADJ/FkzPVX/swMSUIoyE2inlxK
# t26Y/yjBaQImz32psrfJEMbQ7cl789s8WOx8 # LEiPkkZYwiFYCifFYUTnQjWU1Ls0EV+ysosL+jhzCxO8S6oRdp5TAi4F
# SIG # End signature block # SIG # End signature block

View File

@@ -241,42 +241,6 @@ check_min_reqs() {
return 0 return 0
} }
check_pre_reqs() {
eval $invocation
if [ "${DOTNET_INSTALL_SKIP_PREREQS:-}" = "1" ]; then
return 0
fi
if [ "$(uname)" = "Linux" ]; then
if is_musl_based_distro; then
if ! command -v scanelf > /dev/null; then
say_warning "scanelf not found, please install pax-utils package."
return 0
fi
LDCONFIG_COMMAND="scanelf --ldpath -BF '%f'"
[ -z "$($LDCONFIG_COMMAND 2>/dev/null | grep libintl)" ] && say_warning "Unable to locate libintl. Probable prerequisite missing; install libintl (or gettext)."
else
if [ ! -x "$(command -v ldconfig)" ]; then
say_verbose "ldconfig is not in PATH, trying /sbin/ldconfig."
LDCONFIG_COMMAND="/sbin/ldconfig"
else
LDCONFIG_COMMAND="ldconfig"
fi
local librarypath=${LD_LIBRARY_PATH:-}
LDCONFIG_COMMAND="$LDCONFIG_COMMAND -NXv ${librarypath//:/ }"
fi
[ -z "$($LDCONFIG_COMMAND 2>/dev/null | grep zlib)" ] && say_warning "Unable to locate zlib. Probable prerequisite missing; install zlib."
[ -z "$($LDCONFIG_COMMAND 2>/dev/null | grep ssl)" ] && say_warning "Unable to locate libssl. Probable prerequisite missing; install libssl."
[ -z "$($LDCONFIG_COMMAND 2>/dev/null | grep libicu)" ] && say_warning "Unable to locate libicu. Probable prerequisite missing; install libicu."
[ -z "$($LDCONFIG_COMMAND 2>/dev/null | grep lttng)" ] && say_warning "Unable to locate liblttng. Probable prerequisite missing; install libcurl."
[ -z "$($LDCONFIG_COMMAND 2>/dev/null | grep libcurl)" ] && say_warning "Unable to locate libcurl. Probable prerequisite missing; install libcurl."
fi
return 0
}
# args: # args:
# input - $1 # input - $1
to_lowercase() { to_lowercase() {
@@ -373,7 +337,7 @@ get_normalized_architecture_from_architecture() {
;; ;;
esac esac
say_err "Architecture \`$architecture\` not supported. If you think this is a bug, report it at https://github.com/dotnet/sdk/issues" say_err "Architecture \`$architecture\` not supported. If you think this is a bug, report it at https://github.com/dotnet/install-scripts/issues"
return 1 return 1
} }
@@ -468,7 +432,6 @@ parse_jsonfile_for_version() {
sdk_list=$(echo $sdk_section | awk -F"[{}]" '{print $2}') sdk_list=$(echo $sdk_section | awk -F"[{}]" '{print $2}')
sdk_list=${sdk_list//[\" ]/} sdk_list=${sdk_list//[\" ]/}
sdk_list=${sdk_list//,/$'\n'} sdk_list=${sdk_list//,/$'\n'}
sdk_list="$(echo -e "${sdk_list}" | tr -d '[[:space:]]')"
local version_info="" local version_info=""
while read -r line; do while read -r line; do
@@ -545,17 +508,18 @@ construct_download_link() {
local channel="$2" local channel="$2"
local normalized_architecture="$3" local normalized_architecture="$3"
local specific_version="${4//[$'\t\r\n']}" local specific_version="${4//[$'\t\r\n']}"
local specific_product_version="$(get_specific_product_version "$1" "$4")"
local osname local osname
osname="$(get_current_os_name)" || return 1 osname="$(get_current_os_name)" || return 1
local download_link=null local download_link=null
if [[ "$runtime" == "dotnet" ]]; then if [[ "$runtime" == "dotnet" ]]; then
download_link="$azure_feed/Runtime/$specific_version/dotnet-runtime-$specific_version-$osname-$normalized_architecture.tar.gz" download_link="$azure_feed/Runtime/$specific_version/dotnet-runtime-$specific_product_version-$osname-$normalized_architecture.tar.gz"
elif [[ "$runtime" == "aspnetcore" ]]; then elif [[ "$runtime" == "aspnetcore" ]]; then
download_link="$azure_feed/aspnetcore/Runtime/$specific_version/aspnetcore-runtime-$specific_version-$osname-$normalized_architecture.tar.gz" download_link="$azure_feed/aspnetcore/Runtime/$specific_version/aspnetcore-runtime-$specific_product_version-$osname-$normalized_architecture.tar.gz"
elif [ -z "$runtime" ]; then elif [ -z "$runtime" ]; then
download_link="$azure_feed/Sdk/$specific_version/dotnet-sdk-$specific_version-$osname-$normalized_architecture.tar.gz" download_link="$azure_feed/Sdk/$specific_version/dotnet-sdk-$specific_product_version-$osname-$normalized_architecture.tar.gz"
else else
return 1 return 1
fi fi
@@ -564,6 +528,50 @@ construct_download_link() {
return 0 return 0
} }
# args:
# azure_feed - $1
# specific_version - $2
get_specific_product_version() {
# If we find a 'productVersion.txt' at the root of any folder, we'll use its contents
# to resolve the version of what's in the folder, superseding the specified version.
eval $invocation
local azure_feed="$1"
local specific_version="${2//[$'\t\r\n']}"
local specific_product_version=$specific_version
local download_link=null
if [[ "$runtime" == "dotnet" ]]; then
download_link="$azure_feed/Runtime/$specific_version/productVersion.txt${feed_credential}"
elif [[ "$runtime" == "aspnetcore" ]]; then
download_link="$azure_feed/aspnetcore/Runtime/$specific_version/productVersion.txt${feed_credential}"
elif [ -z "$runtime" ]; then
download_link="$azure_feed/Sdk/$specific_version/productVersion.txt${feed_credential}"
else
return 1
fi
if machine_has "curl"
then
specific_product_version=$(curl -s --fail "$download_link")
if [ $? -ne 0 ]
then
specific_product_version=$specific_version
fi
elif machine_has "wget"
then
specific_product_version=$(wget -qO- "$download_link")
if [ $? -ne 0 ]
then
specific_product_version=$specific_version
fi
fi
specific_product_version="${specific_product_version//[$'\t\r\n']}"
echo "$specific_product_version"
return 0
}
# args: # args:
# azure_feed - $1 # azure_feed - $1
# channel - $2 # channel - $2
@@ -771,6 +779,7 @@ calculate_vars() {
say_verbose "normalized_architecture=$normalized_architecture" say_verbose "normalized_architecture=$normalized_architecture"
specific_version="$(get_specific_version_from_version "$azure_feed" "$channel" "$normalized_architecture" "$version" "$json_file")" specific_version="$(get_specific_version_from_version "$azure_feed" "$channel" "$normalized_architecture" "$version" "$json_file")"
specific_product_version="$(get_specific_product_version "$azure_feed" "$specific_version")"
say_verbose "specific_version=$specific_version" say_verbose "specific_version=$specific_version"
if [ -z "$specific_version" ]; then if [ -z "$specific_version" ]; then
say_err "Could not resolve version information." say_err "Could not resolve version information."
@@ -869,12 +878,12 @@ install_dotnet() {
fi fi
# Check if the standard SDK version is installed. # Check if the standard SDK version is installed.
say_verbose "Checking installation: version = $specific_version" say_verbose "Checking installation: version = $specific_product_version"
if is_dotnet_package_installed "$install_root" "$asset_relative_path" "$specific_version"; then if is_dotnet_package_installed "$install_root" "$asset_relative_path" "$specific_product_version"; then
return 0 return 0
fi fi
say_err "\`$asset_name\` with version = $specific_version failed to install with an unknown error." say_err "\`$asset_name\` with version = $specific_product_version failed to install with an unknown error."
return 1 return 1
} }
@@ -1058,6 +1067,11 @@ if [ "$no_cdn" = true ]; then
azure_feed="$uncached_feed" azure_feed="$uncached_feed"
fi fi
say "Note that the intended use of this script is for Continuous Integration (CI) scenarios, where:"
say "- The SDK needs to be installed without user interaction and without admin rights."
say "- The SDK installation doesn't need to persist across multiple CI runs."
say "To set up a development environment or to run apps, use installers rather than this script. Visit https://dotnet.microsoft.com/download to get the installer.\n"
check_min_reqs check_min_reqs
calculate_vars calculate_vars
script_name=$(basename "$0") script_name=$(basename "$0")
@@ -1079,7 +1093,6 @@ if [ "$dry_run" = true ]; then
exit 0 exit 0
fi fi
check_pre_reqs
install_dotnet install_dotnet
bin_path="$(get_absolute_path "$(combine_paths "$install_root" "$bin_folder_relative_path")")" bin_path="$(get_absolute_path "$(combine_paths "$install_root" "$bin_folder_relative_path")")"
@@ -1090,4 +1103,6 @@ else
say "Binaries of dotnet can be found in $bin_path" say "Binaries of dotnet can be found in $bin_path"
fi fi
say "Note that the script does not resolve dependencies during installation."
say "To check the list of dependencies, go to https://docs.microsoft.com/dotnet/core/install, select your operating system and check the \"Dependencies\" section."
say "Installation finished successfully." say "Installation finished successfully."

View File

@@ -5,9 +5,9 @@
"requires": true, "requires": true,
"dependencies": { "dependencies": {
"@actions/core": { "@actions/core": {
"version": "1.2.0", "version": "1.2.6",
"resolved": "https://registry.npmjs.org/@actions/core/-/core-1.2.0.tgz", "resolved": "https://registry.npmjs.org/@actions/core/-/core-1.2.6.tgz",
"integrity": "sha512-ZKdyhlSlyz38S6YFfPnyNgCDZuAF2T0Qv5eHflNWytPS8Qjvz39bZFMry9Bb/dpSnqWcNeav5yM2CTYpJeY+Dw==" "integrity": "sha512-ZQYitnqiyBc3D+k7LsgSBmMDVkOVidaagDG7j3fOym77jNunWRuYx7VSHa9GNfFZh+zh61xsCjRj4JxMZlDqTA=="
}, },
"@actions/glob": { "@actions/glob": {
"version": "0.1.0", "version": "0.1.0",
@@ -1683,9 +1683,9 @@
} }
}, },
"lodash": { "lodash": {
"version": "4.17.15", "version": "4.17.19",
"resolved": "https://registry.npmjs.org/lodash/-/lodash-4.17.15.tgz", "resolved": "https://registry.npmjs.org/lodash/-/lodash-4.17.19.tgz",
"integrity": "sha512-8xOcRHvCjnocdS5cpwXQXVzmmh5e5+saE2QGoeQmbKmRS6J3VQppPOIt0MnmE+4xlZoumy0GPG0D0MVIQbNA1A==", "integrity": "sha512-JNvd8XER9GQX0v2qJgsaN/mzFCNA5BRe/j8JN9d+tWyGLSodKQHKFicdwNYzWwI3wjRnaKPsGj1XkBjx/F96DQ==",
"dev": true "dev": true
}, },
"lodash.unescape": { "lodash.unescape": {

View File

@@ -23,5 +23,7 @@
<key>ACTIONS_RUNNER_SVC</key> <key>ACTIONS_RUNNER_SVC</key>
<string>1</string> <string>1</string>
</dict> </dict>
<key>ProcessType</key>
<string>Interactive</string>
</dict> </dict>
</plist> </plist>

View File

@@ -49,70 +49,68 @@ then
cat /etc/debian_version cat /etc/debian_version
echo "------------------------------" echo "------------------------------"
# prefer apt over apt-get # prefer apt-get over apt
command -v apt
if [ $? -eq 0 ]
then
apt update && apt install -y liblttng-ust0 libkrb5-3 zlib1g
if [ $? -ne 0 ]
then
echo "'apt' failed with exit code '$?'"
print_errormessage
exit 1
fi
# libissl version prefer: libssl1.1 -> libssl1.0.2 -> libssl1.0.0
apt install -y libssl1.1$ || apt install -y libssl1.0.2$ || apt install -y libssl1.0.0$
if [ $? -ne 0 ]
then
echo "'apt' failed with exit code '$?'"
print_errormessage
exit 1
fi
# libicu version prefer: libicu66 -> libicu63 -> libicu60 -> libicu57 -> libicu55 -> libicu52
apt install -y libicu66 || apt install -y libicu63 || apt install -y libicu60 || apt install -y libicu57 || apt install -y libicu55 || apt install -y libicu52
if [ $? -ne 0 ]
then
echo "'apt' failed with exit code '$?'"
print_errormessage
exit 1
fi
else
command -v apt-get command -v apt-get
if [ $? -eq 0 ] if [ $? -eq 0 ]
then then
apt-get update && apt-get install -y liblttng-ust0 libkrb5-3 zlib1g apt_get=apt-get
else
command -v apt
if [ $? -eq 0 ]
then
apt_get=apt
else
echo "Found neither 'apt-get' nor 'apt'"
print_errormessage
exit 1
fi
fi
$apt_get update && $apt_get install -y liblttng-ust0 libkrb5-3 zlib1g
if [ $? -ne 0 ] if [ $? -ne 0 ]
then then
echo "'apt-get' failed with exit code '$?'" echo "'$apt_get' failed with exit code '$?'"
print_errormessage print_errormessage
exit 1 exit 1
fi fi
# libissl version prefer: libssl1.1 -> libssl1.0.2 -> libssl1.0.0 apt_get_with_fallbacks() {
apt-get install -y libssl1.1$ || apt-get install -y libssl1.0.2$ || apt install -y libssl1.0.0$ $apt_get install -y $1
fail=$?
if [ $fail -eq 0 ]
then
if [ "${1#"${1%?}"}" = '$' ]; then
dpkg -l "${1%?}" > /dev/null 2> /dev/null
fail=$?
fi
fi
if [ $fail -ne 0 ]
then
shift
if [ -n "$1" ]
then
apt_get_with_fallbacks "$@"
fi
fi
}
# libssl version prefer: libssl1.1 -> libssl1.0.2 -> libssl1.0.0
apt_get_with_fallbacks libssl1.1$ libssl1.0.2$ libssl1.0.0$
if [ $? -ne 0 ] if [ $? -ne 0 ]
then then
echo "'apt-get' failed with exit code '$?'" echo "'$apt_get' failed with exit code '$?'"
print_errormessage print_errormessage
exit 1 exit 1
fi fi
# libicu version prefer: libicu66 -> libicu63 -> libicu60 -> libicu57 -> libicu55 -> libicu52 # libicu version prefer: libicu66 -> libicu63 -> libicu60 -> libicu57 -> libicu55 -> libicu52
apt-get install -y libicu66 || apt-get install -y libicu63 || apt-get install -y libicu60 || apt install -y libicu57 || apt install -y libicu55 || apt install -y libicu52 apt_get_with_fallbacks libicu66 libicu63 libicu60 libicu57 libicu55 libicu52
if [ $? -ne 0 ] if [ $? -ne 0 ]
then then
echo "'apt-get' failed with exit code '$?'" echo "'$apt_get' failed with exit code '$?'"
print_errormessage print_errormessage
exit 1 exit 1
fi fi
else
echo "Can not find 'apt' or 'apt-get'"
print_errormessage
exit 1
fi
fi
elif [ -e /etc/redhat-release ] elif [ -e /etc/redhat-release ]
then then
echo "The current OS is Fedora based" echo "The current OS is Fedora based"

4
src/Misc/layoutbin/update.sh.template Normal file → Executable file
View File

@@ -28,13 +28,13 @@ date "+[%F %T-%4N] Waiting for $runnerprocessname ($runnerpid) to complete" >> "
while [ -e /proc/$runnerpid ] while [ -e /proc/$runnerpid ]
do do
date "+[%F %T-%4N] Process $runnerpid still running" >> "$logfile" 2>&1 date "+[%F %T-%4N] Process $runnerpid still running" >> "$logfile" 2>&1
ping -c 2 127.0.0.1 >nul sleep 2
done done
date "+[%F %T-%4N] Process $runnerpid finished running" >> "$logfile" 2>&1 date "+[%F %T-%4N] Process $runnerpid finished running" >> "$logfile" 2>&1
# start re-organize folders # start re-organize folders
date "+[%F %T-%4N] Sleep 1 more second to make sure process exited" >> "$logfile" 2>&1 date "+[%F %T-%4N] Sleep 1 more second to make sure process exited" >> "$logfile" 2>&1
ping -c 2 127.0.0.1 >nul sleep 1
# the folder structure under runner root will be # the folder structure under runner root will be
# ./bin -> bin.2.100.0 (junction folder) # ./bin -> bin.2.100.0 (junction folder)

View File

@@ -18,24 +18,26 @@ then
exit 1 exit 1
fi fi
message="Execute sudo ./bin/installdependencies.sh to install any missing Dotnet Core 3.0 dependencies."
ldd ./bin/libcoreclr.so | grep 'not found' ldd ./bin/libcoreclr.so | grep 'not found'
if [ $? -eq 0 ]; then if [ $? -eq 0 ]; then
echo "Dependencies is missing for Dotnet Core 3.0" echo "Dependencies is missing for Dotnet Core 3.0"
echo "Execute ./bin/installdependencies.sh to install any missing Dotnet Core 3.0 dependencies." echo $message
exit 1 exit 1
fi fi
ldd ./bin/System.Security.Cryptography.Native.OpenSsl.so | grep 'not found' ldd ./bin/System.Security.Cryptography.Native.OpenSsl.so | grep 'not found'
if [ $? -eq 0 ]; then if [ $? -eq 0 ]; then
echo "Dependencies is missing for Dotnet Core 3.0" echo "Dependencies is missing for Dotnet Core 3.0"
echo "Execute ./bin/installdependencies.sh to install any missing Dotnet Core 3.0 dependencies." echo $message
exit 1 exit 1
fi fi
ldd ./bin/System.IO.Compression.Native.so | grep 'not found' ldd ./bin/System.IO.Compression.Native.so | grep 'not found'
if [ $? -eq 0 ]; then if [ $? -eq 0 ]; then
echo "Dependencies is missing for Dotnet Core 3.0" echo "Dependencies is missing for Dotnet Core 3.0"
echo "Execute ./bin/installdependencies.sh to install any missing Dotnet Core 3.0 dependencies." echo $message
exit 1 exit 1
fi fi
@@ -53,7 +55,7 @@ then
$LDCONFIG_COMMAND -NXv ${libpath//:/ } 2>&1 | grep libicu >/dev/null 2>&1 $LDCONFIG_COMMAND -NXv ${libpath//:/ } 2>&1 | grep libicu >/dev/null 2>&1
if [ $? -ne 0 ]; then if [ $? -ne 0 ]; then
echo "Libicu's dependencies is missing for Dotnet Core 3.0" echo "Libicu's dependencies is missing for Dotnet Core 3.0"
echo "Execute ./bin/installdependencies.sh to install any missing Dotnet Core 3.0 dependencies." echo $message
exit 1 exit 1
fi fi
fi fi
@@ -67,7 +69,7 @@ while [ -h "$SOURCE" ]; do # resolve $SOURCE until the file is no longer a symli
[[ $SOURCE != /* ]] && SOURCE="$DIR/$SOURCE" # if $SOURCE was a relative symlink, we need to resolve it relative to the path where the symlink file was located [[ $SOURCE != /* ]] && SOURCE="$DIR/$SOURCE" # if $SOURCE was a relative symlink, we need to resolve it relative to the path where the symlink file was located
done done
DIR="$( cd -P "$( dirname "$SOURCE" )" && pwd )" DIR="$( cd -P "$( dirname "$SOURCE" )" && pwd )"
cd $DIR cd "$DIR"
source ./env.sh source ./env.sh

View File

@@ -90,7 +90,7 @@ namespace GitHub.Runner.Common
public static readonly string Labels = "labels"; public static readonly string Labels = "labels";
public static readonly string MonitorSocketAddress = "monitorsocketaddress"; public static readonly string MonitorSocketAddress = "monitorsocketaddress";
public static readonly string Name = "name"; public static readonly string Name = "name";
public static readonly string Pool = "pool"; public static readonly string RunnerGroup = "runnergroup";
public static readonly string StartupType = "startuptype"; public static readonly string StartupType = "startuptype";
public static readonly string Url = "url"; public static readonly string Url = "url";
public static readonly string UserName = "username"; public static readonly string UserName = "username";
@@ -140,6 +140,8 @@ namespace GitHub.Runner.Common
public static readonly string InternalTelemetryIssueDataKey = "_internal_telemetry"; public static readonly string InternalTelemetryIssueDataKey = "_internal_telemetry";
public static readonly string WorkerCrash = "WORKER_CRASH"; public static readonly string WorkerCrash = "WORKER_CRASH";
public static readonly string UnsupportedCommand = "UNSUPPORTED_COMMAND";
public static readonly string UnsupportedCommandMessageDisabled = "The `{0}` command is disabled. Please upgrade to using Environment Files or opt into unsecure command execution by setting the `ACTIONS_ALLOW_UNSECURE_COMMANDS` environment variable to `true`. For more information see: https://github.blog/changelog/2020-10-01-github-actions-deprecating-set-env-and-add-path-commands/";
} }
public static class RunnerEvent public static class RunnerEvent
@@ -198,6 +200,7 @@ namespace GitHub.Runner.Common
// //
// Keep alphabetical // Keep alphabetical
// //
public static readonly string AllowUnsupportedCommands = "ACTIONS_ALLOW_UNSECURE_COMMANDS";
public static readonly string RunnerDebug = "ACTIONS_RUNNER_DEBUG"; public static readonly string RunnerDebug = "ACTIONS_RUNNER_DEBUG";
public static readonly string StepDebug = "ACTIONS_STEP_DEBUG"; public static readonly string StepDebug = "ACTIONS_STEP_DEBUG";
} }

View File

@@ -56,6 +56,10 @@ namespace GitHub.Runner.Common
Add<T>(extensions, "GitHub.Runner.Worker.EndGroupCommandExtension, Runner.Worker"); Add<T>(extensions, "GitHub.Runner.Worker.EndGroupCommandExtension, Runner.Worker");
Add<T>(extensions, "GitHub.Runner.Worker.EchoCommandExtension, Runner.Worker"); Add<T>(extensions, "GitHub.Runner.Worker.EchoCommandExtension, Runner.Worker");
break; break;
case "GitHub.Runner.Worker.IFileCommandExtension":
Add<T>(extensions, "GitHub.Runner.Worker.AddPathFileCommand, Runner.Worker");
Add<T>(extensions, "GitHub.Runner.Worker.SetEnvFileCommand, Runner.Worker");
break;
default: default:
// This should never happen. // This should never happen.
throw new NotSupportedException($"Unexpected extension type: '{typeof(T).FullName}'"); throw new NotSupportedException($"Unexpected extension type: '{typeof(T).FullName}'");

View File

@@ -16,6 +16,7 @@ namespace GitHub.Runner.Common
// logging and console // logging and console
Task<TaskLog> AppendLogContentAsync(Guid scopeIdentifier, string hubName, Guid planId, int logId, Stream uploadStream, CancellationToken cancellationToken); Task<TaskLog> AppendLogContentAsync(Guid scopeIdentifier, string hubName, Guid planId, int logId, Stream uploadStream, CancellationToken cancellationToken);
Task AppendTimelineRecordFeedAsync(Guid scopeIdentifier, string hubName, Guid planId, Guid timelineId, Guid timelineRecordId, Guid stepId, IList<string> lines, CancellationToken cancellationToken); Task AppendTimelineRecordFeedAsync(Guid scopeIdentifier, string hubName, Guid planId, Guid timelineId, Guid timelineRecordId, Guid stepId, IList<string> lines, CancellationToken cancellationToken);
Task AppendTimelineRecordFeedAsync(Guid scopeIdentifier, string hubName, Guid planId, Guid timelineId, Guid timelineRecordId, Guid stepId, IList<string> lines, long startLine, CancellationToken cancellationToken);
Task<TaskAttachment> CreateAttachmentAsync(Guid scopeIdentifier, string hubName, Guid planId, Guid timelineId, Guid timelineRecordId, String type, String name, Stream uploadStream, CancellationToken cancellationToken); Task<TaskAttachment> CreateAttachmentAsync(Guid scopeIdentifier, string hubName, Guid planId, Guid timelineId, Guid timelineRecordId, String type, String name, Stream uploadStream, CancellationToken cancellationToken);
Task<TaskLog> CreateLogAsync(Guid scopeIdentifier, string hubName, Guid planId, TaskLog log, CancellationToken cancellationToken); Task<TaskLog> CreateLogAsync(Guid scopeIdentifier, string hubName, Guid planId, TaskLog log, CancellationToken cancellationToken);
Task<Timeline> CreateTimelineAsync(Guid scopeIdentifier, string hubName, Guid planId, Guid timelineId, CancellationToken cancellationToken); Task<Timeline> CreateTimelineAsync(Guid scopeIdentifier, string hubName, Guid planId, Guid timelineId, CancellationToken cancellationToken);
@@ -79,6 +80,12 @@ namespace GitHub.Runner.Common
return _taskClient.AppendTimelineRecordFeedAsync(scopeIdentifier, hubName, planId, timelineId, timelineRecordId, stepId, lines, cancellationToken: cancellationToken); return _taskClient.AppendTimelineRecordFeedAsync(scopeIdentifier, hubName, planId, timelineId, timelineRecordId, stepId, lines, cancellationToken: cancellationToken);
} }
public Task AppendTimelineRecordFeedAsync(Guid scopeIdentifier, string hubName, Guid planId, Guid timelineId, Guid timelineRecordId, Guid stepId, IList<string> lines, long startLine, CancellationToken cancellationToken)
{
CheckConnection();
return _taskClient.AppendTimelineRecordFeedAsync(scopeIdentifier, hubName, planId, timelineId, timelineRecordId, stepId, lines, startLine, cancellationToken: cancellationToken);
}
public Task<TaskAttachment> CreateAttachmentAsync(Guid scopeIdentifier, string hubName, Guid planId, Guid timelineId, Guid timelineRecordId, string type, string name, Stream uploadStream, CancellationToken cancellationToken) public Task<TaskAttachment> CreateAttachmentAsync(Guid scopeIdentifier, string hubName, Guid planId, Guid timelineId, Guid timelineRecordId, string type, string name, Stream uploadStream, CancellationToken cancellationToken)
{ {
CheckConnection(); CheckConnection();

View File

@@ -18,7 +18,7 @@ namespace GitHub.Runner.Common
event EventHandler<ThrottlingEventArgs> JobServerQueueThrottling; event EventHandler<ThrottlingEventArgs> JobServerQueueThrottling;
Task ShutdownAsync(); Task ShutdownAsync();
void Start(Pipelines.AgentJobRequestMessage jobRequest); void Start(Pipelines.AgentJobRequestMessage jobRequest);
void QueueWebConsoleLine(Guid stepRecordId, string line); void QueueWebConsoleLine(Guid stepRecordId, string line, long? lineNumber = null);
void QueueFileUpload(Guid timelineId, Guid timelineRecordId, string type, string name, string path, bool deleteSource); void QueueFileUpload(Guid timelineId, Guid timelineRecordId, string type, string name, string path, bool deleteSource);
void QueueTimelineRecordUpdate(Guid timelineId, TimelineRecord timelineRecord); void QueueTimelineRecordUpdate(Guid timelineId, TimelineRecord timelineRecord);
} }
@@ -155,10 +155,10 @@ namespace GitHub.Runner.Common
Trace.Info("All queue process tasks have been stopped, and all queues are drained."); Trace.Info("All queue process tasks have been stopped, and all queues are drained.");
} }
public void QueueWebConsoleLine(Guid stepRecordId, string line) public void QueueWebConsoleLine(Guid stepRecordId, string line, long? lineNumber)
{ {
Trace.Verbose("Enqueue web console line queue: {0}", line); Trace.Verbose("Enqueue web console line queue: {0}", line);
_webConsoleLineQueue.Enqueue(new ConsoleLineInfo(stepRecordId, line)); _webConsoleLineQueue.Enqueue(new ConsoleLineInfo(stepRecordId, line, lineNumber));
} }
public void QueueFileUpload(Guid timelineId, Guid timelineRecordId, string type, string name, string path, bool deleteSource) public void QueueFileUpload(Guid timelineId, Guid timelineRecordId, string type, string name, string path, bool deleteSource)
@@ -214,7 +214,7 @@ namespace GitHub.Runner.Common
} }
// Group consolelines by timeline record of each step // Group consolelines by timeline record of each step
Dictionary<Guid, List<string>> stepsConsoleLines = new Dictionary<Guid, List<string>>(); Dictionary<Guid, List<TimelineRecordLogLine>> stepsConsoleLines = new Dictionary<Guid, List<TimelineRecordLogLine>>();
List<Guid> stepRecordIds = new List<Guid>(); // We need to keep lines in order List<Guid> stepRecordIds = new List<Guid>(); // We need to keep lines in order
int linesCounter = 0; int linesCounter = 0;
ConsoleLineInfo lineInfo; ConsoleLineInfo lineInfo;
@@ -222,7 +222,7 @@ namespace GitHub.Runner.Common
{ {
if (!stepsConsoleLines.ContainsKey(lineInfo.StepRecordId)) if (!stepsConsoleLines.ContainsKey(lineInfo.StepRecordId))
{ {
stepsConsoleLines[lineInfo.StepRecordId] = new List<string>(); stepsConsoleLines[lineInfo.StepRecordId] = new List<TimelineRecordLogLine>();
stepRecordIds.Add(lineInfo.StepRecordId); stepRecordIds.Add(lineInfo.StepRecordId);
} }
@@ -232,7 +232,7 @@ namespace GitHub.Runner.Common
lineInfo.Line = $"{lineInfo.Line.Substring(0, 1024)}..."; lineInfo.Line = $"{lineInfo.Line.Substring(0, 1024)}...";
} }
stepsConsoleLines[lineInfo.StepRecordId].Add(lineInfo.Line); stepsConsoleLines[lineInfo.StepRecordId].Add(new TimelineRecordLogLine(lineInfo.Line, lineInfo.LineNumber));
linesCounter++; linesCounter++;
// process at most about 500 lines of web console line during regular timer dequeue task. // process at most about 500 lines of web console line during regular timer dequeue task.
@@ -247,13 +247,13 @@ namespace GitHub.Runner.Common
{ {
// Split consolelines into batch, each batch will container at most 100 lines. // Split consolelines into batch, each batch will container at most 100 lines.
int batchCounter = 0; int batchCounter = 0;
List<List<string>> batchedLines = new List<List<string>>(); List<List<TimelineRecordLogLine>> batchedLines = new List<List<TimelineRecordLogLine>>();
foreach (var line in stepsConsoleLines[stepRecordId]) foreach (var line in stepsConsoleLines[stepRecordId])
{ {
var currentBatch = batchedLines.ElementAtOrDefault(batchCounter); var currentBatch = batchedLines.ElementAtOrDefault(batchCounter);
if (currentBatch == null) if (currentBatch == null)
{ {
batchedLines.Add(new List<string>()); batchedLines.Add(new List<TimelineRecordLogLine>());
currentBatch = batchedLines.ElementAt(batchCounter); currentBatch = batchedLines.ElementAt(batchCounter);
} }
@@ -275,7 +275,6 @@ namespace GitHub.Runner.Common
{ {
Trace.Info($"Skip {batchedLines.Count - 2} batches web console lines for last run"); Trace.Info($"Skip {batchedLines.Count - 2} batches web console lines for last run");
batchedLines = batchedLines.TakeLast(2).ToList(); batchedLines = batchedLines.TakeLast(2).ToList();
batchedLines[0].Insert(0, "...");
} }
int errorCount = 0; int errorCount = 0;
@@ -284,7 +283,15 @@ namespace GitHub.Runner.Common
try try
{ {
// we will not requeue failed batch, since the web console lines are time sensitive. // we will not requeue failed batch, since the web console lines are time sensitive.
await _jobServer.AppendTimelineRecordFeedAsync(_scopeIdentifier, _hubName, _planId, _jobTimelineId, _jobTimelineRecordId, stepRecordId, batch, default(CancellationToken)); if (batch[0].LineNumber.HasValue)
{
await _jobServer.AppendTimelineRecordFeedAsync(_scopeIdentifier, _hubName, _planId, _jobTimelineId, _jobTimelineRecordId, stepRecordId, batch.Select(logLine => logLine.Line).ToList(), batch[0].LineNumber.Value, default(CancellationToken));
}
else
{
await _jobServer.AppendTimelineRecordFeedAsync(_scopeIdentifier, _hubName, _planId, _jobTimelineId, _jobTimelineRecordId, stepRecordId, batch.Select(logLine => logLine.Line).ToList(), default(CancellationToken));
}
if (_firstConsoleOutputs) if (_firstConsoleOutputs)
{ {
HostContext.WritePerfCounter($"WorkerJobServerQueueAppendFirstConsoleOutput_{_planId.ToString()}"); HostContext.WritePerfCounter($"WorkerJobServerQueueAppendFirstConsoleOutput_{_planId.ToString()}");
@@ -653,13 +660,15 @@ namespace GitHub.Runner.Common
internal class ConsoleLineInfo internal class ConsoleLineInfo
{ {
public ConsoleLineInfo(Guid recordId, string line) public ConsoleLineInfo(Guid recordId, string line, long? lineNumber)
{ {
this.StepRecordId = recordId; this.StepRecordId = recordId;
this.Line = line; this.Line = line;
this.LineNumber = lineNumber;
} }
public Guid StepRecordId { get; set; } public Guid StepRecordId { get; set; }
public string Line { get; set; } public string Line { get; set; }
public long? LineNumber { get; set; }
} }
} }

View File

@@ -0,0 +1,51 @@
using System;
using System.Threading;
using System.Threading.Tasks;
using GitHub.Runner.Sdk;
using GitHub.Runner.Common;
namespace GitHub.Runner.Common.Util
{
public static class EncodingUtil
{
public static async Task SetEncoding(IHostContext hostContext, Tracing trace, CancellationToken cancellationToken)
{
#if OS_WINDOWS
try
{
if (Console.InputEncoding.CodePage != 65001)
{
using (var p = hostContext.CreateService<IProcessInvoker>())
{
// Use UTF8 code page
int exitCode = await p.ExecuteAsync(workingDirectory: hostContext.GetDirectory(WellKnownDirectory.Work),
fileName: WhichUtil.Which("chcp", true, trace),
arguments: "65001",
environment: null,
requireExitCodeZero: false,
outputEncoding: null,
killProcessOnCancel: false,
redirectStandardIn: null,
inheritConsoleHandler: true,
cancellationToken: cancellationToken);
if (exitCode == 0)
{
trace.Info("Successfully returned to code page 65001 (UTF8)");
}
else
{
trace.Warning($"'chcp 65001' failed with exit code {exitCode}");
}
}
}
}
catch (Exception ex)
{
trace.Warning($"'chcp 65001' failed with exception {ex.Message}");
}
#endif
// Dummy variable to prevent compiler error CS1998: "This async method lacks 'await' operators and will run synchronously..."
await Task.CompletedTask;
}
}
}

View File

@@ -42,7 +42,7 @@ namespace GitHub.Runner.Listener
Constants.Runner.CommandLine.Args.Labels, Constants.Runner.CommandLine.Args.Labels,
Constants.Runner.CommandLine.Args.MonitorSocketAddress, Constants.Runner.CommandLine.Args.MonitorSocketAddress,
Constants.Runner.CommandLine.Args.Name, Constants.Runner.CommandLine.Args.Name,
Constants.Runner.CommandLine.Args.Pool, Constants.Runner.CommandLine.Args.RunnerGroup,
Constants.Runner.CommandLine.Args.StartupType, Constants.Runner.CommandLine.Args.StartupType,
Constants.Runner.CommandLine.Args.Token, Constants.Runner.CommandLine.Args.Token,
Constants.Runner.CommandLine.Args.Url, Constants.Runner.CommandLine.Args.Url,
@@ -169,6 +169,15 @@ namespace GitHub.Runner.Listener
validator: Validators.NonEmptyValidator); validator: Validators.NonEmptyValidator);
} }
public string GetRunnerGroupName(string defaultPoolName = null)
{
return GetArgOrPrompt(
name: Constants.Runner.CommandLine.Args.RunnerGroup,
description: "Enter the name of the runner group to add this runner to:",
defaultValue: defaultPoolName ?? "default",
validator: Validators.NonEmptyValidator);
}
public string GetToken() public string GetToken()
{ {
return GetArgOrPrompt( return GetArgOrPrompt(

View File

@@ -159,17 +159,34 @@ namespace GitHub.Runner.Listener.Configuration
_term.WriteSection("Runner Registration"); _term.WriteSection("Runner Registration");
//Get all the agent pools, and select the first private pool // If we have more than one runner group available, allow the user to specify which one to be added into
string poolName = null;
TaskAgentPool agentPool = null;
List<TaskAgentPool> agentPools = await _runnerServer.GetAgentPoolsAsync(); List<TaskAgentPool> agentPools = await _runnerServer.GetAgentPoolsAsync();
TaskAgentPool agentPool = agentPools?.Where(x => x.IsHosted == false).FirstOrDefault(); TaskAgentPool defaultPool = agentPools?.Where(x => x.IsInternal).FirstOrDefault();
if (agentPool == null) if (agentPools?.Where(x => !x.IsHosted).Count() > 1)
{ {
throw new TaskAgentPoolNotFoundException($"Could not find any private pool. Contact support."); poolName = command.GetRunnerGroupName(defaultPool?.Name);
_term.WriteLine();
agentPool = agentPools.Where(x => string.Equals(poolName, x.Name, StringComparison.OrdinalIgnoreCase) && !x.IsHosted).FirstOrDefault();
} }
else else
{ {
Trace.Info("Found a private pool with id {1} and name {2}", agentPool.Id, agentPool.Name); agentPool = defaultPool;
}
if (agentPool == null && poolName == null)
{
throw new TaskAgentPoolNotFoundException($"Could not find any self-hosted runner groups. Contact support.");
}
else if (agentPool == null && poolName != null)
{
throw new TaskAgentPoolNotFoundException($"Could not find any self-hosted runner group named \"{poolName}\".");
}
else
{
Trace.Info("Found a self-hosted runner group with id {1} and name {2}", agentPool.Id, agentPool.Name);
runnerSettings.PoolId = agentPool.Id; runnerSettings.PoolId = agentPool.Id;
runnerSettings.PoolName = agentPool.Name; runnerSettings.PoolName = agentPool.Name;
} }
@@ -246,6 +263,7 @@ namespace GitHub.Runner.Listener.Configuration
{ {
{ "clientId", agent.Authorization.ClientId.ToString("D") }, { "clientId", agent.Authorization.ClientId.ToString("D") },
{ "authorizationUrl", agent.Authorization.AuthorizationUrl.AbsoluteUri }, { "authorizationUrl", agent.Authorization.AuthorizationUrl.AbsoluteUri },
{ "requireFipsCryptography", agent.Properties.GetValue("RequireFipsCryptography", false).ToString() }
}, },
}; };

View File

@@ -20,7 +20,7 @@ namespace GitHub.Runner.Listener.Configuration
/// key is returned to the caller. /// key is returned to the caller.
/// </summary> /// </summary>
/// <returns>An <c>RSACryptoServiceProvider</c> instance representing the key for the runner</returns> /// <returns>An <c>RSACryptoServiceProvider</c> instance representing the key for the runner</returns>
RSACryptoServiceProvider CreateKey(); RSA CreateKey();
/// <summary> /// <summary>
/// Deletes the RSA key managed by the key manager. /// Deletes the RSA key managed by the key manager.
@@ -32,7 +32,7 @@ namespace GitHub.Runner.Listener.Configuration
/// </summary> /// </summary>
/// <returns>An <c>RSACryptoServiceProvider</c> instance representing the key for the runner</returns> /// <returns>An <c>RSACryptoServiceProvider</c> instance representing the key for the runner</returns>
/// <exception cref="CryptographicException">No key exists in the store</exception> /// <exception cref="CryptographicException">No key exists in the store</exception>
RSACryptoServiceProvider GetKey(); RSA GetKey();
} }
// Newtonsoft 10 is not working properly with dotnet RSAParameters class // Newtonsoft 10 is not working properly with dotnet RSAParameters class

View File

@@ -36,7 +36,7 @@ namespace GitHub.Runner.Listener.Configuration
// We expect the key to be in the machine store at this point. Configuration should have set all of // We expect the key to be in the machine store at this point. Configuration should have set all of
// this up correctly so we can use the key to generate access tokens. // this up correctly so we can use the key to generate access tokens.
var keyManager = context.GetService<IRSAKeyManager>(); var keyManager = context.GetService<IRSAKeyManager>();
var signingCredentials = VssSigningCredentials.Create(() => keyManager.GetKey()); var signingCredentials = VssSigningCredentials.Create(() => keyManager.GetKey(), StringUtil.ConvertToBoolean(CredentialData.Data.GetValueOrDefault("requireFipsCryptography"), false));
var clientCredential = new VssOAuthJwtBearerClientCredential(clientId, authorizationUrl, signingCredentials); var clientCredential = new VssOAuthJwtBearerClientCredential(clientId, authorizationUrl, signingCredentials);
var agentCredential = new VssOAuthCredential(new Uri(oauthEndpointUrl, UriKind.Absolute), VssOAuthGrant.ClientCredentials, clientCredential); var agentCredential = new VssOAuthCredential(new Uri(oauthEndpointUrl, UriKind.Absolute), VssOAuthGrant.ClientCredentials, clientCredential);

View File

@@ -13,14 +13,14 @@ namespace GitHub.Runner.Listener.Configuration
private string _keyFile; private string _keyFile;
private IHostContext _context; private IHostContext _context;
public RSACryptoServiceProvider CreateKey() public RSA CreateKey()
{ {
RSACryptoServiceProvider rsa = null; RSA rsa = null;
if (!File.Exists(_keyFile)) if (!File.Exists(_keyFile))
{ {
Trace.Info("Creating new RSA key using 2048-bit key length"); Trace.Info("Creating new RSA key using 2048-bit key length");
rsa = new RSACryptoServiceProvider(2048); rsa = RSA.Create(2048);
// Now write the parameters to disk // Now write the parameters to disk
SaveParameters(rsa.ExportParameters(true)); SaveParameters(rsa.ExportParameters(true));
@@ -30,7 +30,7 @@ namespace GitHub.Runner.Listener.Configuration
{ {
Trace.Info("Found existing RSA key parameters file {0}", _keyFile); Trace.Info("Found existing RSA key parameters file {0}", _keyFile);
rsa = new RSACryptoServiceProvider(); rsa = RSA.Create();
rsa.ImportParameters(LoadParameters()); rsa.ImportParameters(LoadParameters());
} }
@@ -46,7 +46,7 @@ namespace GitHub.Runner.Listener.Configuration
} }
} }
public RSACryptoServiceProvider GetKey() public RSA GetKey()
{ {
if (!File.Exists(_keyFile)) if (!File.Exists(_keyFile))
{ {
@@ -55,7 +55,7 @@ namespace GitHub.Runner.Listener.Configuration
Trace.Info("Loading RSA key parameters from file {0}", _keyFile); Trace.Info("Loading RSA key parameters from file {0}", _keyFile);
var rsa = new RSACryptoServiceProvider(); var rsa = RSA.Create();
rsa.ImportParameters(LoadParameters()); rsa.ImportParameters(LoadParameters());
return rsa; return rsa;
} }

View File

@@ -14,14 +14,14 @@ namespace GitHub.Runner.Listener.Configuration
private string _keyFile; private string _keyFile;
private IHostContext _context; private IHostContext _context;
public RSACryptoServiceProvider CreateKey() public RSA CreateKey()
{ {
RSACryptoServiceProvider rsa = null; RSA rsa = null;
if (!File.Exists(_keyFile)) if (!File.Exists(_keyFile))
{ {
Trace.Info("Creating new RSA key using 2048-bit key length"); Trace.Info("Creating new RSA key using 2048-bit key length");
rsa = new RSACryptoServiceProvider(2048); rsa = RSA.Create(2048);
// Now write the parameters to disk // Now write the parameters to disk
IOUtil.SaveObject(new RSAParametersSerializable(rsa.ExportParameters(true)), _keyFile); IOUtil.SaveObject(new RSAParametersSerializable(rsa.ExportParameters(true)), _keyFile);
@@ -54,7 +54,7 @@ namespace GitHub.Runner.Listener.Configuration
{ {
Trace.Info("Found existing RSA key parameters file {0}", _keyFile); Trace.Info("Found existing RSA key parameters file {0}", _keyFile);
rsa = new RSACryptoServiceProvider(); rsa = RSA.Create();
rsa.ImportParameters(IOUtil.LoadObject<RSAParametersSerializable>(_keyFile).RSAParameters); rsa.ImportParameters(IOUtil.LoadObject<RSAParametersSerializable>(_keyFile).RSAParameters);
} }
@@ -70,7 +70,7 @@ namespace GitHub.Runner.Listener.Configuration
} }
} }
public RSACryptoServiceProvider GetKey() public RSA GetKey()
{ {
if (!File.Exists(_keyFile)) if (!File.Exists(_keyFile))
{ {
@@ -80,7 +80,7 @@ namespace GitHub.Runner.Listener.Configuration
Trace.Info("Loading RSA key parameters from file {0}", _keyFile); Trace.Info("Loading RSA key parameters from file {0}", _keyFile);
var parameters = IOUtil.LoadObject<RSAParametersSerializable>(_keyFile).RSAParameters; var parameters = IOUtil.LoadObject<RSAParametersSerializable>(_keyFile).RSAParameters;
var rsa = new RSACryptoServiceProvider(); var rsa = RSA.Create();
rsa.ImportParameters(parameters); rsa.ImportParameters(parameters);
return rsa; return rsa;
} }

View File

@@ -319,7 +319,8 @@ namespace GitHub.Runner.Listener
var keyManager = HostContext.GetService<IRSAKeyManager>(); var keyManager = HostContext.GetService<IRSAKeyManager>();
using (var rsa = keyManager.GetKey()) using (var rsa = keyManager.GetKey())
{ {
return aes.CreateDecryptor(rsa.Decrypt(_session.EncryptionKey.Value, RSAEncryptionPadding.OaepSHA1), message.IV); var padding = _session.UseFipsEncryption ? RSAEncryptionPadding.OaepSHA256 : RSAEncryptionPadding.OaepSHA1;
return aes.CreateDecryptor(rsa.Decrypt(_session.EncryptionKey.Value, padding), message.IV);
} }
} }
else else

View File

@@ -466,6 +466,7 @@ Config Options:
--url string Repository to add the runner to. Required if unattended --url string Repository to add the runner to. Required if unattended
--token string Registration token. Required if unattended --token string Registration token. Required if unattended
--name string Name of the runner to configure (default {Environment.MachineName ?? "myrunner"}) --name string Name of the runner to configure (default {Environment.MachineName ?? "myrunner"})
--runnergroup string Name of the runner group to add this runner to (defaults to the default runner group)
--labels string Extra labels in addition to the default: 'self-hosted,{Constants.Runner.Platform},{Constants.Runner.PlatformArchitecture}' --labels string Extra labels in addition to the default: 'self-hosted,{Constants.Runner.Platform},{Constants.Runner.PlatformArchitecture}'
--work string Relative runner work directory (default {Constants.Path.WorkDirectory}) --work string Relative runner work directory (default {Constants.Path.WorkDirectory})
--replace Replace any existing runner with the same name (default false)"); --replace Replace any existing runner with the same name (default false)");

View File

@@ -1,4 +1,4 @@
using System; using System;
using System.Collections.Generic; using System.Collections.Generic;
using System.Net; using System.Net;
using System.Text.RegularExpressions; using System.Text.RegularExpressions;
@@ -71,7 +71,7 @@ namespace GitHub.Runner.Sdk
if (!string.IsNullOrEmpty(httpProxyAddress) && Uri.TryCreate(httpProxyAddress, UriKind.Absolute, out var proxyHttpUri)) if (!string.IsNullOrEmpty(httpProxyAddress) && Uri.TryCreate(httpProxyAddress, UriKind.Absolute, out var proxyHttpUri))
{ {
_httpProxyAddress = proxyHttpUri.AbsoluteUri; _httpProxyAddress = proxyHttpUri.OriginalString;
// Set both environment variables since there are tools support both casing (curl, wget) and tools support only one casing (docker) // Set both environment variables since there are tools support both casing (curl, wget) and tools support only one casing (docker)
Environment.SetEnvironmentVariable("HTTP_PROXY", _httpProxyAddress); Environment.SetEnvironmentVariable("HTTP_PROXY", _httpProxyAddress);
@@ -101,7 +101,7 @@ namespace GitHub.Runner.Sdk
if (!string.IsNullOrEmpty(httpsProxyAddress) && Uri.TryCreate(httpsProxyAddress, UriKind.Absolute, out var proxyHttpsUri)) if (!string.IsNullOrEmpty(httpsProxyAddress) && Uri.TryCreate(httpsProxyAddress, UriKind.Absolute, out var proxyHttpsUri))
{ {
_httpsProxyAddress = proxyHttpsUri.AbsoluteUri; _httpsProxyAddress = proxyHttpsUri.OriginalString;
// Set both environment variables since there are tools support both casing (curl, wget) and tools support only one casing (docker) // Set both environment variables since there are tools support both casing (curl, wget) and tools support only one casing (docker)
Environment.SetEnvironmentVariable("HTTPS_PROXY", _httpsProxyAddress); Environment.SetEnvironmentVariable("HTTPS_PROXY", _httpsProxyAddress);

View File

@@ -30,7 +30,7 @@ namespace GitHub.Runner.Sdk
// //
// For example, on an en-US box, this is required for loading the encoding for the // For example, on an en-US box, this is required for loading the encoding for the
// default console output code page '437'. Without loading the correct encoding for // default console output code page '437'. Without loading the correct encoding for
// code page IBM437, some characters cannot be translated correctly, e.g. write 'ç' // code page IBM437, some characters cannot be translated correctly, e.g. write 'ç'
// from powershell.exe. // from powershell.exe.
Encoding.RegisterProvider(CodePagesEncodingProvider.Instance); Encoding.RegisterProvider(CodePagesEncodingProvider.Instance);
#endif #endif

View File

@@ -1,4 +1,5 @@
using GitHub.DistributedTask.Pipelines; using GitHub.DistributedTask.Pipelines;
using GitHub.DistributedTask.Pipelines.ContextData;
using GitHub.DistributedTask.WebApi; using GitHub.DistributedTask.WebApi;
using GitHub.Runner.Common.Util; using GitHub.Runner.Common.Util;
using GitHub.Runner.Worker.Container; using GitHub.Runner.Worker.Container;
@@ -183,12 +184,49 @@ namespace GitHub.Runner.Worker
public void ProcessCommand(IExecutionContext context, string line, ActionCommand command, ContainerInfo container) public void ProcessCommand(IExecutionContext context, string line, ActionCommand command, ContainerInfo container)
{ {
var allowUnsecureCommands = false;
bool.TryParse(Environment.GetEnvironmentVariable(Constants.Variables.Actions.AllowUnsupportedCommands), out allowUnsecureCommands);
// Apply environment from env context, env context contains job level env and action's env block
#if OS_WINDOWS
var envContext = context.ExpressionValues["env"] as DictionaryContextData;
#else
var envContext = context.ExpressionValues["env"] as CaseSensitiveDictionaryContextData;
#endif
if (!allowUnsecureCommands && envContext.ContainsKey(Constants.Variables.Actions.AllowUnsupportedCommands))
{
bool.TryParse(envContext[Constants.Variables.Actions.AllowUnsupportedCommands].ToString(), out allowUnsecureCommands);
}
if (!allowUnsecureCommands)
{
throw new Exception(String.Format(Constants.Runner.UnsupportedCommandMessageDisabled, this.Command));
}
if (!command.Properties.TryGetValue(SetEnvCommandProperties.Name, out string envName) || string.IsNullOrEmpty(envName)) if (!command.Properties.TryGetValue(SetEnvCommandProperties.Name, out string envName) || string.IsNullOrEmpty(envName))
{ {
throw new Exception("Required field 'name' is missing in ##[set-env] command."); throw new Exception("Required field 'name' is missing in ##[set-env] command.");
} }
context.EnvironmentVariables[envName] = command.Data;
foreach (var blocked in _setEnvBlockList)
{
if (string.Equals(blocked, envName, StringComparison.OrdinalIgnoreCase))
{
// Log Telemetry and let user know they shouldn't do this
var issue = new Issue()
{
Type = IssueType.Error,
Message = $"Can't update {blocked} environment variable using ::set-env:: command."
};
issue.Data[Constants.Runner.InternalTelemetryIssueDataKey] = $"{Constants.Runner.UnsupportedCommand}_{envName}";
context.AddIssue(issue);
return;
}
}
context.Global.EnvironmentVariables[envName] = command.Data;
context.SetEnvContext(envName, command.Data); context.SetEnvContext(envName, command.Data);
context.Debug($"{envName}='{command.Data}'"); context.Debug($"{envName}='{command.Data}'");
} }
@@ -197,6 +235,11 @@ namespace GitHub.Runner.Worker
{ {
public const String Name = "name"; public const String Name = "name";
} }
private string[] _setEnvBlockList =
{
"NODE_OPTIONS"
};
} }
public sealed class SetOutputCommandExtension : RunnerService, IActionCommandExtension public sealed class SetOutputCommandExtension : RunnerService, IActionCommandExtension
@@ -282,9 +325,28 @@ namespace GitHub.Runner.Worker
public void ProcessCommand(IExecutionContext context, string line, ActionCommand command, ContainerInfo container) public void ProcessCommand(IExecutionContext context, string line, ActionCommand command, ContainerInfo container)
{ {
var allowUnsecureCommands = false;
bool.TryParse(Environment.GetEnvironmentVariable(Constants.Variables.Actions.AllowUnsupportedCommands), out allowUnsecureCommands);
// Apply environment from env context, env context contains job level env and action's env block
#if OS_WINDOWS
var envContext = context.ExpressionValues["env"] as DictionaryContextData;
#else
var envContext = context.ExpressionValues["env"] as CaseSensitiveDictionaryContextData;
#endif
if (!allowUnsecureCommands && envContext.ContainsKey(Constants.Variables.Actions.AllowUnsupportedCommands))
{
bool.TryParse(envContext[Constants.Variables.Actions.AllowUnsupportedCommands].ToString(), out allowUnsecureCommands);
}
if (!allowUnsecureCommands)
{
throw new Exception(String.Format(Constants.Runner.UnsupportedCommandMessageDisabled, this.Command));
}
ArgUtil.NotNullOrEmpty(command.Data, "path"); ArgUtil.NotNullOrEmpty(command.Data, "path");
context.PrependPath.RemoveAll(x => string.Equals(x, command.Data, StringComparison.CurrentCulture)); context.Global.PrependPath.RemoveAll(x => string.Equals(x, command.Data, StringComparison.CurrentCulture));
context.PrependPath.Add(command.Data); context.Global.PrependPath.Add(command.Data);
} }
} }

View File

@@ -66,7 +66,7 @@ namespace GitHub.Runner.Worker
// TODO: Deprecate the PREVIEW_ACTION_TOKEN // TODO: Deprecate the PREVIEW_ACTION_TOKEN
// Log even if we aren't using it to ensure users know. // Log even if we aren't using it to ensure users know.
if (!string.IsNullOrEmpty(executionContext.Variables.Get("PREVIEW_ACTION_TOKEN"))) if (!string.IsNullOrEmpty(executionContext.Global.Variables.Get("PREVIEW_ACTION_TOKEN")))
{ {
executionContext.Warning("The 'PREVIEW_ACTION_TOKEN' secret is deprecated. Please remove it from the repository's secrets"); executionContext.Warning("The 'PREVIEW_ACTION_TOKEN' secret is deprecated. Please remove it from the repository's secrets");
} }
@@ -75,7 +75,7 @@ namespace GitHub.Runner.Worker
IOUtil.DeleteDirectory(HostContext.GetDirectory(WellKnownDirectory.Actions), executionContext.CancellationToken); IOUtil.DeleteDirectory(HostContext.GetDirectory(WellKnownDirectory.Actions), executionContext.CancellationToken);
// todo: Remove when feature flag DistributedTask.NewActionMetadata is removed // todo: Remove when feature flag DistributedTask.NewActionMetadata is removed
var newActionMetadata = executionContext.Variables.GetBoolean("DistributedTask.NewActionMetadata") ?? false; var newActionMetadata = executionContext.Global.Variables.GetBoolean("DistributedTask.NewActionMetadata") ?? false;
var repositoryActions = new List<Pipelines.ActionStep>(); var repositoryActions = new List<Pipelines.ActionStep>();
@@ -395,7 +395,7 @@ namespace GitHub.Runner.Worker
Trace.Info($"Action cleanup plugin: {plugin.PluginTypeName}."); Trace.Info($"Action cleanup plugin: {plugin.PluginTypeName}.");
} }
} }
else if (definition.Data.Execution.ExecutionType == ActionExecutionType.Composite && !string.IsNullOrEmpty(Environment.GetEnvironmentVariable("TESTING_COMPOSITE_ACTIONS_ALPHA"))) else if (definition.Data.Execution.ExecutionType == ActionExecutionType.Composite)
{ {
var compositeAction = definition.Data.Execution as CompositeActionExecutionData; var compositeAction = definition.Data.Execution as CompositeActionExecutionData;
Trace.Info($"Load {compositeAction.Steps?.Count ?? 0} action steps."); Trace.Info($"Load {compositeAction.Steps?.Count ?? 0} action steps.");
@@ -468,7 +468,7 @@ namespace GitHub.Runner.Worker
ArgUtil.NotNull(setupInfo, nameof(setupInfo)); ArgUtil.NotNull(setupInfo, nameof(setupInfo));
ArgUtil.NotNullOrEmpty(setupInfo.Container.Image, nameof(setupInfo.Container.Image)); ArgUtil.NotNullOrEmpty(setupInfo.Container.Image, nameof(setupInfo.Container.Image));
executionContext.Output($"Pull down action image '{setupInfo.Container.Image}'"); executionContext.Output($"##[group]Pull down action image '{setupInfo.Container.Image}'");
// Pull down docker image with retry up to 3 times // Pull down docker image with retry up to 3 times
var dockerManger = HostContext.GetService<IDockerCommandManager>(); var dockerManger = HostContext.GetService<IDockerCommandManager>();
@@ -492,6 +492,7 @@ namespace GitHub.Runner.Worker
} }
} }
} }
executionContext.Output("##[endgroup]");
if (retryCount == 3 && pullExitCode != 0) if (retryCount == 3 && pullExitCode != 0)
{ {
@@ -511,7 +512,7 @@ namespace GitHub.Runner.Worker
ArgUtil.NotNull(setupInfo, nameof(setupInfo)); ArgUtil.NotNull(setupInfo, nameof(setupInfo));
ArgUtil.NotNullOrEmpty(setupInfo.Container.Dockerfile, nameof(setupInfo.Container.Dockerfile)); ArgUtil.NotNullOrEmpty(setupInfo.Container.Dockerfile, nameof(setupInfo.Container.Dockerfile));
executionContext.Output($"Build container for action use: '{setupInfo.Container.Dockerfile}'."); executionContext.Output($"##[group]Build container for action use: '{setupInfo.Container.Dockerfile}'.");
// Build docker image with retry up to 3 times // Build docker image with retry up to 3 times
var dockerManger = HostContext.GetService<IDockerCommandManager>(); var dockerManger = HostContext.GetService<IDockerCommandManager>();
@@ -541,6 +542,7 @@ namespace GitHub.Runner.Worker
} }
} }
} }
executionContext.Output("##[endgroup]");
if (retryCount == 3 && buildExitCode != 0) if (retryCount == 3 && buildExitCode != 0)
{ {
@@ -589,10 +591,12 @@ namespace GitHub.Runner.Worker
{ {
try try
{ {
actionDownloadInfos = await jobServer.ResolveActionDownloadInfoAsync(executionContext.Plan.ScopeIdentifier, executionContext.Plan.PlanType, executionContext.Plan.PlanId, new WebApi.ActionReferenceList { Actions = actionReferences }, executionContext.CancellationToken); actionDownloadInfos = await jobServer.ResolveActionDownloadInfoAsync(executionContext.Global.Plan.ScopeIdentifier, executionContext.Global.Plan.PlanType, executionContext.Global.Plan.PlanId, new WebApi.ActionReferenceList { Actions = actionReferences }, executionContext.CancellationToken);
break; break;
} }
catch (Exception ex) when (attempt < 3) catch (Exception ex)
{
if (attempt < 3)
{ {
executionContext.Output($"Failed to resolve action download info. Error: {ex.Message}"); executionContext.Output($"Failed to resolve action download info. Error: {ex.Message}");
executionContext.Debug(ex.ToString()); executionContext.Debug(ex.ToString());
@@ -603,6 +607,11 @@ namespace GitHub.Runner.Worker
await Task.Delay(backoff); await Task.Delay(backoff);
} }
} }
else
{
throw new WebApi.FailedToResolveActionDownloadInfoException("Failed to resolve action download info.", ex);
}
}
} }
ArgUtil.NotNull(actionDownloadInfos, nameof(actionDownloadInfos)); ArgUtil.NotNull(actionDownloadInfos, nameof(actionDownloadInfos));
@@ -947,7 +956,7 @@ namespace GitHub.Runner.Worker
if (string.IsNullOrEmpty(authToken)) if (string.IsNullOrEmpty(authToken))
{ {
// TODO: Deprecate the PREVIEW_ACTION_TOKEN // TODO: Deprecate the PREVIEW_ACTION_TOKEN
authToken = executionContext.Variables.Get("PREVIEW_ACTION_TOKEN"); authToken = executionContext.Global.Variables.Get("PREVIEW_ACTION_TOKEN");
} }
if (!string.IsNullOrEmpty(authToken)) if (!string.IsNullOrEmpty(authToken))
@@ -1046,7 +1055,7 @@ namespace GitHub.Runner.Worker
Trace.Info($"Action plugin: {(actionDefinitionData.Execution as PluginActionExecutionData).Plugin}, no more preparation."); Trace.Info($"Action plugin: {(actionDefinitionData.Execution as PluginActionExecutionData).Plugin}, no more preparation.");
return null; return null;
} }
else if (actionDefinitionData.Execution.ExecutionType == ActionExecutionType.Composite && !string.IsNullOrEmpty(Environment.GetEnvironmentVariable("TESTING_COMPOSITE_ACTIONS_ALPHA"))) else if (actionDefinitionData.Execution.ExecutionType == ActionExecutionType.Composite)
{ {
Trace.Info($"Action composite: {(actionDefinitionData.Execution as CompositeActionExecutionData).Steps}, no more preparation."); Trace.Info($"Action composite: {(actionDefinitionData.Execution as CompositeActionExecutionData).Steps}, no more preparation.");
return null; return null;

View File

@@ -30,8 +30,6 @@ namespace GitHub.Runner.Worker
Dictionary<string, string> EvaluateContainerEnvironment(IExecutionContext executionContext, MappingToken token, IDictionary<string, PipelineContextData> extraExpressionValues); Dictionary<string, string> EvaluateContainerEnvironment(IExecutionContext executionContext, MappingToken token, IDictionary<string, PipelineContextData> extraExpressionValues);
string EvaluateDefaultInput(IExecutionContext executionContext, string inputName, TemplateToken token); string EvaluateDefaultInput(IExecutionContext executionContext, string inputName, TemplateToken token);
void SetAllCompositeOutputs(IExecutionContext parentExecutionContext, DictionaryContextData actionOutputs);
} }
public sealed class ActionManifestManager : RunnerService, IActionManifestManager public sealed class ActionManifestManager : RunnerService, IActionManifestManager
@@ -57,7 +55,7 @@ namespace GitHub.Runner.Worker
public ActionDefinitionData Load(IExecutionContext executionContext, string manifestFile) public ActionDefinitionData Load(IExecutionContext executionContext, string manifestFile)
{ {
var templateContext = CreateContext(executionContext); var templateContext = CreateTemplateContext(executionContext);
ActionDefinitionData actionDefinition = new ActionDefinitionData(); ActionDefinitionData actionDefinition = new ActionDefinitionData();
// Clean up file name real quick // Clean up file name real quick
@@ -79,9 +77,9 @@ namespace GitHub.Runner.Worker
// Add this file to the FileTable in executionContext if it hasn't been added already // Add this file to the FileTable in executionContext if it hasn't been added already
// we use > since fileID is 1 indexed // we use > since fileID is 1 indexed
if (fileId > executionContext.FileTable.Count) if (fileId > executionContext.Global.FileTable.Count)
{ {
executionContext.FileTable.Add(fileRelativePath); executionContext.Global.FileTable.Add(fileRelativePath);
} }
// Read the file // Read the file
@@ -107,20 +105,15 @@ namespace GitHub.Runner.Worker
break; break;
case "outputs": case "outputs":
if (!string.IsNullOrEmpty(Environment.GetEnvironmentVariable("TESTING_COMPOSITE_ACTIONS_ALPHA")))
{
actionOutputs = actionPair.Value.AssertMapping("outputs"); actionOutputs = actionPair.Value.AssertMapping("outputs");
break; break;
}
Trace.Info($"Ignore action property outputs. Outputs for a whole action is not supported yet.");
break;
case "description": case "description":
actionDefinition.Description = actionPair.Value.AssertString("description").Value; actionDefinition.Description = actionPair.Value.AssertString("description").Value;
break; break;
case "inputs": case "inputs":
ConvertInputs(templateContext, actionPair.Value, actionDefinition); ConvertInputs(actionPair.Value, actionDefinition);
break; break;
case "runs": case "runs":
@@ -137,7 +130,7 @@ namespace GitHub.Runner.Worker
// Evaluate Runs Last // Evaluate Runs Last
if (actionRunValueToken != null) if (actionRunValueToken != null)
{ {
actionDefinition.Execution = ConvertRuns(executionContext, templateContext, actionRunValueToken, actionOutputs); actionDefinition.Execution = ConvertRuns(executionContext, templateContext, actionRunValueToken, fileRelativePath, actionOutputs);
} }
} }
catch (Exception ex) catch (Exception ex)
@@ -170,34 +163,6 @@ namespace GitHub.Runner.Worker
return actionDefinition; return actionDefinition;
} }
public void SetAllCompositeOutputs(
IExecutionContext parentExecutionContext,
DictionaryContextData actionOutputs)
{
// Each pair is structured like this
// We ignore "description" for now
// {
// "the-output-name": {
// "description": "",
// "value": "the value"
// },
// ...
// }
foreach (var pair in actionOutputs)
{
var outputsName = pair.Key;
var outputsAttributes = pair.Value as DictionaryContextData;
outputsAttributes.TryGetValue("value", out var val);
var outputsValue = val as StringContextData;
// Set output in the whole composite scope.
if (!String.IsNullOrEmpty(outputsName) && !String.IsNullOrEmpty(outputsValue))
{
parentExecutionContext.SetOutput(outputsName, outputsValue, out _);
}
}
}
public DictionaryContextData EvaluateCompositeOutputs( public DictionaryContextData EvaluateCompositeOutputs(
IExecutionContext executionContext, IExecutionContext executionContext,
TemplateToken token, TemplateToken token,
@@ -207,19 +172,19 @@ namespace GitHub.Runner.Worker
if (token != null) if (token != null)
{ {
var context = CreateContext(executionContext, extraExpressionValues); var templateContext = CreateTemplateContext(executionContext, extraExpressionValues);
try try
{ {
token = TemplateEvaluator.Evaluate(context, "outputs", token, 0, null, omitHeader: true); token = TemplateEvaluator.Evaluate(templateContext, "outputs", token, 0, null, omitHeader: true);
context.Errors.Check(); templateContext.Errors.Check();
result = token.ToContextData().AssertDictionary("composite outputs"); result = token.ToContextData().AssertDictionary("composite outputs");
} }
catch (Exception ex) when (!(ex is TemplateValidationException)) catch (Exception ex) when (!(ex is TemplateValidationException))
{ {
context.Errors.Add(ex); templateContext.Errors.Add(ex);
} }
context.Errors.Check(); templateContext.Errors.Check();
} }
return result ?? new DictionaryContextData(); return result ?? new DictionaryContextData();
@@ -234,11 +199,11 @@ namespace GitHub.Runner.Worker
if (token != null) if (token != null)
{ {
var context = CreateContext(executionContext, extraExpressionValues); var templateContext = CreateTemplateContext(executionContext, extraExpressionValues);
try try
{ {
var evaluateResult = TemplateEvaluator.Evaluate(context, "container-runs-args", token, 0, null, omitHeader: true); var evaluateResult = TemplateEvaluator.Evaluate(templateContext, "container-runs-args", token, 0, null, omitHeader: true);
context.Errors.Check(); templateContext.Errors.Check();
Trace.Info($"Arguments evaluate result: {StringUtil.ConvertToJson(evaluateResult)}"); Trace.Info($"Arguments evaluate result: {StringUtil.ConvertToJson(evaluateResult)}");
@@ -255,10 +220,10 @@ namespace GitHub.Runner.Worker
catch (Exception ex) when (!(ex is TemplateValidationException)) catch (Exception ex) when (!(ex is TemplateValidationException))
{ {
Trace.Error(ex); Trace.Error(ex);
context.Errors.Add(ex); templateContext.Errors.Add(ex);
} }
context.Errors.Check(); templateContext.Errors.Check();
} }
return result; return result;
@@ -273,11 +238,11 @@ namespace GitHub.Runner.Worker
if (token != null) if (token != null)
{ {
var context = CreateContext(executionContext, extraExpressionValues); var templateContext = CreateTemplateContext(executionContext, extraExpressionValues);
try try
{ {
var evaluateResult = TemplateEvaluator.Evaluate(context, "container-runs-env", token, 0, null, omitHeader: true); var evaluateResult = TemplateEvaluator.Evaluate(templateContext, "container-runs-env", token, 0, null, omitHeader: true);
context.Errors.Check(); templateContext.Errors.Check();
Trace.Info($"Environments evaluate result: {StringUtil.ConvertToJson(evaluateResult)}"); Trace.Info($"Environments evaluate result: {StringUtil.ConvertToJson(evaluateResult)}");
@@ -299,10 +264,10 @@ namespace GitHub.Runner.Worker
catch (Exception ex) when (!(ex is TemplateValidationException)) catch (Exception ex) when (!(ex is TemplateValidationException))
{ {
Trace.Error(ex); Trace.Error(ex);
context.Errors.Add(ex); templateContext.Errors.Add(ex);
} }
context.Errors.Check(); templateContext.Errors.Check();
} }
return result; return result;
@@ -316,11 +281,11 @@ namespace GitHub.Runner.Worker
string result = ""; string result = "";
if (token != null) if (token != null)
{ {
var context = CreateContext(executionContext); var templateContext = CreateTemplateContext(executionContext);
try try
{ {
var evaluateResult = TemplateEvaluator.Evaluate(context, "input-default-context", token, 0, null, omitHeader: true); var evaluateResult = TemplateEvaluator.Evaluate(templateContext, "input-default-context", token, 0, null, omitHeader: true);
context.Errors.Check(); templateContext.Errors.Check();
Trace.Info($"Input '{inputName}': default value evaluate result: {StringUtil.ConvertToJson(evaluateResult)}"); Trace.Info($"Input '{inputName}': default value evaluate result: {StringUtil.ConvertToJson(evaluateResult)}");
@@ -330,16 +295,16 @@ namespace GitHub.Runner.Worker
catch (Exception ex) when (!(ex is TemplateValidationException)) catch (Exception ex) when (!(ex is TemplateValidationException))
{ {
Trace.Error(ex); Trace.Error(ex);
context.Errors.Add(ex); templateContext.Errors.Add(ex);
} }
context.Errors.Check(); templateContext.Errors.Check();
} }
return result; return result;
} }
private TemplateContext CreateContext( private TemplateContext CreateTemplateContext(
IExecutionContext executionContext, IExecutionContext executionContext,
IDictionary<string, PipelineContextData> extraExpressionValues = null) IDictionary<string, PipelineContextData> extraExpressionValues = null)
{ {
@@ -377,9 +342,9 @@ namespace GitHub.Runner.Worker
} }
// Add the file table from the Execution Context // Add the file table from the Execution Context
for (var i = 0; i < executionContext.FileTable.Count; i++) for (var i = 0; i < executionContext.Global.FileTable.Count; i++)
{ {
result.GetFileId(executionContext.FileTable[i]); result.GetFileId(executionContext.Global.FileTable[i]);
} }
return result; return result;
@@ -387,8 +352,9 @@ namespace GitHub.Runner.Worker
private ActionExecutionData ConvertRuns( private ActionExecutionData ConvertRuns(
IExecutionContext executionContext, IExecutionContext executionContext,
TemplateContext context, TemplateContext templateContext,
TemplateToken inputsToken, TemplateToken inputsToken,
String fileRelativePath,
MappingToken outputs = null) MappingToken outputs = null)
{ {
var runsMapping = inputsToken.AssertMapping("runs"); var runsMapping = inputsToken.AssertMapping("runs");
@@ -405,7 +371,7 @@ namespace GitHub.Runner.Worker
var postToken = default(StringToken); var postToken = default(StringToken);
var postEntrypointToken = default(StringToken); var postEntrypointToken = default(StringToken);
var postIfToken = default(StringToken); var postIfToken = default(StringToken);
var stepsLoaded = default(List<Pipelines.ActionStep>); var steps = default(List<Pipelines.Step>);
foreach (var run in runsMapping) foreach (var run in runsMapping)
{ {
@@ -452,14 +418,10 @@ namespace GitHub.Runner.Worker
preIfToken = run.Value.AssertString("pre-if"); preIfToken = run.Value.AssertString("pre-if");
break; break;
case "steps": case "steps":
if (!string.IsNullOrEmpty(Environment.GetEnvironmentVariable("TESTING_COMPOSITE_ACTIONS_ALPHA"))) var stepsToken = run.Value.AssertSequence("steps");
{ steps = PipelineTemplateConverter.ConvertToSteps(templateContext, stepsToken);
var steps = run.Value.AssertSequence("steps"); templateContext.Errors.Check();
var evaluator = executionContext.ToPipelineTemplateEvaluator();
stepsLoaded = evaluator.LoadCompositeSteps(steps);
break; break;
}
throw new Exception("You aren't supposed to be using Composite Actions yet!");
default: default:
Trace.Info($"Ignore run property {runsKey}."); Trace.Info($"Ignore run property {runsKey}.");
break; break;
@@ -472,7 +434,7 @@ namespace GitHub.Runner.Worker
{ {
if (string.IsNullOrEmpty(imageToken?.Value)) if (string.IsNullOrEmpty(imageToken?.Value))
{ {
throw new ArgumentNullException($"Image is not provided."); throw new ArgumentNullException($"You are using a Container Action but an image is not provided in {fileRelativePath}.");
} }
else else
{ {
@@ -493,7 +455,7 @@ namespace GitHub.Runner.Worker
{ {
if (string.IsNullOrEmpty(mainToken?.Value)) if (string.IsNullOrEmpty(mainToken?.Value))
{ {
throw new ArgumentNullException($"Entry javascript file is not provided."); throw new ArgumentNullException($"You are using a JavaScript Action but there is not an entry JavaScript file provided in {fileRelativePath}.");
} }
else else
{ {
@@ -507,18 +469,17 @@ namespace GitHub.Runner.Worker
}; };
} }
} }
else if (string.Equals(usingToken.Value, "composite", StringComparison.OrdinalIgnoreCase) && !string.IsNullOrEmpty(Environment.GetEnvironmentVariable("TESTING_COMPOSITE_ACTIONS_ALPHA"))) else if (string.Equals(usingToken.Value, "composite", StringComparison.OrdinalIgnoreCase))
{ {
if (stepsLoaded == null) if (steps == null)
{ {
// TODO: Add a more helpful error message + including file name, etc. to show user that it's because of their yaml file throw new ArgumentNullException($"You are using a composite action but there are no steps provided in {fileRelativePath}.");
throw new ArgumentNullException($"No steps provided.");
} }
else else
{ {
return new CompositeActionExecutionData() return new CompositeActionExecutionData()
{ {
Steps = stepsLoaded, Steps = steps.Cast<Pipelines.ActionStep>().ToList(),
Outputs = outputs Outputs = outputs
}; };
} }
@@ -540,7 +501,6 @@ namespace GitHub.Runner.Worker
} }
private void ConvertInputs( private void ConvertInputs(
TemplateContext context,
TemplateToken inputsToken, TemplateToken inputsToken,
ActionDefinitionData actionDefinition) ActionDefinitionData actionDefinition)
{ {

View File

@@ -135,16 +135,33 @@ namespace GitHub.Runner.Worker
ExecutionContext.SetGitHubContext("event_path", workflowFile); ExecutionContext.SetGitHubContext("event_path", workflowFile);
} }
// Set GITHUB_ACTION_REPOSITORY if this Action is from a repository
if (Action.Reference is Pipelines.RepositoryPathReference repoPathReferenceAction &&
!string.Equals(repoPathReferenceAction.RepositoryType, Pipelines.PipelineConstants.SelfAlias, StringComparison.OrdinalIgnoreCase))
{
ExecutionContext.SetGitHubContext("action_repository", repoPathReferenceAction.Name);
ExecutionContext.SetGitHubContext("action_ref", repoPathReferenceAction.Ref);
}
else
{
ExecutionContext.SetGitHubContext("action_repository", null);
ExecutionContext.SetGitHubContext("action_ref", null);
}
// Setup container stephost for running inside the container. // Setup container stephost for running inside the container.
if (ExecutionContext.Container != null) if (ExecutionContext.Global.Container != null)
{ {
// Make sure required container is already created. // Make sure required container is already created.
ArgUtil.NotNullOrEmpty(ExecutionContext.Container.ContainerId, nameof(ExecutionContext.Container.ContainerId)); ArgUtil.NotNullOrEmpty(ExecutionContext.Global.Container.ContainerId, nameof(ExecutionContext.Global.Container.ContainerId));
var containerStepHost = HostContext.CreateService<IContainerStepHost>(); var containerStepHost = HostContext.CreateService<IContainerStepHost>();
containerStepHost.Container = ExecutionContext.Container; containerStepHost.Container = ExecutionContext.Global.Container;
stepHost = containerStepHost; stepHost = containerStepHost;
} }
// Setup File Command Manager
var fileCommandManager = HostContext.CreateService<IFileCommandManager>();
fileCommandManager.InitializeFiles(ExecutionContext, null);
// Load the inputs. // Load the inputs.
ExecutionContext.Debug("Loading inputs"); ExecutionContext.Debug("Loading inputs");
var templateEvaluator = ExecutionContext.ToPipelineTemplateEvaluator(); var templateEvaluator = ExecutionContext.ToPipelineTemplateEvaluator();
@@ -231,15 +248,23 @@ namespace GitHub.Runner.Worker
handlerData, handlerData,
inputs, inputs,
environment, environment,
ExecutionContext.Variables, ExecutionContext.Global.Variables,
actionDirectory: definition.Directory); actionDirectory: definition.Directory);
// Print out action details // Print out action details
handler.PrintActionDetails(Stage); handler.PrintActionDetails(Stage);
// Run the task. // Run the task.
try
{
await handler.RunAsync(Stage); await handler.RunAsync(Stage);
} }
finally
{
fileCommandManager.ProcessFiles(ExecutionContext, ExecutionContext.Global.Container);
}
}
public bool TryEvaluateDisplayName(DictionaryContextData contextData, IExecutionContext context) public bool TryEvaluateDisplayName(DictionaryContextData contextData, IExecutionContext context)
{ {

View File

@@ -21,6 +21,11 @@ namespace GitHub.Runner.Worker.Container
{ {
} }
public ContainerInfo(IHostContext hostContext)
{
UpdateWebProxyEnv(hostContext.WebProxy);
}
public ContainerInfo(IHostContext hostContext, Pipelines.JobContainer container, bool isJobContainer = true, string networkAlias = null) public ContainerInfo(IHostContext hostContext, Pipelines.JobContainer container, bool isJobContainer = true, string networkAlias = null)
{ {
this.ContainerName = container.Alias; this.ContainerName = container.Alias;
@@ -34,6 +39,9 @@ namespace GitHub.Runner.Worker.Container
_environmentVariables = container.Environment; _environmentVariables = container.Environment;
this.IsJobContainer = isJobContainer; this.IsJobContainer = isJobContainer;
this.ContainerNetworkAlias = networkAlias; this.ContainerNetworkAlias = networkAlias;
this.RegistryAuthUsername = container.Credentials?.Username;
this.RegistryAuthPassword = container.Credentials?.Password;
this.RegistryServer = DockerUtil.ParseRegistryHostnameFromImageName(this.ContainerImage);
#if OS_WINDOWS #if OS_WINDOWS
_pathMappings.Add(new PathMapping(hostContext.GetDirectory(WellKnownDirectory.Work), "C:\\__w")); _pathMappings.Add(new PathMapping(hostContext.GetDirectory(WellKnownDirectory.Work), "C:\\__w"));
@@ -79,6 +87,9 @@ namespace GitHub.Runner.Worker.Container
public string ContainerWorkDirectory { get; set; } public string ContainerWorkDirectory { get; set; }
public string ContainerCreateOptions { get; private set; } public string ContainerCreateOptions { get; private set; }
public string ContainerRuntimePath { get; set; } public string ContainerRuntimePath { get; set; }
public string RegistryServer { get; set; }
public string RegistryAuthUsername { get; set; }
public string RegistryAuthPassword { get; set; }
public bool IsJobContainer { get; set; } public bool IsJobContainer { get; set; }
public IDictionary<string, string> ContainerEnvironmentVariables public IDictionary<string, string> ContainerEnvironmentVariables

View File

@@ -4,6 +4,7 @@ using System.IO;
using System.Linq; using System.Linq;
using System.Text.RegularExpressions; using System.Text.RegularExpressions;
using System.Threading; using System.Threading;
using System.Threading.Channels;
using System.Threading.Tasks; using System.Threading.Tasks;
using GitHub.Runner.Common; using GitHub.Runner.Common;
using GitHub.Runner.Sdk; using GitHub.Runner.Sdk;
@@ -17,6 +18,7 @@ namespace GitHub.Runner.Worker.Container
string DockerInstanceLabel { get; } string DockerInstanceLabel { get; }
Task<DockerVersion> DockerVersion(IExecutionContext context); Task<DockerVersion> DockerVersion(IExecutionContext context);
Task<int> DockerPull(IExecutionContext context, string image); Task<int> DockerPull(IExecutionContext context, string image);
Task<int> DockerPull(IExecutionContext context, string image, string configFileDirectory);
Task<int> DockerBuild(IExecutionContext context, string workingDirectory, string dockerFile, string dockerContext, string tag); Task<int> DockerBuild(IExecutionContext context, string workingDirectory, string dockerFile, string dockerContext, string tag);
Task<string> DockerCreate(IExecutionContext context, ContainerInfo container); Task<string> DockerCreate(IExecutionContext context, ContainerInfo container);
Task<int> DockerRun(IExecutionContext context, ContainerInfo container, EventHandler<ProcessDataReceivedEventArgs> stdoutDataReceived, EventHandler<ProcessDataReceivedEventArgs> stderrDataReceived); Task<int> DockerRun(IExecutionContext context, ContainerInfo container, EventHandler<ProcessDataReceivedEventArgs> stdoutDataReceived, EventHandler<ProcessDataReceivedEventArgs> stderrDataReceived);
@@ -31,6 +33,7 @@ namespace GitHub.Runner.Worker.Container
Task<int> DockerExec(IExecutionContext context, string containerId, string options, string command, List<string> outputs); Task<int> DockerExec(IExecutionContext context, string containerId, string options, string command, List<string> outputs);
Task<List<string>> DockerInspect(IExecutionContext context, string dockerObject, string options); Task<List<string>> DockerInspect(IExecutionContext context, string dockerObject, string options);
Task<List<PortMapping>> DockerPort(IExecutionContext context, string containerId); Task<List<PortMapping>> DockerPort(IExecutionContext context, string containerId);
Task<int> DockerLogin(IExecutionContext context, string configFileDirectory, string registry, string username, string password);
} }
public class DockerCommandManager : RunnerService, IDockerCommandManager public class DockerCommandManager : RunnerService, IDockerCommandManager
@@ -82,9 +85,18 @@ namespace GitHub.Runner.Worker.Container
return new DockerVersion(serverVersion, clientVersion); return new DockerVersion(serverVersion, clientVersion);
} }
public async Task<int> DockerPull(IExecutionContext context, string image) public Task<int> DockerPull(IExecutionContext context, string image)
{ {
return await ExecuteDockerCommandAsync(context, "pull", image, context.CancellationToken); return DockerPull(context, image, null);
}
public async Task<int> DockerPull(IExecutionContext context, string image, string configFileDirectory)
{
if (string.IsNullOrEmpty(configFileDirectory))
{
return await ExecuteDockerCommandAsync(context, $"pull", image, context.CancellationToken);
}
return await ExecuteDockerCommandAsync(context, $"--config {configFileDirectory} pull", image, context.CancellationToken);
} }
public async Task<int> DockerBuild(IExecutionContext context, string workingDirectory, string dockerFile, string dockerContext, string tag) public async Task<int> DockerBuild(IExecutionContext context, string workingDirectory, string dockerFile, string dockerContext, string tag)
@@ -346,6 +358,28 @@ namespace GitHub.Runner.Worker.Container
return DockerUtil.ParseDockerPort(portMappingLines); return DockerUtil.ParseDockerPort(portMappingLines);
} }
public Task<int> DockerLogin(IExecutionContext context, string configFileDirectory, string registry, string username, string password)
{
string args = $"--config {configFileDirectory} login {registry} -u {username} --password-stdin";
context.Command($"{DockerPath} {args}");
var input = Channel.CreateBounded<string>(new BoundedChannelOptions(1) { SingleReader = true, SingleWriter = true });
input.Writer.TryWrite(password);
var processInvoker = HostContext.CreateService<IProcessInvoker>();
return processInvoker.ExecuteAsync(
workingDirectory: context.GetGitHubContext("workspace"),
fileName: DockerPath,
arguments: args,
environment: null,
requireExitCodeZero: false,
outputEncoding: null,
killProcessOnCancel: false,
redirectStandardIn: input,
cancellationToken: context.CancellationToken);
}
private Task<int> ExecuteDockerCommandAsync(IExecutionContext context, string command, string options, CancellationToken cancellationToken = default(CancellationToken)) private Task<int> ExecuteDockerCommandAsync(IExecutionContext context, string command, string options, CancellationToken cancellationToken = default(CancellationToken))
{ {
return ExecuteDockerCommandAsync(context, command, options, null, cancellationToken); return ExecuteDockerCommandAsync(context, command, options, null, cancellationToken);

View File

@@ -45,5 +45,21 @@ namespace GitHub.Runner.Worker.Container
} }
return ""; return "";
} }
public static string ParseRegistryHostnameFromImageName(string name)
{
var nameSplit = name.Split('/');
// Single slash is implictly from Dockerhub, unless first part has .tld or :port
if (nameSplit.Length == 2 && (nameSplit[0].Contains(":") || nameSplit[0].Contains(".")))
{
return nameSplit[0];
}
// All other non Dockerhub registries
else if (nameSplit.Length > 2)
{
return nameSplit[0];
}
return "";
}
} }
} }

View File

@@ -91,7 +91,10 @@ namespace GitHub.Runner.Worker
#endif #endif
// Check docker client/server version // Check docker client/server version
executionContext.Output("##[group]Checking docker version");
DockerVersion dockerVersion = await _dockerManger.DockerVersion(executionContext); DockerVersion dockerVersion = await _dockerManger.DockerVersion(executionContext);
executionContext.Output("##[endgroup]");
ArgUtil.NotNull(dockerVersion.ServerVersion, nameof(dockerVersion.ServerVersion)); ArgUtil.NotNull(dockerVersion.ServerVersion, nameof(dockerVersion.ServerVersion));
ArgUtil.NotNull(dockerVersion.ClientVersion, nameof(dockerVersion.ClientVersion)); ArgUtil.NotNull(dockerVersion.ClientVersion, nameof(dockerVersion.ClientVersion));
@@ -111,7 +114,7 @@ namespace GitHub.Runner.Worker
} }
// Clean up containers left by previous runs // Clean up containers left by previous runs
executionContext.Debug($"Delete stale containers from previous jobs"); executionContext.Output("##[group]Clean up resources from previous jobs");
var staleContainers = await _dockerManger.DockerPS(executionContext, $"--all --quiet --no-trunc --filter \"label={_dockerManger.DockerInstanceLabel}\""); var staleContainers = await _dockerManger.DockerPS(executionContext, $"--all --quiet --no-trunc --filter \"label={_dockerManger.DockerInstanceLabel}\"");
foreach (var staleContainer in staleContainers) foreach (var staleContainer in staleContainers)
{ {
@@ -122,18 +125,20 @@ namespace GitHub.Runner.Worker
} }
} }
executionContext.Debug($"Delete stale container networks from previous jobs");
int networkPruneExitCode = await _dockerManger.DockerNetworkPrune(executionContext); int networkPruneExitCode = await _dockerManger.DockerNetworkPrune(executionContext);
if (networkPruneExitCode != 0) if (networkPruneExitCode != 0)
{ {
executionContext.Warning($"Delete stale container networks failed, docker network prune fail with exit code {networkPruneExitCode}"); executionContext.Warning($"Delete stale container networks failed, docker network prune fail with exit code {networkPruneExitCode}");
} }
executionContext.Output("##[endgroup]");
// Create local docker network for this job to avoid port conflict when multiple runners run on same machine. // Create local docker network for this job to avoid port conflict when multiple runners run on same machine.
// All containers within a job join the same network // All containers within a job join the same network
executionContext.Output("##[group]Create local container network");
var containerNetwork = $"github_network_{Guid.NewGuid().ToString("N")}"; var containerNetwork = $"github_network_{Guid.NewGuid().ToString("N")}";
await CreateContainerNetworkAsync(executionContext, containerNetwork); await CreateContainerNetworkAsync(executionContext, containerNetwork);
executionContext.JobContext.Container["network"] = new StringContextData(containerNetwork); executionContext.JobContext.Container["network"] = new StringContextData(containerNetwork);
executionContext.Output("##[endgroup]");
foreach (var container in containers) foreach (var container in containers)
{ {
@@ -141,10 +146,12 @@ namespace GitHub.Runner.Worker
await StartContainerAsync(executionContext, container); await StartContainerAsync(executionContext, container);
} }
executionContext.Output("##[group]Waiting for all services to be ready");
foreach (var container in containers.Where(c => !c.IsJobContainer)) foreach (var container in containers.Where(c => !c.IsJobContainer))
{ {
await ContainerHealthcheck(executionContext, container); await ContainerHealthcheck(executionContext, container);
} }
executionContext.Output("##[endgroup]");
} }
public async Task StopContainersAsync(IExecutionContext executionContext, object data) public async Task StopContainersAsync(IExecutionContext executionContext, object data)
@@ -173,6 +180,10 @@ namespace GitHub.Runner.Worker
Trace.Info($"Container name: {container.ContainerName}"); Trace.Info($"Container name: {container.ContainerName}");
Trace.Info($"Container image: {container.ContainerImage}"); Trace.Info($"Container image: {container.ContainerImage}");
Trace.Info($"Container options: {container.ContainerCreateOptions}"); Trace.Info($"Container options: {container.ContainerCreateOptions}");
var groupName = container.IsJobContainer ? "Starting job container" : $"Starting {container.ContainerNetworkAlias} service container";
executionContext.Output($"##[group]{groupName}");
foreach (var port in container.UserPortMappings) foreach (var port in container.UserPortMappings)
{ {
Trace.Info($"User provided port: {port.Value}"); Trace.Info($"User provided port: {port.Value}");
@@ -187,12 +198,18 @@ namespace GitHub.Runner.Worker
} }
} }
// TODO: Add at a later date. This currently no local package registry to test with
// UpdateRegistryAuthForGitHubToken(executionContext, container);
// Before pulling, generate client authentication if required
var configLocation = await ContainerRegistryLogin(executionContext, container);
// Pull down docker image with retry up to 3 times // Pull down docker image with retry up to 3 times
int retryCount = 0; int retryCount = 0;
int pullExitCode = 0; int pullExitCode = 0;
while (retryCount < 3) while (retryCount < 3)
{ {
pullExitCode = await _dockerManger.DockerPull(executionContext, container.ContainerImage); pullExitCode = await _dockerManger.DockerPull(executionContext, container.ContainerImage, configLocation);
if (pullExitCode == 0) if (pullExitCode == 0)
{ {
break; break;
@@ -209,6 +226,9 @@ namespace GitHub.Runner.Worker
} }
} }
// Remove credentials after pulling
ContainerRegistryLogout(configLocation);
if (retryCount == 3 && pullExitCode != 0) if (retryCount == 3 && pullExitCode != 0)
{ {
throw new InvalidOperationException($"Docker pull failed with exit code {pullExitCode}"); throw new InvalidOperationException($"Docker pull failed with exit code {pullExitCode}");
@@ -304,6 +324,7 @@ namespace GitHub.Runner.Worker
container.ContainerRuntimePath = DockerUtil.ParsePathFromConfigEnv(containerEnv); container.ContainerRuntimePath = DockerUtil.ParsePathFromConfigEnv(containerEnv);
executionContext.JobContext.Container["id"] = new StringContextData(container.ContainerId); executionContext.JobContext.Container["id"] = new StringContextData(container.ContainerId);
} }
executionContext.Output("##[endgroup]");
} }
private async Task StopContainerAsync(IExecutionContext executionContext, ContainerInfo container) private async Task StopContainerAsync(IExecutionContext executionContext, ContainerInfo container)
@@ -425,5 +446,83 @@ namespace GitHub.Runner.Worker
throw new InvalidOperationException($"Failed to initialize, {container.ContainerNetworkAlias} service is {serviceHealth}."); throw new InvalidOperationException($"Failed to initialize, {container.ContainerNetworkAlias} service is {serviceHealth}.");
} }
} }
private async Task<string> ContainerRegistryLogin(IExecutionContext executionContext, ContainerInfo container)
{
if (string.IsNullOrEmpty(container.RegistryAuthUsername) || string.IsNullOrEmpty(container.RegistryAuthPassword))
{
// No valid client config can be generated
return "";
}
var configLocation = Path.Combine(HostContext.GetDirectory(WellKnownDirectory.Temp), $".docker_{Guid.NewGuid()}");
try
{
var dirInfo = Directory.CreateDirectory(configLocation);
}
catch (Exception e)
{
throw new InvalidOperationException($"Failed to create directory to store registry client credentials: {e.Message}");
}
var loginExitCode = await _dockerManger.DockerLogin(
executionContext,
configLocation,
container.RegistryServer,
container.RegistryAuthUsername,
container.RegistryAuthPassword);
if (loginExitCode != 0)
{
throw new InvalidOperationException($"Docker login for '{container.RegistryServer}' failed with exit code {loginExitCode}");
}
return configLocation;
}
private void ContainerRegistryLogout(string configLocation)
{
try
{
if (!string.IsNullOrEmpty(configLocation) && Directory.Exists(configLocation))
{
Directory.Delete(configLocation, recursive: true);
}
}
catch (Exception e)
{
throw new InvalidOperationException($"Failed to remove directory containing Docker client credentials: {e.Message}");
}
}
private void UpdateRegistryAuthForGitHubToken(IExecutionContext executionContext, ContainerInfo container)
{
var registryIsTokenCompatible = container.RegistryServer.Equals("docker.pkg.github.com", StringComparison.OrdinalIgnoreCase);
if (!registryIsTokenCompatible)
{
return;
}
var registryMatchesWorkflow = false;
// REGISTRY/OWNER/REPO/IMAGE[:TAG]
var imageParts = container.ContainerImage.Split('/');
if (imageParts.Length != 4)
{
executionContext.Warning($"Could not identify owner and repo for container image {container.ContainerImage}. Skipping automatic token auth");
return;
}
var owner = imageParts[1];
var repo = imageParts[2];
var nwo = $"{owner}/{repo}";
if (nwo.Equals(executionContext.GetGitHubContext("repository"), StringComparison.OrdinalIgnoreCase))
{
registryMatchesWorkflow = true;
}
var registryCredentialsNotSupplied = string.IsNullOrEmpty(container.RegistryAuthUsername) && string.IsNullOrEmpty(container.RegistryAuthPassword);
if (registryCredentialsNotSupplied && registryMatchesWorkflow)
{
container.RegistryAuthUsername = executionContext.GetGitHubContext("actor");
container.RegistryAuthPassword = executionContext.GetGitHubContext("token");
}
}
} }
} }

View File

@@ -86,9 +86,9 @@ namespace GitHub.Runner.Worker
executionContext.Debug("Zipping diagnostic files."); executionContext.Debug("Zipping diagnostic files.");
string buildNumber = executionContext.Variables.Build_Number ?? "UnknownBuildNumber"; string buildNumber = executionContext.Global.Variables.Build_Number ?? "UnknownBuildNumber";
string buildName = $"Build {buildNumber}"; string buildName = $"Build {buildNumber}";
string phaseName = executionContext.Variables.System_PhaseDisplayName ?? "UnknownPhaseName"; string phaseName = executionContext.Global.Variables.System_PhaseDisplayName ?? "UnknownPhaseName";
// zip the files // zip the files
string diagnosticsZipFileName = $"{buildName}-{phaseName}.zip"; string diagnosticsZipFileName = $"{buildName}-{phaseName}.zip";

View File

@@ -44,41 +44,33 @@ namespace GitHub.Runner.Worker
string ResultCode { get; set; } string ResultCode { get; set; }
TaskResult? CommandResult { get; set; } TaskResult? CommandResult { get; set; }
CancellationToken CancellationToken { get; } CancellationToken CancellationToken { get; }
List<ServiceEndpoint> Endpoints { get; } GlobalContext Global { get; }
TaskOrchestrationPlanReference Plan { get; }
PlanFeatures Features { get; }
Variables Variables { get; }
Dictionary<string, string> IntraActionState { get; } Dictionary<string, string> IntraActionState { get; }
IDictionary<String, IDictionary<String, String>> JobDefaults { get; }
Dictionary<string, VariableValue> JobOutputs { get; } Dictionary<string, VariableValue> JobOutputs { get; }
IDictionary<String, String> EnvironmentVariables { get; } ActionsEnvironmentReference ActionsEnvironment { get; }
IList<String> FileTable { get; }
StepsContext StepsContext { get; }
DictionaryContextData ExpressionValues { get; } DictionaryContextData ExpressionValues { get; }
IList<IFunctionInfo> ExpressionFunctions { get; } IList<IFunctionInfo> ExpressionFunctions { get; }
List<string> PrependPath { get; }
ContainerInfo Container { get; set; }
List<ContainerInfo> ServiceContainers { get; }
JobContext JobContext { get; } JobContext JobContext { get; }
// Only job level ExecutionContext has JobSteps // Only job level ExecutionContext has JobSteps
List<IStep> JobSteps { get; } Queue<IStep> JobSteps { get; }
// Only job level ExecutionContext has PostJobSteps // Only job level ExecutionContext has PostJobSteps
Stack<IStep> PostJobSteps { get; } Stack<IStep> PostJobSteps { get; }
bool EchoOnActionCommand { get; set; } bool EchoOnActionCommand { get; set; }
IExecutionContext FinalizeContext { get; set; } bool InsideComposite { get; }
ExecutionContext Root { get; }
// Initialize // Initialize
void InitializeJob(Pipelines.AgentJobRequestMessage message, CancellationToken token); void InitializeJob(Pipelines.AgentJobRequestMessage message, CancellationToken token);
void CancelToken(); void CancelToken();
IExecutionContext CreateChild(Guid recordId, string displayName, string refName, string scopeName, string contextName, Dictionary<string, string> intraActionState = null, int? recordOrder = null, IPagingLogger logger = null); IExecutionContext CreateChild(Guid recordId, string displayName, string refName, string scopeName, string contextName, Dictionary<string, string> intraActionState = null, int? recordOrder = null, IPagingLogger logger = null, bool insideComposite = false, CancellationTokenSource cancellationTokenSource = null);
// logging // logging
bool WriteDebug { get; }
long Write(string tag, string message); long Write(string tag, string message);
void QueueAttachFile(string type, string name, string filePath); void QueueAttachFile(string type, string name, string filePath);
@@ -107,7 +99,7 @@ namespace GitHub.Runner.Worker
// others // others
void ForceTaskComplete(); void ForceTaskComplete();
void RegisterPostJobStep(IStep step); void RegisterPostJobStep(IStep step);
IStep RegisterNestedStep(IActionRunner step, DictionaryContextData inputsData, int location, Dictionary<string, string> envData, bool cleanUp = false); IStep CreateCompositeStep(string scopeName, IActionRunner step, DictionaryContextData inputsData, Dictionary<string, string> envData);
} }
public sealed class ExecutionContext : RunnerService, IExecutionContext public sealed class ExecutionContext : RunnerService, IExecutionContext
@@ -122,9 +114,6 @@ namespace GitHub.Runner.Worker
private event OnMatcherChanged _onMatcherChanged; private event OnMatcherChanged _onMatcherChanged;
// Regex used for checking if ScopeName meets the condition that shows that its id is null.
private readonly static Regex _generatedContextNamePattern = new Regex("^__[a-f0-9]{8}-[a-f0-9]{4}-[a-f0-9]{4}-[a-f0-9]{4}-[a-f0-9]{12}$", RegexOptions.Compiled | RegexOptions.CultureInvariant | RegexOptions.IgnoreCase);
private IssueMatcherConfig[] _matchers; private IssueMatcherConfig[] _matchers;
private IPagingLogger _logger; private IPagingLogger _logger;
@@ -147,24 +136,18 @@ namespace GitHub.Runner.Worker
public string ContextName { get; private set; } public string ContextName { get; private set; }
public Task ForceCompleted => _forceCompleted.Task; public Task ForceCompleted => _forceCompleted.Task;
public CancellationToken CancellationToken => _cancellationTokenSource.Token; public CancellationToken CancellationToken => _cancellationTokenSource.Token;
public List<ServiceEndpoint> Endpoints { get; private set; }
public TaskOrchestrationPlanReference Plan { get; private set; }
public Variables Variables { get; private set; }
public Dictionary<string, string> IntraActionState { get; private set; } public Dictionary<string, string> IntraActionState { get; private set; }
public IDictionary<String, IDictionary<String, String>> JobDefaults { get; private set; }
public Dictionary<string, VariableValue> JobOutputs { get; private set; } public Dictionary<string, VariableValue> JobOutputs { get; private set; }
public IDictionary<String, String> EnvironmentVariables { get; private set; }
public IList<String> FileTable { get; private set; } public ActionsEnvironmentReference ActionsEnvironment { get; private set; }
public StepsContext StepsContext { get; private set; }
public DictionaryContextData ExpressionValues { get; } = new DictionaryContextData(); public DictionaryContextData ExpressionValues { get; } = new DictionaryContextData();
public IList<IFunctionInfo> ExpressionFunctions { get; } = new List<IFunctionInfo>(); public IList<IFunctionInfo> ExpressionFunctions { get; } = new List<IFunctionInfo>();
public bool WriteDebug { get; private set; }
public List<string> PrependPath { get; private set; } // Shared pointer across job-level execution context and step-level execution contexts
public ContainerInfo Container { get; set; } public GlobalContext Global { get; private set; }
public List<ContainerInfo> ServiceContainers { get; private set; }
// Only job level ExecutionContext has JobSteps // Only job level ExecutionContext has JobSteps
public List<IStep> JobSteps { get; private set; } public Queue<IStep> JobSteps { get; private set; }
// Only job level ExecutionContext has PostJobSteps // Only job level ExecutionContext has PostJobSteps
public Stack<IStep> PostJobSteps { get; private set; } public Stack<IStep> PostJobSteps { get; private set; }
@@ -174,7 +157,7 @@ namespace GitHub.Runner.Worker
public bool EchoOnActionCommand { get; set; } public bool EchoOnActionCommand { get; set; }
public IExecutionContext FinalizeContext { get; set; } public bool InsideComposite { get; private set; }
public TaskResult? Result public TaskResult? Result
{ {
@@ -206,9 +189,7 @@ namespace GitHub.Runner.Worker
} }
} }
public PlanFeatures Features { get; private set; } public ExecutionContext Root
private ExecutionContext Root
{ {
get get
{ {
@@ -276,32 +257,15 @@ namespace GitHub.Runner.Worker
/// Helper function used in CompositeActionHandler::RunAsync to /// Helper function used in CompositeActionHandler::RunAsync to
/// add a child node, aka a step, to the current job to the Root.JobSteps based on the location. /// add a child node, aka a step, to the current job to the Root.JobSteps based on the location.
/// </summary> /// </summary>
public IStep RegisterNestedStep( public IStep CreateCompositeStep(
string scopeName,
IActionRunner step, IActionRunner step,
DictionaryContextData inputsData, DictionaryContextData inputsData,
int location, Dictionary<string, string> envData)
Dictionary<string, string> envData,
bool cleanUp = false)
{ {
// If the context name is empty and the scope name is empty, we would generate a unique scope name for this child in the following format: step.ExecutionContext = Root.CreateChild(_record.Id, _record.Name, _record.Id.ToString("N"), scopeName, step.Action.ContextName, logger: _logger, insideComposite: true, cancellationTokenSource: CancellationTokenSource.CreateLinkedTokenSource(_cancellationTokenSource.Token));
// "__<GUID>"
var safeContextName = !string.IsNullOrEmpty(ContextName) ? ContextName : $"__{Guid.NewGuid()}";
// Set Scope Name. Note, for our design, we consider each step in a composite action to have the same scope
// This makes it much simpler to handle their outputs at the end of the Composite Action
var childScopeName = !string.IsNullOrEmpty(ScopeName) ? $"{ScopeName}.{safeContextName}" : safeContextName;
var childContextName = !string.IsNullOrEmpty(step.Action.ContextName) ? step.Action.ContextName : $"__{Guid.NewGuid()}";
step.ExecutionContext = Root.CreateChild(_record.Id, step.DisplayName, _record.Id.ToString("N"), childScopeName, childContextName, logger: _logger);
step.ExecutionContext.ExpressionValues["inputs"] = inputsData; step.ExecutionContext.ExpressionValues["inputs"] = inputsData;
step.ExecutionContext.ExpressionValues["steps"] = Global.StepsContext.GetScope(step.ExecutionContext.GetFullyQualifiedContextName());
// Set Parent Attribute for Clean Up Step
if (cleanUp)
{
step.ExecutionContext.FinalizeContext = this;
}
// Add the composite action environment variables to each step. // Add the composite action environment variables to each step.
#if OS_WINDOWS #if OS_WINDOWS
@@ -315,23 +279,18 @@ namespace GitHub.Runner.Worker
} }
step.ExecutionContext.ExpressionValues["env"] = envContext; step.ExecutionContext.ExpressionValues["env"] = envContext;
Root.JobSteps.Insert(location, step);
return step; return step;
} }
public IExecutionContext CreateChild(Guid recordId, string displayName, string refName, string scopeName, string contextName, Dictionary<string, string> intraActionState = null, int? recordOrder = null, IPagingLogger logger = null) public IExecutionContext CreateChild(Guid recordId, string displayName, string refName, string scopeName, string contextName, Dictionary<string, string> intraActionState = null, int? recordOrder = null, IPagingLogger logger = null, bool insideComposite = false, CancellationTokenSource cancellationTokenSource = null)
{ {
Trace.Entering(); Trace.Entering();
var child = new ExecutionContext(); var child = new ExecutionContext();
child.Initialize(HostContext); child.Initialize(HostContext);
child.Global = Global;
child.ScopeName = scopeName; child.ScopeName = scopeName;
child.ContextName = contextName; child.ContextName = contextName;
child.Features = Features;
child.Variables = Variables;
child.Endpoints = Endpoints;
child.Plan = Plan;
if (intraActionState == null) if (intraActionState == null)
{ {
child.IntraActionState = new Dictionary<string, string>(StringComparer.OrdinalIgnoreCase); child.IntraActionState = new Dictionary<string, string>(StringComparer.OrdinalIgnoreCase);
@@ -340,10 +299,6 @@ namespace GitHub.Runner.Worker
{ {
child.IntraActionState = intraActionState; child.IntraActionState = intraActionState;
} }
child.EnvironmentVariables = EnvironmentVariables;
child.JobDefaults = JobDefaults;
child.FileTable = FileTable;
child.StepsContext = StepsContext;
foreach (var pair in ExpressionValues) foreach (var pair in ExpressionValues)
{ {
child.ExpressionValues[pair.Key] = pair.Value; child.ExpressionValues[pair.Key] = pair.Value;
@@ -352,12 +307,8 @@ namespace GitHub.Runner.Worker
{ {
child.ExpressionFunctions.Add(item); child.ExpressionFunctions.Add(item);
} }
child._cancellationTokenSource = new CancellationTokenSource(); child._cancellationTokenSource = cancellationTokenSource ?? new CancellationTokenSource();
child.WriteDebug = WriteDebug;
child._parentExecutionContext = this; child._parentExecutionContext = this;
child.PrependPath = PrependPath;
child.Container = Container;
child.ServiceContainers = ServiceContainers;
child.EchoOnActionCommand = EchoOnActionCommand; child.EchoOnActionCommand = EchoOnActionCommand;
if (recordOrder != null) if (recordOrder != null)
@@ -378,6 +329,8 @@ namespace GitHub.Runner.Worker
child._logger.Setup(_mainTimelineId, recordId); child._logger.Setup(_mainTimelineId, recordId);
} }
child.InsideComposite = insideComposite;
return child; return child;
} }
@@ -434,10 +387,11 @@ namespace GitHub.Runner.Worker
_logger.End(); _logger.End();
if (!string.IsNullOrEmpty(ContextName)) // Skip if generated context name. Generated context names start with "__". After M271-ish the server will never send an empty context name.
if (!string.IsNullOrEmpty(ContextName) && !ContextName.StartsWith("__", StringComparison.Ordinal))
{ {
StepsContext.SetOutcome(ScopeName, ContextName, (Outcome ?? Result ?? TaskResult.Succeeded).ToActionResult()); Global.StepsContext.SetOutcome(ScopeName, ContextName, (Outcome ?? Result ?? TaskResult.Succeeded).ToActionResult());
StepsContext.SetConclusion(ScopeName, ContextName, (Result ?? TaskResult.Succeeded).ToActionResult()); Global.StepsContext.SetConclusion(ScopeName, ContextName, (Result ?? TaskResult.Succeeded).ToActionResult());
} }
return Result.Value; return Result.Value;
@@ -496,8 +450,8 @@ namespace GitHub.Runner.Worker
{ {
ArgUtil.NotNullOrEmpty(name, nameof(name)); ArgUtil.NotNullOrEmpty(name, nameof(name));
// if the ContextName follows the __GUID format which is set as the default value for ContextName if null for Composite Actions. // Skip if generated context name. Generated context names start with "__". After M271-ish the server will never send an empty context name.
if (String.IsNullOrEmpty(ContextName) || _generatedContextNamePattern.IsMatch(ContextName)) if (string.IsNullOrEmpty(ContextName) || ContextName.StartsWith("__", StringComparison.Ordinal))
{ {
reference = null; reference = null;
return; return;
@@ -505,7 +459,7 @@ namespace GitHub.Runner.Worker
// todo: restrict multiline? // todo: restrict multiline?
StepsContext.SetOutput(ScopeName, ContextName, name, value, out reference); Global.StepsContext.SetOutput(ScopeName, ContextName, name, value, out reference);
} }
public void SetTimeout(TimeSpan? timeout) public void SetTimeout(TimeSpan? timeout)
@@ -639,33 +593,38 @@ namespace GitHub.Runner.Worker
_cancellationTokenSource = CancellationTokenSource.CreateLinkedTokenSource(token); _cancellationTokenSource = CancellationTokenSource.CreateLinkedTokenSource(token);
Global = new GlobalContext();
// Plan // Plan
Plan = message.Plan; Global.Plan = message.Plan;
Features = PlanUtil.GetFeatures(message.Plan); Global.Features = PlanUtil.GetFeatures(message.Plan);
// Endpoints // Endpoints
Endpoints = message.Resources.Endpoints; Global.Endpoints = message.Resources.Endpoints;
// Variables // Variables
Variables = new Variables(HostContext, message.Variables); Global.Variables = new Variables(HostContext, message.Variables);
// Environment variables shared across all actions // Environment variables shared across all actions
EnvironmentVariables = new Dictionary<string, string>(VarUtil.EnvironmentVariableKeyComparer); Global.EnvironmentVariables = new Dictionary<string, string>(VarUtil.EnvironmentVariableKeyComparer);
// Job defaults shared across all actions // Job defaults shared across all actions
JobDefaults = new Dictionary<string, IDictionary<string, string>>(StringComparer.OrdinalIgnoreCase); Global.JobDefaults = new Dictionary<string, IDictionary<string, string>>(StringComparer.OrdinalIgnoreCase);
// Job Outputs // Job Outputs
JobOutputs = new Dictionary<string, VariableValue>(StringComparer.OrdinalIgnoreCase); JobOutputs = new Dictionary<string, VariableValue>(StringComparer.OrdinalIgnoreCase);
// Actions environment
ActionsEnvironment = message.ActionsEnvironment;
// Service container info // Service container info
ServiceContainers = new List<ContainerInfo>(); Global.ServiceContainers = new List<ContainerInfo>();
// Steps context (StepsRunner manages adding the scoped steps context) // Steps context (StepsRunner manages adding the scoped steps context)
StepsContext = new StepsContext(); Global.StepsContext = new StepsContext();
// File table // File table
FileTable = new List<String>(message.FileTable ?? new string[0]); Global.FileTable = new List<String>(message.FileTable ?? new string[0]);
// Expression values // Expression values
if (message.ContextData?.Count > 0) if (message.ContextData?.Count > 0)
@@ -676,15 +635,15 @@ namespace GitHub.Runner.Worker
} }
} }
ExpressionValues["secrets"] = Variables.ToSecretsContext(); ExpressionValues["secrets"] = Global.Variables.ToSecretsContext();
ExpressionValues["runner"] = new RunnerContext(); ExpressionValues["runner"] = new RunnerContext();
ExpressionValues["job"] = new JobContext(); ExpressionValues["job"] = new JobContext();
Trace.Info("Initialize GitHub context"); Trace.Info("Initialize GitHub context");
var githubAccessToken = new StringContextData(Variables.Get("system.github.token")); var githubAccessToken = new StringContextData(Global.Variables.Get("system.github.token"));
var base64EncodedToken = Convert.ToBase64String(Encoding.UTF8.GetBytes($"x-access-token:{githubAccessToken}")); var base64EncodedToken = Convert.ToBase64String(Encoding.UTF8.GetBytes($"x-access-token:{githubAccessToken}"));
HostContext.SecretMasker.AddValue(base64EncodedToken); HostContext.SecretMasker.AddValue(base64EncodedToken);
var githubJob = Variables.Get("system.github.job"); var githubJob = Global.Variables.Get("system.github.job");
var githubContext = new GitHubContext(); var githubContext = new GitHubContext();
githubContext["token"] = githubAccessToken; githubContext["token"] = githubAccessToken;
if (!string.IsNullOrEmpty(githubJob)) if (!string.IsNullOrEmpty(githubJob))
@@ -707,10 +666,10 @@ namespace GitHub.Runner.Worker
#endif #endif
// Prepend Path // Prepend Path
PrependPath = new List<string>(); Global.PrependPath = new List<string>();
// JobSteps for job ExecutionContext // JobSteps for job ExecutionContext
JobSteps = new List<IStep>(); JobSteps = new Queue<IStep>();
// PostJobSteps for job ExecutionContext // PostJobSteps for job ExecutionContext
PostJobSteps = new Stack<IStep>(); PostJobSteps = new Stack<IStep>();
@@ -733,10 +692,10 @@ namespace GitHub.Runner.Worker
_logger.Setup(_mainTimelineId, _record.Id); _logger.Setup(_mainTimelineId, _record.Id);
// Initialize 'echo on action command success' property, default to false, unless Step_Debug is set // Initialize 'echo on action command success' property, default to false, unless Step_Debug is set
EchoOnActionCommand = Variables.Step_Debug ?? false; EchoOnActionCommand = Global.Variables.Step_Debug ?? false;
// Verbosity (from GitHub.Step_Debug). // Verbosity (from GitHub.Step_Debug).
WriteDebug = Variables.Step_Debug ?? false; Global.WriteDebug = Global.Variables.Step_Debug ?? false;
// Hook up JobServerQueueThrottling event, we will log warning on server tarpit. // Hook up JobServerQueueThrottling event, we will log warning on server tarpit.
_jobServerQueue.JobServerQueueThrottling += JobServerQueueThrottling_EventReceived; _jobServerQueue.JobServerQueueThrottling += JobServerQueueThrottling_EventReceived;
@@ -764,7 +723,7 @@ namespace GitHub.Runner.Worker
} }
} }
_jobServerQueue.QueueWebConsoleLine(_record.Id, msg); _jobServerQueue.QueueWebConsoleLine(_record.Id, msg, totalLines);
return totalLines; return totalLines;
} }
@@ -937,6 +896,16 @@ namespace GitHub.Runner.Worker
// Otherwise individual overloads would need to be implemented (depending on the unit test). // Otherwise individual overloads would need to be implemented (depending on the unit test).
public static class ExecutionContextExtension public static class ExecutionContextExtension
{ {
public static string GetFullyQualifiedContextName(this IExecutionContext context)
{
if (!string.IsNullOrEmpty(context.ScopeName))
{
return $"{context.ScopeName}.{context.ContextName}";
}
return context.ContextName;
}
public static void Error(this IExecutionContext context, Exception ex) public static void Error(this IExecutionContext context, Exception ex)
{ {
context.Error(ex.Message); context.Error(ex.Message);
@@ -949,6 +918,12 @@ namespace GitHub.Runner.Worker
context.AddIssue(new Issue() { Type = IssueType.Error, Message = message }); context.AddIssue(new Issue() { Type = IssueType.Error, Message = message });
} }
// Do not add a format string overload. See comment on ExecutionContext.Write().
public static void InfrastructureError(this IExecutionContext context, string message)
{
context.AddIssue(new Issue() { Type = IssueType.Error, Message = message, IsInfrastructureIssue = true});
}
// Do not add a format string overload. See comment on ExecutionContext.Write(). // Do not add a format string overload. See comment on ExecutionContext.Write().
public static void Warning(this IExecutionContext context, string message) public static void Warning(this IExecutionContext context, string message)
{ {
@@ -975,7 +950,7 @@ namespace GitHub.Runner.Worker
// Do not add a format string overload. See comment on ExecutionContext.Write(). // Do not add a format string overload. See comment on ExecutionContext.Write().
public static void Debug(this IExecutionContext context, string message) public static void Debug(this IExecutionContext context, string message)
{ {
if (context.WriteDebug) if (context.Global.WriteDebug)
{ {
var multilines = message?.Replace("\r\n", "\n")?.Split("\n"); var multilines = message?.Replace("\r\n", "\n")?.Split("\n");
if (multilines != null) if (multilines != null)
@@ -1000,7 +975,7 @@ namespace GitHub.Runner.Worker
traceWriter = context.ToTemplateTraceWriter(); traceWriter = context.ToTemplateTraceWriter();
} }
var schema = PipelineTemplateSchemaFactory.GetSchema(); var schema = PipelineTemplateSchemaFactory.GetSchema();
return new PipelineTemplateEvaluator(traceWriter, schema, context.FileTable); return new PipelineTemplateEvaluator(traceWriter, schema, context.Global.FileTable);
} }
public static ObjectTemplating.ITraceWriter ToTemplateTraceWriter(this IExecutionContext context) public static ObjectTemplating.ITraceWriter ToTemplateTraceWriter(this IExecutionContext context)

View File

@@ -0,0 +1,262 @@
using GitHub.DistributedTask.WebApi;
using GitHub.Runner.Worker.Container;
using GitHub.Runner.Common;
using GitHub.Runner.Sdk;
using System;
using System.Collections;
using System.Collections.Generic;
using System.IO;
using System.Text;
namespace GitHub.Runner.Worker
{
[ServiceLocator(Default = typeof(FileCommandManager))]
public interface IFileCommandManager : IRunnerService
{
void InitializeFiles(IExecutionContext context, ContainerInfo container);
void ProcessFiles(IExecutionContext context, ContainerInfo container);
}
public sealed class FileCommandManager : RunnerService, IFileCommandManager
{
private const string _folderName = "_runner_file_commands";
private List<IFileCommandExtension> _commandExtensions;
private string _fileSuffix = String.Empty;
private string _fileCommandDirectory;
private Tracing _trace;
public override void Initialize(IHostContext hostContext)
{
base.Initialize(hostContext);
_trace = HostContext.GetTrace(nameof(FileCommandManager));
_fileCommandDirectory = Path.Combine(HostContext.GetDirectory(WellKnownDirectory.Temp), _folderName);
if (!Directory.Exists(_fileCommandDirectory))
{
Directory.CreateDirectory(_fileCommandDirectory);
}
var extensionManager = hostContext.GetService<IExtensionManager>();
_commandExtensions = extensionManager.GetExtensions<IFileCommandExtension>() ?? new List<IFileCommandExtension>();
}
public void InitializeFiles(IExecutionContext context, ContainerInfo container)
{
var oldSuffix = _fileSuffix;
_fileSuffix = Guid.NewGuid().ToString();
foreach (var fileCommand in _commandExtensions)
{
var oldPath = Path.Combine(_fileCommandDirectory, fileCommand.FilePrefix + oldSuffix);
if (oldSuffix != String.Empty && File.Exists(oldPath))
{
TryDeleteFile(oldPath);
}
var newPath = Path.Combine(_fileCommandDirectory, fileCommand.FilePrefix + _fileSuffix);
TryDeleteFile(newPath);
File.Create(newPath).Dispose();
var pathToSet = container != null ? container.TranslateToContainerPath(newPath) : newPath;
context.SetGitHubContext(fileCommand.ContextName, pathToSet);
}
}
public void ProcessFiles(IExecutionContext context, ContainerInfo container)
{
foreach (var fileCommand in _commandExtensions)
{
try
{
fileCommand.ProcessCommand(context, Path.Combine(_fileCommandDirectory, fileCommand.FilePrefix + _fileSuffix),container);
}
catch (Exception ex)
{
context.Error($"Unable to process file command '{fileCommand.ContextName}' successfully.");
context.Error(ex);
context.CommandResult = TaskResult.Failed;
}
}
}
private bool TryDeleteFile(string path)
{
if (!File.Exists(path))
{
return true;
}
try
{
File.Delete(path);
}
catch (Exception e)
{
_trace.Warning($"Unable to delete file {path} for reason: {e.ToString()}");
return false;
}
return true;
}
}
public interface IFileCommandExtension : IExtension
{
string ContextName { get; }
string FilePrefix { get; }
void ProcessCommand(IExecutionContext context, string filePath, ContainerInfo container);
}
public sealed class AddPathFileCommand : RunnerService, IFileCommandExtension
{
public string ContextName => "path";
public string FilePrefix => "add_path_";
public Type ExtensionType => typeof(IFileCommandExtension);
public void ProcessCommand(IExecutionContext context, string filePath, ContainerInfo container)
{
if (File.Exists(filePath))
{
var lines = File.ReadAllLines(filePath, Encoding.UTF8);
foreach(var line in lines)
{
if (line == string.Empty)
{
continue;
}
context.Global.PrependPath.RemoveAll(x => string.Equals(x, line, StringComparison.CurrentCulture));
context.Global.PrependPath.Add(line);
}
}
}
}
public sealed class SetEnvFileCommand : RunnerService, IFileCommandExtension
{
public string ContextName => "env";
public string FilePrefix => "set_env_";
public Type ExtensionType => typeof(IFileCommandExtension);
public void ProcessCommand(IExecutionContext context, string filePath, ContainerInfo container)
{
try
{
var text = File.ReadAllText(filePath) ?? string.Empty;
var index = 0;
var line = ReadLine(text, ref index);
while (line != null)
{
if (!string.IsNullOrEmpty(line))
{
var equalsIndex = line.IndexOf("=", StringComparison.Ordinal);
var heredocIndex = line.IndexOf("<<", StringComparison.Ordinal);
// Normal style NAME=VALUE
if (equalsIndex >= 0 && (heredocIndex < 0 || equalsIndex < heredocIndex))
{
var split = line.Split(new[] { '=' }, 2, StringSplitOptions.None);
if (string.IsNullOrEmpty(line))
{
throw new Exception($"Invalid environment variable format '{line}'. Environment variable name must not be empty");
}
SetEnvironmentVariable(context, split[0], split[1]);
}
// Heredoc style NAME<<EOF
else if (heredocIndex >= 0 && (equalsIndex < 0 || heredocIndex < equalsIndex))
{
var split = line.Split(new[] { "<<" }, 2, StringSplitOptions.None);
if (string.IsNullOrEmpty(split[0]) || string.IsNullOrEmpty(split[1]))
{
throw new Exception($"Invalid environment variable format '{line}'. Environment variable name must not be empty and delimiter must not be empty");
}
var name = split[0];
var delimiter = split[1];
var startIndex = index; // Start index of the value (inclusive)
var endIndex = index; // End index of the value (exclusive)
var tempLine = ReadLine(text, ref index, out var newline);
while (!string.Equals(tempLine, delimiter, StringComparison.Ordinal))
{
if (tempLine == null)
{
throw new Exception($"Invalid environment variable value. Matching delimiter not found '{delimiter}'");
}
endIndex = index - newline.Length;
tempLine = ReadLine(text, ref index, out newline);
}
var value = endIndex > startIndex ? text.Substring(startIndex, endIndex - startIndex) : string.Empty;
SetEnvironmentVariable(context, name, value);
}
else
{
throw new Exception($"Invalid environment variable format '{line}'");
}
}
line = ReadLine(text, ref index);
}
}
catch (DirectoryNotFoundException)
{
context.Debug($"Environment variables file does not exist '{filePath}'");
}
catch (FileNotFoundException)
{
context.Debug($"Environment variables file does not exist '{filePath}'");
}
}
private static void SetEnvironmentVariable(
IExecutionContext context,
string name,
string value)
{
context.Global.EnvironmentVariables[name] = value;
context.SetEnvContext(name, value);
context.Debug($"{name}='{value}'");
}
private static string ReadLine(
string text,
ref int index)
{
return ReadLine(text, ref index, out _);
}
private static string ReadLine(
string text,
ref int index,
out string newline)
{
if (index >= text.Length)
{
newline = null;
return null;
}
var originalIndex = index;
var lfIndex = text.IndexOf("\n", index, StringComparison.Ordinal);
if (lfIndex < 0)
{
index = text.Length;
newline = null;
return text.Substring(originalIndex);
}
#if OS_WINDOWS
var crLFIndex = text.IndexOf("\r\n", index, StringComparison.Ordinal);
if (crLFIndex >= 0 && crLFIndex < lfIndex)
{
index = crLFIndex + 2; // Skip over CRLF
newline = "\r\n";
return text.Substring(originalIndex, crLFIndex - originalIndex);
}
#endif
index = lfIndex + 1; // Skip over LF
newline = "\n";
return text.Substring(originalIndex, lfIndex - originalIndex);
}
}
}

View File

@@ -6,20 +6,26 @@ namespace GitHub.Runner.Worker
{ {
public sealed class GitHubContext : DictionaryContextData, IEnvironmentContextData public sealed class GitHubContext : DictionaryContextData, IEnvironmentContextData
{ {
private readonly HashSet<string> _contextEnvWhitelist = new HashSet<string>(StringComparer.OrdinalIgnoreCase) private readonly HashSet<string> _contextEnvAllowlist = new HashSet<string>(StringComparer.OrdinalIgnoreCase)
{ {
"action", "action",
"action_path",
"action_ref",
"action_repository",
"actor", "actor",
"api_url", "api_url",
"base_ref", "base_ref",
"env",
"event_name", "event_name",
"event_path", "event_path",
"graphql_url", "graphql_url",
"head_ref", "head_ref",
"job", "job",
"path",
"ref", "ref",
"repository", "repository",
"repository_owner", "repository_owner",
"retention_days",
"run_id", "run_id",
"run_number", "run_number",
"server_url", "server_url",
@@ -32,11 +38,23 @@ namespace GitHub.Runner.Worker
{ {
foreach (var data in this) foreach (var data in this)
{ {
if (_contextEnvWhitelist.Contains(data.Key) && data.Value is StringContextData value) if (_contextEnvAllowlist.Contains(data.Key) && data.Value is StringContextData value)
{ {
yield return new KeyValuePair<string, string>($"GITHUB_{data.Key.ToUpperInvariant()}", value); yield return new KeyValuePair<string, string>($"GITHUB_{data.Key.ToUpperInvariant()}", value);
} }
} }
} }
public GitHubContext ShallowCopy()
{
var copy = new GitHubContext();
foreach (var pair in this)
{
copy[pair.Key] = pair.Value;
}
return copy;
}
} }
} }

View File

@@ -0,0 +1,24 @@
using System;
using System.Collections.Generic;
using GitHub.DistributedTask.WebApi;
using GitHub.Runner.Common.Util;
using GitHub.Runner.Worker.Container;
namespace GitHub.Runner.Worker
{
public sealed class GlobalContext
{
public ContainerInfo Container { get; set; }
public List<ServiceEndpoint> Endpoints { get; set; }
public IDictionary<String, String> EnvironmentVariables { get; set; }
public PlanFeatures Features { get; set; }
public IList<String> FileTable { get; set; }
public IDictionary<String, IDictionary<String, String>> JobDefaults { get; set; }
public TaskOrchestrationPlanReference Plan { get; set; }
public List<string> PrependPath { get; set; }
public List<ContainerInfo> ServiceContainers { get; set; }
public StepsContext StepsContext { get; set; }
public Variables Variables { get; set; }
public bool WriteDebug { get; set; }
}
}

View File

@@ -3,6 +3,7 @@ using System.Collections.Generic;
using System.IO; using System.IO;
using System.Linq; using System.Linq;
using System.Text; using System.Text;
using System.Threading;
using System.Threading.Tasks; using System.Threading.Tasks;
using GitHub.DistributedTask.ObjectTemplating.Tokens; using GitHub.DistributedTask.ObjectTemplating.Tokens;
using GitHub.DistributedTask.Pipelines.ContextData; using GitHub.DistributedTask.Pipelines.ContextData;
@@ -23,17 +24,13 @@ namespace GitHub.Runner.Worker.Handlers
{ {
public CompositeActionExecutionData Data { get; set; } public CompositeActionExecutionData Data { get; set; }
public Task RunAsync(ActionRunStage stage) public async Task RunAsync(ActionRunStage stage)
{ {
// Validate args. // Validate args.
Trace.Entering(); Trace.Entering();
ArgUtil.NotNull(ExecutionContext, nameof(ExecutionContext)); ArgUtil.NotNull(ExecutionContext, nameof(ExecutionContext));
ArgUtil.NotNull(Inputs, nameof(Inputs)); ArgUtil.NotNull(Inputs, nameof(Inputs));
ArgUtil.NotNull(Data.Steps, nameof(Data.Steps));
var githubContext = ExecutionContext.ExpressionValues["github"] as GitHubContext;
ArgUtil.NotNull(githubContext, nameof(githubContext));
var tempDirectory = HostContext.GetDirectory(WellKnownDirectory.Temp);
// Resolve action steps // Resolve action steps
var actionSteps = Data.Steps; var actionSteps = Data.Steps;
@@ -45,73 +42,241 @@ namespace GitHub.Runner.Worker.Handlers
inputsData[i.Key] = new StringContextData(i.Value); inputsData[i.Key] = new StringContextData(i.Value);
} }
// Add each composite action step to the front of the queue // Initialize Composite Steps List of Steps
int location = 0; var compositeSteps = new List<IStep>();
foreach (Pipelines.ActionStep aStep in actionSteps) // Temporary hack until after M271-ish. After M271-ish the server will never send an empty
// context name. Generated context names start with "__"
var childScopeName = ExecutionContext.GetFullyQualifiedContextName();
if (string.IsNullOrEmpty(childScopeName))
{ {
// Ex: childScopeName = $"__{Guid.NewGuid()}";
// runs: }
// using: "composite"
// steps:
// - uses: example/test-composite@v2 (a)
// - run echo hello world (b)
// - run echo hello world 2 (c)
//
// ethanchewy/test-composite/action.yaml
// runs:
// using: "composite"
// steps:
// - run echo hello world 3 (d)
// - run echo hello world 4 (e)
//
// Steps processed as follow:
// | a |
// | a | => | d |
// (Run step d)
// | a |
// | a | => | e |
// (Run step e)
// | a |
// (Run step a)
// | b |
// (Run step b)
// | c |
// (Run step c)
// Done.
foreach (Pipelines.ActionStep actionStep in actionSteps)
{
var actionRunner = HostContext.CreateService<IActionRunner>(); var actionRunner = HostContext.CreateService<IActionRunner>();
actionRunner.Action = aStep; actionRunner.Action = actionStep;
actionRunner.Stage = stage; actionRunner.Stage = stage;
actionRunner.Condition = aStep.Condition; actionRunner.Condition = actionStep.Condition;
var step = ExecutionContext.RegisterNestedStep(actionRunner, inputsData, location, Environment); var step = ExecutionContext.CreateCompositeStep(childScopeName, actionRunner, inputsData, Environment);
InitializeScope(step); // Shallow copy github context
var gitHubContext = step.ExecutionContext.ExpressionValues["github"] as GitHubContext;
ArgUtil.NotNull(gitHubContext, nameof(gitHubContext));
gitHubContext = gitHubContext.ShallowCopy();
step.ExecutionContext.ExpressionValues["github"] = gitHubContext;
location++; // Set GITHUB_ACTION_PATH
step.ExecutionContext.SetGitHubContext("action_path", ActionDirectory);
compositeSteps.Add(step);
} }
// Create a step that handles all the composite action steps' outputs try
Pipelines.ActionStep cleanOutputsStep = new Pipelines.ActionStep();
cleanOutputsStep.ContextName = ExecutionContext.ContextName;
// Use the same reference type as our composite steps.
cleanOutputsStep.Reference = Action;
var actionRunner2 = HostContext.CreateService<IActionRunner>();
actionRunner2.Action = cleanOutputsStep;
actionRunner2.Stage = ActionRunStage.Main;
actionRunner2.Condition = "always()";
ExecutionContext.RegisterNestedStep(actionRunner2, inputsData, location, Environment, true);
return Task.CompletedTask;
}
private void InitializeScope(IStep step)
{ {
var stepsContext = step.ExecutionContext.StepsContext; // This is where we run each step.
var scopeName = step.ExecutionContext.ScopeName; await RunStepsAsync(compositeSteps);
step.ExecutionContext.ExpressionValues["steps"] = stepsContext.GetScope(scopeName);
// Get the pointer of the correct "steps" object and pass it to the ExecutionContext so that we can process the outputs correctly
ExecutionContext.ExpressionValues["inputs"] = inputsData;
ExecutionContext.ExpressionValues["steps"] = ExecutionContext.Global.StepsContext.GetScope(ExecutionContext.GetFullyQualifiedContextName());
ProcessCompositeActionOutputs();
ExecutionContext.Global.StepsContext.ClearScope(childScopeName);
}
catch (Exception ex)
{
// Composite StepRunner should never throw exception out.
Trace.Error($"Caught exception from composite steps {nameof(CompositeActionHandler)}: {ex}");
ExecutionContext.Error(ex);
ExecutionContext.Result = TaskResult.Failed;
}
}
private void ProcessCompositeActionOutputs()
{
ArgUtil.NotNull(ExecutionContext, nameof(ExecutionContext));
// Evaluate the mapped outputs value
if (Data.Outputs != null)
{
// Evaluate the outputs in the steps context to easily retrieve the values
var actionManifestManager = HostContext.GetService<IActionManifestManager>();
// Format ExpressionValues to Dictionary<string, PipelineContextData>
var evaluateContext = new Dictionary<string, PipelineContextData>(StringComparer.OrdinalIgnoreCase);
foreach (var pair in ExecutionContext.ExpressionValues)
{
evaluateContext[pair.Key] = pair.Value;
}
// Get the evluated composite outputs' values mapped to the outputs named
DictionaryContextData actionOutputs = actionManifestManager.EvaluateCompositeOutputs(ExecutionContext, Data.Outputs, evaluateContext);
// Set the outputs for the outputs object in the whole composite action
// Each pair is structured like this
// We ignore "description" for now
// {
// "the-output-name": {
// "description": "",
// "value": "the value"
// },
// ...
// }
foreach (var pair in actionOutputs)
{
var outputsName = pair.Key;
var outputsAttributes = pair.Value as DictionaryContextData;
outputsAttributes.TryGetValue("value", out var val);
if (val != null)
{
var outputsValue = val as StringContextData;
// Set output in the whole composite scope.
if (!String.IsNullOrEmpty(outputsValue))
{
ExecutionContext.SetOutput(outputsName, outputsValue, out _);
}
else
{
ExecutionContext.SetOutput(outputsName, "", out _);
}
}
}
}
}
private async Task RunStepsAsync(List<IStep> compositeSteps)
{
ArgUtil.NotNull(compositeSteps, nameof(compositeSteps));
// The parent StepsRunner of the whole Composite Action Step handles the cancellation stuff already.
foreach (IStep step in compositeSteps)
{
Trace.Info($"Processing composite step: DisplayName='{step.DisplayName}'");
step.ExecutionContext.ExpressionValues["steps"] = ExecutionContext.Global.StepsContext.GetScope(step.ExecutionContext.ScopeName);
// Populate env context for each step
Trace.Info("Initialize Env context for step");
#if OS_WINDOWS
var envContext = new DictionaryContextData();
#else
var envContext = new CaseSensitiveDictionaryContextData();
#endif
// Global env
foreach (var pair in ExecutionContext.Global.EnvironmentVariables)
{
envContext[pair.Key] = new StringContextData(pair.Value ?? string.Empty);
}
// Stomps over with outside step env
if (step.ExecutionContext.ExpressionValues.TryGetValue("env", out var envContextData))
{
#if OS_WINDOWS
var dict = envContextData as DictionaryContextData;
#else
var dict = envContextData as CaseSensitiveDictionaryContextData;
#endif
foreach (var pair in dict)
{
envContext[pair.Key] = pair.Value;
}
}
step.ExecutionContext.ExpressionValues["env"] = envContext;
var actionStep = step as IActionRunner;
try
{
// Evaluate and merge action's env block to env context
var templateEvaluator = step.ExecutionContext.ToPipelineTemplateEvaluator();
var actionEnvironment = templateEvaluator.EvaluateStepEnvironment(actionStep.Action.Environment, step.ExecutionContext.ExpressionValues, step.ExecutionContext.ExpressionFunctions, Common.Util.VarUtil.EnvironmentVariableKeyComparer);
foreach (var env in actionEnvironment)
{
envContext[env.Key] = new StringContextData(env.Value ?? string.Empty);
}
}
catch (Exception ex)
{
// fail the step since there is an evaluate error.
Trace.Info("Caught exception in Composite Steps Runner from expression for step.env");
// evaluateStepEnvFailed = true;
step.ExecutionContext.Error(ex);
step.ExecutionContext.Complete(TaskResult.Failed);
}
await RunStepAsync(step);
// Directly after the step, check if the step has failed or cancelled
// If so, return that to the output
if (step.ExecutionContext.Result == TaskResult.Failed || step.ExecutionContext.Result == TaskResult.Canceled)
{
ExecutionContext.Result = step.ExecutionContext.Result;
break;
}
// TODO: Add compat for other types of steps.
}
// Completion Status handled by StepsRunner for the whole Composite Action Step
}
private async Task RunStepAsync(IStep step)
{
// Start the step.
Trace.Info("Starting the step.");
step.ExecutionContext.Debug($"Starting: {step.DisplayName}");
// TODO: Fix for Step Level Timeout Attributes for an individual Composite Run Step
// For now, we are not going to support this for an individual composite run step
var templateEvaluator = step.ExecutionContext.ToPipelineTemplateEvaluator();
await Common.Util.EncodingUtil.SetEncoding(HostContext, Trace, step.ExecutionContext.CancellationToken);
try
{
await step.RunAsync();
}
catch (OperationCanceledException ex)
{
if (step.ExecutionContext.CancellationToken.IsCancellationRequested &&
!ExecutionContext.Root.CancellationToken.IsCancellationRequested)
{
Trace.Error($"Caught timeout exception from step: {ex.Message}");
step.ExecutionContext.Error("The action has timed out.");
step.ExecutionContext.Result = TaskResult.Failed;
}
else
{
Trace.Error($"Caught cancellation exception from step: {ex}");
step.ExecutionContext.Error(ex);
step.ExecutionContext.Result = TaskResult.Canceled;
}
}
catch (Exception ex)
{
// Log the error and fail the step.
Trace.Error($"Caught exception from step: {ex}");
step.ExecutionContext.Error(ex);
step.ExecutionContext.Result = TaskResult.Failed;
}
// Merge execution context result with command result
if (step.ExecutionContext.CommandResult != null)
{
step.ExecutionContext.Result = Common.Util.TaskResultUtil.MergeTaskResults(step.ExecutionContext.Result, step.ExecutionContext.CommandResult.Value);
}
Trace.Info($"Step result: {step.ExecutionContext.Result}");
// Complete the step context.
step.ExecutionContext.Debug($"Finishing: {step.DisplayName}");
} }
} }
} }

View File

@@ -1,53 +0,0 @@
using System;
using System.Collections.Generic;
using System.IO;
using System.Linq;
using System.Text;
using System.Threading.Tasks;
using GitHub.DistributedTask.ObjectTemplating.Schema;
using GitHub.DistributedTask.ObjectTemplating.Tokens;
using GitHub.DistributedTask.Pipelines.ContextData;
using GitHub.DistributedTask.WebApi;
using GitHub.Runner.Common;
using GitHub.Runner.Sdk;
using Pipelines = GitHub.DistributedTask.Pipelines;
namespace GitHub.Runner.Worker.Handlers
{
[ServiceLocator(Default = typeof(CompositeActionOutputHandler))]
public interface ICompositeActionOutputHandler : IHandler
{
CompositeActionExecutionData Data { get; set; }
}
public sealed class CompositeActionOutputHandler : Handler, ICompositeActionOutputHandler
{
public CompositeActionExecutionData Data { get; set; }
public Task RunAsync(ActionRunStage stage)
{
// Evaluate the mapped outputs value
if (Data.Outputs != null)
{
// Evaluate the outputs in the steps context to easily retrieve the values
var actionManifestManager = HostContext.GetService<IActionManifestManager>();
// Format ExpressionValues to Dictionary<string, PipelineContextData>
var evaluateContext = new Dictionary<string, PipelineContextData>(StringComparer.OrdinalIgnoreCase);
foreach (var pair in ExecutionContext.ExpressionValues)
{
evaluateContext[pair.Key] = pair.Value;
}
// Get the evluated composite outputs' values mapped to the outputs named
DictionaryContextData actionOutputs = actionManifestManager.EvaluateCompositeOutputs(ExecutionContext, Data.Outputs, evaluateContext);
// Set the outputs for the outputs object in the whole composite action
actionManifestManager.SetAllCompositeOutputs(ExecutionContext.FinalizeContext, actionOutputs);
}
return Task.CompletedTask;
}
}
}

View File

@@ -49,8 +49,9 @@ namespace GitHub.Runner.Worker.Handlers
// ensure docker file exist // ensure docker file exist
var dockerFile = Path.Combine(ActionDirectory, Data.Image); var dockerFile = Path.Combine(ActionDirectory, Data.Image);
ArgUtil.File(dockerFile, nameof(Data.Image)); ArgUtil.File(dockerFile, nameof(Data.Image));
ExecutionContext.Output($"Dockerfile for action: '{dockerFile}'.");
ExecutionContext.Output($"##[group]Building docker image");
ExecutionContext.Output($"Dockerfile for action: '{dockerFile}'.");
var imageName = $"{dockerManger.DockerInstanceLabel}:{ExecutionContext.Id.ToString("N")}"; var imageName = $"{dockerManger.DockerInstanceLabel}:{ExecutionContext.Id.ToString("N")}";
var buildExitCode = await dockerManger.DockerBuild( var buildExitCode = await dockerManger.DockerBuild(
ExecutionContext, ExecutionContext,
@@ -58,6 +59,8 @@ namespace GitHub.Runner.Worker.Handlers
dockerFile, dockerFile,
Directory.GetParent(dockerFile).FullName, Directory.GetParent(dockerFile).FullName,
imageName); imageName);
ExecutionContext.Output("##[endgroup]");
if (buildExitCode != 0) if (buildExitCode != 0)
{ {
throw new InvalidOperationException($"Docker build failed with exit code {buildExitCode}"); throw new InvalidOperationException($"Docker build failed with exit code {buildExitCode}");
@@ -67,7 +70,7 @@ namespace GitHub.Runner.Worker.Handlers
} }
// run container // run container
var container = new ContainerInfo() var container = new ContainerInfo(HostContext)
{ {
ContainerImage = Data.Image, ContainerImage = Data.Image,
ContainerName = ExecutionContext.Id.ToString("N"), ContainerName = ExecutionContext.Id.ToString("N"),
@@ -158,16 +161,21 @@ namespace GitHub.Runner.Worker.Handlers
Directory.CreateDirectory(tempHomeDirectory); Directory.CreateDirectory(tempHomeDirectory);
this.Environment["HOME"] = tempHomeDirectory; this.Environment["HOME"] = tempHomeDirectory;
var tempFileCommandDirectory = Path.Combine(tempDirectory, "_runner_file_commands");
ArgUtil.Directory(tempFileCommandDirectory, nameof(tempFileCommandDirectory));
var tempWorkflowDirectory = Path.Combine(tempDirectory, "_github_workflow"); var tempWorkflowDirectory = Path.Combine(tempDirectory, "_github_workflow");
ArgUtil.Directory(tempWorkflowDirectory, nameof(tempWorkflowDirectory)); ArgUtil.Directory(tempWorkflowDirectory, nameof(tempWorkflowDirectory));
container.MountVolumes.Add(new MountVolume("/var/run/docker.sock", "/var/run/docker.sock")); container.MountVolumes.Add(new MountVolume("/var/run/docker.sock", "/var/run/docker.sock"));
container.MountVolumes.Add(new MountVolume(tempHomeDirectory, "/github/home")); container.MountVolumes.Add(new MountVolume(tempHomeDirectory, "/github/home"));
container.MountVolumes.Add(new MountVolume(tempWorkflowDirectory, "/github/workflow")); container.MountVolumes.Add(new MountVolume(tempWorkflowDirectory, "/github/workflow"));
container.MountVolumes.Add(new MountVolume(tempFileCommandDirectory, "/github/file_commands"));
container.MountVolumes.Add(new MountVolume(defaultWorkingDirectory, "/github/workspace")); container.MountVolumes.Add(new MountVolume(defaultWorkingDirectory, "/github/workspace"));
container.AddPathTranslateMapping(tempHomeDirectory, "/github/home"); container.AddPathTranslateMapping(tempHomeDirectory, "/github/home");
container.AddPathTranslateMapping(tempWorkflowDirectory, "/github/workflow"); container.AddPathTranslateMapping(tempWorkflowDirectory, "/github/workflow");
container.AddPathTranslateMapping(tempFileCommandDirectory, "/github/file_commands");
container.AddPathTranslateMapping(defaultWorkingDirectory, "/github/workspace"); container.AddPathTranslateMapping(defaultWorkingDirectory, "/github/workspace");
container.ContainerWorkDirectory = "/github/workspace"; container.ContainerWorkDirectory = "/github/workspace";
@@ -185,7 +193,7 @@ namespace GitHub.Runner.Worker.Handlers
} }
// Add Actions Runtime server info // Add Actions Runtime server info
var systemConnection = ExecutionContext.Endpoints.Single(x => string.Equals(x.Name, WellKnownServiceEndpointNames.SystemVssConnection, StringComparison.OrdinalIgnoreCase)); var systemConnection = ExecutionContext.Global.Endpoints.Single(x => string.Equals(x.Name, WellKnownServiceEndpointNames.SystemVssConnection, StringComparison.OrdinalIgnoreCase));
Environment["ACTIONS_RUNTIME_URL"] = systemConnection.Url.AbsoluteUri; Environment["ACTIONS_RUNTIME_URL"] = systemConnection.Url.AbsoluteUri;
Environment["ACTIONS_RUNTIME_TOKEN"] = systemConnection.Authorization.Parameters[EndpointAuthorizationParameters.AccessToken]; Environment["ACTIONS_RUNTIME_TOKEN"] = systemConnection.Authorization.Parameters[EndpointAuthorizationParameters.AccessToken];
if (systemConnection.Data.TryGetValue("CacheServerUrl", out var cacheUrl) && !string.IsNullOrEmpty(cacheUrl)) if (systemConnection.Data.TryGetValue("CacheServerUrl", out var cacheUrl) && !string.IsNullOrEmpty(cacheUrl))

View File

@@ -148,14 +148,14 @@ namespace GitHub.Runner.Worker.Handlers
{ {
// Validate args. // Validate args.
Trace.Entering(); Trace.Entering();
ArgUtil.NotNull(ExecutionContext.PrependPath, nameof(ExecutionContext.PrependPath)); ArgUtil.NotNull(ExecutionContext.Global.PrependPath, nameof(ExecutionContext.Global.PrependPath));
if (ExecutionContext.PrependPath.Count == 0) if (ExecutionContext.Global.PrependPath.Count == 0)
{ {
return; return;
} }
// Prepend path. // Prepend path.
string prepend = string.Join(Path.PathSeparator.ToString(), ExecutionContext.PrependPath.Reverse<string>()); string prepend = string.Join(Path.PathSeparator.ToString(), ExecutionContext.Global.PrependPath.Reverse<string>());
var containerStepHost = StepHost as ContainerStepHost; var containerStepHost = StepHost as ContainerStepHost;
if (containerStepHost != null) if (containerStepHost != null)
{ {

View File

@@ -67,19 +67,11 @@ namespace GitHub.Runner.Worker.Handlers
(handler as IRunnerPluginHandler).Data = data as PluginActionExecutionData; (handler as IRunnerPluginHandler).Data = data as PluginActionExecutionData;
} }
else if (data.ExecutionType == ActionExecutionType.Composite) else if (data.ExecutionType == ActionExecutionType.Composite)
{
if (executionContext.FinalizeContext == null)
{ {
handler = HostContext.CreateService<ICompositeActionHandler>(); handler = HostContext.CreateService<ICompositeActionHandler>();
(handler as ICompositeActionHandler).Data = data as CompositeActionExecutionData; (handler as ICompositeActionHandler).Data = data as CompositeActionExecutionData;
} }
else else
{
handler = HostContext.CreateService<ICompositeActionOutputHandler>();
(handler as ICompositeActionOutputHandler).Data = data as CompositeActionExecutionData;
}
}
else
{ {
// This should never happen. // This should never happen.
throw new NotSupportedException(data.ExecutionType.ToString()); throw new NotSupportedException(data.ExecutionType.ToString());

View File

@@ -46,7 +46,7 @@ namespace GitHub.Runner.Worker.Handlers
} }
// Add Actions Runtime server info // Add Actions Runtime server info
var systemConnection = ExecutionContext.Endpoints.Single(x => string.Equals(x.Name, WellKnownServiceEndpointNames.SystemVssConnection, StringComparison.OrdinalIgnoreCase)); var systemConnection = ExecutionContext.Global.Endpoints.Single(x => string.Equals(x.Name, WellKnownServiceEndpointNames.SystemVssConnection, StringComparison.OrdinalIgnoreCase));
Environment["ACTIONS_RUNTIME_URL"] = systemConnection.Url.AbsoluteUri; Environment["ACTIONS_RUNTIME_URL"] = systemConnection.Url.AbsoluteUri;
Environment["ACTIONS_RUNTIME_TOKEN"] = systemConnection.Authorization.Parameters[EndpointAuthorizationParameters.AccessToken]; Environment["ACTIONS_RUNTIME_TOKEN"] = systemConnection.Authorization.Parameters[EndpointAuthorizationParameters.AccessToken];
if (systemConnection.Data.TryGetValue("CacheServerUrl", out var cacheUrl) && !string.IsNullOrEmpty(cacheUrl)) if (systemConnection.Data.TryGetValue("CacheServerUrl", out var cacheUrl) && !string.IsNullOrEmpty(cacheUrl))
@@ -113,7 +113,7 @@ namespace GitHub.Runner.Worker.Handlers
requireExitCodeZero: false, requireExitCodeZero: false,
outputEncoding: outputEncoding, outputEncoding: outputEncoding,
killProcessOnCancel: false, killProcessOnCancel: false,
inheritConsoleHandler: !ExecutionContext.Variables.Retain_Default_Encoding, inheritConsoleHandler: !ExecutionContext.Global.Variables.Retain_Default_Encoding,
cancellationToken: ExecutionContext.CancellationToken); cancellationToken: ExecutionContext.CancellationToken);
// Wait for either the node exit or force finish through ##vso command // Wait for either the node exit or force finish through ##vso command

View File

@@ -31,7 +31,7 @@ namespace GitHub.Runner.Worker.Handlers
{ {
_executionContext = executionContext; _executionContext = executionContext;
_commandManager = commandManager; _commandManager = commandManager;
_container = container ?? executionContext.Container; _container = container ?? executionContext.Global.Container;
// Recursion failsafe (test override) // Recursion failsafe (test override)
var failsafeString = Environment.GetEnvironmentVariable("RUNNER_TEST_GET_REPOSITORY_PATH_FAILSAFE"); var failsafeString = Environment.GetEnvironmentVariable("RUNNER_TEST_GET_REPOSITORY_PATH_FAILSAFE");
@@ -41,7 +41,7 @@ namespace GitHub.Runner.Worker.Handlers
} }
// Determine the timeout // Determine the timeout
var timeoutStr = _executionContext.Variables.Get(_timeoutKey); var timeoutStr = _executionContext.Global.Variables.Get(_timeoutKey);
if (string.IsNullOrEmpty(timeoutStr) || if (string.IsNullOrEmpty(timeoutStr) ||
!TimeSpan.TryParse(timeoutStr, CultureInfo.InvariantCulture, out _timeout) || !TimeSpan.TryParse(timeoutStr, CultureInfo.InvariantCulture, out _timeout) ||
_timeout <= TimeSpan.Zero) _timeout <= TimeSpan.Zero)

View File

@@ -23,6 +23,19 @@ namespace GitHub.Runner.Worker.Handlers
public override void PrintActionDetails(ActionRunStage stage) public override void PrintActionDetails(ActionRunStage stage)
{ {
// We don't want to display the internal workings if composite (similar/equivalent information can be found in debug)
void writeDetails(string message)
{
if (ExecutionContext.InsideComposite)
{
ExecutionContext.Debug(message);
}
else
{
ExecutionContext.Output(message);
}
}
if (stage == ActionRunStage.Post) if (stage == ActionRunStage.Post)
{ {
throw new NotSupportedException("Script action should not have 'Post' job action."); throw new NotSupportedException("Script action should not have 'Post' job action.");
@@ -39,7 +52,7 @@ namespace GitHub.Runner.Worker.Handlers
firstLine = firstLine.Substring(0, firstNewLine); firstLine = firstLine.Substring(0, firstNewLine);
} }
ExecutionContext.Output($"##[group]Run {firstLine}"); writeDetails(ExecutionContext.InsideComposite ? $"Run {firstLine}" : $"##[group]Run {firstLine}");
} }
else else
{ {
@@ -50,20 +63,20 @@ namespace GitHub.Runner.Worker.Handlers
foreach (var line in multiLines) foreach (var line in multiLines)
{ {
// Bright Cyan color // Bright Cyan color
ExecutionContext.Output($"\x1b[36;1m{line}\x1b[0m"); writeDetails($"\x1b[36;1m{line}\x1b[0m");
} }
string argFormat; string argFormat;
string shellCommand; string shellCommand;
string shellCommandPath = null; string shellCommandPath = null;
bool validateShellOnHost = !(StepHost is ContainerStepHost); bool validateShellOnHost = !(StepHost is ContainerStepHost);
string prependPath = string.Join(Path.PathSeparator.ToString(), ExecutionContext.PrependPath.Reverse<string>()); string prependPath = string.Join(Path.PathSeparator.ToString(), ExecutionContext.Global.PrependPath.Reverse<string>());
string shell = null; string shell = null;
if (!Inputs.TryGetValue("shell", out shell) || string.IsNullOrEmpty(shell)) if (!Inputs.TryGetValue("shell", out shell) || string.IsNullOrEmpty(shell))
{ {
// TODO: figure out how defaults interact with template later // TODO: figure out how defaults interact with template later
// for now, we won't check job.defaults if we are inside a template. // for now, we won't check job.defaults if we are inside a template.
if (string.IsNullOrEmpty(ExecutionContext.ScopeName) && ExecutionContext.JobDefaults.TryGetValue("run", out var runDefaults)) if (string.IsNullOrEmpty(ExecutionContext.ScopeName) && ExecutionContext.Global.JobDefaults.TryGetValue("run", out var runDefaults))
{ {
runDefaults.TryGetValue("shell", out shell); runDefaults.TryGetValue("shell", out shell);
} }
@@ -109,23 +122,23 @@ namespace GitHub.Runner.Worker.Handlers
if (!string.IsNullOrEmpty(shellCommandPath)) if (!string.IsNullOrEmpty(shellCommandPath))
{ {
ExecutionContext.Output($"shell: {shellCommandPath} {argFormat}"); writeDetails($"shell: {shellCommandPath} {argFormat}");
} }
else else
{ {
ExecutionContext.Output($"shell: {shellCommand} {argFormat}"); writeDetails($"shell: {shellCommand} {argFormat}");
} }
if (this.Environment?.Count > 0) if (this.Environment?.Count > 0)
{ {
ExecutionContext.Output("env:"); writeDetails("env:");
foreach (var env in this.Environment) foreach (var env in this.Environment)
{ {
ExecutionContext.Output($" {env.Key}: {env.Value}"); writeDetails($" {env.Key}: {env.Value}");
} }
} }
ExecutionContext.Output("##[endgroup]"); writeDetails(ExecutionContext.InsideComposite ? "" : "##[endgroup]");
} }
public async Task RunAsync(ActionRunStage stage) public async Task RunAsync(ActionRunStage stage)
@@ -151,9 +164,7 @@ namespace GitHub.Runner.Worker.Handlers
string workingDirectory = null; string workingDirectory = null;
if (!Inputs.TryGetValue("workingDirectory", out workingDirectory)) if (!Inputs.TryGetValue("workingDirectory", out workingDirectory))
{ {
// TODO: figure out how defaults interact with template later if (string.IsNullOrEmpty(ExecutionContext.ScopeName) && ExecutionContext.Global.JobDefaults.TryGetValue("run", out var runDefaults))
// for now, we won't check job.defaults if we are inside a template.
if (string.IsNullOrEmpty(ExecutionContext.ScopeName) && ExecutionContext.JobDefaults.TryGetValue("run", out var runDefaults))
{ {
if (runDefaults.TryGetValue("working-directory", out workingDirectory)) if (runDefaults.TryGetValue("working-directory", out workingDirectory))
{ {
@@ -167,9 +178,7 @@ namespace GitHub.Runner.Worker.Handlers
string shell = null; string shell = null;
if (!Inputs.TryGetValue("shell", out shell) || string.IsNullOrEmpty(shell)) if (!Inputs.TryGetValue("shell", out shell) || string.IsNullOrEmpty(shell))
{ {
// TODO: figure out how defaults interact with template later if (string.IsNullOrEmpty(ExecutionContext.ScopeName) && ExecutionContext.Global.JobDefaults.TryGetValue("run", out var runDefaults))
// for now, we won't check job.defaults if we are inside a template.
if (string.IsNullOrEmpty(ExecutionContext.ScopeName) && ExecutionContext.JobDefaults.TryGetValue("run", out var runDefaults))
{ {
if (runDefaults.TryGetValue("shell", out shell)) if (runDefaults.TryGetValue("shell", out shell))
{ {
@@ -180,7 +189,7 @@ namespace GitHub.Runner.Worker.Handlers
var isContainerStepHost = StepHost is ContainerStepHost; var isContainerStepHost = StepHost is ContainerStepHost;
string prependPath = string.Join(Path.PathSeparator.ToString(), ExecutionContext.PrependPath.Reverse<string>()); string prependPath = string.Join(Path.PathSeparator.ToString(), ExecutionContext.Global.PrependPath.Reverse<string>());
string commandPath, argFormat, shellCommand; string commandPath, argFormat, shellCommand;
// Set up default command and arguments // Set up default command and arguments
if (string.IsNullOrEmpty(shell)) if (string.IsNullOrEmpty(shell))
@@ -232,7 +241,7 @@ namespace GitHub.Runner.Worker.Handlers
#if OS_WINDOWS #if OS_WINDOWS
// Normalize Windows line endings // Normalize Windows line endings
contents = contents.Replace("\r\n", "\n").Replace("\n", "\r\n"); contents = contents.Replace("\r\n", "\n").Replace("\n", "\r\n");
var encoding = ExecutionContext.Variables.Retain_Default_Encoding && Console.InputEncoding.CodePage != 65001 var encoding = ExecutionContext.Global.Variables.Retain_Default_Encoding && Console.InputEncoding.CodePage != 65001
? Console.InputEncoding ? Console.InputEncoding
: new UTF8Encoding(false); : new UTF8Encoding(false);
#else #else
@@ -285,7 +294,7 @@ namespace GitHub.Runner.Worker.Handlers
requireExitCodeZero: false, requireExitCodeZero: false,
outputEncoding: null, outputEncoding: null,
killProcessOnCancel: false, killProcessOnCancel: false,
inheritConsoleHandler: !ExecutionContext.Variables.Retain_Default_Encoding, inheritConsoleHandler: !ExecutionContext.Global.Variables.Retain_Default_Encoding,
cancellationToken: ExecutionContext.CancellationToken); cancellationToken: ExecutionContext.CancellationToken);
// Error // Error

View File

@@ -1,4 +1,4 @@
using System; using System;
using System.Collections.Generic; using System.Collections.Generic;
using System.Diagnostics; using System.Diagnostics;
using System.Globalization; using System.Globalization;
@@ -74,6 +74,10 @@ namespace GitHub.Runner.Worker
{ {
// print out HostName for self-hosted runner // print out HostName for self-hosted runner
context.Output($"Runner name: '{setting.AgentName}'"); context.Output($"Runner name: '{setting.AgentName}'");
if (message.Variables.TryGetValue("system.runnerGroupName", out VariableValue runnerGroupName))
{
context.Output($"Runner group name: '{runnerGroupName.Value}'");
}
context.Output($"Machine name: '{Environment.MachineName}'"); context.Output($"Machine name: '{Environment.MachineName}'");
} }
} }
@@ -162,7 +166,7 @@ namespace GitHub.Runner.Worker
var environmentVariables = templateEvaluator.EvaluateStepEnvironment(token, jobContext.ExpressionValues, jobContext.ExpressionFunctions, VarUtil.EnvironmentVariableKeyComparer); var environmentVariables = templateEvaluator.EvaluateStepEnvironment(token, jobContext.ExpressionValues, jobContext.ExpressionFunctions, VarUtil.EnvironmentVariableKeyComparer);
foreach (var pair in environmentVariables) foreach (var pair in environmentVariables)
{ {
context.EnvironmentVariables[pair.Key] = pair.Value ?? string.Empty; context.Global.EnvironmentVariables[pair.Key] = pair.Value ?? string.Empty;
context.SetEnvContext(pair.Key, pair.Value ?? string.Empty); context.SetEnvContext(pair.Key, pair.Value ?? string.Empty);
} }
} }
@@ -172,7 +176,7 @@ namespace GitHub.Runner.Worker
var container = templateEvaluator.EvaluateJobContainer(message.JobContainer, jobContext.ExpressionValues, jobContext.ExpressionFunctions); var container = templateEvaluator.EvaluateJobContainer(message.JobContainer, jobContext.ExpressionValues, jobContext.ExpressionFunctions);
if (container != null) if (container != null)
{ {
jobContext.Container = new Container.ContainerInfo(HostContext, container); jobContext.Global.Container = new Container.ContainerInfo(HostContext, container);
} }
// Evaluate the job service containers // Evaluate the job service containers
@@ -184,7 +188,7 @@ namespace GitHub.Runner.Worker
{ {
var networkAlias = pair.Key; var networkAlias = pair.Key;
var serviceContainer = pair.Value; var serviceContainer = pair.Value;
jobContext.ServiceContainers.Add(new Container.ContainerInfo(HostContext, serviceContainer, false, networkAlias)); jobContext.Global.ServiceContainers.Add(new Container.ContainerInfo(HostContext, serviceContainer, false, networkAlias));
} }
} }
@@ -195,14 +199,14 @@ namespace GitHub.Runner.Worker
var defaults = token.AssertMapping("defaults"); var defaults = token.AssertMapping("defaults");
if (defaults.Any(x => string.Equals(x.Key.AssertString("defaults key").Value, "run", StringComparison.OrdinalIgnoreCase))) if (defaults.Any(x => string.Equals(x.Key.AssertString("defaults key").Value, "run", StringComparison.OrdinalIgnoreCase)))
{ {
context.JobDefaults["run"] = new Dictionary<string, string>(StringComparer.OrdinalIgnoreCase); context.Global.JobDefaults["run"] = new Dictionary<string, string>(StringComparer.OrdinalIgnoreCase);
var defaultsRun = defaults.First(x => string.Equals(x.Key.AssertString("defaults key").Value, "run", StringComparison.OrdinalIgnoreCase)); var defaultsRun = defaults.First(x => string.Equals(x.Key.AssertString("defaults key").Value, "run", StringComparison.OrdinalIgnoreCase));
var jobDefaults = templateEvaluator.EvaluateJobDefaultsRun(defaultsRun.Value, jobContext.ExpressionValues, jobContext.ExpressionFunctions); var jobDefaults = templateEvaluator.EvaluateJobDefaultsRun(defaultsRun.Value, jobContext.ExpressionValues, jobContext.ExpressionFunctions);
foreach (var pair in jobDefaults) foreach (var pair in jobDefaults)
{ {
if (!string.IsNullOrEmpty(pair.Value)) if (!string.IsNullOrEmpty(pair.Value))
{ {
context.JobDefaults["run"][pair.Key] = pair.Value; context.Global.JobDefaults["run"][pair.Key] = pair.Value;
} }
} }
} }
@@ -216,15 +220,15 @@ namespace GitHub.Runner.Worker
preJobSteps.AddRange(prepareResult.ContainerSetupSteps); preJobSteps.AddRange(prepareResult.ContainerSetupSteps);
// Add start-container steps, record and stop-container steps // Add start-container steps, record and stop-container steps
if (jobContext.Container != null || jobContext.ServiceContainers.Count > 0) if (jobContext.Global.Container != null || jobContext.Global.ServiceContainers.Count > 0)
{ {
var containerProvider = HostContext.GetService<IContainerOperationProvider>(); var containerProvider = HostContext.GetService<IContainerOperationProvider>();
var containers = new List<Container.ContainerInfo>(); var containers = new List<Container.ContainerInfo>();
if (jobContext.Container != null) if (jobContext.Global.Container != null)
{ {
containers.Add(jobContext.Container); containers.Add(jobContext.Global.Container);
} }
containers.AddRange(jobContext.ServiceContainers); containers.AddRange(jobContext.Global.ServiceContainers);
preJobSteps.Add(new JobExtensionRunner(runAsync: containerProvider.StartContainersAsync, preJobSteps.Add(new JobExtensionRunner(runAsync: containerProvider.StartContainersAsync,
condition: $"{PipelineTemplateConstants.Success}()", condition: $"{PipelineTemplateConstants.Success}()",
@@ -296,7 +300,7 @@ namespace GitHub.Runner.Worker
{ {
ArgUtil.NotNull(actionStep, step.DisplayName); ArgUtil.NotNull(actionStep, step.DisplayName);
intraActionStates.TryGetValue(actionStep.Action.Id, out var intraActionState); intraActionStates.TryGetValue(actionStep.Action.Id, out var intraActionState);
actionStep.ExecutionContext = jobContext.CreateChild(actionStep.Action.Id, actionStep.DisplayName, actionStep.Action.Name, actionStep.Action.ScopeName, actionStep.Action.ContextName, intraActionState); actionStep.ExecutionContext = jobContext.CreateChild(actionStep.Action.Id, actionStep.DisplayName, actionStep.Action.Name, null, actionStep.Action.ContextName, intraActionState);
} }
} }
@@ -305,7 +309,7 @@ namespace GitHub.Runner.Worker
steps.AddRange(jobSteps); steps.AddRange(jobSteps);
// Prepare for orphan process cleanup // Prepare for orphan process cleanup
_processCleanup = jobContext.Variables.GetBoolean("process.clean") ?? true; _processCleanup = jobContext.Global.Variables.GetBoolean("process.clean") ?? true;
if (_processCleanup) if (_processCleanup)
{ {
// Set the RUNNER_TRACKING_ID env variable. // Set the RUNNER_TRACKING_ID env variable.
@@ -331,6 +335,14 @@ namespace GitHub.Runner.Worker
context.Result = TaskResult.Canceled; context.Result = TaskResult.Canceled;
throw; throw;
} }
catch (FailedToResolveActionDownloadInfoException ex)
{
// Log the error and fail the JobExtension Initialization.
Trace.Error($"Caught exception from JobExtenion Initialization: {ex}");
context.InfrastructureError(ex.Message);
context.Result = TaskResult.Failed;
throw;
}
catch (Exception ex) catch (Exception ex)
{ {
// Log the error and fail the JobExtension Initialization. // Log the error and fail the JobExtension Initialization.
@@ -361,6 +373,24 @@ namespace GitHub.Runner.Worker
context.Start(); context.Start();
context.Debug("Starting: Complete job"); context.Debug("Starting: Complete job");
Trace.Info("Initialize Env context");
#if OS_WINDOWS
var envContext = new DictionaryContextData();
#else
var envContext = new CaseSensitiveDictionaryContextData();
#endif
context.ExpressionValues["env"] = envContext;
foreach (var pair in context.Global.EnvironmentVariables)
{
envContext[pair.Key] = new StringContextData(pair.Value ?? string.Empty);
}
// Populate env context for each step
Trace.Info("Initialize steps context");
context.ExpressionValues["steps"] = context.Global.StepsContext.GetScope(context.ScopeName);
var templateEvaluator = context.ToPipelineTemplateEvaluator();
// Evaluate job outputs // Evaluate job outputs
if (message.JobOutputs != null && message.JobOutputs.Type != TokenType.Null) if (message.JobOutputs != null && message.JobOutputs.Type != TokenType.Null)
{ {
@@ -370,21 +400,7 @@ namespace GitHub.Runner.Worker
// Populate env context for each step // Populate env context for each step
Trace.Info("Initialize Env context for evaluating job outputs"); Trace.Info("Initialize Env context for evaluating job outputs");
#if OS_WINDOWS
var envContext = new DictionaryContextData();
#else
var envContext = new CaseSensitiveDictionaryContextData();
#endif
context.ExpressionValues["env"] = envContext;
foreach (var pair in context.EnvironmentVariables)
{
envContext[pair.Key] = new StringContextData(pair.Value ?? string.Empty);
}
Trace.Info("Initialize steps context for evaluating job outputs");
context.ExpressionValues["steps"] = context.StepsContext.GetScope(context.ScopeName);
var templateEvaluator = context.ToPipelineTemplateEvaluator();
var outputs = templateEvaluator.EvaluateJobOutput(message.JobOutputs, context.ExpressionValues, context.ExpressionFunctions); var outputs = templateEvaluator.EvaluateJobOutput(message.JobOutputs, context.ExpressionValues, context.ExpressionFunctions);
foreach (var output in outputs) foreach (var output in outputs)
{ {
@@ -413,7 +429,35 @@ namespace GitHub.Runner.Worker
} }
} }
if (context.Variables.GetBoolean(Constants.Variables.Actions.RunnerDebug) ?? false) // Evaluate environment data
if (jobContext.ActionsEnvironment?.Url != null && jobContext.ActionsEnvironment?.Url.Type != TokenType.Null)
{
try
{
context.Output($"Evaluate and set environment url");
var environmentUrlToken = templateEvaluator.EvaluateEnvironmentUrl(jobContext.ActionsEnvironment.Url, context.ExpressionValues, context.ExpressionFunctions);
var environmentUrl = environmentUrlToken.AssertString("environment.url");
if (!string.Equals(environmentUrl.Value, HostContext.SecretMasker.MaskSecrets(environmentUrl.Value)))
{
context.Warning($"Skip setting environment url as environment '{jobContext.ActionsEnvironment.Name}' may contain secret.");
}
else
{
context.Output($"Evaluated environment url: {environmentUrl}");
jobContext.ActionsEnvironment.Url = environmentUrlToken;
}
}
catch (Exception ex)
{
context.Result = TaskResult.Failed;
context.Error($"Failed to evaluate environment url");
context.Error(ex);
jobContext.Result = TaskResultUtil.MergeTaskResults(jobContext.Result, TaskResult.Failed);
}
}
if (context.Global.Variables.GetBoolean(Constants.Variables.Actions.RunnerDebug) ?? false)
{ {
Trace.Info("Support log upload starting."); Trace.Info("Support log upload starting.");
context.Output("Uploading runner diagnostic logs"); context.Output("Uploading runner diagnostic logs");

View File

@@ -99,7 +99,7 @@ namespace GitHub.Runner.Worker
return await CompleteJobAsync(jobServer, jobContext, message, TaskResult.Failed); return await CompleteJobAsync(jobServer, jobContext, message, TaskResult.Failed);
} }
if (jobContext.WriteDebug) if (jobContext.Global.WriteDebug)
{ {
jobContext.SetRunnerContext("debug", "1"); jobContext.SetRunnerContext("debug", "1");
} }
@@ -152,7 +152,7 @@ namespace GitHub.Runner.Worker
{ {
foreach (var step in jobSteps) foreach (var step in jobSteps)
{ {
jobContext.JobSteps.Add(step); jobContext.JobSteps.Enqueue(step);
} }
await stepsRunner.RunAsync(jobContext); await stepsRunner.RunAsync(jobContext);
@@ -209,14 +209,14 @@ namespace GitHub.Runner.Worker
// Clean TEMP after finish process jobserverqueue, since there might be a pending fileupload still use the TEMP dir. // Clean TEMP after finish process jobserverqueue, since there might be a pending fileupload still use the TEMP dir.
_tempDirectoryManager?.CleanupTempDirectory(); _tempDirectoryManager?.CleanupTempDirectory();
if (!jobContext.Features.HasFlag(PlanFeatures.JobCompletedPlanEvent)) if (!jobContext.Global.Features.HasFlag(PlanFeatures.JobCompletedPlanEvent))
{ {
Trace.Info($"Skip raise job completed event call from worker because Plan version is {message.Plan.Version}"); Trace.Info($"Skip raise job completed event call from worker because Plan version is {message.Plan.Version}");
return result; return result;
} }
Trace.Info("Raising job completed event."); Trace.Info("Raising job completed event.");
var jobCompletedEvent = new JobCompletedEvent(message.RequestId, message.JobId, result, jobContext.JobOutputs); var jobCompletedEvent = new JobCompletedEvent(message.RequestId, message.JobId, result, jobContext.JobOutputs, jobContext.ActionsEnvironment);
var completeJobRetryLimit = 5; var completeJobRetryLimit = 5;
var exceptions = new List<Exception>(); var exceptions = new List<Exception>();

View File

@@ -100,12 +100,12 @@ namespace GitHub.Runner.Worker
RunnerActionPluginExecutionContext pluginContext = new RunnerActionPluginExecutionContext RunnerActionPluginExecutionContext pluginContext = new RunnerActionPluginExecutionContext
{ {
Inputs = inputs, Inputs = inputs,
Endpoints = context.Endpoints, Endpoints = context.Global.Endpoints,
Context = context.ExpressionValues Context = context.ExpressionValues
}; };
// variables // variables
foreach (var variable in context.Variables.AllVariables) foreach (var variable in context.Global.Variables.AllVariables)
{ {
pluginContext.Variables[variable.Name] = new VariableValue(variable.Value, variable.Secret); pluginContext.Variables[variable.Name] = new VariableValue(variable.Value, variable.Secret);
} }

View File

@@ -15,6 +15,14 @@ namespace GitHub.Runner.Worker
private static readonly Regex _propertyRegex = new Regex("^[a-zA-Z_][a-zA-Z0-9_]*$", RegexOptions.Compiled); private static readonly Regex _propertyRegex = new Regex("^[a-zA-Z_][a-zA-Z0-9_]*$", RegexOptions.Compiled);
private readonly DictionaryContextData _contextData = new DictionaryContextData(); private readonly DictionaryContextData _contextData = new DictionaryContextData();
public void ClearScope(string scopeName)
{
if (_contextData.TryGetValue(scopeName, out _))
{
_contextData[scopeName] = new DictionaryContextData();
}
}
public DictionaryContextData GetScope(string scopeName) public DictionaryContextData GetScope(string scopeName)
{ {
if (scopeName == null) if (scopeName == null)

View File

@@ -59,18 +59,18 @@ namespace GitHub.Runner.Worker
checkPostJobActions = true; checkPostJobActions = true;
while (jobContext.PostJobSteps.TryPop(out var postStep)) while (jobContext.PostJobSteps.TryPop(out var postStep))
{ {
jobContext.JobSteps.Add(postStep); jobContext.JobSteps.Enqueue(postStep);
} }
continue; continue;
} }
var step = jobContext.JobSteps[0]; var step = jobContext.JobSteps.Dequeue();
jobContext.JobSteps.RemoveAt(0);
Trace.Info($"Processing step: DisplayName='{step.DisplayName}'"); Trace.Info($"Processing step: DisplayName='{step.DisplayName}'");
ArgUtil.NotNull(step.ExecutionContext, nameof(step.ExecutionContext)); ArgUtil.NotNull(step.ExecutionContext, nameof(step.ExecutionContext));
ArgUtil.NotNull(step.ExecutionContext.Variables, nameof(step.ExecutionContext.Variables)); ArgUtil.NotNull(step.ExecutionContext.Global, nameof(step.ExecutionContext.Global));
ArgUtil.NotNull(step.ExecutionContext.Global.Variables, nameof(step.ExecutionContext.Global.Variables));
// Start // Start
step.ExecutionContext.Start(); step.ExecutionContext.Start();
@@ -82,7 +82,7 @@ namespace GitHub.Runner.Worker
step.ExecutionContext.ExpressionFunctions.Add(new FunctionInfo<SuccessFunction>(PipelineTemplateConstants.Success, 0, 0)); step.ExecutionContext.ExpressionFunctions.Add(new FunctionInfo<SuccessFunction>(PipelineTemplateConstants.Success, 0, 0));
step.ExecutionContext.ExpressionFunctions.Add(new FunctionInfo<HashFilesFunction>(PipelineTemplateConstants.HashFiles, 1, byte.MaxValue)); step.ExecutionContext.ExpressionFunctions.Add(new FunctionInfo<HashFilesFunction>(PipelineTemplateConstants.HashFiles, 1, byte.MaxValue));
step.ExecutionContext.ExpressionValues["steps"] = step.ExecutionContext.StepsContext.GetScope(step.ExecutionContext.ScopeName); step.ExecutionContext.ExpressionValues["steps"] = step.ExecutionContext.Global.StepsContext.GetScope(step.ExecutionContext.ScopeName);
// Populate env context for each step // Populate env context for each step
Trace.Info("Initialize Env context for step"); Trace.Info("Initialize Env context for step");
@@ -93,25 +93,11 @@ namespace GitHub.Runner.Worker
#endif #endif
// Global env // Global env
foreach (var pair in step.ExecutionContext.EnvironmentVariables) foreach (var pair in step.ExecutionContext.Global.EnvironmentVariables)
{ {
envContext[pair.Key] = new StringContextData(pair.Value ?? string.Empty); envContext[pair.Key] = new StringContextData(pair.Value ?? string.Empty);
} }
// Stomps over with outside step env
if (step.ExecutionContext.ExpressionValues.TryGetValue("env", out var envContextData))
{
#if OS_WINDOWS
var dict = envContextData as DictionaryContextData;
#else
var dict = envContextData as CaseSensitiveDictionaryContextData;
#endif
foreach (var pair in dict)
{
envContext[pair.Key] = pair.Value;
}
}
step.ExecutionContext.ExpressionValues["env"] = envContext; step.ExecutionContext.ExpressionValues["env"] = envContext;
bool evaluateStepEnvFailed = false; bool evaluateStepEnvFailed = false;
@@ -300,40 +286,7 @@ namespace GitHub.Runner.Worker
step.ExecutionContext.SetTimeout(timeout); step.ExecutionContext.SetTimeout(timeout);
} }
#if OS_WINDOWS await EncodingUtil.SetEncoding(HostContext, Trace, step.ExecutionContext.CancellationToken);
try
{
if (Console.InputEncoding.CodePage != 65001)
{
using (var p = HostContext.CreateService<IProcessInvoker>())
{
// Use UTF8 code page
int exitCode = await p.ExecuteAsync(workingDirectory: HostContext.GetDirectory(WellKnownDirectory.Work),
fileName: WhichUtil.Which("chcp", true, Trace),
arguments: "65001",
environment: null,
requireExitCodeZero: false,
outputEncoding: null,
killProcessOnCancel: false,
redirectStandardIn: null,
inheritConsoleHandler: true,
cancellationToken: step.ExecutionContext.CancellationToken);
if (exitCode == 0)
{
Trace.Info("Successfully returned to code page 65001 (UTF8)");
}
else
{
Trace.Warning($"'chcp 65001' failed with exit code {exitCode}");
}
}
}
}
catch (Exception ex)
{
Trace.Warning($"'chcp 65001' failed with exception {ex.Message}");
}
#endif
try try
{ {

View File

@@ -108,19 +108,26 @@
} }
}, },
"composite-steps": { "composite-steps": {
"context": [
"github",
"strategy",
"matrix",
"steps",
"inputs",
"job",
"runner",
"env",
"hashFiles(1,255)"
],
"sequence": { "sequence": {
"item-type": "any" "item-type": "composite-step"
}
},
"composite-step": {
"mapping": {
"properties": {
"name": "string-steps-context",
"id": "non-empty-string",
"run": {
"type": "string-steps-context",
"required": true
},
"env": "step-env",
"working-directory": "string-steps-context",
"shell": {
"type": "non-empty-string",
"required": true
}
}
} }
}, },
"container-runs-context": { "container-runs-context": {
@@ -157,6 +164,37 @@
"string": { "string": {
"require-non-empty": true "require-non-empty": true
} }
},
"string-steps-context": {
"context": [
"github",
"inputs",
"strategy",
"matrix",
"steps",
"job",
"runner",
"env",
"hashFiles(1,255)"
],
"string": {}
},
"step-env": {
"context": [
"github",
"inputs",
"strategy",
"matrix",
"steps",
"job",
"runner",
"env",
"hashFiles(1,255)"
],
"mapping": {
"loose-key-type": "non-empty-string",
"loose-value-type": "string"
}
} }
} }
} }

View File

@@ -52,6 +52,7 @@ namespace GitHub.DistributedTask.ObjectTemplating
internal const String String = "string"; internal const String String = "string";
internal const String StringDefinition = "string-definition"; internal const String StringDefinition = "string-definition";
internal const String StringDefinitionProperties = "string-definition-properties"; internal const String StringDefinitionProperties = "string-definition-properties";
internal const String StringRunnerContextNoSecrets = "string-runner-context-no-secrets";
internal const String Structure = "structure"; internal const String Structure = "structure";
internal const String TemplateSchema = "template-schema"; internal const String TemplateSchema = "template-schema";
internal const String True = "true"; internal const String True = "true";

View File

@@ -24,7 +24,6 @@ namespace GitHub.DistributedTask.Pipelines
Environment = actionToClone.Environment?.Clone(); Environment = actionToClone.Environment?.Clone();
Inputs = actionToClone.Inputs?.Clone(); Inputs = actionToClone.Inputs?.Clone();
ContextName = actionToClone?.ContextName; ContextName = actionToClone?.ContextName;
ScopeName = actionToClone?.ScopeName;
DisplayNameToken = actionToClone.DisplayNameToken?.Clone(); DisplayNameToken = actionToClone.DisplayNameToken?.Clone();
} }
@@ -41,9 +40,6 @@ namespace GitHub.DistributedTask.Pipelines
[DataMember(EmitDefaultValue = false)] [DataMember(EmitDefaultValue = false)]
public TemplateToken DisplayNameToken { get; set; } public TemplateToken DisplayNameToken { get; set; }
[DataMember(EmitDefaultValue = false)]
public String ScopeName { get; set; }
[DataMember(EmitDefaultValue = false)] [DataMember(EmitDefaultValue = false)]
public String ContextName { get; set; } public String ContextName { get; set; }

View File

@@ -39,10 +39,10 @@ namespace GitHub.DistributedTask.Pipelines
DictionaryContextData contextData, DictionaryContextData contextData,
WorkspaceOptions workspaceOptions, WorkspaceOptions workspaceOptions,
IEnumerable<JobStep> steps, IEnumerable<JobStep> steps,
IEnumerable<ContextScope> scopes,
IList<String> fileTable, IList<String> fileTable,
TemplateToken jobOutputs, TemplateToken jobOutputs,
IList<TemplateToken> defaults) IList<TemplateToken> defaults,
ActionsEnvironmentReference actionsEnvironment)
{ {
this.MessageType = JobRequestMessageTypes.PipelineAgentJobRequest; this.MessageType = JobRequestMessageTypes.PipelineAgentJobRequest;
this.Plan = plan; this.Plan = plan;
@@ -55,16 +55,11 @@ namespace GitHub.DistributedTask.Pipelines
this.Resources = jobResources; this.Resources = jobResources;
this.Workspace = workspaceOptions; this.Workspace = workspaceOptions;
this.JobOutputs = jobOutputs; this.JobOutputs = jobOutputs;
this.ActionsEnvironment = actionsEnvironment;
m_variables = new Dictionary<String, VariableValue>(variables, StringComparer.OrdinalIgnoreCase); m_variables = new Dictionary<String, VariableValue>(variables, StringComparer.OrdinalIgnoreCase);
m_maskHints = new List<MaskHint>(maskHints); m_maskHints = new List<MaskHint>(maskHints);
m_steps = new List<JobStep>(steps); m_steps = new List<JobStep>(steps);
if (scopes != null)
{
m_scopes = new List<ContextScope>(scopes);
}
if (environmentVariables?.Count > 0) if (environmentVariables?.Count > 0)
{ {
m_environmentVariables = new List<TemplateToken>(environmentVariables); m_environmentVariables = new List<TemplateToken>(environmentVariables);
@@ -234,6 +229,13 @@ namespace GitHub.DistributedTask.Pipelines
} }
} }
[DataMember(EmitDefaultValue = false)]
public ActionsEnvironmentReference ActionsEnvironment
{
get;
set;
}
/// <summary> /// <summary>
/// Gets the collection of variables associated with the current context. /// Gets the collection of variables associated with the current context.
/// </summary> /// </summary>
@@ -261,18 +263,6 @@ namespace GitHub.DistributedTask.Pipelines
} }
} }
public IList<ContextScope> Scopes
{
get
{
if (m_scopes == null)
{
m_scopes = new List<ContextScope>();
}
return m_scopes;
}
}
/// <summary> /// <summary>
/// Gets the table of files used when parsing the pipeline (e.g. yaml files) /// Gets the table of files used when parsing the pipeline (e.g. yaml files)
/// </summary> /// </summary>
@@ -415,11 +405,6 @@ namespace GitHub.DistributedTask.Pipelines
m_maskHints = new List<MaskHint>(this.m_maskHints.Distinct()); m_maskHints = new List<MaskHint>(this.m_maskHints.Distinct());
} }
if (m_scopes?.Count == 0)
{
m_scopes = null;
}
if (m_variables?.Count == 0) if (m_variables?.Count == 0)
{ {
m_variables = null; m_variables = null;
@@ -447,9 +432,6 @@ namespace GitHub.DistributedTask.Pipelines
[DataMember(Name = "Steps", EmitDefaultValue = false)] [DataMember(Name = "Steps", EmitDefaultValue = false)]
private List<JobStep> m_steps; private List<JobStep> m_steps;
[DataMember(Name = "Scopes", EmitDefaultValue = false)]
private List<ContextScope> m_scopes;
[DataMember(Name = "Variables", EmitDefaultValue = false)] [DataMember(Name = "Variables", EmitDefaultValue = false)]
private IDictionary<String, VariableValue> m_variables; private IDictionary<String, VariableValue> m_variables;

View File

@@ -1,53 +0,0 @@
using System;
using System.Collections.Generic;
using System.ComponentModel;
using System.Runtime.Serialization;
using GitHub.DistributedTask.ObjectTemplating.Tokens;
using Newtonsoft.Json;
namespace GitHub.DistributedTask.Pipelines
{
[DataContract]
[EditorBrowsable(EditorBrowsableState.Never)]
public sealed class ContextScope
{
[DataMember(EmitDefaultValue = false)]
public String Name { get; set; }
[IgnoreDataMember]
public String ContextName
{
get
{
var index = Name.LastIndexOf('.');
if (index >= 0)
{
return Name.Substring(index + 1);
}
return Name;
}
}
[IgnoreDataMember]
public String ParentName
{
get
{
var index = Name.LastIndexOf('.');
if (index >= 0)
{
return Name.Substring(0, index);
}
return String.Empty;
}
}
[DataMember(EmitDefaultValue = false)]
public TemplateToken Inputs { get; set; }
[DataMember(EmitDefaultValue = false)]
public TemplateToken Outputs { get; set; }
}
}

View File

@@ -56,5 +56,36 @@ namespace GitHub.DistributedTask.Pipelines
get; get;
set; set;
} }
/// <summary>
/// Gets or sets the credentials used for pulling the container iamge.
/// </summary>
public ContainerRegistryCredentials Credentials
{
get;
set;
}
}
[EditorBrowsable(EditorBrowsableState.Never)]
public sealed class ContainerRegistryCredentials
{
/// <summary>
/// Gets or sets the user to authenticate to a registry with
/// </summary>
public String Username
{
get;
set;
}
/// <summary>
/// Gets or sets the password to authenticate to a registry with
/// </summary>
public String Password
{
get;
set;
}
} }
} }

View File

@@ -11,19 +11,19 @@ namespace GitHub.DistributedTask.Pipelines.ObjectTemplating
public const String BooleanStrategyContext = "boolean-strategy-context"; public const String BooleanStrategyContext = "boolean-strategy-context";
public const String CancelTimeoutMinutes = "cancel-timeout-minutes"; public const String CancelTimeoutMinutes = "cancel-timeout-minutes";
public const String Cancelled = "cancelled"; public const String Cancelled = "cancelled";
public const String Checkout = "checkout";
public const String Clean= "clean"; public const String Clean= "clean";
public const String Container = "container"; public const String Container = "container";
public const String ContinueOnError = "continue-on-error"; public const String ContinueOnError = "continue-on-error";
public const String Credentials = "credentials";
public const String Defaults = "defaults"; public const String Defaults = "defaults";
public const String Env = "env"; public const String Env = "env";
public const String Environment = "environment";
public const String Event = "event"; public const String Event = "event";
public const String EventPattern = "github.event"; public const String EventPattern = "github.event";
public const String Exclude = "exclude"; public const String Exclude = "exclude";
public const String FailFast = "fail-fast"; public const String FailFast = "fail-fast";
public const String Failure = "failure"; public const String Failure = "failure";
public const String FetchDepth = "fetch-depth"; public const String FetchDepth = "fetch-depth";
public const String GeneratedId = "generated-id";
public const String GitHub = "github"; public const String GitHub = "github";
public const String HashFiles = "hashFiles"; public const String HashFiles = "hashFiles";
public const String Id = "id"; public const String Id = "id";
@@ -36,7 +36,6 @@ namespace GitHub.DistributedTask.Pipelines.ObjectTemplating
public const String JobIfResult = "job-if-result"; public const String JobIfResult = "job-if-result";
public const String JobOutputs = "job-outputs"; public const String JobOutputs = "job-outputs";
public const String Jobs = "jobs"; public const String Jobs = "jobs";
public const String Labels = "labels";
public const String Lfs = "lfs"; public const String Lfs = "lfs";
public const String Matrix = "matrix"; public const String Matrix = "matrix";
public const String MaxParallel = "max-parallel"; public const String MaxParallel = "max-parallel";
@@ -48,28 +47,23 @@ namespace GitHub.DistributedTask.Pipelines.ObjectTemplating
public const String Options = "options"; public const String Options = "options";
public const String Outputs = "outputs"; public const String Outputs = "outputs";
public const String OutputsPattern = "needs.*.outputs"; public const String OutputsPattern = "needs.*.outputs";
public const String Password = "password";
public const String Path = "path"; public const String Path = "path";
public const String Pool = "pool"; public const String Pool = "pool";
public const String Ports = "ports"; public const String Ports = "ports";
public const String Result = "result"; public const String Result = "result";
public const String RunDisplayPrefix = "Run ";
public const String Run = "run"; public const String Run = "run";
public const String RunDisplayPrefix = "Run ";
public const String Runner = "runner"; public const String Runner = "runner";
public const String RunsOn = "runs-on"; public const String RunsOn = "runs-on";
public const String Scope = "scope";
public const String Scopes = "scopes";
public const String Secrets = "secrets"; public const String Secrets = "secrets";
public const String Services = "services"; public const String Services = "services";
public const String Shell = "shell"; public const String Shell = "shell";
public const String Skipped = "skipped"; public const String Skipped = "skipped";
public const String StepEnv = "step-env"; public const String StepEnv = "step-env";
public const String StepIfResult = "step-if-result"; public const String StepIfResult = "step-if-result";
public const String Steps = "steps";
public const String StepsInTemplate = "steps-in-template";
public const String StepsScopeInputs = "steps-scope-inputs";
public const String StepsScopeOutputs = "steps-scope-outputs";
public const String StepsTemplateRoot = "steps-template-root";
public const String StepWith = "step-with"; public const String StepWith = "step-with";
public const String Steps = "steps";
public const String Strategy = "strategy"; public const String Strategy = "strategy";
public const String StringStepsContext = "string-steps-context"; public const String StringStepsContext = "string-steps-context";
public const String StringStrategyContext = "string-strategy-context"; public const String StringStrategyContext = "string-strategy-context";
@@ -77,7 +71,7 @@ namespace GitHub.DistributedTask.Pipelines.ObjectTemplating
public const String Success = "success"; public const String Success = "success";
public const String Template = "template"; public const String Template = "template";
public const String TimeoutMinutes = "timeout-minutes"; public const String TimeoutMinutes = "timeout-minutes";
public const String Token = "token"; public const String Username = "username";
public const String Uses = "uses"; public const String Uses = "uses";
public const String VmImage = "vmImage"; public const String VmImage = "vmImage";
public const String Volumes = "volumes"; public const String Volumes = "volumes";

View File

@@ -1,6 +1,6 @@
using System; using System;
using System.Collections.Generic; using System.Collections.Generic;
using System.Globalization; using System.ComponentModel;
using System.Linq; using System.Linq;
using GitHub.DistributedTask.Expressions2; using GitHub.DistributedTask.Expressions2;
using GitHub.DistributedTask.Expressions2.Sdk; using GitHub.DistributedTask.Expressions2.Sdk;
@@ -14,8 +14,62 @@ using Newtonsoft.Json.Linq;
namespace GitHub.DistributedTask.Pipelines.ObjectTemplating namespace GitHub.DistributedTask.Pipelines.ObjectTemplating
{ {
internal static class PipelineTemplateConverter [EditorBrowsable(EditorBrowsableState.Never)]
public static class PipelineTemplateConverter
{ {
public static List<Step> ConvertToSteps(
TemplateContext context,
TemplateToken steps)
{
var stepsSequence = steps.AssertSequence($"job {PipelineTemplateConstants.Steps}");
var result = new List<Step>();
var nameBuilder = new ReferenceNameBuilder();
foreach (var stepsItem in stepsSequence)
{
var step = ConvertToStep(context, stepsItem, nameBuilder);
if (step != null) // step = null means we are hitting error during step conversion, there should be an error in context.errors
{
if (step.Enabled)
{
result.Add(step);
}
}
}
// Generate context name if empty
foreach (ActionStep step in result)
{
if (!String.IsNullOrEmpty(step.ContextName))
{
continue;
}
var name = default(string);
switch (step.Reference.Type)
{
case ActionSourceType.ContainerRegistry:
var containerReference = step.Reference as ContainerRegistryReference;
name = containerReference.Image;
break;
case ActionSourceType.Repository:
var repositoryReference = step.Reference as RepositoryPathReference;
name = !String.IsNullOrEmpty(repositoryReference.Name) ? repositoryReference.Name : PipelineConstants.SelfAlias;
break;
}
if (String.IsNullOrEmpty(name))
{
name = "run";
}
nameBuilder.AppendSegment($"__{name}");
step.ContextName = nameBuilder.Build();
}
return result;
}
internal static Boolean ConvertToIfResult( internal static Boolean ConvertToIfResult(
TemplateContext context, TemplateContext context,
TemplateToken ifResult) TemplateToken ifResult)
@@ -29,6 +83,7 @@ namespace GitHub.DistributedTask.Pipelines.ObjectTemplating
var evaluationResult = EvaluationResult.CreateIntermediateResult(null, ifResult); var evaluationResult = EvaluationResult.CreateIntermediateResult(null, ifResult);
return evaluationResult.IsTruthy; return evaluationResult.IsTruthy;
} }
internal static Boolean? ConvertToStepContinueOnError( internal static Boolean? ConvertToStepContinueOnError(
TemplateContext context, TemplateContext context,
TemplateToken token, TemplateToken token,
@@ -154,6 +209,30 @@ namespace GitHub.DistributedTask.Pipelines.ObjectTemplating
return (Int32)numberToken.Value; return (Int32)numberToken.Value;
} }
internal static ContainerRegistryCredentials ConvertToContainerCredentials(TemplateToken token)
{
var credentials = token.AssertMapping(PipelineTemplateConstants.Credentials);
var result = new ContainerRegistryCredentials();
foreach (var credentialProperty in credentials)
{
var propertyName = credentialProperty.Key.AssertString($"{PipelineTemplateConstants.Credentials} key");
switch (propertyName.Value)
{
case PipelineTemplateConstants.Username:
result.Username = credentialProperty.Value.AssertString(PipelineTemplateConstants.Username).Value;
break;
case PipelineTemplateConstants.Password:
result.Password = credentialProperty.Value.AssertString(PipelineTemplateConstants.Password).Value;
break;
default:
propertyName.AssertUnexpectedValue($"{PipelineTemplateConstants.Credentials} key {propertyName}");
break;
}
}
return result;
}
internal static JobContainer ConvertToJobContainer( internal static JobContainer ConvertToJobContainer(
TemplateContext context, TemplateContext context,
TemplateToken value, TemplateToken value,
@@ -220,6 +299,9 @@ namespace GitHub.DistributedTask.Pipelines.ObjectTemplating
} }
result.Volumes = volumeList; result.Volumes = volumeList;
break; break;
case PipelineTemplateConstants.Credentials:
result.Credentials = ConvertToContainerCredentials(containerPropertyPair.Value);
break;
default: default:
propertyName.AssertUnexpectedValue($"{PipelineTemplateConstants.Container} key"); propertyName.AssertUnexpectedValue($"{PipelineTemplateConstants.Container} key");
break; break;
@@ -264,32 +346,10 @@ namespace GitHub.DistributedTask.Pipelines.ObjectTemplating
return result; return result;
} }
//Note: originally was List<Step> but we need to change to List<ActionStep> to use the "Inputs" attribute
internal static List<ActionStep> ConvertToSteps(
TemplateContext context,
TemplateToken steps)
{
var stepsSequence = steps.AssertSequence($"job {PipelineTemplateConstants.Steps}");
var result = new List<ActionStep>();
foreach (var stepsItem in stepsSequence)
{
var step = ConvertToStep(context, stepsItem);
if (step != null) // step = null means we are hitting error during step conversion, there should be an error in context.errors
{
if (step.Enabled)
{
result.Add(step);
}
}
}
return result;
}
private static ActionStep ConvertToStep( private static ActionStep ConvertToStep(
TemplateContext context, TemplateContext context,
TemplateToken stepsItem) TemplateToken stepsItem,
ReferenceNameBuilder nameBuilder)
{ {
var step = stepsItem.AssertMapping($"{PipelineTemplateConstants.Steps} item"); var step = stepsItem.AssertMapping($"{PipelineTemplateConstants.Steps} item");
var continueOnError = default(ScalarToken); var continueOnError = default(ScalarToken);
@@ -299,7 +359,6 @@ namespace GitHub.DistributedTask.Pipelines.ObjectTemplating
var ifToken = default(ScalarToken); var ifToken = default(ScalarToken);
var name = default(ScalarToken); var name = default(ScalarToken);
var run = default(ScalarToken); var run = default(ScalarToken);
var scope = default(StringToken);
var timeoutMinutes = default(ScalarToken); var timeoutMinutes = default(ScalarToken);
var uses = default(StringToken); var uses = default(StringToken);
var with = default(TemplateToken); var with = default(TemplateToken);
@@ -337,9 +396,12 @@ namespace GitHub.DistributedTask.Pipelines.ObjectTemplating
case PipelineTemplateConstants.Id: case PipelineTemplateConstants.Id:
id = stepProperty.Value.AssertString($"{PipelineTemplateConstants.Steps} item {PipelineTemplateConstants.Id}"); id = stepProperty.Value.AssertString($"{PipelineTemplateConstants.Steps} item {PipelineTemplateConstants.Id}");
if (!NameValidation.IsValid(id.Value, true)) if (!String.IsNullOrEmpty(id.Value))
{ {
context.Error(id, $"Step id {id.Value} is invalid. Ids must start with a letter or '_' and contain only alphanumeric characters, '-', or '_'"); if (!nameBuilder.TryAddKnownName(id.Value, out var error))
{
context.Error(id, error);
}
} }
break; break;
@@ -367,10 +429,6 @@ namespace GitHub.DistributedTask.Pipelines.ObjectTemplating
shell = stepProperty.Value.AssertScalar($"{PipelineTemplateConstants.Steps} item {PipelineTemplateConstants.Shell}"); shell = stepProperty.Value.AssertScalar($"{PipelineTemplateConstants.Steps} item {PipelineTemplateConstants.Shell}");
break; break;
case PipelineTemplateConstants.Scope:
scope = stepProperty.Value.AssertString($"{PipelineTemplateConstants.Steps} item {PipelineTemplateConstants.Scope}");
break;
case PipelineTemplateConstants.Submodules: case PipelineTemplateConstants.Submodules:
submodules = stepProperty.Value.AssertScalar($"{PipelineTemplateConstants.Steps} item {PipelineTemplateConstants.Submodules}"); submodules = stepProperty.Value.AssertScalar($"{PipelineTemplateConstants.Steps} item {PipelineTemplateConstants.Submodules}");
break; break;
@@ -400,14 +458,12 @@ namespace GitHub.DistributedTask.Pipelines.ObjectTemplating
} }
// Fixup the if-condition // Fixup the if-condition
var isDefaultScope = String.IsNullOrEmpty(scope?.Value); ifCondition = ConvertToIfCondition(context, ifToken, false);
ifCondition = ConvertToIfCondition(context, ifToken, false, isDefaultScope);
if (run != null) if (run != null)
{ {
var result = new ActionStep var result = new ActionStep
{ {
ScopeName = scope?.Value,
ContextName = id?.Value, ContextName = id?.Value,
ContinueOnError = continueOnError, ContinueOnError = continueOnError,
DisplayNameToken = name, DisplayNameToken = name,
@@ -439,7 +495,6 @@ namespace GitHub.DistributedTask.Pipelines.ObjectTemplating
uses.AssertString($"{PipelineTemplateConstants.Steps} item {PipelineTemplateConstants.Uses}"); uses.AssertString($"{PipelineTemplateConstants.Steps} item {PipelineTemplateConstants.Uses}");
var result = new ActionStep var result = new ActionStep
{ {
ScopeName = scope?.Value,
ContextName = id?.Value, ContextName = id?.Value,
ContinueOnError = continueOnError, ContinueOnError = continueOnError,
DisplayNameToken = name, DisplayNameToken = name,
@@ -503,8 +558,7 @@ namespace GitHub.DistributedTask.Pipelines.ObjectTemplating
private static String ConvertToIfCondition( private static String ConvertToIfCondition(
TemplateContext context, TemplateContext context,
TemplateToken token, TemplateToken token,
Boolean isJob, Boolean isJob)
Boolean isDefaultScope)
{ {
String condition; String condition;
if (token is null) if (token is null)
@@ -537,7 +591,7 @@ namespace GitHub.DistributedTask.Pipelines.ObjectTemplating
} }
else else
{ {
namedValues = isDefaultScope ? s_stepNamedValues : s_stepInTemplateNamedValues; namedValues = s_stepNamedValues;
functions = s_stepConditionFunctions; functions = s_stepConditionFunctions;
} }
@@ -589,18 +643,6 @@ namespace GitHub.DistributedTask.Pipelines.ObjectTemplating
new NamedValueInfo<NoOperationNamedValue>(PipelineTemplateConstants.Env), new NamedValueInfo<NoOperationNamedValue>(PipelineTemplateConstants.Env),
new NamedValueInfo<NoOperationNamedValue>(PipelineTemplateConstants.Needs), new NamedValueInfo<NoOperationNamedValue>(PipelineTemplateConstants.Needs),
}; };
private static readonly INamedValueInfo[] s_stepInTemplateNamedValues = new INamedValueInfo[]
{
new NamedValueInfo<NoOperationNamedValue>(PipelineTemplateConstants.Strategy),
new NamedValueInfo<NoOperationNamedValue>(PipelineTemplateConstants.Matrix),
new NamedValueInfo<NoOperationNamedValue>(PipelineTemplateConstants.Steps),
new NamedValueInfo<NoOperationNamedValue>(PipelineTemplateConstants.Inputs),
new NamedValueInfo<NoOperationNamedValue>(PipelineTemplateConstants.GitHub),
new NamedValueInfo<NoOperationNamedValue>(PipelineTemplateConstants.Job),
new NamedValueInfo<NoOperationNamedValue>(PipelineTemplateConstants.Runner),
new NamedValueInfo<NoOperationNamedValue>(PipelineTemplateConstants.Env),
new NamedValueInfo<NoOperationNamedValue>(PipelineTemplateConstants.Needs),
};
private static readonly IFunctionInfo[] s_stepConditionFunctions = new IFunctionInfo[] private static readonly IFunctionInfo[] s_stepConditionFunctions = new IFunctionInfo[]
{ {
new FunctionInfo<NoOperation>(PipelineTemplateConstants.Always, 0, 0), new FunctionInfo<NoOperation>(PipelineTemplateConstants.Always, 0, 0),

View File

@@ -51,60 +51,6 @@ namespace GitHub.DistributedTask.Pipelines.ObjectTemplating
public Int32 MaxResultSize { get; set; } = 10 * 1024 * 1024; // 10 mb public Int32 MaxResultSize { get; set; } = 10 * 1024 * 1024; // 10 mb
public DictionaryContextData EvaluateStepScopeInputs(
TemplateToken token,
DictionaryContextData contextData,
IList<IFunctionInfo> expressionFunctions)
{
var result = default(DictionaryContextData);
if (token != null && token.Type != TokenType.Null)
{
var context = CreateContext(contextData, expressionFunctions);
try
{
token = TemplateEvaluator.Evaluate(context, PipelineTemplateConstants.StepsScopeInputs, token, 0, null, omitHeader: true);
context.Errors.Check();
result = token.ToContextData().AssertDictionary("steps scope inputs");
}
catch (Exception ex) when (!(ex is TemplateValidationException))
{
context.Errors.Add(ex);
}
context.Errors.Check();
}
return result ?? new DictionaryContextData();
}
public DictionaryContextData EvaluateStepScopeOutputs(
TemplateToken token,
DictionaryContextData contextData,
IList<IFunctionInfo> expressionFunctions)
{
var result = default(DictionaryContextData);
if (token != null && token.Type != TokenType.Null)
{
var context = CreateContext(contextData, expressionFunctions);
try
{
token = TemplateEvaluator.Evaluate(context, PipelineTemplateConstants.StepsScopeOutputs, token, 0, null, omitHeader: true);
context.Errors.Check();
result = token.ToContextData().AssertDictionary("steps scope outputs");
}
catch (Exception ex) when (!(ex is TemplateValidationException))
{
context.Errors.Add(ex);
}
context.Errors.Check();
}
return result ?? new DictionaryContextData();
}
public Boolean EvaluateStepContinueOnError( public Boolean EvaluateStepContinueOnError(
TemplateToken token, TemplateToken token,
DictionaryContextData contextData, DictionaryContextData contextData,
@@ -159,31 +105,6 @@ namespace GitHub.DistributedTask.Pipelines.ObjectTemplating
return result; return result;
} }
public List<ActionStep> LoadCompositeSteps(
TemplateToken token)
{
var result = default(List<ActionStep>);
if (token != null && token.Type != TokenType.Null)
{
var context = CreateContext(null, null, setMissingContext: false);
// TODO: we might want to to have a bool to prevent it from filling in with missing context w/ dummy variables
try
{
token = TemplateEvaluator.Evaluate(context, PipelineTemplateConstants.StepsInTemplate, token, 0, null, omitHeader: true);
context.Errors.Check();
result = PipelineTemplateConverter.ConvertToSteps(context, token);
}
catch (Exception ex) when (!(ex is TemplateValidationException))
{
context.Errors.Add(ex);
}
context.Errors.Check();
}
return result;
}
public Dictionary<String, String> EvaluateStepEnvironment( public Dictionary<String, String> EvaluateStepEnvironment(
TemplateToken token, TemplateToken token,
DictionaryContextData contextData, DictionaryContextData contextData,
@@ -358,6 +279,33 @@ namespace GitHub.DistributedTask.Pipelines.ObjectTemplating
return result; return result;
} }
public TemplateToken EvaluateEnvironmentUrl(
TemplateToken token,
DictionaryContextData contextData,
IList<IFunctionInfo> expressionFunctions)
{
var result = default(TemplateToken);
if (token != null && token.Type != TokenType.Null)
{
var context = CreateContext(contextData, expressionFunctions);
try
{
token = TemplateEvaluator.Evaluate(context, TemplateConstants.StringRunnerContextNoSecrets, token, 0, null, omitHeader: true);
context.Errors.Check();
result = token.AssertString("environment.url");
}
catch (Exception ex) when (!(ex is TemplateValidationException))
{
context.Errors.Add(ex);
}
context.Errors.Check();
}
return result;
}
public Dictionary<String, String> EvaluateJobDefaultsRun( public Dictionary<String, String> EvaluateJobDefaultsRun(
TemplateToken token, TemplateToken token,
DictionaryContextData contextData, DictionaryContextData contextData,
@@ -425,8 +373,7 @@ namespace GitHub.DistributedTask.Pipelines.ObjectTemplating
private TemplateContext CreateContext( private TemplateContext CreateContext(
DictionaryContextData contextData, DictionaryContextData contextData,
IList<IFunctionInfo> expressionFunctions, IList<IFunctionInfo> expressionFunctions,
IEnumerable<KeyValuePair<String, Object>> expressionState = null, IEnumerable<KeyValuePair<String, Object>> expressionState = null)
bool setMissingContext = true)
{ {
var result = new TemplateContext var result = new TemplateContext
{ {
@@ -475,8 +422,6 @@ namespace GitHub.DistributedTask.Pipelines.ObjectTemplating
// - Evaluating early when all referenced contexts are available, even though all allowed // - Evaluating early when all referenced contexts are available, even though all allowed
// contexts may not yet be available. For example, evaluating step display name can often // contexts may not yet be available. For example, evaluating step display name can often
// be performed early. // be performed early.
if (setMissingContext)
{
foreach (var name in s_expressionValueNames) foreach (var name in s_expressionValueNames)
{ {
if (!result.ExpressionValues.ContainsKey(name)) if (!result.ExpressionValues.ContainsKey(name))
@@ -491,7 +436,6 @@ namespace GitHub.DistributedTask.Pipelines.ObjectTemplating
result.ExpressionFunctions.Add(new FunctionInfo<NoOperation>(name, 0, Int32.MaxValue)); result.ExpressionFunctions.Add(new FunctionInfo<NoOperation>(name, 0, Int32.MaxValue));
} }
} }
}
// Add state // Add state
if (expressionState != null) if (expressionState != null)

View File

@@ -0,0 +1,121 @@
using System;
using System.Collections.Generic;
using System.Globalization;
using System.Text;
using GitHub.DistributedTask.Pipelines.Validation;
namespace GitHub.DistributedTask.Pipelines.ObjectTemplating
{
internal sealed class ReferenceNameBuilder
{
internal void AppendSegment(String value)
{
if (String.IsNullOrEmpty(value))
{
return;
}
if (m_name.Length == 0)
{
var first = value[0];
if ((first >= 'a' && first <= 'z') ||
(first >= 'A' && first <= 'Z') ||
first == '_')
{
// Legal first char
}
else if ((first >= '0' && first <= '9') || first == '-')
{
// Illegal first char, but legal char.
// Prepend "_".
m_name.Append("_");
}
else
{
// Illegal char
}
}
else
{
// Separator
m_name.Append(c_separator);
}
foreach (var c in value)
{
if ((c >= 'a' && c <= 'z') ||
(c >= 'A' && c <= 'Z') ||
(c >= '0' && c <= '9') ||
c == '_' ||
c == '-')
{
// Legal
m_name.Append(c);
}
else
{
// Illegal
m_name.Append("_");
}
}
}
internal String Build()
{
var original = m_name.Length > 0 ? m_name.ToString() : "job";
var attempt = 1;
var suffix = default(String);
while (true)
{
if (attempt == 1)
{
suffix = String.Empty;
}
else if (attempt < 1000)
{
suffix = String.Format(CultureInfo.InvariantCulture, "_{0}", attempt);
}
else
{
throw new InvalidOperationException("Unable to create a unique name");
}
var candidate = original.Substring(0, Math.Min(original.Length, PipelineConstants.MaxNodeNameLength - suffix.Length)) + suffix;
if (m_distinctNames.Add(candidate))
{
m_name.Clear();
return candidate;
}
attempt++;
}
}
internal Boolean TryAddKnownName(
String value,
out String error)
{
if (!NameValidation.IsValid(value, allowHyphens: true) && value.Length < PipelineConstants.MaxNodeNameLength)
{
error = $"The identifier '{value}' is invalid. IDs may only contain alphanumeric characters, '_', and '-'. IDs must start with a letter or '_' and and must be less than {PipelineConstants.MaxNodeNameLength} characters.";
return false;
}
else if (!m_distinctNames.Add(value))
{
error = $"The identifier '{value}' may not be used more than once within the same scope.";
return false;
}
else
{
error = null;
return true;
}
}
private const String c_separator = "_";
private readonly HashSet<String> m_distinctNames = new HashSet<String>(StringComparer.OrdinalIgnoreCase);
private readonly StringBuilder m_name = new StringBuilder();
}
}

View File

@@ -16,116 +16,6 @@
} }
}, },
"steps-template-root": {
"description": "Steps template file",
"mapping": {
"properties": {
"inputs": "steps-template-inputs",
"outputs": "steps-template-outputs",
"steps": "steps-in-template"
}
}
},
"steps-scope-inputs": {
"description": "Used when evaluating steps scope inputs",
"mapping": {
"loose-key-type": "non-empty-string",
"loose-value-type": "steps-scope-input-value"
}
},
"steps-scope-input-value": {
"context": [
"github",
"needs",
"strategy",
"matrix",
"secrets",
"steps",
"inputs",
"job",
"runner",
"env"
],
"one-of": [
"string",
"sequence",
"mapping"
]
},
"steps-scope-outputs": {
"description": "Used when evaluating steps scope outputs",
"mapping": {
"loose-key-type": "non-empty-string",
"loose-value-type": "steps-scope-output-value"
}
},
"steps-scope-output-value": {
"context": [
"github",
"needs",
"strategy",
"matrix",
"secrets",
"steps",
"inputs",
"job",
"runner",
"env"
],
"string": {}
},
"steps-template-inputs": {
"description": "Allowed inputs in a steps template",
"mapping": {
"loose-key-type": "non-empty-string",
"loose-value-type": "steps-template-input-value"
}
},
"steps-template-input-value": {
"description": "Default input values for a steps template",
"context": [
"github",
"needs",
"strategy",
"matrix"
],
"one-of": [
"string",
"sequence",
"mapping"
]
},
"steps-template-outputs": {
"description": "Output mapping for a steps template",
"mapping": {
"loose-key-type": "non-empty-string",
"loose-value-type": "steps-template-output-value"
}
},
"steps-template-output-value": {
"description": "Output values for a steps template",
"context": [
"github",
"needs",
"strategy",
"matrix",
"secrets",
"steps",
"job",
"runner",
"env"
],
"string": {}
},
"workflow-defaults": { "workflow-defaults": {
"mapping": { "mapping": {
"properties": { "properties": {
@@ -240,54 +130,27 @@
"matrix": { "matrix": {
"mapping": { "mapping": {
"properties": { "properties": {
"include": "matrix-include", "include": "matrix-filter",
"exclude": "matrix-exclude" "exclude": "matrix-filter"
}, },
"loose-key-type": "non-empty-string", "loose-key-type": "non-empty-string",
"loose-value-type": "sequence" "loose-value-type": "sequence"
} }
}, },
"matrix-include": { "matrix-filter": {
"sequence": { "sequence": {
"item-type": "matrix-include-item" "item-type": "matrix-filter-item"
} }
}, },
"matrix-include-item": { "matrix-filter-item": {
"mapping": { "mapping": {
"loose-key-type": "non-empty-string", "loose-key-type": "non-empty-string",
"loose-value-type": "any" "loose-value-type": "any"
} }
}, },
"matrix-exclude": {
"sequence": {
"item-type": "matrix-exclude-item"
}
},
"matrix-exclude-item": {
"mapping": {
"loose-key-type": "non-empty-string",
"loose-value-type": "matrix-exclude-filter-item"
}
},
"matrix-exclude-filter-item": {
"one-of": [
"string",
"matrix-exclude-mapping-filter"
]
},
"matrix-exclude-mapping-filter": {
"mapping": {
"loose-key-type": "non-empty-string",
"loose-value-type": "matrix-exclude-filter-item"
}
},
"runs-on": { "runs-on": {
"context": [ "context": [
"github", "github",
@@ -364,25 +227,10 @@
} }
}, },
"steps-in-template": {
"sequence": {
"item-type": "steps-item-in-template"
}
},
"steps-item": { "steps-item": {
"one-of": [ "one-of": [
"run-step", "run-step",
"regular-step", "regular-step"
"steps-template-reference"
]
},
"steps-item-in-template": {
"one-of": [
"run-step-in-template",
"regular-step-in-template",
"steps-template-reference-in-template"
] ]
}, },
@@ -405,25 +253,6 @@
} }
}, },
"run-step-in-template": {
"mapping": {
"properties": {
"name": "string-steps-context-in-template",
"id": "non-empty-string",
"if": "step-if-in-template",
"timeout-minutes": "number-steps-context-in-template",
"run": {
"type": "string-steps-context-in-template",
"required": true
},
"continue-on-error": "boolean-steps-context-in-template",
"env": "step-env-in-template",
"working-directory": "string-steps-context-in-template",
"shell": "non-empty-string"
}
}
},
"regular-step": { "regular-step": {
"mapping": { "mapping": {
"properties": { "properties": {
@@ -442,24 +271,6 @@
} }
}, },
"regular-step-in-template": {
"mapping": {
"properties": {
"name": "string-steps-context-in-template",
"id": "non-empty-string",
"if": "step-if-in-template",
"continue-on-error": "boolean-steps-context-in-template",
"timeout-minutes": "number-steps-context-in-template",
"uses": {
"type": "non-empty-string",
"required": true
},
"with": "step-with-in-template",
"env": "step-env-in-template"
}
}
},
"step-if": { "step-if": {
"context": [ "context": [
"github", "github",
@@ -479,26 +290,6 @@
"string": {} "string": {}
}, },
"step-if-in-template": {
"context": [
"github",
"needs",
"strategy",
"matrix",
"steps",
"inputs",
"job",
"runner",
"env",
"always(0,0)",
"failure(0,0)",
"cancelled(0,0)",
"success(0,0)",
"hashFiles(1,255)"
],
"string": {}
},
"step-if-result": { "step-if-result": {
"context": [ "context": [
"github", "github",
@@ -524,89 +315,6 @@
] ]
}, },
"step-if-result-in-template": {
"context": [
"github",
"strategy",
"matrix",
"steps",
"inputs",
"job",
"runner",
"env",
"always(0,0)",
"failure(0,0)",
"cancelled(0,0)",
"success(0,0)",
"hashFiles(1,255)"
],
"one-of": [
"null",
"boolean",
"number",
"string",
"sequence",
"mapping"
]
},
"steps-template-reference": {
"mapping": {
"properties": {
"template": "non-empty-string",
"id": "non-empty-string",
"inputs": "steps-template-reference-inputs"
}
}
},
"steps-template-reference-in-template": {
"mapping": {
"properties": {
"template": "non-empty-string",
"id": "non-empty-string",
"inputs": "steps-template-reference-inputs-in-template"
}
}
},
"steps-template-reference-inputs": {
"context": [
"github",
"needs",
"strategy",
"matrix",
"secrets",
"steps",
"job",
"runner",
"env"
],
"mapping": {
"loose-key-type": "non-empty-string",
"loose-value-type": "string"
}
},
"steps-template-reference-inputs-in-template": {
"context": [
"github",
"needs",
"strategy",
"matrix",
"secrets",
"steps",
"inputs",
"job",
"runner",
"env"
],
"mapping": {
"loose-key-type": "non-empty-string",
"loose-value-type": "string"
}
},
"step-env": { "step-env": {
"context": [ "context": [
"github", "github",
@@ -626,26 +334,6 @@
} }
}, },
"step-env-in-template": {
"context": [
"github",
"needs",
"strategy",
"matrix",
"secrets",
"steps",
"inputs",
"job",
"runner",
"env",
"hashFiles(1,255)"
],
"mapping": {
"loose-key-type": "non-empty-string",
"loose-value-type": "string"
}
},
"step-with": { "step-with": {
"context": [ "context": [
"github", "github",
@@ -665,26 +353,6 @@
} }
}, },
"step-with-in-template": {
"context": [
"github",
"needs",
"strategy",
"matrix",
"secrets",
"steps",
"inputs",
"job",
"runner",
"env",
"hashFiles(1,255)"
],
"mapping": {
"loose-key-type": "non-empty-string",
"loose-value-type": "string"
}
},
"container": { "container": {
"context": [ "context": [
"github", "github",
@@ -705,7 +373,8 @@
"options": "non-empty-string", "options": "non-empty-string",
"env": "container-env", "env": "container-env",
"ports": "sequence-of-non-empty-string", "ports": "sequence-of-non-empty-string",
"volumes": "sequence-of-non-empty-string" "volumes": "sequence-of-non-empty-string",
"credentials": "container-registry-credentials"
} }
} }
}, },
@@ -736,6 +405,20 @@
] ]
}, },
"container-registry-credentials": {
"context": [
"secrets",
"env",
"github"
],
"mapping": {
"properties": {
"username": "non-empty-string",
"password": "non-empty-string"
}
}
},
"container-env": { "container-env": {
"mapping": { "mapping": {
"loose-key-type": "non-empty-string", "loose-key-type": "non-empty-string",
@@ -801,23 +484,6 @@
"boolean": {} "boolean": {}
}, },
"boolean-steps-context-in-template": {
"context": [
"github",
"needs",
"strategy",
"matrix",
"secrets",
"steps",
"inputs",
"job",
"runner",
"env",
"hashFiles(1,255)"
],
"boolean": {}
},
"number-steps-context": { "number-steps-context": {
"context": [ "context": [
"github", "github",
@@ -834,23 +500,6 @@
"number": {} "number": {}
}, },
"number-steps-context-in-template": {
"context": [
"github",
"needs",
"strategy",
"matrix",
"secrets",
"steps",
"inputs",
"job",
"runner",
"env",
"hashFiles(1,255)"
],
"number": {}
},
"string-runner-context": { "string-runner-context": {
"context": [ "context": [
"github", "github",
@@ -866,6 +515,20 @@
"string": {} "string": {}
}, },
"string-runner-context-no-secrets": {
"context": [
"github",
"needs",
"strategy",
"matrix",
"steps",
"job",
"runner",
"env"
],
"string": {}
},
"string-steps-context": { "string-steps-context": {
"context": [ "context": [
"github", "github",
@@ -880,23 +543,6 @@
"hashFiles(1,255)" "hashFiles(1,255)"
], ],
"string": {} "string": {}
},
"string-steps-context-in-template": {
"context": [
"github",
"needs",
"strategy",
"matrix",
"secrets",
"steps",
"inputs",
"job",
"runner",
"env",
"hashFiles(1,255)"
],
"string": {}
} }
} }
} }

View File

@@ -0,0 +1,23 @@
using System.Runtime.Serialization;
using GitHub.DistributedTask.ObjectTemplating.Tokens;
namespace GitHub.DistributedTask.WebApi
{
/// <summary>
/// Information about an environment parsed from YML with evaluated name, URL will be evaluated on runner
/// </summary>
[DataContract]
public class ActionsEnvironmentReference
{
public ActionsEnvironmentReference(string name)
{
Name = name;
}
[DataMember(EmitDefaultValue = false)]
public string Name { get; set; }
[DataMember(EmitDefaultValue = false)]
public TemplateToken Url { get; set; }
}
}

View File

@@ -2458,4 +2458,23 @@ namespace GitHub.DistributedTask.WebApi
{ {
} }
} }
[Serializable]
public sealed class FailedToResolveActionDownloadInfoException : DistributedTaskException
{
public FailedToResolveActionDownloadInfoException(String message)
: base(message)
{
}
public FailedToResolveActionDownloadInfoException(String message, Exception innerException)
: base(message, innerException)
{
}
private FailedToResolveActionDownloadInfoException(SerializationInfo info, StreamingContext context)
: base(info, context)
{
}
}
} }

View File

@@ -17,6 +17,7 @@ namespace GitHub.DistributedTask.WebApi
this.Type = issueToBeCloned.Type; this.Type = issueToBeCloned.Type;
this.Category = issueToBeCloned.Category; this.Category = issueToBeCloned.Category;
this.Message = issueToBeCloned.Message; this.Message = issueToBeCloned.Message;
this.IsInfrastructureIssue = issueToBeCloned.IsInfrastructureIssue;
if (issueToBeCloned.m_data != null) if (issueToBeCloned.m_data != null)
{ {
@@ -48,6 +49,13 @@ namespace GitHub.DistributedTask.WebApi
set; set;
} }
[DataMember(Order = 4)]
public bool? IsInfrastructureIssue
{
get;
set;
}
public IDictionary<String, String> Data public IDictionary<String, String> Data
{ {
get get

View File

@@ -131,6 +131,17 @@ namespace GitHub.DistributedTask.WebApi
this.Outputs = outputs; this.Outputs = outputs;
} }
public JobCompletedEvent(
Int64 requestId,
Guid jobId,
TaskResult result,
Dictionary<String, VariableValue> outputs,
ActionsEnvironmentReference actionsEnvironment)
: this(requestId, jobId, result, outputs)
{
this.ActionsEnvironment = actionsEnvironment;
}
[DataMember(EmitDefaultValue = false)] [DataMember(EmitDefaultValue = false)]
public Int64 RequestId public Int64 RequestId
{ {
@@ -151,6 +162,13 @@ namespace GitHub.DistributedTask.WebApi
get; get;
set; set;
} }
[DataMember(EmitDefaultValue = false)]
public ActionsEnvironmentReference ActionsEnvironment
{
get;
set;
}
} }
[DataContract] [DataContract]

View File

@@ -29,6 +29,7 @@ namespace GitHub.DistributedTask.WebApi
this.PoolType = referenceToBeCloned.PoolType; this.PoolType = referenceToBeCloned.PoolType;
this.Size = referenceToBeCloned.Size; this.Size = referenceToBeCloned.Size;
this.IsLegacy = referenceToBeCloned.IsLegacy; this.IsLegacy = referenceToBeCloned.IsLegacy;
this.IsInternal = referenceToBeCloned.IsInternal;
} }
public TaskAgentPoolReference Clone() public TaskAgentPoolReference Clone()
@@ -67,6 +68,16 @@ namespace GitHub.DistributedTask.WebApi
set; set;
} }
/// <summary>
/// Gets or sets a value indicating whether or not this pool is internal and can't be modified by users
/// </summary>
[DataMember]
public bool IsInternal
{
get;
set;
}
/// <summary> /// <summary>
/// Gets or sets the type of the pool /// Gets or sets the type of the pool
/// </summary> /// </summary>

View File

@@ -65,5 +65,15 @@ namespace GitHub.DistributedTask.WebApi
get; get;
set; set;
} }
/// <summary>
/// Gets or sets whether to use FIPS compliant encryption scheme for job message key
/// </summary>
[DataMember]
public bool UseFipsEncryption
{
get;
set;
}
} }
} }

View File

@@ -92,6 +92,28 @@ namespace GitHub.DistributedTask.WebApi
cancellationToken); cancellationToken);
} }
public Task AppendTimelineRecordFeedAsync(
Guid scopeIdentifier,
String planType,
Guid planId,
Guid timelineId,
Guid recordId,
Guid stepId,
IList<String> lines,
long startLine,
CancellationToken cancellationToken = default(CancellationToken),
Object userState = null)
{
return AppendTimelineRecordFeedAsync(scopeIdentifier,
planType,
planId,
timelineId,
recordId,
new TimelineRecordFeedLinesWrapper(stepId, lines, startLine),
userState,
cancellationToken);
}
public async Task RaisePlanEventAsync<T>( public async Task RaisePlanEventAsync<T>(
Guid scopeIdentifier, Guid scopeIdentifier,
String planType, String planType,

View File

@@ -20,6 +20,12 @@ namespace GitHub.DistributedTask.WebApi
this.Count = lines.Count; this.Count = lines.Count;
} }
public TimelineRecordFeedLinesWrapper(Guid stepId, IList<string> lines, Int64 startLine)
: this(stepId, lines)
{
this.StartLine = startLine;
}
[DataMember(Order = 0)] [DataMember(Order = 0)]
public Int32 Count { get; private set; } public Int32 Count { get; private set; }
@@ -31,5 +37,8 @@ namespace GitHub.DistributedTask.WebApi
[DataMember(EmitDefaultValue = false)] [DataMember(EmitDefaultValue = false)]
public Guid StepId { get; set; } public Guid StepId { get; set; }
[DataMember (EmitDefaultValue = false)]
public Int64? StartLine { get; private set; }
} }
} }

View File

@@ -0,0 +1,29 @@
using System;
using System.Runtime.Serialization;
namespace GitHub.DistributedTask.WebApi
{
[DataContract]
public sealed class TimelineRecordLogLine
{
public TimelineRecordLogLine(String line, long? lineNumber)
{
this.Line = line;
this.LineNumber = lineNumber;
}
[DataMember]
public String Line
{
get;
set;
}
[DataMember (EmitDefaultValue = false)]
public long? LineNumber
{
get;
set;
}
}
}

View File

@@ -13,5 +13,8 @@ namespace GitHub.DistributedTask.WebApi
[EnumMember] [EnumMember]
Completed, Completed,
[EnumMember]
Delayed,
} }
} }

View File

@@ -130,55 +130,6 @@ namespace GitHub.Services.WebApi.Jwt
return credentials.SignatureAlgorithm; return credentials.SignatureAlgorithm;
} }
public static ClaimsPrincipal ValidateToken(this JsonWebToken token, JsonWebTokenValidationParameters parameters)
{
ArgumentUtility.CheckForNull(token, nameof(token));
ArgumentUtility.CheckForNull(parameters, nameof(parameters));
ClaimsIdentity actorIdentity = ValidateActor(token, parameters);
ValidateLifetime(token, parameters);
ValidateAudience(token, parameters);
ValidateSignature(token, parameters);
ValidateIssuer(token, parameters);
ClaimsIdentity identity = new ClaimsIdentity("Federation", parameters.IdentityNameClaimType, ClaimTypes.Role);
if (actorIdentity != null)
{
identity.Actor = actorIdentity;
}
IEnumerable<Claim> claims = token.ExtractClaims();
foreach (Claim claim in claims)
{
identity.AddClaim(new Claim(claim.Type, claim.Value, claim.ValueType, token.Issuer));
}
return new ClaimsPrincipal(identity);
}
private static ClaimsIdentity ValidateActor(JsonWebToken token, JsonWebTokenValidationParameters parameters)
{
ArgumentUtility.CheckForNull(token, nameof(token));
ArgumentUtility.CheckForNull(parameters, nameof(parameters));
if (!parameters.ValidateActor)
{
return null;
}
//this recursive call with check the parameters
ClaimsPrincipal principal = token.Actor.ValidateToken(parameters.ActorValidationParameters);
if (!(principal?.Identity is ClaimsIdentity))
{
throw new ActorValidationException();
}
return (ClaimsIdentity)principal.Identity;
}
private static void ValidateLifetime(JsonWebToken token, JsonWebTokenValidationParameters parameters) private static void ValidateLifetime(JsonWebToken token, JsonWebTokenValidationParameters parameters)
{ {
ArgumentUtility.CheckForNull(token, nameof(token)); ArgumentUtility.CheckForNull(token, nameof(token));
@@ -241,59 +192,6 @@ namespace GitHub.Services.WebApi.Jwt
throw new InvalidAudienceException(); //validation exception; throw new InvalidAudienceException(); //validation exception;
} }
private static void ValidateSignature(JsonWebToken token, JsonWebTokenValidationParameters parameters)
{
ArgumentUtility.CheckForNull(token, nameof(token));
ArgumentUtility.CheckForNull(parameters, nameof(parameters));
if (!parameters.ValidateSignature)
{
return;
}
string encodedData = token.EncodedToken;
string[] parts = encodedData.Split('.');
if (parts.Length != 3)
{
throw new InvalidTokenException(JwtResources.EncodedTokenDataMalformed()); //validation exception
}
if (string.IsNullOrEmpty(parts[2]))
{
throw new InvalidTokenException(JwtResources.SignatureNotFound()); //validation exception
}
if (token.Algorithm == JWTAlgorithm.None)
{
throw new InvalidTokenException(JwtResources.InvalidSignatureAlgorithm()); //validation exception
}
ArgumentUtility.CheckForNull(parameters.SigningCredentials, nameof(parameters.SigningCredentials));
//ArgumentUtility.CheckEnumerableForNullOrEmpty(parameters.SigningToken.SecurityKeys, nameof(parameters.SigningToken.SecurityKeys));
byte[] sourceInput = Encoding.UTF8.GetBytes(string.Format("{0}.{1}", parts[0], parts[1]));
byte[] sourceSignature = parts[2].FromBase64StringNoPadding();
try
{
if (parameters.SigningCredentials.VerifySignature(sourceInput, sourceSignature))
{
return;
}
}
catch (Exception)
{
//swallow exceptions here, we'll throw if nothing works...
}
throw new SignatureValidationException(); //valiation exception
}
private static void ValidateIssuer(JsonWebToken token, JsonWebTokenValidationParameters parameters) private static void ValidateIssuer(JsonWebToken token, JsonWebTokenValidationParameters parameters)
{ {
ArgumentUtility.CheckForNull(token, nameof(token)); ArgumentUtility.CheckForNull(token, nameof(token));

View File

@@ -1,7 +1,6 @@
using System; using System;
using System.Collections.Generic; using System.Collections.Generic;
using System.Security.Cryptography; using System.Security.Cryptography;
using System.Security.Cryptography.X509Certificates;
using GitHub.Services.Common; using GitHub.Services.Common;
using GitHub.Services.WebApi.Jwt; using GitHub.Services.WebApi.Jwt;
@@ -75,7 +74,6 @@ namespace GitHub.Services.WebApi
{ {
throw new InvalidOperationException(); throw new InvalidOperationException();
} }
return GetSignature(input); return GetSignature(input);
} }
@@ -86,48 +84,13 @@ namespace GitHub.Services.WebApi
/// <returns>A blob of data representing the signature of the input data</returns> /// <returns>A blob of data representing the signature of the input data</returns>
protected abstract Byte[] GetSignature(Byte[] input); protected abstract Byte[] GetSignature(Byte[] input);
/// <summary>
/// Verifies the signature of the input data, returning true if the signature is valid.
/// </summary>
/// <param name="input">The data which should be signed</param>
/// <param name="signature">The signature which should be verified</param>
/// <returns>True if the provided signature matches the current signing token; otherwise, false</returns>
public abstract Boolean VerifySignature(Byte[] input, Byte[] signature);
/// <summary>
/// Creates a new <c>VssSigningCredentials</c> instance using the specified <paramref name="certificate"/> instance
/// as the signing key.
/// </summary>
/// <param name="certificate">The certificate which contains the key used for signing and verification</param>
/// <returns>A new <c>VssSigningCredentials</c> instance which uses the specified certificate for signing</returns>
public static VssSigningCredentials Create(X509Certificate2 certificate)
{
ArgumentUtility.CheckForNull(certificate, nameof(certificate));
if (certificate.HasPrivateKey)
{
var rsa = certificate.GetRSAPrivateKey();
if (rsa == null)
{
throw new SignatureAlgorithmUnsupportedException(certificate.SignatureAlgorithm.FriendlyName);
}
if (rsa.KeySize < c_minKeySize)
{
throw new InvalidCredentialsException(JwtResources.SigningTokenKeyTooSmall());
}
}
return new X509Certificate2SigningToken(certificate);
}
/// <summary> /// <summary>
/// Creates a new <c>VssSigningCredentials</c> instance using the specified <paramref name="factory"/> /// Creates a new <c>VssSigningCredentials</c> instance using the specified <paramref name="factory"/>
/// callback function to retrieve the signing key. /// callback function to retrieve the signing key.
/// </summary> /// </summary>
/// <param name="factory">The factory which creates <c>RSA</c> keys used for signing and verification</param> /// <param name="factory">The factory which creates <c>RSA</c> keys used for signing and verification</param>
/// <returns>A new <c>VssSigningCredentials</c> instance which uses the specified provider for signing</returns> /// <returns>A new <c>VssSigningCredentials</c> instance which uses the specified provider for signing</returns>
public static VssSigningCredentials Create(Func<RSA> factory) public static VssSigningCredentials Create(Func<RSA> factory, bool requireFipsCryptography)
{ {
ArgumentUtility.CheckForNull(factory, nameof(factory)); ArgumentUtility.CheckForNull(factory, nameof(factory));
@@ -143,22 +106,12 @@ namespace GitHub.Services.WebApi
throw new InvalidCredentialsException(JwtResources.SigningTokenKeyTooSmall()); throw new InvalidCredentialsException(JwtResources.SigningTokenKeyTooSmall());
} }
return new RSASigningToken(factory, rsa.KeySize); if (requireFipsCryptography)
}
}
/// <summary>
/// Creates a new <c>VssSigningCredentials</c> instance using the specified <paramref name="key"/> as the signing
/// key. The returned signing token performs symmetric key signing and verification.
/// </summary>
/// <param name="rsa">The key used for signing and verification</param>
/// <returns>A new <c>VssSigningCredentials</c> instance which uses the specified key for signing</returns>
public static VssSigningCredentials Create(Byte[] key)
{ {
ArgumentUtility.CheckForNull(key, nameof(key)); return new RSASigningToken(factory, rsa.KeySize, RSASignaturePadding.Pss);
}
// Probably should have validation here, but there was none previously return new RSASigningToken(factory, rsa.KeySize, RSASignaturePadding.Pkcs1);
return new SymmetricKeySigningToken(key); }
} }
private const Int32 c_minKeySize = 2048; private const Int32 c_minKeySize = 2048;
@@ -166,57 +119,6 @@ namespace GitHub.Services.WebApi
#region Concrete Implementations #region Concrete Implementations
private class SymmetricKeySigningToken : VssSigningCredentials
{
public SymmetricKeySigningToken(Byte[] key)
{
m_key = new Byte[key.Length];
Buffer.BlockCopy(key, 0, m_key, 0, m_key.Length);
}
public override Boolean CanSignData
{
get
{
return true;
}
}
public override Int32 KeySize
{
get
{
return m_key.Length * 8;
}
}
public override JWTAlgorithm SignatureAlgorithm
{
get
{
return JWTAlgorithm.HS256;
}
}
protected override Byte[] GetSignature(Byte[] input)
{
using (var hash = new HMACSHA256(m_key))
{
return hash.ComputeHash(input);
}
}
public override Boolean VerifySignature(
Byte[] input,
Byte[] signature)
{
var computedSignature = SignData(input);
return SecureCompare.TimeInvariantEquals(computedSignature, signature);
}
private readonly Byte[] m_key;
}
private abstract class AsymmetricKeySigningToken : VssSigningCredentials private abstract class AsymmetricKeySigningToken : VssSigningCredentials
{ {
protected abstract Boolean HasPrivateKey(); protected abstract Boolean HasPrivateKey();
@@ -244,70 +146,14 @@ namespace GitHub.Services.WebApi
private Boolean? m_hasPrivateKey; private Boolean? m_hasPrivateKey;
} }
private class X509Certificate2SigningToken : AsymmetricKeySigningToken, IJsonWebTokenHeaderProvider
{
public X509Certificate2SigningToken(X509Certificate2 certificate)
{
m_certificate = certificate;
}
public override Int32 KeySize
{
get
{
return m_certificate.GetRSAPublicKey().KeySize;
}
}
public override DateTime ValidFrom
{
get
{
return m_certificate.NotBefore;
}
}
public override DateTime ValidTo
{
get
{
return m_certificate.NotAfter;
}
}
public override Boolean VerifySignature(
Byte[] input,
Byte[] signature)
{
var rsa = m_certificate.GetRSAPublicKey();
return rsa.VerifyData(input, signature, HashAlgorithmName.SHA256, RSASignaturePadding.Pkcs1);
}
protected override Byte[] GetSignature(Byte[] input)
{
var rsa = m_certificate.GetRSAPrivateKey();
return rsa.SignData(input, HashAlgorithmName.SHA256, RSASignaturePadding.Pkcs1);
}
protected override Boolean HasPrivateKey()
{
return m_certificate.HasPrivateKey;
}
void IJsonWebTokenHeaderProvider.SetHeaders(IDictionary<String, Object> headers)
{
headers[JsonWebTokenHeaderParameters.X509CertificateThumbprint] = m_certificate.GetCertHash().ToBase64StringNoPadding();
}
private readonly X509Certificate2 m_certificate;
}
private class RSASigningToken : AsymmetricKeySigningToken private class RSASigningToken : AsymmetricKeySigningToken
{ {
public RSASigningToken( public RSASigningToken(
Func<RSA> factory, Func<RSA> factory,
Int32 keySize) Int32 keySize,
RSASignaturePadding signaturePadding)
{ {
m_signaturePadding = signaturePadding;
m_keySize = keySize; m_keySize = keySize;
m_factory = factory; m_factory = factory;
} }
@@ -324,7 +170,7 @@ namespace GitHub.Services.WebApi
{ {
using (var rsa = m_factory()) using (var rsa = m_factory())
{ {
return rsa.SignData(input, HashAlgorithmName.SHA256, RSASignaturePadding.Pkcs1); return rsa.SignData(input, HashAlgorithmName.SHA256, m_signaturePadding);
} }
} }
@@ -344,18 +190,9 @@ namespace GitHub.Services.WebApi
} }
} }
public override Boolean VerifySignature(
Byte[] input,
Byte[] signature)
{
using (var rsa = m_factory())
{
return rsa.VerifyData(input, signature, HashAlgorithmName.SHA256, RSASignaturePadding.Pkcs1);
}
}
private readonly Int32 m_keySize; private readonly Int32 m_keySize;
private readonly Func<RSA> m_factory; private readonly Func<RSA> m_factory;
private readonly RSASignaturePadding m_signaturePadding;
} }
#endregion #endregion

View File

@@ -126,5 +126,23 @@ namespace GitHub.Runner.Common.Tests.Worker.Container
Assert.NotNull(result5); Assert.NotNull(result5);
Assert.Equal("/foo/bar:/baz", result5); Assert.Equal("/foo/bar:/baz", result5);
} }
[Theory]
[Trait("Level", "L0")]
[Trait("Category", "Worker")]
[InlineData("dockerhub/repo", "")]
[InlineData("localhost/doesnt_work", "")]
[InlineData("localhost:port/works", "localhost:port")]
[InlineData("host.tld/works", "host.tld")]
[InlineData("ghcr.io/owner/image", "ghcr.io")]
[InlineData("gcr.io/project/image", "gcr.io")]
[InlineData("myregistry.azurecr.io/namespace/image", "myregistry.azurecr.io")]
[InlineData("account.dkr.ecr.region.amazonaws.com/image", "account.dkr.ecr.region.amazonaws.com")]
[InlineData("docker.pkg.github.com/owner/repo/image", "docker.pkg.github.com")]
public void ParseRegistryHostnameFromImageName(string input, string expected)
{
var actual = DockerUtil.ParseRegistryHostnameFromImageName(input);
Assert.Equal(expected, actual);
}
} }
} }

View File

@@ -39,10 +39,12 @@ namespace GitHub.Runner.Common.Tests.Listener.Configuration
private string _expectedToken = "expectedToken"; private string _expectedToken = "expectedToken";
private string _expectedServerUrl = "https://codedev.ms"; private string _expectedServerUrl = "https://codedev.ms";
private string _expectedAgentName = "expectedAgentName"; private string _expectedAgentName = "expectedAgentName";
private string _expectedPoolName = "poolName"; private string _defaultRunnerGroupName = "defaultRunnerGroup";
private string _secondRunnerGroupName = "secondRunnerGroup";
private string _expectedAuthType = "pat"; private string _expectedAuthType = "pat";
private string _expectedWorkFolder = "_work"; private string _expectedWorkFolder = "_work";
private int _expectedPoolId = 1; private int _defaultRunnerGroupId = 1;
private int _secondRunnerGroupId = 2;
private RSACryptoServiceProvider rsa = null; private RSACryptoServiceProvider rsa = null;
private RunnerSettings _configMgrAgentSettings = new RunnerSettings(); private RunnerSettings _configMgrAgentSettings = new RunnerSettings();
@@ -97,7 +99,7 @@ namespace GitHub.Runner.Common.Tests.Listener.Configuration
_serviceControlManager.Setup(x => x.GenerateScripts(It.IsAny<RunnerSettings>())); _serviceControlManager.Setup(x => x.GenerateScripts(It.IsAny<RunnerSettings>()));
#endif #endif
var expectedPools = new List<TaskAgentPool>() { new TaskAgentPool(_expectedPoolName) { Id = _expectedPoolId } }; var expectedPools = new List<TaskAgentPool>() { new TaskAgentPool(_defaultRunnerGroupName) { Id = _defaultRunnerGroupId, IsInternal = true }, new TaskAgentPool(_secondRunnerGroupName) { Id = _secondRunnerGroupId } };
_runnerServer.Setup(x => x.GetAgentPoolsAsync(It.IsAny<string>(), It.IsAny<TaskAgentPoolType>())).Returns(Task.FromResult(expectedPools)); _runnerServer.Setup(x => x.GetAgentPoolsAsync(It.IsAny<string>(), It.IsAny<TaskAgentPoolType>())).Returns(Task.FromResult(expectedPools));
var expectedAgents = new List<TaskAgent>(); var expectedAgents = new List<TaskAgent>();
@@ -155,7 +157,7 @@ namespace GitHub.Runner.Common.Tests.Listener.Configuration
"configure", "configure",
"--url", _expectedServerUrl, "--url", _expectedServerUrl,
"--name", _expectedAgentName, "--name", _expectedAgentName,
"--pool", _expectedPoolName, "--runnergroup", _secondRunnerGroupName,
"--work", _expectedWorkFolder, "--work", _expectedWorkFolder,
"--auth", _expectedAuthType, "--auth", _expectedAuthType,
"--token", _expectedToken, "--token", _expectedToken,
@@ -175,7 +177,7 @@ namespace GitHub.Runner.Common.Tests.Listener.Configuration
Assert.NotNull(s); Assert.NotNull(s);
Assert.True(s.ServerUrl.Equals(_expectedServerUrl)); Assert.True(s.ServerUrl.Equals(_expectedServerUrl));
Assert.True(s.AgentName.Equals(_expectedAgentName)); Assert.True(s.AgentName.Equals(_expectedAgentName));
Assert.True(s.PoolId.Equals(_expectedPoolId)); Assert.True(s.PoolId.Equals(_secondRunnerGroupId));
Assert.True(s.WorkFolder.Equals(_expectedWorkFolder)); Assert.True(s.WorkFolder.Equals(_expectedWorkFolder));
// validate GetAgentPoolsAsync gets called twice with automation pool type // validate GetAgentPoolsAsync gets called twice with automation pool type

View File

@@ -1,4 +1,4 @@
using GitHub.Runner.Common.Util; using GitHub.Runner.Common.Util;
using System.Collections.Generic; using System.Collections.Generic;
using System.IO; using System.IO;
using System.Linq; using System.Linq;
@@ -127,11 +127,11 @@ namespace GitHub.Runner.Common.Tests
Environment.SetEnvironmentVariable("no_proxy", "github.com, google.com,"); Environment.SetEnvironmentVariable("no_proxy", "github.com, google.com,");
var proxy = new RunnerWebProxy(); var proxy = new RunnerWebProxy();
Assert.Equal("http://127.0.0.1:8888/", proxy.HttpProxyAddress); Assert.Equal("http://127.0.0.1:8888", proxy.HttpProxyAddress);
Assert.Null(proxy.HttpProxyUsername); Assert.Null(proxy.HttpProxyUsername);
Assert.Null(proxy.HttpProxyPassword); Assert.Null(proxy.HttpProxyPassword);
Assert.Equal("http://user:pass@127.0.0.1:9999/", proxy.HttpsProxyAddress); Assert.Equal("http://user:pass@127.0.0.1:9999", proxy.HttpsProxyAddress);
Assert.Equal("user", proxy.HttpsProxyUsername); Assert.Equal("user", proxy.HttpsProxyUsername);
Assert.Equal("pass", proxy.HttpsProxyPassword); Assert.Equal("pass", proxy.HttpsProxyPassword);
@@ -161,11 +161,11 @@ namespace GitHub.Runner.Common.Tests
Environment.SetEnvironmentVariable("NO_PROXY", "github.com, google.com,"); Environment.SetEnvironmentVariable("NO_PROXY", "github.com, google.com,");
var proxy = new RunnerWebProxy(); var proxy = new RunnerWebProxy();
Assert.Equal("http://127.0.0.1:7777/", proxy.HttpProxyAddress); Assert.Equal("http://127.0.0.1:7777", proxy.HttpProxyAddress);
Assert.Null(proxy.HttpProxyUsername); Assert.Null(proxy.HttpProxyUsername);
Assert.Null(proxy.HttpProxyPassword); Assert.Null(proxy.HttpProxyPassword);
Assert.Equal("http://user:pass@127.0.0.1:8888/", proxy.HttpsProxyAddress); Assert.Equal("http://user:pass@127.0.0.1:8888", proxy.HttpsProxyAddress);
Assert.Equal("user", proxy.HttpsProxyUsername); Assert.Equal("user", proxy.HttpsProxyUsername);
Assert.Equal("pass", proxy.HttpsProxyPassword); Assert.Equal("pass", proxy.HttpsProxyPassword);
@@ -218,19 +218,19 @@ namespace GitHub.Runner.Common.Tests
Environment.SetEnvironmentVariable("no_proxy", "github.com, google.com,"); Environment.SetEnvironmentVariable("no_proxy", "github.com, google.com,");
var proxy = new RunnerWebProxy(); var proxy = new RunnerWebProxy();
Assert.Equal("http://user1@127.0.0.1:8888/", proxy.HttpProxyAddress); Assert.Equal("http://user1@127.0.0.1:8888", proxy.HttpProxyAddress);
Assert.Equal("user1", proxy.HttpProxyUsername); Assert.Equal("user1", proxy.HttpProxyUsername);
Assert.Null(proxy.HttpProxyPassword); Assert.Null(proxy.HttpProxyPassword);
var cred = proxy.Credentials.GetCredential(new Uri("http://user1@127.0.0.1:8888/"), "Basic"); var cred = proxy.Credentials.GetCredential(new Uri("http://user1@127.0.0.1:8888"), "Basic");
Assert.Equal("user1", cred.UserName); Assert.Equal("user1", cred.UserName);
Assert.Equal(string.Empty, cred.Password); Assert.Equal(string.Empty, cred.Password);
Assert.Equal("http://user2:pass@127.0.0.1:9999/", proxy.HttpsProxyAddress); Assert.Equal("http://user2:pass@127.0.0.1:9999", proxy.HttpsProxyAddress);
Assert.Equal("user2", proxy.HttpsProxyUsername); Assert.Equal("user2", proxy.HttpsProxyUsername);
Assert.Equal("pass", proxy.HttpsProxyPassword); Assert.Equal("pass", proxy.HttpsProxyPassword);
cred = proxy.Credentials.GetCredential(new Uri("http://user2:pass@127.0.0.1:9999/"), "Basic"); cred = proxy.Credentials.GetCredential(new Uri("http://user2:pass@127.0.0.1:9999"), "Basic");
Assert.Equal("user2", cred.UserName); Assert.Equal("user2", cred.UserName);
Assert.Equal("pass", cred.Password); Assert.Equal("pass", cred.Password);
@@ -256,19 +256,19 @@ namespace GitHub.Runner.Common.Tests
Environment.SetEnvironmentVariable("no_proxy", "github.com, google.com,"); Environment.SetEnvironmentVariable("no_proxy", "github.com, google.com,");
var proxy = new RunnerWebProxy(); var proxy = new RunnerWebProxy();
Assert.Equal("http://user1:pass1%40@127.0.0.1:8888/", proxy.HttpProxyAddress); Assert.Equal("http://user1:pass1%40@127.0.0.1:8888", proxy.HttpProxyAddress);
Assert.Equal("user1", proxy.HttpProxyUsername); Assert.Equal("user1", proxy.HttpProxyUsername);
Assert.Equal("pass1@", proxy.HttpProxyPassword); Assert.Equal("pass1@", proxy.HttpProxyPassword);
var cred = proxy.Credentials.GetCredential(new Uri("http://user1:pass1%40@127.0.0.1:8888/"), "Basic"); var cred = proxy.Credentials.GetCredential(new Uri("http://user1:pass1%40@127.0.0.1:8888"), "Basic");
Assert.Equal("user1", cred.UserName); Assert.Equal("user1", cred.UserName);
Assert.Equal("pass1@", cred.Password); Assert.Equal("pass1@", cred.Password);
Assert.Equal("http://user2:pass2%40@127.0.0.1:9999/", proxy.HttpsProxyAddress); Assert.Equal("http://user2:pass2%40@127.0.0.1:9999", proxy.HttpsProxyAddress);
Assert.Equal("user2", proxy.HttpsProxyUsername); Assert.Equal("user2", proxy.HttpsProxyUsername);
Assert.Equal("pass2@", proxy.HttpsProxyPassword); Assert.Equal("pass2@", proxy.HttpsProxyPassword);
cred = proxy.Credentials.GetCredential(new Uri("http://user2:pass2%40@127.0.0.1:9999/"), "Basic"); cred = proxy.Credentials.GetCredential(new Uri("http://user2:pass2%40@127.0.0.1:9999"), "Basic");
Assert.Equal("user2", cred.UserName); Assert.Equal("user2", cred.UserName);
Assert.Equal("pass2@", cred.Password); Assert.Equal("pass2@", cred.Password);
@@ -405,6 +405,36 @@ namespace GitHub.Runner.Common.Tests
} }
} }
[Fact]
[Trait("Level", "L0")]
[Trait("Category", "Common")]
public void WebProxyFromEnvironmentVariablesWithPort80()
{
try
{
Environment.SetEnvironmentVariable("http_proxy", "http://127.0.0.1:80");
Environment.SetEnvironmentVariable("https_proxy", "http://user:pass@127.0.0.1:80");
Environment.SetEnvironmentVariable("no_proxy", "github.com, google.com,");
var proxy = new RunnerWebProxy();
Assert.Equal("http://127.0.0.1:80", Environment.GetEnvironmentVariable("http_proxy"));
Assert.Null(proxy.HttpProxyUsername);
Assert.Null(proxy.HttpProxyPassword);
Assert.Equal("http://user:pass@127.0.0.1:80", Environment.GetEnvironmentVariable("https_proxy"));
Assert.Equal("user", proxy.HttpsProxyUsername);
Assert.Equal("pass", proxy.HttpsProxyPassword);
Assert.Equal(2, proxy.NoProxyList.Count);
Assert.Equal("github.com", proxy.NoProxyList[0].Host);
Assert.Equal("google.com", proxy.NoProxyList[1].Host);
}
finally
{
CleanProxyEnv();
}
}
private void CleanProxyEnv() private void CleanProxyEnv()
{ {
Environment.SetEnvironmentVariable("http_proxy", null); Environment.SetEnvironmentVariable("http_proxy", null);

View File

@@ -60,6 +60,7 @@ namespace GitHub.Runner.Common.Tests
{ {
typeof(IActionCommandExtension), typeof(IActionCommandExtension),
typeof(IExecutionContext), typeof(IExecutionContext),
typeof(IFileCommandExtension),
typeof(IHandler), typeof(IHandler),
typeof(IJobExtension), typeof(IJobExtension),
typeof(IStep), typeof(IStep),

View File

@@ -96,7 +96,7 @@ namespace GitHub.Runner.Common.Tests.Worker
hc.GetTrace().Info($"{issue.Type} {issue.Message} {message ?? string.Empty}"); hc.GetTrace().Info($"{issue.Type} {issue.Message} {message ?? string.Empty}");
}); });
_ec.Setup(x => x.EnvironmentVariables).Returns(new Dictionary<string, string>()); _ec.Object.Global.EnvironmentVariables = new Dictionary<string, string>();
Assert.True(_commandManager.TryProcessCommand(_ec.Object, "##[stop-commands]stopToken", null)); Assert.True(_commandManager.TryProcessCommand(_ec.Object, "##[stop-commands]stopToken", null));
Assert.False(_commandManager.TryProcessCommand(_ec.Object, "##[set-env name=foo]bar", null)); Assert.False(_commandManager.TryProcessCommand(_ec.Object, "##[set-env name=foo]bar", null));
@@ -119,8 +119,6 @@ namespace GitHub.Runner.Common.Tests.Worker
return 1; return 1;
}); });
_ec.SetupAllProperties();
Assert.False(_ec.Object.EchoOnActionCommand); Assert.False(_ec.Object.EchoOnActionCommand);
Assert.True(_commandManager.TryProcessCommand(_ec.Object, "::echo::on", null)); Assert.True(_commandManager.TryProcessCommand(_ec.Object, "::echo::on", null));
@@ -204,8 +202,6 @@ namespace GitHub.Runner.Common.Tests.Worker
return 1; return 1;
}); });
_ec.SetupAllProperties();
// Echo commands below are considered "processed", but are invalid // Echo commands below are considered "processed", but are invalid
// 1. Invalid echo value // 1. Invalid echo value
Assert.True(_commandManager.TryProcessCommand(_ec.Object, "::echo::invalid", null)); Assert.True(_commandManager.TryProcessCommand(_ec.Object, "::echo::invalid", null));
@@ -287,6 +283,8 @@ namespace GitHub.Runner.Common.Tests.Worker
// Execution context // Execution context
_ec = new Mock<IExecutionContext>(); _ec = new Mock<IExecutionContext>();
_ec.SetupAllProperties();
_ec.Setup(x => x.Global).Returns(new GlobalContext());
// Command manager // Command manager
_commandManager = new ActionCommandManager(); _commandManager = new ActionCommandManager();

View File

@@ -133,7 +133,7 @@ namespace GitHub.Runner.Common.Tests.Worker
Reference = new Pipelines.RepositoryPathReference() Reference = new Pipelines.RepositoryPathReference()
{ {
Name = ActionName, Name = ActionName,
Ref = "master", Ref = "main",
RepositoryType = "GitHub" RepositoryType = "GitHub"
} }
} }
@@ -141,7 +141,7 @@ namespace GitHub.Runner.Common.Tests.Worker
// Return a valid action from GHES via mock // Return a valid action from GHES via mock
const string ApiUrl = "https://ghes.example.com/api/v3"; const string ApiUrl = "https://ghes.example.com/api/v3";
string expectedArchiveLink = GetLinkToActionArchive(ApiUrl, ActionName, "master"); string expectedArchiveLink = GetLinkToActionArchive(ApiUrl, ActionName, "main");
string archiveFile = await CreateRepoArchive(); string archiveFile = await CreateRepoArchive();
using var stream = File.OpenRead(archiveFile); using var stream = File.OpenRead(archiveFile);
var mockClientHandler = new Mock<HttpClientHandler>(); var mockClientHandler = new Mock<HttpClientHandler>();
@@ -159,10 +159,10 @@ namespace GitHub.Runner.Common.Tests.Worker
await _actionManager.PrepareActionsAsync(_ec.Object, actions); await _actionManager.PrepareActionsAsync(_ec.Object, actions);
//Assert //Assert
var watermarkFile = Path.Combine(_hc.GetDirectory(WellKnownDirectory.Actions), ActionName, "master.completed"); var watermarkFile = Path.Combine(_hc.GetDirectory(WellKnownDirectory.Actions), ActionName, "main.completed");
Assert.True(File.Exists(watermarkFile)); Assert.True(File.Exists(watermarkFile));
var actionYamlFile = Path.Combine(_hc.GetDirectory(WellKnownDirectory.Actions), ActionName, "master", "action.yml"); var actionYamlFile = Path.Combine(_hc.GetDirectory(WellKnownDirectory.Actions), ActionName, "main", "action.yml");
Assert.True(File.Exists(actionYamlFile)); Assert.True(File.Exists(actionYamlFile));
_hc.GetTrace().Info(File.ReadAllText(actionYamlFile)); _hc.GetTrace().Info(File.ReadAllText(actionYamlFile));
} }
@@ -191,7 +191,7 @@ namespace GitHub.Runner.Common.Tests.Worker
Reference = new Pipelines.RepositoryPathReference() Reference = new Pipelines.RepositoryPathReference()
{ {
Name = ActionName, Name = ActionName,
Ref = "master", Ref = "main",
RepositoryType = "GitHub" RepositoryType = "GitHub"
} }
} }
@@ -199,8 +199,8 @@ namespace GitHub.Runner.Common.Tests.Worker
// Return a valid action from GHES via mock // Return a valid action from GHES via mock
const string ApiUrl = "https://ghes.example.com/api/v3"; const string ApiUrl = "https://ghes.example.com/api/v3";
string builtInArchiveLink = GetLinkToActionArchive(ApiUrl, ActionName, "master"); string builtInArchiveLink = GetLinkToActionArchive(ApiUrl, ActionName, "main");
string dotcomArchiveLink = GetLinkToActionArchive("https://api.github.com", ActionName, "master"); string dotcomArchiveLink = GetLinkToActionArchive("https://api.github.com", ActionName, "main");
string archiveFile = await CreateRepoArchive(); string archiveFile = await CreateRepoArchive();
using var stream = File.OpenRead(archiveFile); using var stream = File.OpenRead(archiveFile);
var mockClientHandler = new Mock<HttpClientHandler>(); var mockClientHandler = new Mock<HttpClientHandler>();
@@ -220,10 +220,10 @@ namespace GitHub.Runner.Common.Tests.Worker
await _actionManager.PrepareActionsAsync(_ec.Object, actions); await _actionManager.PrepareActionsAsync(_ec.Object, actions);
//Assert //Assert
var watermarkFile = Path.Combine(_hc.GetDirectory(WellKnownDirectory.Actions), ActionName, "master.completed"); var watermarkFile = Path.Combine(_hc.GetDirectory(WellKnownDirectory.Actions), ActionName, "main.completed");
Assert.True(File.Exists(watermarkFile)); Assert.True(File.Exists(watermarkFile));
var actionYamlFile = Path.Combine(_hc.GetDirectory(WellKnownDirectory.Actions), ActionName, "master", "action.yml"); var actionYamlFile = Path.Combine(_hc.GetDirectory(WellKnownDirectory.Actions), ActionName, "main", "action.yml");
Assert.True(File.Exists(actionYamlFile)); Assert.True(File.Exists(actionYamlFile));
_hc.GetTrace().Info(File.ReadAllText(actionYamlFile)); _hc.GetTrace().Info(File.ReadAllText(actionYamlFile));
} }
@@ -252,7 +252,7 @@ namespace GitHub.Runner.Common.Tests.Worker
Reference = new Pipelines.RepositoryPathReference() Reference = new Pipelines.RepositoryPathReference()
{ {
Name = ActionName, Name = ActionName,
Ref = "master", Ref = "main",
RepositoryType = "GitHub" RepositoryType = "GitHub"
} }
} }
@@ -260,7 +260,7 @@ namespace GitHub.Runner.Common.Tests.Worker
// Return a valid action from GHES via mock // Return a valid action from GHES via mock
const string ApiUrl = "https://ghes.example.com/api/v3"; const string ApiUrl = "https://ghes.example.com/api/v3";
string archiveLink = GetLinkToActionArchive(ApiUrl, ActionName, "master"); string archiveLink = GetLinkToActionArchive(ApiUrl, ActionName, "main");
string archiveFile = await CreateRepoArchive(); string archiveFile = await CreateRepoArchive();
using var stream = File.OpenRead(archiveFile); using var stream = File.OpenRead(archiveFile);
var mockClientHandler = new Mock<HttpClientHandler>(); var mockClientHandler = new Mock<HttpClientHandler>();
@@ -280,10 +280,10 @@ namespace GitHub.Runner.Common.Tests.Worker
//Assert //Assert
await Assert.ThrowsAsync<ActionNotFoundException>(action); await Assert.ThrowsAsync<ActionNotFoundException>(action);
var watermarkFile = Path.Combine(_hc.GetDirectory(WellKnownDirectory.Actions), ActionName, "master.completed"); var watermarkFile = Path.Combine(_hc.GetDirectory(WellKnownDirectory.Actions), ActionName, "main.completed");
Assert.False(File.Exists(watermarkFile)); Assert.False(File.Exists(watermarkFile));
var actionYamlFile = Path.Combine(_hc.GetDirectory(WellKnownDirectory.Actions), ActionName, "master", "action.yml"); var actionYamlFile = Path.Combine(_hc.GetDirectory(WellKnownDirectory.Actions), ActionName, "main", "action.yml");
Assert.False(File.Exists(actionYamlFile)); Assert.False(File.Exists(actionYamlFile));
} }
finally finally
@@ -846,7 +846,7 @@ namespace GitHub.Runner.Common.Tests.Worker
{ {
var traceFile = Path.GetTempFileName(); var traceFile = Path.GetTempFileName();
File.Copy(_hc.TraceFileName, traceFile, true); File.Copy(_hc.TraceFileName, traceFile, true);
Assert.Contains("Entry javascript file is not provided.", File.ReadAllText(traceFile)); Assert.Contains("You are using a JavaScript Action but there is not an entry JavaScript file provided in", File.ReadAllText(traceFile));
} }
} }
finally finally
@@ -1278,7 +1278,7 @@ runs:
"; ";
Pipelines.ActionStep instance; Pipelines.ActionStep instance;
string directory; string directory;
directory = Path.Combine(_workFolder, Constants.Path.ActionsDirectory, "GitHub/actions".Replace(Path.AltDirectorySeparatorChar, Path.DirectorySeparatorChar), "master"); directory = Path.Combine(_workFolder, Constants.Path.ActionsDirectory, "GitHub/actions".Replace(Path.AltDirectorySeparatorChar, Path.DirectorySeparatorChar), "main");
string file = Path.Combine(directory, Constants.Path.ActionManifestYamlFile); string file = Path.Combine(directory, Constants.Path.ActionManifestYamlFile);
Directory.CreateDirectory(Path.GetDirectoryName(file)); Directory.CreateDirectory(Path.GetDirectoryName(file));
File.WriteAllText(file, Content); File.WriteAllText(file, Content);
@@ -1288,7 +1288,7 @@ runs:
Reference = new Pipelines.RepositoryPathReference() Reference = new Pipelines.RepositoryPathReference()
{ {
Name = "GitHub/actions", Name = "GitHub/actions",
Ref = "master", Ref = "main",
RepositoryType = Pipelines.RepositoryTypes.GitHub RepositoryType = Pipelines.RepositoryTypes.GitHub
} }
}; };
@@ -2466,7 +2466,7 @@ runs:
{ {
var traceFile = Path.GetTempFileName(); var traceFile = Path.GetTempFileName();
File.Copy(_hc.TraceFileName, traceFile, true); File.Copy(_hc.TraceFileName, traceFile, true);
Assert.Contains("Entry javascript file is not provided.", File.ReadAllText(traceFile)); Assert.Contains("You are using a JavaScript Action but there is not an entry JavaScript file provided in", File.ReadAllText(traceFile));
} }
} }
finally finally
@@ -2898,7 +2898,7 @@ runs:
"; ";
Pipelines.ActionStep instance; Pipelines.ActionStep instance;
string directory; string directory;
directory = Path.Combine(_workFolder, Constants.Path.ActionsDirectory, "GitHub/actions".Replace(Path.AltDirectorySeparatorChar, Path.DirectorySeparatorChar), "master"); directory = Path.Combine(_workFolder, Constants.Path.ActionsDirectory, "GitHub/actions".Replace(Path.AltDirectorySeparatorChar, Path.DirectorySeparatorChar), "main");
string file = Path.Combine(directory, Constants.Path.ActionManifestYamlFile); string file = Path.Combine(directory, Constants.Path.ActionManifestYamlFile);
Directory.CreateDirectory(Path.GetDirectoryName(file)); Directory.CreateDirectory(Path.GetDirectoryName(file));
File.WriteAllText(file, Content); File.WriteAllText(file, Content);
@@ -2908,7 +2908,7 @@ runs:
Reference = new Pipelines.RepositoryPathReference() Reference = new Pipelines.RepositoryPathReference()
{ {
Name = "GitHub/actions", Name = "GitHub/actions",
Ref = "master", Ref = "main",
RepositoryType = Pipelines.RepositoryTypes.GitHub RepositoryType = Pipelines.RepositoryTypes.GitHub
} }
}; };
@@ -3453,7 +3453,7 @@ runs:
private void CreateAction(string yamlContent, out Pipelines.ActionStep instance, out string directory) private void CreateAction(string yamlContent, out Pipelines.ActionStep instance, out string directory)
{ {
directory = Path.Combine(_workFolder, Constants.Path.ActionsDirectory, "GitHub/actions".Replace(Path.AltDirectorySeparatorChar, Path.DirectorySeparatorChar), "master"); directory = Path.Combine(_workFolder, Constants.Path.ActionsDirectory, "GitHub/actions".Replace(Path.AltDirectorySeparatorChar, Path.DirectorySeparatorChar), "main");
string file = Path.Combine(directory, Constants.Path.ActionManifestYmlFile); string file = Path.Combine(directory, Constants.Path.ActionManifestYmlFile);
Directory.CreateDirectory(Path.GetDirectoryName(file)); Directory.CreateDirectory(Path.GetDirectoryName(file));
File.WriteAllText(file, yamlContent); File.WriteAllText(file, yamlContent);
@@ -3463,7 +3463,7 @@ runs:
Reference = new Pipelines.RepositoryPathReference() Reference = new Pipelines.RepositoryPathReference()
{ {
Name = "GitHub/actions", Name = "GitHub/actions",
Ref = "master", Ref = "main",
RepositoryType = Pipelines.RepositoryTypes.GitHub RepositoryType = Pipelines.RepositoryTypes.GitHub
} }
}; };
@@ -3481,7 +3481,7 @@ runs:
Reference = new Pipelines.RepositoryPathReference() Reference = new Pipelines.RepositoryPathReference()
{ {
Name = "GitHub/actions", Name = "GitHub/actions",
Ref = "master", Ref = "main",
RepositoryType = Pipelines.PipelineConstants.SelfAlias RepositoryType = Pipelines.PipelineConstants.SelfAlias
} }
}; };
@@ -3575,17 +3575,18 @@ runs:
_workFolder = _hc.GetDirectory(WellKnownDirectory.Work); _workFolder = _hc.GetDirectory(WellKnownDirectory.Work);
_ec = new Mock<IExecutionContext>(); _ec = new Mock<IExecutionContext>();
_ec.Setup(x => x.Global).Returns(new GlobalContext());
_ec.Setup(x => x.CancellationToken).Returns(_ecTokenSource.Token); _ec.Setup(x => x.CancellationToken).Returns(_ecTokenSource.Token);
var variables = new Dictionary<string, VariableValue>(); var variables = new Dictionary<string, VariableValue>();
if (newActionMetadata) if (newActionMetadata)
{ {
variables["DistributedTask.NewActionMetadata"] = "true"; variables["DistributedTask.NewActionMetadata"] = "true";
} }
_ec.Setup(x => x.Variables).Returns(new Variables(_hc, variables)); _ec.Object.Global.Variables = new Variables(_hc, variables);
_ec.Setup(x => x.ExpressionValues).Returns(new DictionaryContextData()); _ec.Setup(x => x.ExpressionValues).Returns(new DictionaryContextData());
_ec.Setup(x => x.ExpressionFunctions).Returns(new List<IFunctionInfo>()); _ec.Setup(x => x.ExpressionFunctions).Returns(new List<IFunctionInfo>());
_ec.Setup(x => x.FileTable).Returns(new List<String>()); _ec.Object.Global.FileTable = new List<String>();
_ec.Setup(x => x.Plan).Returns(new TaskOrchestrationPlanReference()); _ec.Object.Global.Plan = new TaskOrchestrationPlanReference();
_ec.Setup(x => x.Write(It.IsAny<string>(), It.IsAny<string>())).Callback((string tag, string message) => { _hc.GetTrace().Info($"[{tag}]{message}"); }); _ec.Setup(x => x.Write(It.IsAny<string>(), It.IsAny<string>())).Callback((string tag, string message) => { _hc.GetTrace().Info($"[{tag}]{message}"); });
_ec.Setup(x => x.AddIssue(It.IsAny<Issue>(), It.IsAny<string>())).Callback((Issue issue, string message) => { _hc.GetTrace().Info($"[{issue.Type}]{issue.Message ?? message}"); }); _ec.Setup(x => x.AddIssue(It.IsAny<Issue>(), It.IsAny<string>())).Callback((Issue issue, string message) => { _hc.GetTrace().Info($"[{issue.Type}]{issue.Message ?? message}"); });
_ec.Setup(x => x.GetGitHubContext("workspace")).Returns(Path.Combine(_workFolder, "actions", "actions")); _ec.Setup(x => x.GetGitHubContext("workspace")).Returns(Path.Combine(_workFolder, "actions", "actions"));

View File

@@ -717,7 +717,7 @@ namespace GitHub.Runner.Common.Tests.Worker
_ec.Object.ExpressionValues["github"] = new DictionaryContextData _ec.Object.ExpressionValues["github"] = new DictionaryContextData
{ {
{ "ref", new StringContextData("refs/heads/master") }, { "ref", new StringContextData("refs/heads/main") },
}; };
_ec.Object.ExpressionValues["strategy"] = new DictionaryContextData(); _ec.Object.ExpressionValues["strategy"] = new DictionaryContextData();
_ec.Object.ExpressionValues["matrix"] = new DictionaryContextData(); _ec.Object.ExpressionValues["matrix"] = new DictionaryContextData();
@@ -737,7 +737,7 @@ namespace GitHub.Runner.Common.Tests.Worker
result = actionManifest.EvaluateDefaultInput(_ec.Object, "testInput", new BasicExpressionToken(null, null, null, "github.ref")); result = actionManifest.EvaluateDefaultInput(_ec.Object, "testInput", new BasicExpressionToken(null, null, null, "github.ref"));
//Assert //Assert
Assert.Equal("refs/heads/master", result); Assert.Equal("refs/heads/main", result);
} }
finally finally
{ {
@@ -754,12 +754,16 @@ namespace GitHub.Runner.Common.Tests.Worker
_hc = new TestHostContext(this, name); _hc = new TestHostContext(this, name);
_ec = new Mock<IExecutionContext>(); _ec = new Mock<IExecutionContext>();
_ec.Setup(x => x.WriteDebug).Returns(true); _ec.Setup(x => x.Global)
.Returns(new GlobalContext
{
FileTable = new List<String>(),
Variables = new Variables(_hc, new Dictionary<string, VariableValue>()),
WriteDebug = true,
});
_ec.Setup(x => x.CancellationToken).Returns(_ecTokenSource.Token); _ec.Setup(x => x.CancellationToken).Returns(_ecTokenSource.Token);
_ec.Setup(x => x.Variables).Returns(new Variables(_hc, new Dictionary<string, VariableValue>()));
_ec.Setup(x => x.ExpressionValues).Returns(new DictionaryContextData()); _ec.Setup(x => x.ExpressionValues).Returns(new DictionaryContextData());
_ec.Setup(x => x.ExpressionFunctions).Returns(new List<IFunctionInfo>()); _ec.Setup(x => x.ExpressionFunctions).Returns(new List<IFunctionInfo>());
_ec.Setup(x => x.FileTable).Returns(new List<String>());
_ec.Setup(x => x.Write(It.IsAny<string>(), It.IsAny<string>())).Callback((string tag, string message) => { _hc.GetTrace().Info($"{tag}{message}"); }); _ec.Setup(x => x.Write(It.IsAny<string>(), It.IsAny<string>())).Callback((string tag, string message) => { _hc.GetTrace().Info($"{tag}{message}"); });
_ec.Setup(x => x.AddIssue(It.IsAny<Issue>(), It.IsAny<string>())).Callback((Issue issue, string message) => { _hc.GetTrace().Info($"[{issue.Type}]{issue.Message ?? message}"); }); _ec.Setup(x => x.AddIssue(It.IsAny<Issue>(), It.IsAny<string>())).Callback((Issue issue, string message) => { _hc.GetTrace().Info($"[{issue.Type}]{issue.Message ?? message}"); });
} }

View File

@@ -32,6 +32,8 @@ namespace GitHub.Runner.Common.Tests.Worker
private TestHostContext _hc; private TestHostContext _hc;
private ActionRunner _actionRunner; private ActionRunner _actionRunner;
private IActionManifestManager _actionManifestManager; private IActionManifestManager _actionManifestManager;
private Mock<IFileCommandManager> _fileCommandManager;
private DictionaryContextData _context = new DictionaryContextData(); private DictionaryContextData _context = new DictionaryContextData();
[Fact] [Fact]
@@ -331,6 +333,66 @@ namespace GitHub.Runner.Common.Tests.Worker
_ec.Verify(x => x.AddIssue(It.Is<Issue>(s => s.Message.Contains("Unexpected input(s) 'invalid1', 'invalid2'")), It.IsAny<string>()), Times.Once); _ec.Verify(x => x.AddIssue(It.Is<Issue>(s => s.Message.Contains("Unexpected input(s) 'invalid1', 'invalid2'")), It.IsAny<string>()), Times.Once);
} }
[Fact]
[Trait("Level", "L0")]
[Trait("Category", "Worker")]
public async void SetGitHubContextActionRepoRef()
{
//Arrange
Setup();
var actionId = Guid.NewGuid();
var actionInputs = new MappingToken(null, null, null);
actionInputs.Add(new StringToken(null, null, null, "input1"), new StringToken(null, null, null, "test1"));
actionInputs.Add(new StringToken(null, null, null, "input2"), new StringToken(null, null, null, "test2"));
var action = new Pipelines.ActionStep()
{
Name = "action",
Id = actionId,
Reference = new Pipelines.RepositoryPathReference()
{
Name = "actions/test",
Ref = "master"
},
Inputs = actionInputs
};
_actionRunner.Action = action;
Dictionary<string, string> finialInputs = new Dictionary<string, string>();
_handlerFactory.Setup(x => x.Create(It.IsAny<IExecutionContext>(), It.IsAny<ActionStepDefinitionReference>(), It.IsAny<IStepHost>(), It.IsAny<ActionExecutionData>(), It.IsAny<Dictionary<string, string>>(), It.IsAny<Dictionary<string, string>>(), It.IsAny<Variables>(), It.IsAny<string>()))
.Callback((IExecutionContext executionContext, Pipelines.ActionStepDefinitionReference actionReference, IStepHost stepHost, ActionExecutionData data, Dictionary<string, string> inputs, Dictionary<string, string> environment, Variables runtimeVariables, string taskDirectory) =>
{
finialInputs = inputs;
})
.Returns(new Mock<IHandler>().Object);
//Act
await _actionRunner.RunAsync();
//Assert
_ec.Verify(x => x.SetGitHubContext("action_repository", "actions/test"), Times.Once);
_ec.Verify(x => x.SetGitHubContext("action_ref", "master"), Times.Once);
action = new Pipelines.ActionStep()
{
Name = "action",
Id = actionId,
Reference = new Pipelines.ScriptReference(),
Inputs = actionInputs
};
_actionRunner.Action = action;
_hc.EnqueueInstance<IDefaultStepHost>(_defaultStepHost.Object);
_hc.EnqueueInstance(_fileCommandManager.Object);
//Act
await _actionRunner.RunAsync();
//Assert
_ec.Verify(x => x.SetGitHubContext("action_repository", null), Times.Once);
_ec.Verify(x => x.SetGitHubContext("action_ref", null), Times.Once);
}
private void Setup([CallerMemberName] string name = "") private void Setup([CallerMemberName] string name = "")
{ {
_ecTokenSource?.Dispose(); _ecTokenSource?.Dispose();
@@ -362,6 +424,7 @@ namespace GitHub.Runner.Common.Tests.Worker
_handlerFactory = new Mock<IHandlerFactory>(); _handlerFactory = new Mock<IHandlerFactory>();
_defaultStepHost = new Mock<IDefaultStepHost>(); _defaultStepHost = new Mock<IDefaultStepHost>();
_actionManifestManager = new ActionManifestManager(); _actionManifestManager = new ActionManifestManager();
_fileCommandManager = new Mock<IFileCommandManager>();
_actionManifestManager.Initialize(_hc); _actionManifestManager.Initialize(_hc);
var githubContext = new GitHubContext(); var githubContext = new GitHubContext();
@@ -375,15 +438,16 @@ namespace GitHub.Runner.Common.Tests.Worker
#endif #endif
_ec = new Mock<IExecutionContext>(); _ec = new Mock<IExecutionContext>();
_ec.Setup(x => x.Global).Returns(new GlobalContext());
_ec.Setup(x => x.ExpressionValues).Returns(_context); _ec.Setup(x => x.ExpressionValues).Returns(_context);
_ec.Setup(x => x.ExpressionFunctions).Returns(new List<IFunctionInfo>()); _ec.Setup(x => x.ExpressionFunctions).Returns(new List<IFunctionInfo>());
_ec.Setup(x => x.IntraActionState).Returns(new Dictionary<string, string>()); _ec.Setup(x => x.IntraActionState).Returns(new Dictionary<string, string>());
_ec.Setup(x => x.EnvironmentVariables).Returns(new Dictionary<string, string>()); _ec.Object.Global.EnvironmentVariables = new Dictionary<string, string>();
_ec.Setup(x => x.FileTable).Returns(new List<String>()); _ec.Object.Global.FileTable = new List<String>();
_ec.Setup(x => x.SetGitHubContext(It.IsAny<string>(), It.IsAny<string>())); _ec.Setup(x => x.SetGitHubContext(It.IsAny<string>(), It.IsAny<string>()));
_ec.Setup(x => x.GetGitHubContext(It.IsAny<string>())).Returns("{\"foo\":\"bar\"}"); _ec.Setup(x => x.GetGitHubContext(It.IsAny<string>())).Returns("{\"foo\":\"bar\"}");
_ec.Setup(x => x.CancellationToken).Returns(_ecTokenSource.Token); _ec.Setup(x => x.CancellationToken).Returns(_ecTokenSource.Token);
_ec.Setup(x => x.Variables).Returns(new Variables(_hc, new Dictionary<string, VariableValue>())); _ec.Object.Global.Variables = new Variables(_hc, new Dictionary<string, VariableValue>());
_ec.Setup(x => x.Write(It.IsAny<string>(), It.IsAny<string>())).Callback((string tag, string message) => { _hc.GetTrace().Info($"[{tag}]{message}"); }); _ec.Setup(x => x.Write(It.IsAny<string>(), It.IsAny<string>())).Callback((string tag, string message) => { _hc.GetTrace().Info($"[{tag}]{message}"); });
_ec.Setup(x => x.AddIssue(It.IsAny<Issue>(), It.IsAny<string>())).Callback((Issue issue, string message) => { _hc.GetTrace().Info($"[{issue.Type}]{issue.Message ?? message}"); }); _ec.Setup(x => x.AddIssue(It.IsAny<Issue>(), It.IsAny<string>())).Callback((Issue issue, string message) => { _hc.GetTrace().Info($"[{issue.Type}]{issue.Message ?? message}"); });
@@ -393,6 +457,8 @@ namespace GitHub.Runner.Common.Tests.Worker
_hc.EnqueueInstance<IDefaultStepHost>(_defaultStepHost.Object); _hc.EnqueueInstance<IDefaultStepHost>(_defaultStepHost.Object);
_hc.EnqueueInstance(_fileCommandManager.Object);
// Instance to test. // Instance to test.
_actionRunner = new ActionRunner(); _actionRunner = new ActionRunner();
_actionRunner.Initialize(_hc); _actionRunner.Initialize(_hc);

View File

@@ -116,7 +116,7 @@ namespace GitHub.Runner.Common.Tests.Worker
var pagingLogger = new Mock<IPagingLogger>(); var pagingLogger = new Mock<IPagingLogger>();
var jobServerQueue = new Mock<IJobServerQueue>(); var jobServerQueue = new Mock<IJobServerQueue>();
jobServerQueue.Setup(x => x.QueueTimelineRecordUpdate(It.IsAny<Guid>(), It.IsAny<TimelineRecord>())); jobServerQueue.Setup(x => x.QueueTimelineRecordUpdate(It.IsAny<Guid>(), It.IsAny<TimelineRecord>()));
jobServerQueue.Setup(x => x.QueueWebConsoleLine(It.IsAny<Guid>(), It.IsAny<string>())).Callback((Guid id, string msg) => { hc.GetTrace().Info(msg); }); jobServerQueue.Setup(x => x.QueueWebConsoleLine(It.IsAny<Guid>(), It.IsAny<string>(),It.IsAny<long>())).Callback((Guid id, string msg, long? lineNumber) => { hc.GetTrace().Info(msg); });
hc.EnqueueInstance(pagingLogger.Object); hc.EnqueueInstance(pagingLogger.Object);
hc.SetSingleton(jobServerQueue.Object); hc.SetSingleton(jobServerQueue.Object);
@@ -137,7 +137,7 @@ namespace GitHub.Runner.Common.Tests.Worker
ec.Complete(); ec.Complete();
jobServerQueue.Verify(x => x.QueueWebConsoleLine(It.IsAny<Guid>(), It.IsAny<string>()), Times.Exactly(10)); jobServerQueue.Verify(x => x.QueueWebConsoleLine(It.IsAny<Guid>(), It.IsAny<string>(), It.IsAny<long?>()), Times.Exactly(10));
} }
} }
@@ -171,7 +171,7 @@ namespace GitHub.Runner.Common.Tests.Worker
var pagingLogger5 = new Mock<IPagingLogger>(); var pagingLogger5 = new Mock<IPagingLogger>();
var jobServerQueue = new Mock<IJobServerQueue>(); var jobServerQueue = new Mock<IJobServerQueue>();
jobServerQueue.Setup(x => x.QueueTimelineRecordUpdate(It.IsAny<Guid>(), It.IsAny<TimelineRecord>())); jobServerQueue.Setup(x => x.QueueTimelineRecordUpdate(It.IsAny<Guid>(), It.IsAny<TimelineRecord>()));
jobServerQueue.Setup(x => x.QueueWebConsoleLine(It.IsAny<Guid>(), It.IsAny<string>())).Callback((Guid id, string msg) => { hc.GetTrace().Info(msg); }); jobServerQueue.Setup(x => x.QueueWebConsoleLine(It.IsAny<Guid>(), It.IsAny<string>(), It.IsAny<long?>())).Callback((Guid id, string msg, long? lineNumber) => { hc.GetTrace().Info(msg); });
var actionRunner1 = new ActionRunner(); var actionRunner1 = new ActionRunner();
actionRunner1.Initialize(hc); actionRunner1.Initialize(hc);
@@ -269,7 +269,7 @@ namespace GitHub.Runner.Common.Tests.Worker
var pagingLogger5 = new Mock<IPagingLogger>(); var pagingLogger5 = new Mock<IPagingLogger>();
var jobServerQueue = new Mock<IJobServerQueue>(); var jobServerQueue = new Mock<IJobServerQueue>();
jobServerQueue.Setup(x => x.QueueTimelineRecordUpdate(It.IsAny<Guid>(), It.IsAny<TimelineRecord>())); jobServerQueue.Setup(x => x.QueueTimelineRecordUpdate(It.IsAny<Guid>(), It.IsAny<TimelineRecord>()));
jobServerQueue.Setup(x => x.QueueWebConsoleLine(It.IsAny<Guid>(), It.IsAny<string>())).Callback((Guid id, string msg) => { hc.GetTrace().Info(msg); }); jobServerQueue.Setup(x => x.QueueWebConsoleLine(It.IsAny<Guid>(), It.IsAny<string>(), It.IsAny<long?>())).Callback((Guid id, string msg, long? lineNumber) => { hc.GetTrace().Info(msg); });
var actionRunner1 = new ActionRunner(); var actionRunner1 = new ActionRunner();
actionRunner1.Initialize(hc); actionRunner1.Initialize(hc);
@@ -357,20 +357,20 @@ namespace GitHub.Runner.Common.Tests.Worker
// Act. // Act.
jobContext.InitializeJob(jobRequest, CancellationToken.None); jobContext.InitializeJob(jobRequest, CancellationToken.None);
jobContext.StepsContext.SetConclusion(null, "step1", ActionResult.Success); jobContext.Global.StepsContext.SetConclusion(null, "step1", ActionResult.Success);
var conclusion1 = (jobContext.StepsContext.GetScope(null)["step1"] as DictionaryContextData)["conclusion"].ToString(); var conclusion1 = (jobContext.Global.StepsContext.GetScope(null)["step1"] as DictionaryContextData)["conclusion"].ToString();
Assert.Equal(conclusion1, conclusion1.ToLowerInvariant()); Assert.Equal(conclusion1, conclusion1.ToLowerInvariant());
jobContext.StepsContext.SetOutcome(null, "step2", ActionResult.Cancelled); jobContext.Global.StepsContext.SetOutcome(null, "step2", ActionResult.Cancelled);
var outcome1 = (jobContext.StepsContext.GetScope(null)["step2"] as DictionaryContextData)["outcome"].ToString(); var outcome1 = (jobContext.Global.StepsContext.GetScope(null)["step2"] as DictionaryContextData)["outcome"].ToString();
Assert.Equal(outcome1, outcome1.ToLowerInvariant()); Assert.Equal(outcome1, outcome1.ToLowerInvariant());
jobContext.StepsContext.SetConclusion(null, "step3", ActionResult.Failure); jobContext.Global.StepsContext.SetConclusion(null, "step3", ActionResult.Failure);
var conclusion2 = (jobContext.StepsContext.GetScope(null)["step3"] as DictionaryContextData)["conclusion"].ToString(); var conclusion2 = (jobContext.Global.StepsContext.GetScope(null)["step3"] as DictionaryContextData)["conclusion"].ToString();
Assert.Equal(conclusion2, conclusion2.ToLowerInvariant()); Assert.Equal(conclusion2, conclusion2.ToLowerInvariant());
jobContext.StepsContext.SetOutcome(null, "step4", ActionResult.Skipped); jobContext.Global.StepsContext.SetOutcome(null, "step4", ActionResult.Skipped);
var outcome2 = (jobContext.StepsContext.GetScope(null)["step4"] as DictionaryContextData)["outcome"].ToString(); var outcome2 = (jobContext.Global.StepsContext.GetScope(null)["step4"] as DictionaryContextData)["outcome"].ToString();
Assert.Equal(outcome2, outcome2.ToLowerInvariant()); Assert.Equal(outcome2, outcome2.ToLowerInvariant());
jobContext.JobContext.Status = ActionResult.Success; jobContext.JobContext.Status = ActionResult.Success;

View File

@@ -8,6 +8,7 @@ using System.Runtime.CompilerServices;
using System.Threading.Tasks; using System.Threading.Tasks;
using Xunit; using Xunit;
using System.Threading; using System.Threading;
using GitHub.DistributedTask.ObjectTemplating.Tokens;
using Pipelines = GitHub.DistributedTask.Pipelines; using Pipelines = GitHub.DistributedTask.Pipelines;
namespace GitHub.Runner.Common.Tests.Worker namespace GitHub.Runner.Common.Tests.Worker
@@ -281,5 +282,70 @@ namespace GitHub.Runner.Common.Tests.Worker
Times.Never); Times.Never);
} }
} }
[Fact]
[Trait("Level", "L0")]
[Trait("Category", "Worker")]
public void EnsureFinalizeJobRunsIfMessageHasNoEnvironmentUrl()
{
using (TestHostContext hc = CreateTestContext())
{
var jobExtension = new JobExtension();
jobExtension.Initialize(hc);
_message.ActionsEnvironment = new ActionsEnvironmentReference("production");
_jobEc = new Runner.Worker.ExecutionContext {Result = TaskResult.Succeeded};
_jobEc.Initialize(hc);
_jobEc.InitializeJob(_message, _tokenSource.Token);
jobExtension.FinalizeJob(_jobEc, _message, DateTime.UtcNow);
Assert.Equal(TaskResult.Succeeded, _jobEc.Result);
}
}
[Fact] [Trait("Level", "L0")] [Trait("Category", "Worker")]
public void EnsureFinalizeJobHandlesNullEnvironmentUrl()
{
using (TestHostContext hc = CreateTestContext())
{
var jobExtension = new JobExtension();
jobExtension.Initialize(hc);
_message.ActionsEnvironment = new ActionsEnvironmentReference("production")
{
Url = null
};
_jobEc = new Runner.Worker.ExecutionContext {Result = TaskResult.Succeeded};
_jobEc.Initialize(hc);
_jobEc.InitializeJob(_message, _tokenSource.Token);
jobExtension.FinalizeJob(_jobEc, _message, DateTime.UtcNow);
Assert.Equal(TaskResult.Succeeded, _jobEc.Result);
}
}
[Fact] [Trait("Level", "L0")] [Trait("Category", "Worker")]
public void EnsureFinalizeJobHandlesNullEnvironment()
{
using (TestHostContext hc = CreateTestContext())
{
var jobExtension = new JobExtension();
jobExtension.Initialize(hc);
_message.ActionsEnvironment = null;
_jobEc = new Runner.Worker.ExecutionContext {Result = TaskResult.Succeeded};
_jobEc.Initialize(hc);
_jobEc.InitializeJob(_message, _tokenSource.Token);
jobExtension.FinalizeJob(_jobEc, _message, DateTime.UtcNow);
Assert.Equal(TaskResult.Succeeded, _jobEc.Result);
}
}
} }
} }

View File

@@ -955,12 +955,13 @@ namespace GitHub.Runner.Common.Tests.Worker
_variables = new Variables(hostContext, new Dictionary<string, DTWebApi.VariableValue>()); _variables = new Variables(hostContext, new Dictionary<string, DTWebApi.VariableValue>());
_executionContext = new Mock<IExecutionContext>(); _executionContext = new Mock<IExecutionContext>();
_executionContext.Setup(x => x.WriteDebug) _executionContext.Setup(x => x.Global)
.Returns(true); .Returns(new GlobalContext
_executionContext.Setup(x => x.Variables) {
.Returns(_variables); Container = jobContainer,
_executionContext.Setup(x => x.Container) Variables = _variables,
.Returns(jobContainer); WriteDebug = true,
});
_executionContext.Setup(x => x.GetMatchers()) _executionContext.Setup(x => x.GetMatchers())
.Returns(matchers?.Matchers ?? new List<IssueMatcherConfig>()); .Returns(matchers?.Matchers ?? new List<IssueMatcherConfig>());
_executionContext.Setup(x => x.Add(It.IsAny<OnMatcherChanged>())) _executionContext.Setup(x => x.Add(It.IsAny<OnMatcherChanged>()))

View File

@@ -203,6 +203,7 @@ namespace GitHub.Runner.Common.Tests.Worker
// Setup the execution context. // Setup the execution context.
_ec = new Mock<IExecutionContext>(); _ec = new Mock<IExecutionContext>();
_ec.Setup(x => x.Global).Returns(new GlobalContext());
GitHubContext githubContext = new GitHubContext(); GitHubContext githubContext = new GitHubContext();
_ec.Setup(x => x.GetGitHubContext("repository")).Returns("actions/runner"); _ec.Setup(x => x.GetGitHubContext("repository")).Returns("actions/runner");

View File

@@ -0,0 +1,390 @@
using System;
using System.Collections.Generic;
using System.Globalization;
using System.IO;
using System.Linq;
using System.Text;
using System.Threading;
using System.Threading.Tasks;
using System.Runtime.CompilerServices;
using GitHub.Runner.Common.Util;
using GitHub.Runner.Sdk;
using GitHub.Runner.Worker;
using GitHub.Runner.Worker.Container;
using GitHub.Runner.Worker.Handlers;
using Moq;
using Xunit;
using DTWebApi = GitHub.DistributedTask.WebApi;
namespace GitHub.Runner.Common.Tests.Worker
{
public sealed class SetEnvFileCommandL0
{
private Mock<IExecutionContext> _executionContext;
private List<Tuple<DTWebApi.Issue, string>> _issues;
private string _rootDirectory;
private SetEnvFileCommand _setEnvFileCommand;
private ITraceWriter _trace;
[Fact]
[Trait("Level", "L0")]
[Trait("Category", "Worker")]
public void SetEnvFileCommand_DirectoryNotFound()
{
using (var hostContext = Setup())
{
var envFile = Path.Combine(_rootDirectory, "directory-not-found", "env");
_setEnvFileCommand.ProcessCommand(_executionContext.Object, envFile, null);
Assert.Equal(0, _issues.Count);
Assert.Equal(0, _executionContext.Object.Global.EnvironmentVariables.Count);
}
}
[Fact]
[Trait("Level", "L0")]
[Trait("Category", "Worker")]
public void SetEnvFileCommand_NotFound()
{
using (var hostContext = Setup())
{
var envFile = Path.Combine(_rootDirectory, "file-not-found");
_setEnvFileCommand.ProcessCommand(_executionContext.Object, envFile, null);
Assert.Equal(0, _issues.Count);
Assert.Equal(0, _executionContext.Object.Global.EnvironmentVariables.Count);
}
}
[Fact]
[Trait("Level", "L0")]
[Trait("Category", "Worker")]
public void SetEnvFileCommand_EmptyFile()
{
using (var hostContext = Setup())
{
var envFile = Path.Combine(_rootDirectory, "empty-file");
var content = new List<string>();
WriteContent(envFile, content);
_setEnvFileCommand.ProcessCommand(_executionContext.Object, envFile, null);
Assert.Equal(0, _issues.Count);
Assert.Equal(0, _executionContext.Object.Global.EnvironmentVariables.Count);
}
}
[Fact]
[Trait("Level", "L0")]
[Trait("Category", "Worker")]
public void SetEnvFileCommand_Simple()
{
using (var hostContext = Setup())
{
var envFile = Path.Combine(_rootDirectory, "simple");
var content = new List<string>
{
"MY_ENV=MY VALUE",
};
WriteContent(envFile, content);
_setEnvFileCommand.ProcessCommand(_executionContext.Object, envFile, null);
Assert.Equal(0, _issues.Count);
Assert.Equal(1, _executionContext.Object.Global.EnvironmentVariables.Count);
Assert.Equal("MY VALUE", _executionContext.Object.Global.EnvironmentVariables["MY_ENV"]);
}
}
[Fact]
[Trait("Level", "L0")]
[Trait("Category", "Worker")]
public void SetEnvFileCommand_Simple_SkipEmptyLines()
{
using (var hostContext = Setup())
{
var envFile = Path.Combine(_rootDirectory, "simple");
var content = new List<string>
{
string.Empty,
"MY_ENV=my value",
string.Empty,
"MY_ENV_2=my second value",
string.Empty,
};
WriteContent(envFile, content);
_setEnvFileCommand.ProcessCommand(_executionContext.Object, envFile, null);
Assert.Equal(0, _issues.Count);
Assert.Equal(2, _executionContext.Object.Global.EnvironmentVariables.Count);
Assert.Equal("my value", _executionContext.Object.Global.EnvironmentVariables["MY_ENV"]);
Assert.Equal("my second value", _executionContext.Object.Global.EnvironmentVariables["MY_ENV_2"]);
}
}
[Fact]
[Trait("Level", "L0")]
[Trait("Category", "Worker")]
public void SetEnvFileCommand_Simple_EmptyValue()
{
using (var hostContext = Setup())
{
var envFile = Path.Combine(_rootDirectory, "simple-empty-value");
var content = new List<string>
{
"MY_ENV=",
};
WriteContent(envFile, content);
_setEnvFileCommand.ProcessCommand(_executionContext.Object, envFile, null);
Assert.Equal(0, _issues.Count);
Assert.Equal(1, _executionContext.Object.Global.EnvironmentVariables.Count);
Assert.Equal(string.Empty, _executionContext.Object.Global.EnvironmentVariables["MY_ENV"]);
}
}
[Fact]
[Trait("Level", "L0")]
[Trait("Category", "Worker")]
public void SetEnvFileCommand_Simple_MultipleValues()
{
using (var hostContext = Setup())
{
var envFile = Path.Combine(_rootDirectory, "simple");
var content = new List<string>
{
"MY_ENV=my value",
"MY_ENV_2=",
"MY_ENV_3=my third value",
};
WriteContent(envFile, content);
_setEnvFileCommand.ProcessCommand(_executionContext.Object, envFile, null);
Assert.Equal(0, _issues.Count);
Assert.Equal(3, _executionContext.Object.Global.EnvironmentVariables.Count);
Assert.Equal("my value", _executionContext.Object.Global.EnvironmentVariables["MY_ENV"]);
Assert.Equal(string.Empty, _executionContext.Object.Global.EnvironmentVariables["MY_ENV_2"]);
Assert.Equal("my third value", _executionContext.Object.Global.EnvironmentVariables["MY_ENV_3"]);
}
}
[Fact]
[Trait("Level", "L0")]
[Trait("Category", "Worker")]
public void SetEnvFileCommand_Simple_SpecialCharacters()
{
using (var hostContext = Setup())
{
var envFile = Path.Combine(_rootDirectory, "simple");
var content = new List<string>
{
"MY_ENV==abc",
"MY_ENV_2=def=ghi",
"MY_ENV_3=jkl=",
};
WriteContent(envFile, content);
_setEnvFileCommand.ProcessCommand(_executionContext.Object, envFile, null);
Assert.Equal(0, _issues.Count);
Assert.Equal(3, _executionContext.Object.Global.EnvironmentVariables.Count);
Assert.Equal("=abc", _executionContext.Object.Global.EnvironmentVariables["MY_ENV"]);
Assert.Equal("def=ghi", _executionContext.Object.Global.EnvironmentVariables["MY_ENV_2"]);
Assert.Equal("jkl=", _executionContext.Object.Global.EnvironmentVariables["MY_ENV_3"]);
}
}
[Fact]
[Trait("Level", "L0")]
[Trait("Category", "Worker")]
public void SetEnvFileCommand_Heredoc()
{
using (var hostContext = Setup())
{
var envFile = Path.Combine(_rootDirectory, "heredoc");
var content = new List<string>
{
"MY_ENV<<EOF",
"line one",
"line two",
"line three",
"EOF",
};
WriteContent(envFile, content);
_setEnvFileCommand.ProcessCommand(_executionContext.Object, envFile, null);
Assert.Equal(0, _issues.Count);
Assert.Equal(1, _executionContext.Object.Global.EnvironmentVariables.Count);
Assert.Equal($"line one{Environment.NewLine}line two{Environment.NewLine}line three", _executionContext.Object.Global.EnvironmentVariables["MY_ENV"]);
}
}
[Fact]
[Trait("Level", "L0")]
[Trait("Category", "Worker")]
public void SetEnvFileCommand_Heredoc_EmptyValue()
{
using (var hostContext = Setup())
{
var envFile = Path.Combine(_rootDirectory, "heredoc");
var content = new List<string>
{
"MY_ENV<<EOF",
"EOF",
};
WriteContent(envFile, content);
_setEnvFileCommand.ProcessCommand(_executionContext.Object, envFile, null);
Assert.Equal(0, _issues.Count);
Assert.Equal(1, _executionContext.Object.Global.EnvironmentVariables.Count);
Assert.Equal(string.Empty, _executionContext.Object.Global.EnvironmentVariables["MY_ENV"]);
}
}
[Fact]
[Trait("Level", "L0")]
[Trait("Category", "Worker")]
public void SetEnvFileCommand_Heredoc_SkipEmptyLines()
{
using (var hostContext = Setup())
{
var envFile = Path.Combine(_rootDirectory, "heredoc");
var content = new List<string>
{
string.Empty,
"MY_ENV<<EOF",
"hello",
"world",
"EOF",
string.Empty,
"MY_ENV_2<<EOF",
"HELLO",
"AGAIN",
"EOF",
string.Empty,
};
WriteContent(envFile, content);
_setEnvFileCommand.ProcessCommand(_executionContext.Object, envFile, null);
Assert.Equal(0, _issues.Count);
Assert.Equal(2, _executionContext.Object.Global.EnvironmentVariables.Count);
Assert.Equal($"hello{Environment.NewLine}world", _executionContext.Object.Global.EnvironmentVariables["MY_ENV"]);
Assert.Equal($"HELLO{Environment.NewLine}AGAIN", _executionContext.Object.Global.EnvironmentVariables["MY_ENV_2"]);
}
}
[Fact]
[Trait("Level", "L0")]
[Trait("Category", "Worker")]
public void SetEnvFileCommand_Heredoc_SpecialCharacters()
{
using (var hostContext = Setup())
{
var envFile = Path.Combine(_rootDirectory, "heredoc");
var content = new List<string>
{
"MY_ENV<<=EOF",
"hello",
"one",
"=EOF",
"MY_ENV_2<<<EOF",
"hello",
"two",
"<EOF",
"MY_ENV_3<<EOF",
"hello",
string.Empty,
"three",
string.Empty,
"EOF",
"MY_ENV_4<<EOF",
"hello=four",
"EOF",
"MY_ENV_5<<EOF",
" EOF",
"EOF",
};
WriteContent(envFile, content);
_setEnvFileCommand.ProcessCommand(_executionContext.Object, envFile, null);
Assert.Equal(0, _issues.Count);
Assert.Equal(5, _executionContext.Object.Global.EnvironmentVariables.Count);
Assert.Equal($"hello{Environment.NewLine}one", _executionContext.Object.Global.EnvironmentVariables["MY_ENV"]);
Assert.Equal($"hello{Environment.NewLine}two", _executionContext.Object.Global.EnvironmentVariables["MY_ENV_2"]);
Assert.Equal($"hello{Environment.NewLine}{Environment.NewLine}three{Environment.NewLine}", _executionContext.Object.Global.EnvironmentVariables["MY_ENV_3"]);
Assert.Equal($"hello=four", _executionContext.Object.Global.EnvironmentVariables["MY_ENV_4"]);
Assert.Equal($" EOF", _executionContext.Object.Global.EnvironmentVariables["MY_ENV_5"]);
}
}
#if OS_WINDOWS
[Fact]
[Trait("Level", "L0")]
[Trait("Category", "Worker")]
public void SetEnvFileCommand_Heredoc_PreservesNewline()
{
using (var hostContext = Setup())
{
var newline = "\n";
var envFile = Path.Combine(_rootDirectory, "heredoc");
var content = new List<string>
{
"MY_ENV<<EOF",
"hello",
"world",
"EOF",
};
WriteContent(envFile, content, newline: newline);
_setEnvFileCommand.ProcessCommand(_executionContext.Object, envFile, null);
Assert.Equal(0, _issues.Count);
Assert.Equal(1, _executionContext.Object.Global.EnvironmentVariables.Count);
Assert.Equal($"hello{newline}world", _executionContext.Object.Global.EnvironmentVariables["MY_ENV"]);
}
}
#endif
private void WriteContent(
string path,
List<string> content,
string newline = null)
{
if (string.IsNullOrEmpty(newline))
{
newline = Environment.NewLine;
}
var encoding = new UTF8Encoding(true); // Emit BOM
var contentStr = string.Join(newline, content);
File.WriteAllText(path, contentStr, encoding);
}
private TestHostContext Setup([CallerMemberName] string name = "")
{
_issues = new List<Tuple<DTWebApi.Issue, string>>();
var hostContext = new TestHostContext(this, name);
// Trace
_trace = hostContext.GetTrace();
// Directory for test data
var workDirectory = hostContext.GetDirectory(WellKnownDirectory.Work);
ArgUtil.NotNullOrEmpty(workDirectory, nameof(workDirectory));
Directory.CreateDirectory(workDirectory);
_rootDirectory = Path.Combine(workDirectory, nameof(SetEnvFileCommandL0));
Directory.CreateDirectory(_rootDirectory);
// Execution context
_executionContext = new Mock<IExecutionContext>();
_executionContext.Setup(x => x.Global)
.Returns(new GlobalContext
{
EnvironmentVariables = new Dictionary<string, string>(VarUtil.EnvironmentVariableKeyComparer),
WriteDebug = true,
});
_executionContext.Setup(x => x.AddIssue(It.IsAny<DTWebApi.Issue>(), It.IsAny<string>()))
.Callback((DTWebApi.Issue issue, string logMessage) =>
{
_issues.Add(new Tuple<DTWebApi.Issue, string>(issue, logMessage));
var message = !string.IsNullOrEmpty(logMessage) ? logMessage : issue.Message;
_trace.Info($"Issue '{issue.Type}': {message}");
});
_executionContext.Setup(x => x.Write(It.IsAny<string>(), It.IsAny<string>()))
.Callback((string tag, string message) =>
{
_trace.Info($"{tag}{message}");
});
// SetEnvFileCommand
_setEnvFileCommand = new SetEnvFileCommand();
_setEnvFileCommand.Initialize(hostContext);
return hostContext;
}
}
}

View File

@@ -38,11 +38,13 @@ namespace GitHub.Runner.Common.Tests.Worker
}; };
_ec = new Mock<IExecutionContext>(); _ec = new Mock<IExecutionContext>();
_ec.SetupAllProperties(); _ec.SetupAllProperties();
_ec.Setup(x => x.Variables).Returns(_variables); _ec.Setup(x => x.Global).Returns(new GlobalContext { WriteDebug = true });
_ec.Object.Global.Variables = _variables;
_ec.Object.Global.EnvironmentVariables = _env;
_contexts = new DictionaryContextData(); _contexts = new DictionaryContextData();
_jobContext = new JobContext(); _jobContext = new JobContext();
_contexts["github"] = new DictionaryContextData(); _contexts["github"] = new GitHubContext();
_contexts["runner"] = new DictionaryContextData(); _contexts["runner"] = new DictionaryContextData();
_contexts["job"] = _jobContext; _contexts["job"] = _jobContext;
_ec.Setup(x => x.ExpressionValues).Returns(_contexts); _ec.Setup(x => x.ExpressionValues).Returns(_contexts);
@@ -50,7 +52,7 @@ namespace GitHub.Runner.Common.Tests.Worker
_ec.Setup(x => x.JobContext).Returns(_jobContext); _ec.Setup(x => x.JobContext).Returns(_jobContext);
_stepContext = new StepsContext(); _stepContext = new StepsContext();
_ec.Setup(x => x.StepsContext).Returns(_stepContext); _ec.Object.Global.StepsContext = _stepContext;
_ec.Setup(x => x.PostJobSteps).Returns(new Stack<IStep>()); _ec.Setup(x => x.PostJobSteps).Returns(new Stack<IStep>());
@@ -80,7 +82,7 @@ namespace GitHub.Runner.Common.Tests.Worker
{ {
_ec.Object.Result = null; _ec.Object.Result = null;
_ec.Setup(x => x.JobSteps).Returns(new List<IStep>(variableSet.Select(x => x.Object).ToList())); _ec.Setup(x => x.JobSteps).Returns(new Queue<IStep>(variableSet.Select(x => x.Object).ToList()));
// Act. // Act.
await _stepsRunner.RunAsync(jobContext: _ec.Object); await _stepsRunner.RunAsync(jobContext: _ec.Object);
@@ -115,7 +117,7 @@ namespace GitHub.Runner.Common.Tests.Worker
{ {
_ec.Object.Result = null; _ec.Object.Result = null;
_ec.Setup(x => x.JobSteps).Returns(new List<IStep>(variableSet.Select(x => x.Object).ToList())); _ec.Setup(x => x.JobSteps).Returns(new Queue<IStep>(variableSet.Select(x => x.Object).ToList()));
// Act. // Act.
await _stepsRunner.RunAsync(jobContext: _ec.Object); await _stepsRunner.RunAsync(jobContext: _ec.Object);
@@ -154,7 +156,7 @@ namespace GitHub.Runner.Common.Tests.Worker
{ {
_ec.Object.Result = null; _ec.Object.Result = null;
_ec.Setup(x => x.JobSteps).Returns(new List<IStep>(variableSet.Steps.Select(x => x.Object).ToList())); _ec.Setup(x => x.JobSteps).Returns(new Queue<IStep>(variableSet.Steps.Select(x => x.Object).ToList()));
// Act. // Act.
await _stepsRunner.RunAsync(jobContext: _ec.Object); await _stepsRunner.RunAsync(jobContext: _ec.Object);
@@ -208,7 +210,7 @@ namespace GitHub.Runner.Common.Tests.Worker
{ {
_ec.Object.Result = null; _ec.Object.Result = null;
_ec.Setup(x => x.JobSteps).Returns(new List<IStep>(variableSet.Steps.Select(x => x.Object).ToList())); _ec.Setup(x => x.JobSteps).Returns(new Queue<IStep>(variableSet.Steps.Select(x => x.Object).ToList()));
// Act. // Act.
await _stepsRunner.RunAsync(jobContext: _ec.Object); await _stepsRunner.RunAsync(jobContext: _ec.Object);
@@ -287,7 +289,7 @@ namespace GitHub.Runner.Common.Tests.Worker
{ {
_ec.Object.Result = null; _ec.Object.Result = null;
_ec.Setup(x => x.JobSteps).Returns(new List<IStep>(variableSet.Steps.Select(x => x.Object).ToList())); _ec.Setup(x => x.JobSteps).Returns(new Queue<IStep>(variableSet.Steps.Select(x => x.Object).ToList()));
// Act. // Act.
await _stepsRunner.RunAsync(jobContext: _ec.Object); await _stepsRunner.RunAsync(jobContext: _ec.Object);
@@ -330,7 +332,7 @@ namespace GitHub.Runner.Common.Tests.Worker
{ {
_ec.Object.Result = null; _ec.Object.Result = null;
_ec.Setup(x => x.JobSteps).Returns(new List<IStep>(variableSet.Step.Select(x => x.Object).ToList())); _ec.Setup(x => x.JobSteps).Returns(new Queue<IStep>(variableSet.Step.Select(x => x.Object).ToList()));
// Act. // Act.
await _stepsRunner.RunAsync(jobContext: _ec.Object); await _stepsRunner.RunAsync(jobContext: _ec.Object);
@@ -361,7 +363,7 @@ namespace GitHub.Runner.Common.Tests.Worker
{ {
_ec.Object.Result = null; _ec.Object.Result = null;
_ec.Setup(x => x.JobSteps).Returns(new List<IStep>(variableSet.Select(x => x.Object).ToList())); _ec.Setup(x => x.JobSteps).Returns(new Queue<IStep>(variableSet.Select(x => x.Object).ToList()));
// Act. // Act.
await _stepsRunner.RunAsync(jobContext: _ec.Object); await _stepsRunner.RunAsync(jobContext: _ec.Object);
@@ -391,7 +393,7 @@ namespace GitHub.Runner.Common.Tests.Worker
{ {
_ec.Object.Result = null; _ec.Object.Result = null;
_ec.Setup(x => x.JobSteps).Returns(new List<IStep>(variableSet.Select(x => x.Object).ToList())); _ec.Setup(x => x.JobSteps).Returns(new Queue<IStep>(variableSet.Select(x => x.Object).ToList()));
// Act. // Act.
await _stepsRunner.RunAsync(jobContext: _ec.Object); await _stepsRunner.RunAsync(jobContext: _ec.Object);
@@ -417,7 +419,7 @@ namespace GitHub.Runner.Common.Tests.Worker
_ec.Object.Result = null; _ec.Object.Result = null;
_ec.Setup(x => x.JobSteps).Returns(new List<IStep>(new[] { step1.Object })); _ec.Setup(x => x.JobSteps).Returns(new Queue<IStep>(new[] { step1.Object }));
// Act. // Act.
await _stepsRunner.RunAsync(jobContext: _ec.Object); await _stepsRunner.RunAsync(jobContext: _ec.Object);
@@ -455,7 +457,7 @@ namespace GitHub.Runner.Common.Tests.Worker
_ec.Object.Result = null; _ec.Object.Result = null;
_ec.Setup(x => x.JobSteps).Returns(new List<IStep>(new[] { step1.Object, step2.Object })); _ec.Setup(x => x.JobSteps).Returns(new Queue<IStep>(new[] { step1.Object, step2.Object }));
// Act. // Act.
await _stepsRunner.RunAsync(jobContext: _ec.Object); await _stepsRunner.RunAsync(jobContext: _ec.Object);
@@ -493,7 +495,7 @@ namespace GitHub.Runner.Common.Tests.Worker
_ec.Object.Result = null; _ec.Object.Result = null;
_ec.Setup(x => x.JobSteps).Returns(new List<IStep>(new[] { step1.Object, step2.Object })); _ec.Setup(x => x.JobSteps).Returns(new Queue<IStep>(new[] { step1.Object, step2.Object }));
// Act. // Act.
await _stepsRunner.RunAsync(jobContext: _ec.Object); await _stepsRunner.RunAsync(jobContext: _ec.Object);
@@ -524,7 +526,7 @@ namespace GitHub.Runner.Common.Tests.Worker
_ec.Object.Result = null; _ec.Object.Result = null;
_ec.Setup(x => x.JobSteps).Returns(new List<IStep>(new[] { step1.Object, step2.Object, step3.Object })); _ec.Setup(x => x.JobSteps).Returns(new Queue<IStep>(new[] { step1.Object, step2.Object, step3.Object }));
// Act. // Act.
await _stepsRunner.RunAsync(jobContext: _ec.Object); await _stepsRunner.RunAsync(jobContext: _ec.Object);
@@ -560,7 +562,7 @@ namespace GitHub.Runner.Common.Tests.Worker
_ec.Object.Result = null; _ec.Object.Result = null;
_ec.Setup(x => x.JobSteps).Returns(new List<IStep>(new[] { step1.Object, step2.Object, step3.Object })); _ec.Setup(x => x.JobSteps).Returns(new Queue<IStep>(new[] { step1.Object, step2.Object, step3.Object }));
// Act. // Act.
await _stepsRunner.RunAsync(jobContext: _ec.Object); await _stepsRunner.RunAsync(jobContext: _ec.Object);
@@ -599,13 +601,15 @@ namespace GitHub.Runner.Common.Tests.Worker
// Setup the step execution context. // Setup the step execution context.
var stepContext = new Mock<IExecutionContext>(); var stepContext = new Mock<IExecutionContext>();
stepContext.SetupAllProperties(); stepContext.SetupAllProperties();
stepContext.Setup(x => x.WriteDebug).Returns(true); stepContext.Setup(x => x.Global).Returns(() => _ec.Object.Global);
stepContext.Setup(x => x.Variables).Returns(_variables); var expressionValues = new DictionaryContextData();
stepContext.Setup(x => x.EnvironmentVariables).Returns(_env); foreach (var pair in _ec.Object.ExpressionValues)
stepContext.Setup(x => x.ExpressionValues).Returns(new DictionaryContextData()); {
expressionValues[pair.Key] = pair.Value;
}
stepContext.Setup(x => x.ExpressionValues).Returns(expressionValues);
stepContext.Setup(x => x.ExpressionFunctions).Returns(new List<IFunctionInfo>()); stepContext.Setup(x => x.ExpressionFunctions).Returns(new List<IFunctionInfo>());
stepContext.Setup(x => x.JobContext).Returns(_jobContext); stepContext.Setup(x => x.JobContext).Returns(_jobContext);
stepContext.Setup(x => x.StepsContext).Returns(_stepContext);
stepContext.Setup(x => x.ContextName).Returns(step.Object.Action.ContextName); stepContext.Setup(x => x.ContextName).Returns(step.Object.Action.ContextName);
stepContext.Setup(x => x.Complete(It.IsAny<TaskResult?>(), It.IsAny<string>(), It.IsAny<string>())) stepContext.Setup(x => x.Complete(It.IsAny<TaskResult?>(), It.IsAny<string>(), It.IsAny<string>()))
.Callback((TaskResult? r, string currentOperation, string resultCode) => .Callback((TaskResult? r, string currentOperation, string resultCode) =>

View File

@@ -17,7 +17,7 @@ LAYOUT_DIR="$SCRIPT_DIR/../_layout"
DOWNLOAD_DIR="$SCRIPT_DIR/../_downloads/netcore2x" DOWNLOAD_DIR="$SCRIPT_DIR/../_downloads/netcore2x"
PACKAGE_DIR="$SCRIPT_DIR/../_package" PACKAGE_DIR="$SCRIPT_DIR/../_package"
DOTNETSDK_ROOT="$SCRIPT_DIR/../_dotnetsdk" DOTNETSDK_ROOT="$SCRIPT_DIR/../_dotnetsdk"
DOTNETSDK_VERSION="3.1.100" DOTNETSDK_VERSION="3.1.302"
DOTNETSDK_INSTALLDIR="$DOTNETSDK_ROOT/$DOTNETSDK_VERSION" DOTNETSDK_INSTALLDIR="$DOTNETSDK_ROOT/$DOTNETSDK_VERSION"
RUNNER_VERSION=$(cat runnerversion) RUNNER_VERSION=$(cat runnerversion)

View File

@@ -1,5 +1,5 @@
{ {
"sdk": { "sdk": {
"version": "3.1.100" "version": "3.1.302"
} }
} }

Some files were not shown because too many files have changed in this diff Show More