mirror of
https://github.com/actions/runner.git
synced 2025-12-10 12:36:23 +00:00
Compare commits
24 Commits
users/eric
...
v2.165.1
| Author | SHA1 | Date | |
|---|---|---|---|
|
|
a0a590fb48 | ||
|
|
87a232c477 | ||
|
|
a3c2479a29 | ||
|
|
c45aebc9ab | ||
|
|
b676ab3d33 | ||
|
|
0a6bac355d | ||
|
|
eb78d19b17 | ||
|
|
17970ad1f9 | ||
|
|
2e0e8eb822 | ||
|
|
2a506cc556 | ||
|
|
43dd34820b | ||
|
|
746c9d9ec0 | ||
|
|
fa2ecfcc4c | ||
|
|
c59c0e2ded | ||
|
|
7a382facb3 | ||
|
|
e9ae42693f | ||
|
|
9cafe8c028 | ||
|
|
1484c3fb03 | ||
|
|
53d632706d | ||
|
|
d6179242ca | ||
|
|
0da38a6924 | ||
|
|
b19e5d7924 | ||
|
|
80ac4a8964 | ||
|
|
02639a2092 |
8
.github/workflows/build.yml
vendored
8
.github/workflows/build.yml
vendored
@@ -43,6 +43,14 @@ jobs:
|
|||||||
steps:
|
steps:
|
||||||
- uses: actions/checkout@v1
|
- uses: actions/checkout@v1
|
||||||
|
|
||||||
|
# Set Path workaround for https://github.com/actions/virtual-environments/issues/263
|
||||||
|
- run: |
|
||||||
|
echo "::add-path::C:\Program Files\Git\mingw64\bin"
|
||||||
|
echo "::add-path::C:\Program Files\Git\usr\bin"
|
||||||
|
echo "::add-path::C:\Program Files\Git\bin"
|
||||||
|
if: matrix.os == 'windows-latest'
|
||||||
|
name: "Temp step to Set Path for Windows"
|
||||||
|
|
||||||
# Build runner layout
|
# Build runner layout
|
||||||
- name: Build & Layout Release
|
- name: Build & Layout Release
|
||||||
run: |
|
run: |
|
||||||
|
|||||||
1
.gitignore
vendored
1
.gitignore
vendored
@@ -2,6 +2,7 @@
|
|||||||
**/bin
|
**/bin
|
||||||
**/obj
|
**/obj
|
||||||
**/libs
|
**/libs
|
||||||
|
**/lib
|
||||||
|
|
||||||
# editors
|
# editors
|
||||||
**/*.xproj
|
**/*.xproj
|
||||||
|
|||||||
@@ -1,4 +1,5 @@
|
|||||||
# ADR 263: Self Hosted Runner Proxies
|
# ADR 263: Self Hosted Runner Proxies
|
||||||
|
|
||||||
**Date**: 2019-11-13
|
**Date**: 2019-11-13
|
||||||
|
|
||||||
**Status**: Accepted
|
**Status**: Accepted
|
||||||
@@ -9,8 +10,7 @@
|
|||||||
- While there is not a standard convention, many applications support setting proxies via the environmental variables `http_proxy`, `https_proxy`, `no_proxy`, such as curl, wget, perl, python, docker, git, R, ect
|
- While there is not a standard convention, many applications support setting proxies via the environmental variables `http_proxy`, `https_proxy`, `no_proxy`, such as curl, wget, perl, python, docker, git, R, ect
|
||||||
- Some of these applications use `HTTPS_PROXY` versus `https_proxy`, but most understand or primarily support the lowercase variant
|
- Some of these applications use `HTTPS_PROXY` versus `https_proxy`, but most understand or primarily support the lowercase variant
|
||||||
|
|
||||||
|
## Decision
|
||||||
## Decisions
|
|
||||||
|
|
||||||
We will update the Runner to use the conventional environment variables for proxies: `http_proxy`, `https_proxy` and `no_proxy` if they are set.
|
We will update the Runner to use the conventional environment variables for proxies: `http_proxy`, `https_proxy` and `no_proxy` if they are set.
|
||||||
These are described in detail below:
|
These are described in detail below:
|
||||||
|
|||||||
263
docs/adrs/0276-problem-matchers.md
Normal file
263
docs/adrs/0276-problem-matchers.md
Normal file
@@ -0,0 +1,263 @@
|
|||||||
|
# ADR 0276: Problem Matchers
|
||||||
|
|
||||||
|
**Date** 2019-06-05
|
||||||
|
|
||||||
|
**Status** Accepted
|
||||||
|
|
||||||
|
## Context
|
||||||
|
|
||||||
|
Compilation failures during a CI build should surface good error messages.
|
||||||
|
|
||||||
|
For example, the actual compile errors from the typescript compiler should bubble as issues in the UI. And not simply "tsc exited with exit code 1".
|
||||||
|
|
||||||
|
VSCode has an extensible model for solving this type of problem. VSCode allows users to configure which problems matchers to use, when scanning output. For example, a user can apply the `tsc` problem matcher to receive a rich error output experience in VSCode, when compiling their typescript project.
|
||||||
|
|
||||||
|
The problem-matcher concept fits well with "setup" actions. For example, the `setup-nodejs` action will download node.js, add it to the PATH, and register the `tsc` problem matcher. For the duration of the job, the `tsc` problem matcher will be applied against the output.
|
||||||
|
|
||||||
|
## Decision
|
||||||
|
|
||||||
|
### Registration
|
||||||
|
|
||||||
|
#### Using `##` command
|
||||||
|
|
||||||
|
`##[add-matcher]path-to-problem-matcher-config.json`
|
||||||
|
|
||||||
|
Using a `##` command allows for flexibility:
|
||||||
|
- Ad hoc scripts can register problem matchers
|
||||||
|
- Allows problem matchers to be conditionally registered
|
||||||
|
|
||||||
|
Note, if a matcher with the same name is registered a second time, it will clobber the first instance.
|
||||||
|
|
||||||
|
#### Unregister using `##` command
|
||||||
|
|
||||||
|
A way out for rare cases where scoping is a problem.
|
||||||
|
|
||||||
|
`##[remove-matcher]owner`
|
||||||
|
|
||||||
|
For the this to be usable, the `owner` needs to be discoverable. Therefore, debug print the owner on registration.
|
||||||
|
|
||||||
|
### Single line matcher
|
||||||
|
|
||||||
|
Consider the output:
|
||||||
|
|
||||||
|
```
|
||||||
|
[...]
|
||||||
|
|
||||||
|
Build FAILED.
|
||||||
|
|
||||||
|
"C:\temp\problemmatcher\myproject\ConsoleApp1\ConsoleApp1.sln" (default target) (1) ->
|
||||||
|
"C:\temp\problemmatcher\myproject\ConsoleApp1\ConsoleApp1\ConsoleApp1.csproj" (default target) (2) ->
|
||||||
|
"C:\temp\problemmatcher\myproject\ConsoleApp1\ClassLibrary1\ClassLibrary1.csproj" (default target) (3) ->
|
||||||
|
(CoreCompile target) ->
|
||||||
|
Class1.cs(16,24): warning CS0612: 'ClassLibrary1.Helpers.MyHelper.Name' is obsolete [C:\temp\problemmatcher\myproject\ConsoleApp1\ClassLibrary1\ClassLibrary1.csproj]
|
||||||
|
|
||||||
|
|
||||||
|
"C:\temp\problemmatcher\myproject\ConsoleApp1\ConsoleApp1.sln" (default target) (1) ->
|
||||||
|
"C:\temp\problemmatcher\myproject\ConsoleApp1\ConsoleApp1\ConsoleApp1.csproj" (default target) (2) ->
|
||||||
|
"C:\temp\problemmatcher\myproject\ConsoleApp1\ClassLibrary1\ClassLibrary1.csproj" (default target) (3) ->
|
||||||
|
(CoreCompile target) ->
|
||||||
|
Helpers\MyHelper.cs(16,30): error CS1002: ; expected [C:\temp\problemmatcher\myproject\ConsoleApp1\ClassLibrary1\ClassLibrary1.csproj]
|
||||||
|
|
||||||
|
1 Warning(s)
|
||||||
|
1 Error(s)
|
||||||
|
```
|
||||||
|
|
||||||
|
The below match configuration uses a regular expression to discover problem lines. And the match groups are mapped into issue-properties.
|
||||||
|
|
||||||
|
```json
|
||||||
|
"owner": "msbuild",
|
||||||
|
"pattern": [
|
||||||
|
{
|
||||||
|
"regexp": "^\\s*([^:]+)\\((\\d+),(\\d+)\\): (error|warning) ([^:]+): (.*) \\[(.+)\\]$",
|
||||||
|
"file": 1,
|
||||||
|
"line": 2,
|
||||||
|
"column": 3,
|
||||||
|
"severity": 4,
|
||||||
|
"code": 5,
|
||||||
|
"message": 6,
|
||||||
|
"fromPath": 7
|
||||||
|
}
|
||||||
|
]
|
||||||
|
```
|
||||||
|
|
||||||
|
The above output and match configuration produces the following matches:
|
||||||
|
|
||||||
|
```
|
||||||
|
line: Class1.cs(16,24): warning CS0612: 'ClassLibrary1.Helpers.MyHelper.Name' is obsolete [C:\myrepo\myproject\ConsoleApp1\ClassLibrary1\ClassLibrary1.csproj]
|
||||||
|
file: Class1.cs
|
||||||
|
line: 16
|
||||||
|
column: 24
|
||||||
|
severity: warning
|
||||||
|
code: CS0612
|
||||||
|
message: 'ClassLibrary1.Helpers.MyHelper.Name' is obsolete
|
||||||
|
fromPath: C:\myrepo\myproject\ConsoleApp1\ClassLibrary1\ClassLibrary1.csproj
|
||||||
|
```
|
||||||
|
|
||||||
|
```
|
||||||
|
line: Helpers\MyHelper.cs(16,30): error CS1002: ; expected [C:\myrepo\myproject\ConsoleApp1\ClassLibrary1\ClassLibrary1.csproj]
|
||||||
|
file: Helpers\MyHelper.cs
|
||||||
|
line: 16
|
||||||
|
column: 30
|
||||||
|
severity: error
|
||||||
|
code: CS1002
|
||||||
|
message: ; expected
|
||||||
|
fromPath: C:\myrepo\myproject\ConsoleApp1\ClassLibrary1\ClassLibrary1.csproj
|
||||||
|
```
|
||||||
|
|
||||||
|
Additionally the line will appear red in the web UI (prefix with `##[error]`).
|
||||||
|
|
||||||
|
Note, an error does not imply task failure. Exit codes communicate failure.
|
||||||
|
|
||||||
|
Note, strip color codes when evaluating regular expressions.
|
||||||
|
|
||||||
|
### Multi-line matcher
|
||||||
|
|
||||||
|
Consider the below output from ESLint in stylish mode. The file name is printed once, yet multiple error lines are printed.
|
||||||
|
|
||||||
|
```
|
||||||
|
test.js
|
||||||
|
1:0 error Missing "use strict" statement strict
|
||||||
|
5:10 error 'addOne' is defined but never used no-unused-vars
|
||||||
|
✖ 2 problems (2 errors, 0 warnings)
|
||||||
|
```
|
||||||
|
|
||||||
|
The below match configuration uses multiple regular expressions, for the multiple lines.
|
||||||
|
|
||||||
|
And the last pattern of a multiline matcher can specify the `loop` property. This allows multiple errors to be discovered.
|
||||||
|
|
||||||
|
```json
|
||||||
|
"owner": "eslint-stylish",
|
||||||
|
"pattern": [
|
||||||
|
{
|
||||||
|
"regexp": "^([^\\s].*)$",
|
||||||
|
"file": 1
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"regexp": "^\\s+(\\d+):(\\d+)\\s+(error|warning|info)\\s+(.*)\\s\\s+(.*)$",
|
||||||
|
"line": 1,
|
||||||
|
"column": 2,
|
||||||
|
"severity": 3,
|
||||||
|
"message": 4,
|
||||||
|
"code": 5,
|
||||||
|
"loop": true
|
||||||
|
}
|
||||||
|
]
|
||||||
|
```
|
||||||
|
|
||||||
|
The above output and match configuration produces two matches:
|
||||||
|
|
||||||
|
```
|
||||||
|
line: 1:0 error Missing "use strict" statement strict
|
||||||
|
file: test.js
|
||||||
|
line: 1
|
||||||
|
column: 0
|
||||||
|
severity: error
|
||||||
|
message: Missing "use strict" statement
|
||||||
|
code: strict
|
||||||
|
```
|
||||||
|
|
||||||
|
```
|
||||||
|
line: 5:10 error 'addOne' is defined but never used no-unused-vars
|
||||||
|
file: test.js
|
||||||
|
line: 5
|
||||||
|
column: 10
|
||||||
|
severity: error
|
||||||
|
message: 'addOne' is defined but never used
|
||||||
|
code: no-unused-vars
|
||||||
|
```
|
||||||
|
|
||||||
|
Note, in the above example only the error line will appear red in the web UI. The \"file\" line will not appear red.
|
||||||
|
|
||||||
|
### Other details
|
||||||
|
|
||||||
|
#### Configuration `owner`
|
||||||
|
|
||||||
|
Can be used to stomp over or remove.
|
||||||
|
|
||||||
|
#### Rooting the file
|
||||||
|
|
||||||
|
The goal of the file information is to provide a hyperlink in the UI.
|
||||||
|
|
||||||
|
Solving this problem means:
|
||||||
|
- Rooting the file when unrooted:
|
||||||
|
- Use the `fromPath` if specified (assume file path)
|
||||||
|
- Use the `github.workspace` (where the repo is cloned on disk)
|
||||||
|
- Match against a repository to determine the relative path within the repo
|
||||||
|
|
||||||
|
This is a place where we diverge from VSCode. VSCode task configuration are specific to the local workspace (workspace root is known or can be specified). We're solving a more generic problem, so we need more information - specifically the `fromPath` property - in order to accurately root the path.
|
||||||
|
|
||||||
|
In order to avoid creating inaccurate hyperlinks on the error issues, the agent will verify the file exists and is in the main repository. Otherwise omit the file property from the error issue and debug trace what happened.
|
||||||
|
|
||||||
|
#### Supported severity levels
|
||||||
|
|
||||||
|
Ordinal ignore case:
|
||||||
|
|
||||||
|
- `warning`
|
||||||
|
- `error`
|
||||||
|
|
||||||
|
Coalesce empty with \"error\". For any other values, omit logging an issue and debug trace what happened.
|
||||||
|
|
||||||
|
#### Default severity level
|
||||||
|
|
||||||
|
Problem matchers are unable to interpret severity strings other than `warning` and `error`. The `severity` match group expects `warning` or `error` (case insensitive).
|
||||||
|
|
||||||
|
However some tools indicate error/warning in different ways. For example `flake8` uses codes like `E100`, `W200`, and `F300` (error, warning, fatal, respectively).
|
||||||
|
|
||||||
|
Therefore, allow a property `severity`, sibling to `owner`, which identifies the default severity for the problem matcher. This allows two problem matchers are registered - one for warnings and one for errors.
|
||||||
|
|
||||||
|
For example, given the following `flake8` output:
|
||||||
|
|
||||||
|
```
|
||||||
|
./bootcamp/settings.py:156:80: E501 line too long (94 > 79 characters)
|
||||||
|
./bootcamp/settings.py:165:5: F403 'from local_settings import *' used; unable to detect undefined names
|
||||||
|
```
|
||||||
|
|
||||||
|
Two problem matchers can be used:
|
||||||
|
|
||||||
|
```json
|
||||||
|
{
|
||||||
|
"problemMatcher": [
|
||||||
|
{
|
||||||
|
"owner": "flake8",
|
||||||
|
"pattern": [
|
||||||
|
{
|
||||||
|
"regexp": "^(.+):(\\d+):(\\d+): ([EF]\\d+) (.+)$",
|
||||||
|
"file": 1,
|
||||||
|
"line": 2,
|
||||||
|
"column": 3,
|
||||||
|
"code": 4,
|
||||||
|
"message": 5
|
||||||
|
}
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"owner": "flake8-warnings",
|
||||||
|
"severity": "warning",
|
||||||
|
"pattern": [
|
||||||
|
{
|
||||||
|
"regexp": "^(.+):(\\d+):(\\d+): (W\\d+) (.+)$",
|
||||||
|
"file": 1,
|
||||||
|
"line": 2,
|
||||||
|
"column": 3,
|
||||||
|
"code": 4,
|
||||||
|
"message": 5
|
||||||
|
}
|
||||||
|
]
|
||||||
|
}
|
||||||
|
]
|
||||||
|
}
|
||||||
|
```
|
||||||
|
|
||||||
|
#### Mitigate regular expression denial of service (ReDos)
|
||||||
|
|
||||||
|
If a matcher exceeds a 1 second timeout when processing a line, retry up to two three times total.
|
||||||
|
After three unsuccessful attempts, warn and eject the matcher. The matcher will not run again for the duration of the job.
|
||||||
|
|
||||||
|
### Where we diverge from VSCode
|
||||||
|
|
||||||
|
- We added the `fromPath` concept for rooting paths. This is done differently in VSCode, since a task is the scope (root path well known). For us, the job is the scope.
|
||||||
|
- VSCode allows additional activation info background tasks that are always running (recompile on files changed). They allow regular expressions to define when the matcher scope begins and ends. This is an interesting concept that we could leverage to help solve our scoping problem.
|
||||||
|
|
||||||
|
## Consequences
|
||||||
|
|
||||||
|
- Setup actions should register problem matchers
|
||||||
93
docs/adrs/0277-run-action-shell-options.md
Normal file
93
docs/adrs/0277-run-action-shell-options.md
Normal file
@@ -0,0 +1,93 @@
|
|||||||
|
# ADR 0277: Run action shell option
|
||||||
|
|
||||||
|
**Date** 2019-07-09
|
||||||
|
|
||||||
|
**Status** Accepted
|
||||||
|
|
||||||
|
## Context
|
||||||
|
run-actions run scripts using a platform specific shell:
|
||||||
|
`bash -eo pipefail` on non-windows, and `cmd.exe /c /d /s` on windows
|
||||||
|
|
||||||
|
The `shell` option overwrites this to allow different flags or completely different shells/interpreters
|
||||||
|
|
||||||
|
A small example is:
|
||||||
|
```yml
|
||||||
|
jobs:
|
||||||
|
bash-job:
|
||||||
|
actions:
|
||||||
|
- run: echo "Hello"
|
||||||
|
shell: bash
|
||||||
|
python-job:
|
||||||
|
actions:
|
||||||
|
- run: print("Hello")
|
||||||
|
shell: python {0}
|
||||||
|
```
|
||||||
|
|
||||||
|
## Decision
|
||||||
|
|
||||||
|
___
|
||||||
|
|
||||||
|
### Shell option
|
||||||
|
The keyword being used is `shell`
|
||||||
|
|
||||||
|
`shell` can be either:
|
||||||
|
|
||||||
|
1. Builtins / Explicitly supported keywords. It is useful to support at least `cmd`, and `powershell` on Windows. Because `cmd my_cmd_script` and `powershell my_ps1_script` are not valid the same way many Linux/cross-platform interpreters are, e.g. `bash myscript` or `python myscript`. Those tools (and potentially others) also require the correct file extension to run, or must be run in a particular way to get the exit codes consistently, so we must have first class knowledge about them. We provide default templates for these keywords as follows:
|
||||||
|
- `cmd`: Default is: `%ComSpec% /D /E:ON /V:OFF /S /C "CALL "{0}""` where the script name is automatically appended with `.cmd` and substituted for `{0}`
|
||||||
|
- Note this is equivalent to the default Windows behavior if no shell option is given
|
||||||
|
- `pwsh`: Default is: `pwsh -command "& '{0}'"` where the script is automatically appended with `.ps1`
|
||||||
|
- `powershell`: Default is: `powershell -command "& '{0}'"` where the script is automatically appended with `.ps1`
|
||||||
|
- `bash`: Uses `bash --noprofile --norc -eo pipefail {0}`
|
||||||
|
- The default behavior on non-Windows if no shell is given is to attempt this first
|
||||||
|
- `sh`: Uses `sh -e {0}`
|
||||||
|
- This is the default behavior on non-Windows if no shell is given, AND `bash` (see above) was not located on the PATH
|
||||||
|
- `python`: `python {0}`
|
||||||
|
- **NOTE**: The exact command ran may vary by machine. We only provide default arguments and command format for the listed shell. While the above behavior is expected on hosted machines, private runners may vary. For example, `sh` (or other commands) may actually be a link to `/bin/dash`, `/bin/bash`, or other
|
||||||
|
|
||||||
|
1. A template string: `command [...options] {0} [...more_options]`
|
||||||
|
- As above, the file name of the temporary script will be templated in. This gives users more control to have options at any location relative to the script path
|
||||||
|
- The first whitespace-delimited word of the string will be interpreted as the command
|
||||||
|
- e.g. `python {0} arg1 arg2` or similar can be used if passing args is needed. Some shells will require other options after the filename for various reasons
|
||||||
|
|
||||||
|
Note that (1) simply provides defaults that are executed with the same mechanism as (2). That is:
|
||||||
|
- A temporary script file is generated, and the path to that file is templated into the string at `{0}`
|
||||||
|
- The first word of the formatted string is assumed to be a command, and we attempt to locate its full path
|
||||||
|
- The fully qualified path to the command, plus the remaining arguments, is executed
|
||||||
|
- e.g. `shell: bash` expands to `/bin/bash --noprofile --norc -eo pipefail /runner/_layout/_work/_temp/f8d4fb2b-19d9-47e6-a786-4cc538d52761.sh` on my private runner
|
||||||
|
|
||||||
|
At this time, **THE LIST OF WELL-KNOWN SHELL OPTIONS IS**:
|
||||||
|
- cmd - Windows (hosted vs2017, vs2019) only
|
||||||
|
- powershell - Windows (hosted vs2017, vs2019) only
|
||||||
|
- sh - All hosted platforms
|
||||||
|
- pwsh - All hosted platforms
|
||||||
|
- bash - All hosted platforms
|
||||||
|
- python - All hosted platforms. Can use setup-python to configure which python will be used
|
||||||
|
___
|
||||||
|
|
||||||
|
### Containers
|
||||||
|
For container jobs, `shell` should just work the same as above, transparently. We will simply `exec` the command in the job container, passing the same arguments in
|
||||||
|
|
||||||
|
___
|
||||||
|
|
||||||
|
### Exit codes / Error action preference
|
||||||
|
|
||||||
|
For builtin shells, we provide defaults that make the most sense for CI, running within Actions, and being executed by our runner
|
||||||
|
|
||||||
|
bash/sh:
|
||||||
|
- Fail-fast behavior using `set -e o pipefail` is the default for `bash` and `shell` builtins, and by default when no option is given on non-Windows platforms
|
||||||
|
- Users can opt out of fail-fast and take full control easily by providing a template string to the shell options, eg: `bash {0}`.
|
||||||
|
- sh-like shells exit with the exit code of the last command executed in a script, and is our default behavior. Thus the runner reports the status of the step as fail/succeed based on this exit code
|
||||||
|
|
||||||
|
powershell/pwsh
|
||||||
|
- Fail-fast behavior when possible. For `pwsh` and `powershell` builtins, we will prepend `$ErrorActionPreference = 'stop'` to script contents
|
||||||
|
- We append `if ((Test-Path -LiteralPath variable:\LASTEXITCODE)) { exit $LASTEXITCODE }` to powershell scripts to get Action statuses to reflect the script's last exit code
|
||||||
|
- Users can always opt out by not using the builtins, and providing a shell option like: `pwsh -File {0}`, or `powershell -Command "& '{0}'"`, depending on need
|
||||||
|
|
||||||
|
cmd
|
||||||
|
- There doesnt seem to be a way to fully opt in to fail-fast behavior other than writing your script to check each error code and respond accordingly, so we cant actually provide that behavior by default, it will be completely up to the user to write this behavior into their script
|
||||||
|
- cmd.exe will exit (return the error code to the runner) with the errorlevel of the last program it executed. This is internally consistent with the previous default behavior (sh, pwsh) and is the cmd.exe default, so we keep that behavior
|
||||||
|
|
||||||
|
## Consequences
|
||||||
|
Valid `shell` options will depend on the hosted images. We will need to maintain tight image compat
|
||||||
|
|
||||||
|
First class support for a shell will require a major version schema change to modify. We cannot remove or modify the behavior of a well-known supported option, However, adding first class support for new shells is backwards compatible. For instance, we can add a well-known `python` option, because non-well-known options would have always needed to include `{0}`, e.g. `python {0}`
|
||||||
60
docs/adrs/0278-env-context.md
Normal file
60
docs/adrs/0278-env-context.md
Normal file
@@ -0,0 +1,60 @@
|
|||||||
|
# ADR 0278: Env Context
|
||||||
|
|
||||||
|
**Date**: 2019-09-30
|
||||||
|
|
||||||
|
**Status**: Accepted
|
||||||
|
|
||||||
|
## Context
|
||||||
|
|
||||||
|
User wants to reference workflow variables defined in workflow yaml file for action's input, displayName and condition.
|
||||||
|
|
||||||
|
## Decision
|
||||||
|
|
||||||
|
### Add `env` context in the runner
|
||||||
|
|
||||||
|
Runner will create and populate the `env` context for every job execution using following logic:
|
||||||
|
1. On job start, create `env` context with any environment variables in the job message, these are env defined in customer's YAML file's job/workflow level `env` section.
|
||||||
|
2. Update `env` context when customer use `::set-env::` to set env at the runner level.
|
||||||
|
3. Update `env` context with step's `env` block before each step runs.
|
||||||
|
|
||||||
|
The `env` context is only available in the runner, customer can't use the `env` context in any server evaluation part, just like the `runner` context
|
||||||
|
|
||||||
|
Example yaml:
|
||||||
|
```yaml
|
||||||
|
|
||||||
|
env:
|
||||||
|
env1: 10
|
||||||
|
env2: 20
|
||||||
|
env3: 30
|
||||||
|
jobs:
|
||||||
|
build:
|
||||||
|
env:
|
||||||
|
env1: 100
|
||||||
|
env2: 200
|
||||||
|
runs-on: ubuntu-latest
|
||||||
|
steps:
|
||||||
|
- run: |
|
||||||
|
echo ${{ env.env1 }} // 1000
|
||||||
|
echo $env1 // 1000
|
||||||
|
echo $env2 // 200
|
||||||
|
echo $env3 // 30
|
||||||
|
if: env.env2 == 200 // true
|
||||||
|
name: ${{ env.env1 }}_${{ env.env2 }} //1000_200
|
||||||
|
env:
|
||||||
|
env1: 1000
|
||||||
|
```
|
||||||
|
|
||||||
|
### Don't populate the `env` context with environment variables from runner machine.
|
||||||
|
|
||||||
|
With job container and container action, the `env` context may not have the right value customer want and will cause confusion.
|
||||||
|
Ex:
|
||||||
|
```yaml
|
||||||
|
build:
|
||||||
|
runs-on: ubuntu-latest <- $USER=runner in hosted machine
|
||||||
|
container: ubuntu:16.04 <- $USER=root in container
|
||||||
|
steps:
|
||||||
|
- run: echo ${{env.USER}} <- what should customer expect this output? runner/root
|
||||||
|
- uses: docker://ubuntu:18.04
|
||||||
|
with:
|
||||||
|
args: echo ${{env.USER}} <- what should customer expect this output? runner/root
|
||||||
|
```
|
||||||
71
docs/adrs/0279-hashFiles-expression-function.md
Normal file
71
docs/adrs/0279-hashFiles-expression-function.md
Normal file
@@ -0,0 +1,71 @@
|
|||||||
|
# ADR 0279: HashFiles Expression Function
|
||||||
|
|
||||||
|
**Date**: 2019-09-30
|
||||||
|
|
||||||
|
**Status**: Accepted
|
||||||
|
|
||||||
|
## Context
|
||||||
|
First party action `actions/cache` needs a input which is an explicit `key` used for restoring and saving the cache. For packages caching, the most comment `key` might be the hash result of contents from all `package-lock.json` under `node_modules` folder.
|
||||||
|
|
||||||
|
There are serval different ways to get the hash `key` input for `actions/cache` action.
|
||||||
|
|
||||||
|
1. Customer calculate the `key` themselves from a different action, customer won't like this since it needs extra step for using cache feature
|
||||||
|
```yaml
|
||||||
|
steps:
|
||||||
|
- run: |
|
||||||
|
hash=some_linux_hash_method(file1, file2, file3)
|
||||||
|
echo ::set-output name=hash::$hash
|
||||||
|
id: createHash
|
||||||
|
- uses: actions/cache@v1
|
||||||
|
with:
|
||||||
|
key: ${{ steps.createHash.outputs.hash }}
|
||||||
|
```
|
||||||
|
|
||||||
|
2. Make the `key` input of `actions/cache` follow certain convention to calculate hash, this limited the `key` input to a certain format customer may not want.
|
||||||
|
```yaml
|
||||||
|
steps:
|
||||||
|
- uses: actions/cache@v1
|
||||||
|
with:
|
||||||
|
key: ${{ runner.os }}|${{ github.workspace }}|**/package-lock.json
|
||||||
|
```
|
||||||
|
|
||||||
|
## Decision
|
||||||
|
|
||||||
|
### Add hashFiles() function to expression engine for calculate files' hash
|
||||||
|
|
||||||
|
`hashFiles()` will only allow on runner side since it needs to read files on disk, using `hashFiles()` on any server side evaluated expression will cause runtime errors.
|
||||||
|
|
||||||
|
`hashFiles()` will only support hashing files under the `$GITHUB_WORKSPACE` since the expression evaluated on the runner, if customer use job container or container action, the runner won't have access to file system inside the container.
|
||||||
|
|
||||||
|
`hashFiles()` will only take 1 parameters:
|
||||||
|
- `hashFiles('**/package-lock.json')` // Search files under $GITHUB_WORKSPACE and calculate a hash for them
|
||||||
|
|
||||||
|
**Question: Do we need to support more than one match patterns?**
|
||||||
|
Ex: `hashFiles('**/package-lock.json', '!toolkit/core/package-lock.json', '!toolkit/io/package-lock.json')`
|
||||||
|
Answer: Only support single match pattern for GA, we can always add later.
|
||||||
|
|
||||||
|
This will help customer has better experience with the `actions/cache` action's input.
|
||||||
|
```yaml
|
||||||
|
steps:
|
||||||
|
- uses: actions/cache@v1
|
||||||
|
with:
|
||||||
|
key: ${{hashFiles('**/package-lock.json')}}-${{github.ref}}-${{runner.os}}
|
||||||
|
```
|
||||||
|
|
||||||
|
For search pattern, we will use basic globbing (`*` `?` and `[]`) and globstar (`**`).
|
||||||
|
|
||||||
|
Additional pattern details:
|
||||||
|
- Root relative paths with `github.workspace` (the main repo)
|
||||||
|
- Make `*` match files that start with `.`
|
||||||
|
- Case insensitive on Windows
|
||||||
|
- Accept `\` or `/` path separators on Windows
|
||||||
|
|
||||||
|
Hashing logic:
|
||||||
|
1. Get all files under `$GITHUB_WORKSPACE`.
|
||||||
|
2. Use search pattern filter all files to get files that matches the search pattern. (search pattern only apply to file path not folder path)
|
||||||
|
3. Sort all matched files by full file path in alphabet order.
|
||||||
|
4. Use SHA256 algorithm to hash each matched file and store hash result.
|
||||||
|
5. Use SHA256 to hash all stored files' hash results to get the final 64 chars hash result.
|
||||||
|
|
||||||
|
**Question: Should we include the folder structure info into the hash?**
|
||||||
|
Answer: No
|
||||||
30
docs/adrs/0280-command-input-echoing.md
Normal file
30
docs/adrs/0280-command-input-echoing.md
Normal file
@@ -0,0 +1,30 @@
|
|||||||
|
# ADR 0280: Echoing of Command Input
|
||||||
|
|
||||||
|
**Date**: 2019-11-04
|
||||||
|
|
||||||
|
**Status**: Accepted
|
||||||
|
|
||||||
|
## Context
|
||||||
|
|
||||||
|
Command echoing as a default behavior tends to clutter the user logs, so we want to swap to a system where users have to opt in to see this information.
|
||||||
|
|
||||||
|
Command outputs will still be echoed in the case there are any errors processing such commands. This is so the end user can have more context on why the command failed and help with troubleshooting.
|
||||||
|
|
||||||
|
Echo output in the user logs can be explicitly controlled by the new commands `::echo::on` and `::echo::off`. By default, echoing is enabled if `ACTIONS_STEP_DEBUG` secret is enabled, otherwise echoing is disabled.
|
||||||
|
|
||||||
|
## Decision
|
||||||
|
- The only commands that currently echo output are
|
||||||
|
- `remove-matcher`
|
||||||
|
- `add-matcher`
|
||||||
|
- `add-path`
|
||||||
|
- These will no longer echo the command, if processed successfully
|
||||||
|
- All commands echo the input when any of these conditions is fulfilled:
|
||||||
|
1. When such commands fail with an error
|
||||||
|
2. When `::echo::on` is set
|
||||||
|
3. When the `ACTIONS_STEP_DEBUG` is set, and echoing hasn't been explicitly disabled with `::echo::off`
|
||||||
|
- There are a few commands that won't be echoed, even when echo is enabled. These are (as of 2019/11/04):
|
||||||
|
- `add-mask`
|
||||||
|
- `debug`
|
||||||
|
- `warning`
|
||||||
|
- `error`
|
||||||
|
- The three commands above will not echo, either because echoing the command would leak secrets (e.g. `add-mask`), or it would not add any additional troubleshooting information to the logs (e.g. `debug`). It's expected that future commands would follow these "echo-suppressing" guidelines as well. Echo-suppressed commands are still free to output other information to the logs, as deemed fit.
|
||||||
48
docs/adrs/0297-base64-masking-trailing-characters.md
Normal file
48
docs/adrs/0297-base64-masking-trailing-characters.md
Normal file
@@ -0,0 +1,48 @@
|
|||||||
|
# ADR 0297: Base64 Masking Trailing Characters
|
||||||
|
|
||||||
|
**Date** 2020-01-21
|
||||||
|
|
||||||
|
**Status** Proposed
|
||||||
|
|
||||||
|
## Context
|
||||||
|
|
||||||
|
The Runner registers a number of Value Encoders, which mask various encodings of a provided secret. Currently, we register a 3 base64 Encoders:
|
||||||
|
- The base64 encoded secret
|
||||||
|
- The secret with the first character removed then base64 encoded
|
||||||
|
- The secret with the first two characters removed then base64 encoded
|
||||||
|
|
||||||
|
This gives us good coverage across the board for secrets and secrets with a prefix (i.e. `base64($user:$pass)`).
|
||||||
|
|
||||||
|
However, we don't have great coverage for cases where the secret has a string appended to it before it is base64 encoded (i.e.: `base64($pass\n))`).
|
||||||
|
|
||||||
|
Most notably we've seen this as a result of user error where a user accidentially appends a newline or space character before encoding their secret in base64.
|
||||||
|
|
||||||
|
## Decision
|
||||||
|
|
||||||
|
### Trim end characters
|
||||||
|
|
||||||
|
We are going to modify all existing base64 encoders to trim information before registering as a secret.
|
||||||
|
We will trim:
|
||||||
|
- `=` from the end of all base64 strings. This is a padding character that contains no information.
|
||||||
|
- Based on the number of `=`'s at the end of a base64 string, a malicious user could predict the length of the original secret modulo 3.
|
||||||
|
- If a user saw `***==`, they would know the secret could be 1,4,7,10... characters.
|
||||||
|
- If a string contains `=` we will also trim the last non-padding character from the base64 secret.
|
||||||
|
- This character can change if a string is appended to the secret before the encoding.
|
||||||
|
|
||||||
|
|
||||||
|
### Register a fourth encoder
|
||||||
|
|
||||||
|
We will also add back in the original base64 encoded secret encoder for four total encoders:
|
||||||
|
- The base64 encoded secret
|
||||||
|
- The base64 encoded secret trimmed
|
||||||
|
- The secret with the first character removed then base64 encoded and trimmed
|
||||||
|
- The secret with the first two characters removed then base64 encoded and trimmed
|
||||||
|
|
||||||
|
This allows us to fully cover the most common scenario where a user base64 encodes their secret and expects the entire thing to be masked.
|
||||||
|
This will result in us only revealing length or bit information when a prefix or suffix is added to a secret before encoding.
|
||||||
|
|
||||||
|
## Consequences
|
||||||
|
|
||||||
|
- In the case where a secret has a prefix or suffix added before base64 encoding, we may now reveal up to 20 bits of information and the length of the original string modulo 3, rather then the original 16 bits and no length information
|
||||||
|
- Secrets with a suffix appended before encoding will now be masked across the board. Previously it was only masked if it was a multiple of 3 characters
|
||||||
|
- Performance will suffer in a neglible way
|
||||||
@@ -1,17 +1,30 @@
|
|||||||
## Features
|
## Features
|
||||||
- Remove runner flow: Change from PAT to "deletion token" in prompt (#225)
|
- Expose whether debug is on/off via RUNNER_DEBUG. (#253)
|
||||||
- Expose github.run_id and github.run_number to action runtime env. (#224)
|
- Upload log on runner when worker get killed due to cancellation timeout. (#255)
|
||||||
|
- Update config.sh/cmd --help documentation (#282)
|
||||||
|
- Set http_proxy and related env vars for job/service containers (#304)
|
||||||
|
- Set both http_proxy and HTTP_PROXY env for runner/worker processes. (#298)
|
||||||
|
|
||||||
## Bugs
|
## Bugs
|
||||||
- Clean up error messages for container scenarios (#221)
|
- Verify runner Windows service hash started successfully after configuration (#236)
|
||||||
- Pick shell from prependpath (#231)
|
- Detect source file path in L0 without using env. (#257)
|
||||||
|
- Handle escaped '%' in commands data section (#200)
|
||||||
|
- Allow container to be null/empty during matrix expansion (#266)
|
||||||
|
- Translate problem matcher file to host path (#272)
|
||||||
|
- Change hashFiles() expression function to use @actions/glob. (#268)
|
||||||
|
- Default post-job action's condition to always(). (#293)
|
||||||
|
- Support action.yaml file as action's entry file (#288)
|
||||||
|
- Trace javascript action exit code to debug instead of user logs (#290)
|
||||||
|
- Change prompt message when removing a runner to lines up with GitHub.com UI (#303)
|
||||||
|
- Include step.env as part of env context. (#300)
|
||||||
|
- Update Base64 Encoders to deal with suffixes (#284)
|
||||||
|
|
||||||
## Misc
|
## Misc
|
||||||
- Runner code cleanup (#218 #227, #228, #229, #230)
|
- Move .sln file under ./src (#238)
|
||||||
- Consume dotnet core 3.1 in runner. (#213)
|
- Treat warnings as errors during compile (#249)
|
||||||
|
|
||||||
## Windows x64
|
## Windows x64
|
||||||
We recommend configuring the runner under "<DRIVE>:\actions-runner". This will help avoid issues related to service identity folder permissions and long file path restrictions on Windows
|
We recommend configuring the runner in a root folder of the Windows drive (e.g. "C:\actions-runner"). This will help avoid issues related to service identity folder permissions and long file path restrictions on Windows
|
||||||
```
|
```
|
||||||
// Create a folder under the drive root
|
// Create a folder under the drive root
|
||||||
mkdir \actions-runner ; cd \actions-runner
|
mkdir \actions-runner ; cd \actions-runner
|
||||||
@@ -19,7 +32,7 @@ mkdir \actions-runner ; cd \actions-runner
|
|||||||
Invoke-WebRequest -Uri https://github.com/actions/runner/releases/download/v<RUNNER_VERSION>/actions-runner-win-x64-<RUNNER_VERSION>.zip -OutFile actions-runner-win-x64-<RUNNER_VERSION>.zip
|
Invoke-WebRequest -Uri https://github.com/actions/runner/releases/download/v<RUNNER_VERSION>/actions-runner-win-x64-<RUNNER_VERSION>.zip -OutFile actions-runner-win-x64-<RUNNER_VERSION>.zip
|
||||||
// Extract the installer
|
// Extract the installer
|
||||||
Add-Type -AssemblyName System.IO.Compression.FileSystem ;
|
Add-Type -AssemblyName System.IO.Compression.FileSystem ;
|
||||||
[System.IO.Compression.ZipFile]::ExtractToDirectory("$HOME\Downloads\actions-runner-win-x64-<RUNNER_VERSION>.zip", "$PWD")
|
[System.IO.Compression.ZipFile]::ExtractToDirectory("$PWD\actions-runner-win-x64-<RUNNER_VERSION>.zip", "$PWD")
|
||||||
```
|
```
|
||||||
|
|
||||||
## OSX
|
## OSX
|
||||||
@@ -28,7 +41,7 @@ Add-Type -AssemblyName System.IO.Compression.FileSystem ;
|
|||||||
// Create a folder
|
// Create a folder
|
||||||
mkdir actions-runner && cd actions-runner
|
mkdir actions-runner && cd actions-runner
|
||||||
// Download the latest runner package
|
// Download the latest runner package
|
||||||
curl -O https://github.com/actions/runner/releases/download/v<RUNNER_VERSION>/actions-runner-osx-x64-<RUNNER_VERSION>.tar.gz
|
curl -O -L https://github.com/actions/runner/releases/download/v<RUNNER_VERSION>/actions-runner-osx-x64-<RUNNER_VERSION>.tar.gz
|
||||||
// Extract the installer
|
// Extract the installer
|
||||||
tar xzf ./actions-runner-osx-x64-<RUNNER_VERSION>.tar.gz
|
tar xzf ./actions-runner-osx-x64-<RUNNER_VERSION>.tar.gz
|
||||||
```
|
```
|
||||||
@@ -39,7 +52,7 @@ tar xzf ./actions-runner-osx-x64-<RUNNER_VERSION>.tar.gz
|
|||||||
// Create a folder
|
// Create a folder
|
||||||
mkdir actions-runner && cd actions-runner
|
mkdir actions-runner && cd actions-runner
|
||||||
// Download the latest runner package
|
// Download the latest runner package
|
||||||
curl -O https://github.com/actions/runner/releases/download/v<RUNNER_VERSION>/actions-runner-linux-x64-<RUNNER_VERSION>.tar.gz
|
curl -O -L https://github.com/actions/runner/releases/download/v<RUNNER_VERSION>/actions-runner-linux-x64-<RUNNER_VERSION>.tar.gz
|
||||||
// Extract the installer
|
// Extract the installer
|
||||||
tar xzf ./actions-runner-linux-x64-<RUNNER_VERSION>.tar.gz
|
tar xzf ./actions-runner-linux-x64-<RUNNER_VERSION>.tar.gz
|
||||||
```
|
```
|
||||||
@@ -50,7 +63,7 @@ tar xzf ./actions-runner-linux-x64-<RUNNER_VERSION>.tar.gz
|
|||||||
// Create a folder
|
// Create a folder
|
||||||
mkdir actions-runner && cd actions-runner
|
mkdir actions-runner && cd actions-runner
|
||||||
// Download the latest runner package
|
// Download the latest runner package
|
||||||
curl -O https://github.com/actions/runner/releases/download/v<RUNNER_VERSION>/actions-runner-linux-arm64-<RUNNER_VERSION>.tar.gz
|
curl -O -L https://github.com/actions/runner/releases/download/v<RUNNER_VERSION>/actions-runner-linux-arm64-<RUNNER_VERSION>.tar.gz
|
||||||
// Extract the installer
|
// Extract the installer
|
||||||
tar xzf ./actions-runner-linux-arm64-<RUNNER_VERSION>.tar.gz
|
tar xzf ./actions-runner-linux-arm64-<RUNNER_VERSION>.tar.gz
|
||||||
```
|
```
|
||||||
@@ -61,7 +74,7 @@ tar xzf ./actions-runner-linux-arm64-<RUNNER_VERSION>.tar.gz
|
|||||||
// Create a folder
|
// Create a folder
|
||||||
mkdir actions-runner && cd actions-runner
|
mkdir actions-runner && cd actions-runner
|
||||||
// Download the latest runner package
|
// Download the latest runner package
|
||||||
curl -O https://github.com/actions/runner/releases/download/v<RUNNER_VERSION>/actions-runner-linux-arm-<RUNNER_VERSION>.tar.gz
|
curl -O -L https://github.com/actions/runner/releases/download/v<RUNNER_VERSION>/actions-runner-linux-arm-<RUNNER_VERSION>.tar.gz
|
||||||
// Extract the installer
|
// Extract the installer
|
||||||
tar xzf ./actions-runner-linux-arm-<RUNNER_VERSION>.tar.gz
|
tar xzf ./actions-runner-linux-arm-<RUNNER_VERSION>.tar.gz
|
||||||
```
|
```
|
||||||
|
|||||||
3
src/Misc/expressionFunc/hashFiles/.eslintignore
Normal file
3
src/Misc/expressionFunc/hashFiles/.eslintignore
Normal file
@@ -0,0 +1,3 @@
|
|||||||
|
dist/
|
||||||
|
lib/
|
||||||
|
node_modules/
|
||||||
59
src/Misc/expressionFunc/hashFiles/.eslintrc.json
Normal file
59
src/Misc/expressionFunc/hashFiles/.eslintrc.json
Normal file
@@ -0,0 +1,59 @@
|
|||||||
|
{
|
||||||
|
"plugins": ["jest", "@typescript-eslint"],
|
||||||
|
"extends": ["plugin:github/es6"],
|
||||||
|
"parser": "@typescript-eslint/parser",
|
||||||
|
"parserOptions": {
|
||||||
|
"ecmaVersion": 9,
|
||||||
|
"sourceType": "module",
|
||||||
|
"project": "./tsconfig.json"
|
||||||
|
},
|
||||||
|
"rules": {
|
||||||
|
"eslint-comments/no-use": "off",
|
||||||
|
"import/no-namespace": "off",
|
||||||
|
"no-console": "off",
|
||||||
|
"no-unused-vars": "off",
|
||||||
|
"@typescript-eslint/no-unused-vars": "error",
|
||||||
|
"@typescript-eslint/explicit-member-accessibility": ["error", {"accessibility": "no-public"}],
|
||||||
|
"@typescript-eslint/no-require-imports": "error",
|
||||||
|
"@typescript-eslint/array-type": "error",
|
||||||
|
"@typescript-eslint/await-thenable": "error",
|
||||||
|
"@typescript-eslint/ban-ts-ignore": "error",
|
||||||
|
"camelcase": "off",
|
||||||
|
"@typescript-eslint/camelcase": "error",
|
||||||
|
"@typescript-eslint/class-name-casing": "error",
|
||||||
|
"@typescript-eslint/explicit-function-return-type": ["error", {"allowExpressions": true}],
|
||||||
|
"@typescript-eslint/func-call-spacing": ["error", "never"],
|
||||||
|
"@typescript-eslint/generic-type-naming": ["error", "^[A-Z][A-Za-z]*$"],
|
||||||
|
"@typescript-eslint/no-array-constructor": "error",
|
||||||
|
"@typescript-eslint/no-empty-interface": "error",
|
||||||
|
"@typescript-eslint/no-explicit-any": "error",
|
||||||
|
"@typescript-eslint/no-extraneous-class": "error",
|
||||||
|
"@typescript-eslint/no-for-in-array": "error",
|
||||||
|
"@typescript-eslint/no-inferrable-types": "error",
|
||||||
|
"@typescript-eslint/no-misused-new": "error",
|
||||||
|
"@typescript-eslint/no-namespace": "error",
|
||||||
|
"@typescript-eslint/no-non-null-assertion": "warn",
|
||||||
|
"@typescript-eslint/no-object-literal-type-assertion": "error",
|
||||||
|
"@typescript-eslint/no-unnecessary-qualifier": "error",
|
||||||
|
"@typescript-eslint/no-unnecessary-type-assertion": "error",
|
||||||
|
"@typescript-eslint/no-useless-constructor": "error",
|
||||||
|
"@typescript-eslint/no-var-requires": "error",
|
||||||
|
"@typescript-eslint/prefer-for-of": "warn",
|
||||||
|
"@typescript-eslint/prefer-function-type": "warn",
|
||||||
|
"@typescript-eslint/prefer-includes": "error",
|
||||||
|
"@typescript-eslint/prefer-interface": "error",
|
||||||
|
"@typescript-eslint/prefer-string-starts-ends-with": "error",
|
||||||
|
"@typescript-eslint/promise-function-async": "error",
|
||||||
|
"@typescript-eslint/require-array-sort-compare": "error",
|
||||||
|
"@typescript-eslint/restrict-plus-operands": "error",
|
||||||
|
"semi": "off",
|
||||||
|
"@typescript-eslint/semi": ["error", "never"],
|
||||||
|
"@typescript-eslint/type-annotation-spacing": "error",
|
||||||
|
"@typescript-eslint/unbound-method": "error"
|
||||||
|
},
|
||||||
|
"env": {
|
||||||
|
"node": true,
|
||||||
|
"es6": true,
|
||||||
|
"jest/globals": true
|
||||||
|
}
|
||||||
|
}
|
||||||
3
src/Misc/expressionFunc/hashFiles/.prettierignore
Normal file
3
src/Misc/expressionFunc/hashFiles/.prettierignore
Normal file
@@ -0,0 +1,3 @@
|
|||||||
|
dist/
|
||||||
|
lib/
|
||||||
|
node_modules/
|
||||||
11
src/Misc/expressionFunc/hashFiles/.prettierrc.json
Normal file
11
src/Misc/expressionFunc/hashFiles/.prettierrc.json
Normal file
@@ -0,0 +1,11 @@
|
|||||||
|
{
|
||||||
|
"printWidth": 80,
|
||||||
|
"tabWidth": 2,
|
||||||
|
"useTabs": false,
|
||||||
|
"semi": false,
|
||||||
|
"singleQuote": true,
|
||||||
|
"trailingComma": "none",
|
||||||
|
"bracketSpacing": false,
|
||||||
|
"arrowParens": "avoid",
|
||||||
|
"parser": "typescript"
|
||||||
|
}
|
||||||
1
src/Misc/expressionFunc/hashFiles/README.md
Normal file
1
src/Misc/expressionFunc/hashFiles/README.md
Normal file
@@ -0,0 +1 @@
|
|||||||
|
To update hashFiles under `Misc/layoutbin` run `npm install && npm run all`
|
||||||
2347
src/Misc/expressionFunc/hashFiles/package-lock.json
generated
Normal file
2347
src/Misc/expressionFunc/hashFiles/package-lock.json
generated
Normal file
File diff suppressed because it is too large
Load Diff
35
src/Misc/expressionFunc/hashFiles/package.json
Normal file
35
src/Misc/expressionFunc/hashFiles/package.json
Normal file
@@ -0,0 +1,35 @@
|
|||||||
|
{
|
||||||
|
"name": "hashFiles",
|
||||||
|
"version": "1.0.0",
|
||||||
|
"description": "GitHub Actions HashFiles() expression function",
|
||||||
|
"main": "lib/hashFiles.js",
|
||||||
|
"scripts": {
|
||||||
|
"build": "tsc",
|
||||||
|
"format": "prettier --write **/*.ts",
|
||||||
|
"format-check": "prettier --check **/*.ts",
|
||||||
|
"lint": "eslint src/**/*.ts",
|
||||||
|
"pack": "ncc build -o ../../layoutbin/hashFiles",
|
||||||
|
"all": "npm run build && npm run format && npm run lint && npm run pack"
|
||||||
|
},
|
||||||
|
"repository": {
|
||||||
|
"type": "git",
|
||||||
|
"url": "git+https://github.com/actions/runner.git"
|
||||||
|
},
|
||||||
|
"keywords": [
|
||||||
|
"actions"
|
||||||
|
],
|
||||||
|
"author": "GitHub Actions",
|
||||||
|
"license": "MIT",
|
||||||
|
"dependencies": {
|
||||||
|
"@actions/glob": "^0.1.0"
|
||||||
|
},
|
||||||
|
"devDependencies": {
|
||||||
|
"@types/node": "^12.7.12",
|
||||||
|
"@typescript-eslint/parser": "^2.8.0",
|
||||||
|
"@zeit/ncc": "^0.20.5",
|
||||||
|
"eslint": "^5.16.0",
|
||||||
|
"eslint-plugin-github": "^2.0.0",
|
||||||
|
"prettier": "^1.19.1",
|
||||||
|
"typescript": "^3.6.4"
|
||||||
|
}
|
||||||
|
}
|
||||||
55
src/Misc/expressionFunc/hashFiles/src/hashFiles.ts
Normal file
55
src/Misc/expressionFunc/hashFiles/src/hashFiles.ts
Normal file
@@ -0,0 +1,55 @@
|
|||||||
|
import * as glob from '@actions/glob'
|
||||||
|
import * as crypto from 'crypto'
|
||||||
|
import * as fs from 'fs'
|
||||||
|
import * as stream from 'stream'
|
||||||
|
import * as util from 'util'
|
||||||
|
import * as path from 'path'
|
||||||
|
|
||||||
|
async function run(): Promise<void> {
|
||||||
|
// arg0 -> node
|
||||||
|
// arg1 -> hashFiles.js
|
||||||
|
// env[followSymbolicLinks] = true/null
|
||||||
|
// env[patterns] -> glob patterns
|
||||||
|
let followSymbolicLinks = false
|
||||||
|
const matchPatterns = process.env.patterns || ''
|
||||||
|
if (process.env.followSymbolicLinks === 'true') {
|
||||||
|
console.log('Follow symbolic links')
|
||||||
|
followSymbolicLinks = true
|
||||||
|
}
|
||||||
|
|
||||||
|
console.log(`Match Pattern: ${matchPatterns}`)
|
||||||
|
let hasMatch = false
|
||||||
|
const githubWorkspace = process.cwd()
|
||||||
|
const result = crypto.createHash('sha256')
|
||||||
|
let count = 0
|
||||||
|
const globber = await glob.create(matchPatterns, {followSymbolicLinks})
|
||||||
|
for await (const file of globber.globGenerator()) {
|
||||||
|
console.log(file)
|
||||||
|
if (!file.startsWith(`${githubWorkspace}${path.sep}`)) {
|
||||||
|
console.log(`Ignore '${file}' since it is not under GITHUB_WORKSPACE.`)
|
||||||
|
continue
|
||||||
|
}
|
||||||
|
if (fs.statSync(file).isDirectory()) {
|
||||||
|
console.log(`Skip directory '${file}'.`)
|
||||||
|
continue
|
||||||
|
}
|
||||||
|
const hash = crypto.createHash('sha256')
|
||||||
|
const pipeline = util.promisify(stream.pipeline)
|
||||||
|
await pipeline(fs.createReadStream(file), hash)
|
||||||
|
result.write(hash.digest())
|
||||||
|
count++
|
||||||
|
if (!hasMatch) {
|
||||||
|
hasMatch = true
|
||||||
|
}
|
||||||
|
}
|
||||||
|
result.end()
|
||||||
|
|
||||||
|
if (hasMatch) {
|
||||||
|
console.log(`Find ${count} files to hash.`)
|
||||||
|
console.error(`__OUTPUT__${result.digest('hex')}__OUTPUT__`)
|
||||||
|
} else {
|
||||||
|
console.error(`__OUTPUT____OUTPUT__`)
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
run()
|
||||||
12
src/Misc/expressionFunc/hashFiles/tsconfig.json
Normal file
12
src/Misc/expressionFunc/hashFiles/tsconfig.json
Normal file
@@ -0,0 +1,12 @@
|
|||||||
|
{
|
||||||
|
"compilerOptions": {
|
||||||
|
"target": "es6", /* Specify ECMAScript target version: 'ES3' (default), 'ES5', 'ES2015', 'ES2016', 'ES2017', 'ES2018', 'ES2019' or 'ESNEXT'. */
|
||||||
|
"module": "commonjs", /* Specify module code generation: 'none', 'commonjs', 'amd', 'system', 'umd', 'es2015', or 'ESNext'. */
|
||||||
|
"outDir": "./lib", /* Redirect output structure to the directory. */
|
||||||
|
"rootDir": "./src", /* Specify the root directory of input files. Use to control the output directory structure with --outDir. */
|
||||||
|
"strict": true, /* Enable all strict type-checking options. */
|
||||||
|
"noImplicitAny": true, /* Raise error on expressions and declarations with an implied 'any' type. */
|
||||||
|
"esModuleInterop": true /* Enables emit interoperability between CommonJS and ES Modules via creation of namespace objects for all imports. Implies 'allowSyntheticDefaultImports'. */
|
||||||
|
},
|
||||||
|
"exclude": ["node_modules", "**/*.test.ts"]
|
||||||
|
}
|
||||||
2623
src/Misc/layoutbin/hashFiles/index.js
Normal file
2623
src/Misc/layoutbin/hashFiles/index.js
Normal file
File diff suppressed because it is too large
Load Diff
@@ -3,8 +3,6 @@
|
|||||||
<packageSources>
|
<packageSources>
|
||||||
<!--To inherit the global NuGet package sources remove the <clear/> line below -->
|
<!--To inherit the global NuGet package sources remove the <clear/> line below -->
|
||||||
<clear />
|
<clear />
|
||||||
<add key="dotnet-core" value="https://www.myget.org/F/dotnet-core/api/v3/index.json" />
|
|
||||||
<add key="dotnet-buildtools" value="https://www.myget.org/F/dotnet-buildtools/api/v3/index.json" />
|
|
||||||
<add key="api.nuget.org" value="https://api.nuget.org/v3/index.json" />
|
<add key="api.nuget.org" value="https://api.nuget.org/v3/index.json" />
|
||||||
</packageSources>
|
</packageSources>
|
||||||
</configuration>
|
</configuration>
|
||||||
|
|||||||
@@ -162,7 +162,8 @@ namespace GitHub.Runner.Common
|
|||||||
public static class Path
|
public static class Path
|
||||||
{
|
{
|
||||||
public static readonly string ActionsDirectory = "_actions";
|
public static readonly string ActionsDirectory = "_actions";
|
||||||
public static readonly string ActionManifestFile = "action.yml";
|
public static readonly string ActionManifestYmlFile = "action.yml";
|
||||||
|
public static readonly string ActionManifestYamlFile = "action.yaml";
|
||||||
public static readonly string BinDirectory = "bin";
|
public static readonly string BinDirectory = "bin";
|
||||||
public static readonly string DiagDirectory = "_diag";
|
public static readonly string DiagDirectory = "_diag";
|
||||||
public static readonly string ExternalsDirectory = "externals";
|
public static readonly string ExternalsDirectory = "externals";
|
||||||
|
|||||||
@@ -83,6 +83,7 @@ namespace GitHub.Runner.Common
|
|||||||
_loadContext.Unloading += LoadContext_Unloading;
|
_loadContext.Unloading += LoadContext_Unloading;
|
||||||
|
|
||||||
this.SecretMasker.AddValueEncoder(ValueEncoders.Base64StringEscape);
|
this.SecretMasker.AddValueEncoder(ValueEncoders.Base64StringEscape);
|
||||||
|
this.SecretMasker.AddValueEncoder(ValueEncoders.Base64StringEscapeTrimmed);
|
||||||
this.SecretMasker.AddValueEncoder(ValueEncoders.Base64StringEscapeShift1);
|
this.SecretMasker.AddValueEncoder(ValueEncoders.Base64StringEscapeShift1);
|
||||||
this.SecretMasker.AddValueEncoder(ValueEncoders.Base64StringEscapeShift2);
|
this.SecretMasker.AddValueEncoder(ValueEncoders.Base64StringEscapeShift2);
|
||||||
this.SecretMasker.AddValueEncoder(ValueEncoders.ExpressionStringEscape);
|
this.SecretMasker.AddValueEncoder(ValueEncoders.ExpressionStringEscape);
|
||||||
|
|||||||
@@ -100,7 +100,7 @@ namespace GitHub.Runner.Common
|
|||||||
{
|
{
|
||||||
EndPage();
|
EndPage();
|
||||||
_byteCount = 0;
|
_byteCount = 0;
|
||||||
_dataFileName = Path.Combine(_pagesFolder, $"{_timelineRecordId}_{++_pageCount}.log");
|
_dataFileName = Path.Combine(_pagesFolder, $"{_timelineId}_{_timelineRecordId}_{++_pageCount}.log");
|
||||||
_pageData = new FileStream(_dataFileName, FileMode.CreateNew);
|
_pageData = new FileStream(_dataFileName, FileMode.CreateNew);
|
||||||
_pageWriter = new StreamWriter(_pageData, System.Text.Encoding.UTF8);
|
_pageWriter = new StreamWriter(_pageData, System.Text.Encoding.UTF8);
|
||||||
}
|
}
|
||||||
|
|||||||
@@ -190,7 +190,7 @@ namespace GitHub.Runner.Listener
|
|||||||
{
|
{
|
||||||
return GetArgOrPrompt(
|
return GetArgOrPrompt(
|
||||||
name: Constants.Runner.CommandLine.Args.Token,
|
name: Constants.Runner.CommandLine.Args.Token,
|
||||||
description: "Enter runner deletion token:",
|
description: "Enter runner remove token:",
|
||||||
defaultValue: string.Empty,
|
defaultValue: string.Empty,
|
||||||
validator: Validators.NonEmptyValidator);
|
validator: Validators.NonEmptyValidator);
|
||||||
}
|
}
|
||||||
@@ -291,7 +291,7 @@ namespace GitHub.Runner.Listener
|
|||||||
if (!string.IsNullOrEmpty(result))
|
if (!string.IsNullOrEmpty(result))
|
||||||
{
|
{
|
||||||
// After read the arg from input commandline args, remove it from Arg dictionary,
|
// After read the arg from input commandline args, remove it from Arg dictionary,
|
||||||
// This will help if bad arg value passed through CommandLine arg, when ConfigurationManager ask CommandSetting the second time,
|
// This will help if bad arg value passed through CommandLine arg, when ConfigurationManager ask CommandSetting the second time,
|
||||||
// It will prompt for input instead of continue use the bad input.
|
// It will prompt for input instead of continue use the bad input.
|
||||||
_trace.Info($"Remove {name} from Arg dictionary.");
|
_trace.Info($"Remove {name} from Arg dictionary.");
|
||||||
RemoveArg(name);
|
RemoveArg(name);
|
||||||
|
|||||||
@@ -752,27 +752,32 @@ namespace GitHub.Runner.Listener
|
|||||||
foreach (var log in logs)
|
foreach (var log in logs)
|
||||||
{
|
{
|
||||||
var logName = Path.GetFileNameWithoutExtension(log);
|
var logName = Path.GetFileNameWithoutExtension(log);
|
||||||
|
var logNameParts = logName.Split('_', StringSplitOptions.RemoveEmptyEntries);
|
||||||
|
if (logNameParts.Length != 3)
|
||||||
|
{
|
||||||
|
Trace.Warning($"log file '{log}' doesn't follow naming convension 'GUID_GUID_INT'.");
|
||||||
|
continue;
|
||||||
|
}
|
||||||
var logPageSeperator = logName.IndexOf('_');
|
var logPageSeperator = logName.IndexOf('_');
|
||||||
var logRecordId = Guid.Empty;
|
var logRecordId = Guid.Empty;
|
||||||
var pageNumber = 0;
|
var pageNumber = 0;
|
||||||
if (logPageSeperator < 0)
|
|
||||||
|
if (!Guid.TryParse(logNameParts[0], out Guid timelineId) || timelineId != timeline.Id)
|
||||||
{
|
{
|
||||||
Trace.Warning($"log file '{log}' doesn't follow naming convension 'GUID_INT'.");
|
Trace.Warning($"log file '{log}' is not belongs to current job");
|
||||||
continue;
|
continue;
|
||||||
}
|
}
|
||||||
else
|
|
||||||
{
|
|
||||||
if (!Guid.TryParse(logName.Substring(0, logPageSeperator), out logRecordId))
|
|
||||||
{
|
|
||||||
Trace.Warning($"log file '{log}' doesn't follow naming convension 'GUID_INT'.");
|
|
||||||
continue;
|
|
||||||
}
|
|
||||||
|
|
||||||
if (!int.TryParse(logName.Substring(logPageSeperator + 1), out pageNumber))
|
if (!Guid.TryParse(logNameParts[1], out logRecordId))
|
||||||
{
|
{
|
||||||
Trace.Warning($"log file '{log}' doesn't follow naming convension 'GUID_INT'.");
|
Trace.Warning($"log file '{log}' doesn't follow naming convension 'GUID_GUID_INT'.");
|
||||||
continue;
|
continue;
|
||||||
}
|
}
|
||||||
|
|
||||||
|
if (!int.TryParse(logNameParts[2], out pageNumber))
|
||||||
|
{
|
||||||
|
Trace.Warning($"log file '{log}' doesn't follow naming convension 'GUID_GUID_INT'.");
|
||||||
|
continue;
|
||||||
}
|
}
|
||||||
|
|
||||||
var record = timeline.Records.FirstOrDefault(x => x.Id == logRecordId);
|
var record = timeline.Records.FirstOrDefault(x => x.Id == logRecordId);
|
||||||
|
|||||||
@@ -451,16 +451,38 @@ namespace GitHub.Runner.Listener
|
|||||||
ext = "sh";
|
ext = "sh";
|
||||||
#endif
|
#endif
|
||||||
_term.WriteLine($@"
|
_term.WriteLine($@"
|
||||||
Commands:,
|
Commands:
|
||||||
.{separator}config.{ext} Configures the runner
|
.{separator}config.{ext} Configures the runner
|
||||||
.{separator}config.{ext} remove Unconfigures the runner
|
.{separator}config.{ext} remove Unconfigures the runner
|
||||||
.{separator}run.{ext} Runs the runner interactively. Does not require any options.
|
.{separator}run.{ext} Runs the runner interactively. Does not require any options.
|
||||||
|
|
||||||
Options:
|
Options:
|
||||||
|
--help Prints the help for each command
|
||||||
--version Prints the runner version
|
--version Prints the runner version
|
||||||
--commit Prints the runner commit
|
--commit Prints the runner commit
|
||||||
--help Prints the help for each command
|
|
||||||
");
|
Config Options:
|
||||||
|
--unattended Disable interactive prompts for missing arguments. Defaults will be used for missing options
|
||||||
|
--url string Repository to add the runner to. Required if unattended
|
||||||
|
--token string Registration token. Required if unattended
|
||||||
|
--name string Name of the runner to configure (default {Environment.MachineName ?? "myrunner"})
|
||||||
|
--work string Relative runner work directory (default {Constants.Path.WorkDirectory})
|
||||||
|
--replace Replace any existing runner with the same name (default false)");
|
||||||
|
#if OS_WINDOWS
|
||||||
|
_term.WriteLine($@" --runasservice Run the runner as a service");
|
||||||
|
_term.WriteLine($@" --windowslogonaccount string Account to run the service as. Requires runasservice");
|
||||||
|
_term.WriteLine($@" --windowslogonpassword string Password for the service account. Requires runasservice");
|
||||||
|
#endif
|
||||||
|
_term.WriteLine($@"
|
||||||
|
Examples:
|
||||||
|
Configure a runner non-interactively:
|
||||||
|
.{separator}config.{ext} --unattended --url <url> --token <token>
|
||||||
|
Configure a runner non-interactively, replacing any existing runner with the same name:
|
||||||
|
.{separator}config.{ext} --unattended --url <url> --token <token> --replace [--name <name>]");
|
||||||
|
#if OS_WINDOWS
|
||||||
|
_term.WriteLine($@" Configure a runner to run as a service:");
|
||||||
|
_term.WriteLine($@" .{separator}config.{ext} --url <url> --token <token> --runasservice");
|
||||||
|
#endif
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|||||||
@@ -21,6 +21,7 @@ namespace GitHub.Runner.Sdk
|
|||||||
private string _httpsProxyAddress;
|
private string _httpsProxyAddress;
|
||||||
private string _httpsProxyUsername;
|
private string _httpsProxyUsername;
|
||||||
private string _httpsProxyPassword;
|
private string _httpsProxyPassword;
|
||||||
|
private string _noProxyString;
|
||||||
|
|
||||||
private readonly List<ByPassInfo> _noProxyList = new List<ByPassInfo>();
|
private readonly List<ByPassInfo> _noProxyList = new List<ByPassInfo>();
|
||||||
private readonly HashSet<string> _noProxyUnique = new HashSet<string>(StringComparer.OrdinalIgnoreCase);
|
private readonly HashSet<string> _noProxyUnique = new HashSet<string>(StringComparer.OrdinalIgnoreCase);
|
||||||
@@ -33,6 +34,7 @@ namespace GitHub.Runner.Sdk
|
|||||||
public string HttpsProxyAddress => _httpsProxyAddress;
|
public string HttpsProxyAddress => _httpsProxyAddress;
|
||||||
public string HttpsProxyUsername => _httpsProxyUsername;
|
public string HttpsProxyUsername => _httpsProxyUsername;
|
||||||
public string HttpsProxyPassword => _httpsProxyPassword;
|
public string HttpsProxyPassword => _httpsProxyPassword;
|
||||||
|
public string NoProxyString => _noProxyString;
|
||||||
|
|
||||||
public List<ByPassInfo> NoProxyList => _noProxyList;
|
public List<ByPassInfo> NoProxyList => _noProxyList;
|
||||||
|
|
||||||
@@ -71,6 +73,10 @@ namespace GitHub.Runner.Sdk
|
|||||||
{
|
{
|
||||||
_httpProxyAddress = proxyHttpUri.AbsoluteUri;
|
_httpProxyAddress = proxyHttpUri.AbsoluteUri;
|
||||||
|
|
||||||
|
// Set both environment variables since there are tools support both casing (curl, wget) and tools support only one casing (docker)
|
||||||
|
Environment.SetEnvironmentVariable("HTTP_PROXY", _httpProxyAddress);
|
||||||
|
Environment.SetEnvironmentVariable("http_proxy", _httpProxyAddress);
|
||||||
|
|
||||||
// the proxy url looks like http://[user:pass@]127.0.0.1:8888
|
// the proxy url looks like http://[user:pass@]127.0.0.1:8888
|
||||||
var userInfo = Uri.UnescapeDataString(proxyHttpUri.UserInfo).Split(':', 2, StringSplitOptions.RemoveEmptyEntries);
|
var userInfo = Uri.UnescapeDataString(proxyHttpUri.UserInfo).Split(':', 2, StringSplitOptions.RemoveEmptyEntries);
|
||||||
if (userInfo.Length == 2)
|
if (userInfo.Length == 2)
|
||||||
@@ -97,6 +103,10 @@ namespace GitHub.Runner.Sdk
|
|||||||
{
|
{
|
||||||
_httpsProxyAddress = proxyHttpsUri.AbsoluteUri;
|
_httpsProxyAddress = proxyHttpsUri.AbsoluteUri;
|
||||||
|
|
||||||
|
// Set both environment variables since there are tools support both casing (curl, wget) and tools support only one casing (docker)
|
||||||
|
Environment.SetEnvironmentVariable("HTTPS_PROXY", _httpsProxyAddress);
|
||||||
|
Environment.SetEnvironmentVariable("https_proxy", _httpsProxyAddress);
|
||||||
|
|
||||||
// the proxy url looks like http://[user:pass@]127.0.0.1:8888
|
// the proxy url looks like http://[user:pass@]127.0.0.1:8888
|
||||||
var userInfo = Uri.UnescapeDataString(proxyHttpsUri.UserInfo).Split(':', 2, StringSplitOptions.RemoveEmptyEntries);
|
var userInfo = Uri.UnescapeDataString(proxyHttpsUri.UserInfo).Split(':', 2, StringSplitOptions.RemoveEmptyEntries);
|
||||||
if (userInfo.Length == 2)
|
if (userInfo.Length == 2)
|
||||||
@@ -121,6 +131,12 @@ namespace GitHub.Runner.Sdk
|
|||||||
|
|
||||||
if (!string.IsNullOrEmpty(noProxyList))
|
if (!string.IsNullOrEmpty(noProxyList))
|
||||||
{
|
{
|
||||||
|
_noProxyString = noProxyList;
|
||||||
|
|
||||||
|
// Set both environment variables since there are tools support both casing (curl, wget) and tools support only one casing (docker)
|
||||||
|
Environment.SetEnvironmentVariable("NO_PROXY", noProxyList);
|
||||||
|
Environment.SetEnvironmentVariable("no_proxy", noProxyList);
|
||||||
|
|
||||||
var noProxyListSplit = noProxyList.Split(',', StringSplitOptions.RemoveEmptyEntries);
|
var noProxyListSplit = noProxyList.Split(',', StringSplitOptions.RemoveEmptyEntries);
|
||||||
foreach (string noProxy in noProxyListSplit)
|
foreach (string noProxy in noProxyListSplit)
|
||||||
{
|
{
|
||||||
|
|||||||
@@ -33,7 +33,7 @@ namespace GitHub.Runner.Worker
|
|||||||
{
|
{
|
||||||
private const int _defaultFileStreamBufferSize = 4096;
|
private const int _defaultFileStreamBufferSize = 4096;
|
||||||
|
|
||||||
//81920 is the default used by System.IO.Stream.CopyTo and is under the large object heap threshold (85k).
|
//81920 is the default used by System.IO.Stream.CopyTo and is under the large object heap threshold (85k).
|
||||||
private const int _defaultCopyBufferSize = 81920;
|
private const int _defaultCopyBufferSize = 81920;
|
||||||
|
|
||||||
private readonly Dictionary<Guid, ContainerInfo> _cachedActionContainers = new Dictionary<Guid, ContainerInfo>();
|
private readonly Dictionary<Guid, ContainerInfo> _cachedActionContainers = new Dictionary<Guid, ContainerInfo>();
|
||||||
@@ -198,14 +198,21 @@ namespace GitHub.Runner.Worker
|
|||||||
Trace.Info($"Load action that reference repository from '{actionDirectory}'");
|
Trace.Info($"Load action that reference repository from '{actionDirectory}'");
|
||||||
definition.Directory = actionDirectory;
|
definition.Directory = actionDirectory;
|
||||||
|
|
||||||
string manifestFile = Path.Combine(actionDirectory, "action.yml");
|
string manifestFile = Path.Combine(actionDirectory, Constants.Path.ActionManifestYmlFile);
|
||||||
|
string manifestFileYaml = Path.Combine(actionDirectory, Constants.Path.ActionManifestYamlFile);
|
||||||
string dockerFile = Path.Combine(actionDirectory, "Dockerfile");
|
string dockerFile = Path.Combine(actionDirectory, "Dockerfile");
|
||||||
string dockerFileLowerCase = Path.Combine(actionDirectory, "dockerfile");
|
string dockerFileLowerCase = Path.Combine(actionDirectory, "dockerfile");
|
||||||
if (File.Exists(manifestFile))
|
if (File.Exists(manifestFile) || File.Exists(manifestFileYaml))
|
||||||
{
|
{
|
||||||
var manifestManager = HostContext.GetService<IActionManifestManager>();
|
var manifestManager = HostContext.GetService<IActionManifestManager>();
|
||||||
definition.Data = manifestManager.Load(executionContext, manifestFile);
|
if (File.Exists(manifestFile))
|
||||||
|
{
|
||||||
|
definition.Data = manifestManager.Load(executionContext, manifestFile);
|
||||||
|
}
|
||||||
|
else
|
||||||
|
{
|
||||||
|
definition.Data = manifestManager.Load(executionContext, manifestFileYaml);
|
||||||
|
}
|
||||||
Trace.Verbose($"Action friendly name: '{definition.Data.Name}'");
|
Trace.Verbose($"Action friendly name: '{definition.Data.Name}'");
|
||||||
Trace.Verbose($"Action description: '{definition.Data.Description}'");
|
Trace.Verbose($"Action description: '{definition.Data.Description}'");
|
||||||
|
|
||||||
@@ -314,7 +321,7 @@ namespace GitHub.Runner.Worker
|
|||||||
else
|
else
|
||||||
{
|
{
|
||||||
var fullPath = IOUtil.ResolvePath(actionDirectory, "."); // resolve full path without access filesystem.
|
var fullPath = IOUtil.ResolvePath(actionDirectory, "."); // resolve full path without access filesystem.
|
||||||
throw new NotSupportedException($"Can't find 'action.yml' or 'Dockerfile' under '{fullPath}'. Did you forget to run actions/checkout before running your local action?");
|
throw new NotSupportedException($"Can't find 'action.yml', 'action.yaml' or 'Dockerfile' under '{fullPath}'. Did you forget to run actions/checkout before running your local action?");
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
else if (action.Reference.Type == Pipelines.ActionSourceType.Script)
|
else if (action.Reference.Type == Pipelines.ActionSourceType.Script)
|
||||||
@@ -477,7 +484,7 @@ namespace GitHub.Runner.Worker
|
|||||||
|
|
||||||
int retryCount = 0;
|
int retryCount = 0;
|
||||||
|
|
||||||
// Allow up to 20 * 60s for any action to be downloaded from github graph.
|
// Allow up to 20 * 60s for any action to be downloaded from github graph.
|
||||||
int timeoutSeconds = 20 * 60;
|
int timeoutSeconds = 20 * 60;
|
||||||
while (retryCount < 3)
|
while (retryCount < 3)
|
||||||
{
|
{
|
||||||
@@ -655,12 +662,21 @@ namespace GitHub.Runner.Worker
|
|||||||
// find the docker file or action.yml file
|
// find the docker file or action.yml file
|
||||||
var dockerFile = Path.Combine(actionEntryDirectory, "Dockerfile");
|
var dockerFile = Path.Combine(actionEntryDirectory, "Dockerfile");
|
||||||
var dockerFileLowerCase = Path.Combine(actionEntryDirectory, "dockerfile");
|
var dockerFileLowerCase = Path.Combine(actionEntryDirectory, "dockerfile");
|
||||||
var actionManifest = Path.Combine(actionEntryDirectory, "action.yml");
|
var actionManifest = Path.Combine(actionEntryDirectory, Constants.Path.ActionManifestYmlFile);
|
||||||
if (File.Exists(actionManifest))
|
var actionManifestYaml = Path.Combine(actionEntryDirectory, Constants.Path.ActionManifestYamlFile);
|
||||||
|
if (File.Exists(actionManifest) || File.Exists(actionManifestYaml))
|
||||||
{
|
{
|
||||||
executionContext.Debug($"action.yml for action: '{actionManifest}'.");
|
executionContext.Debug($"action.yml for action: '{actionManifest}'.");
|
||||||
var manifestManager = HostContext.GetService<IActionManifestManager>();
|
var manifestManager = HostContext.GetService<IActionManifestManager>();
|
||||||
var actionDefinitionData = manifestManager.Load(executionContext, actionManifest);
|
ActionDefinitionData actionDefinitionData = null;
|
||||||
|
if (File.Exists(actionManifest))
|
||||||
|
{
|
||||||
|
actionDefinitionData = manifestManager.Load(executionContext, actionManifest);
|
||||||
|
}
|
||||||
|
else
|
||||||
|
{
|
||||||
|
actionDefinitionData = manifestManager.Load(executionContext, actionManifestYaml);
|
||||||
|
}
|
||||||
|
|
||||||
if (actionDefinitionData.Execution.ExecutionType == ActionExecutionType.Container)
|
if (actionDefinitionData.Execution.ExecutionType == ActionExecutionType.Container)
|
||||||
{
|
{
|
||||||
@@ -720,7 +736,7 @@ namespace GitHub.Runner.Worker
|
|||||||
else
|
else
|
||||||
{
|
{
|
||||||
var fullPath = IOUtil.ResolvePath(actionEntryDirectory, "."); // resolve full path without access filesystem.
|
var fullPath = IOUtil.ResolvePath(actionEntryDirectory, "."); // resolve full path without access filesystem.
|
||||||
throw new InvalidOperationException($"Can't find 'action.yml' or 'Dockerfile' under '{fullPath}'. Did you forget to run actions/checkout before running your local action?");
|
throw new InvalidOperationException($"Can't find 'action.yml', 'action.yaml' or 'Dockerfile' under '{fullPath}'. Did you forget to run actions/checkout before running your local action?");
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|||||||
@@ -341,7 +341,7 @@ namespace GitHub.Runner.Worker
|
|||||||
EntryPoint = entrypointToken?.Value,
|
EntryPoint = entrypointToken?.Value,
|
||||||
Environment = envToken,
|
Environment = envToken,
|
||||||
Cleanup = postEntrypointToken?.Value,
|
Cleanup = postEntrypointToken?.Value,
|
||||||
CleanupCondition = postIfToken?.Value
|
CleanupCondition = postIfToken?.Value ?? "always()"
|
||||||
};
|
};
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
@@ -357,7 +357,7 @@ namespace GitHub.Runner.Worker
|
|||||||
{
|
{
|
||||||
Script = mainToken.Value,
|
Script = mainToken.Value,
|
||||||
Cleanup = postToken?.Value,
|
Cleanup = postToken?.Value,
|
||||||
CleanupCondition = postIfToken?.Value
|
CleanupCondition = postIfToken?.Value ?? "always()"
|
||||||
};
|
};
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|||||||
@@ -179,17 +179,15 @@ namespace GitHub.Runner.Worker
|
|||||||
ExecutionContext.Debug("Loading env");
|
ExecutionContext.Debug("Loading env");
|
||||||
var environment = new Dictionary<String, String>(VarUtil.EnvironmentVariableKeyComparer);
|
var environment = new Dictionary<String, String>(VarUtil.EnvironmentVariableKeyComparer);
|
||||||
|
|
||||||
// Apply environment set using ##[set-env] first since these are job level env
|
#if OS_WINDOWS
|
||||||
foreach (var env in ExecutionContext.EnvironmentVariables)
|
var envContext = ExecutionContext.ExpressionValues["env"] as DictionaryContextData;
|
||||||
|
#else
|
||||||
|
var envContext = ExecutionContext.ExpressionValues["env"] as CaseSensitiveDictionaryContextData;
|
||||||
|
#endif
|
||||||
|
// Apply environment from env context, env context contains job level env and action's evn block
|
||||||
|
foreach (var env in envContext)
|
||||||
{
|
{
|
||||||
environment[env.Key] = env.Value ?? string.Empty;
|
environment[env.Key] = env.Value.ToString();
|
||||||
}
|
|
||||||
|
|
||||||
// Apply action's env block later.
|
|
||||||
var actionEnvironment = templateEvaluator.EvaluateStepEnvironment(Action.Environment, ExecutionContext.ExpressionValues, VarUtil.EnvironmentVariableKeyComparer);
|
|
||||||
foreach (var env in actionEnvironment)
|
|
||||||
{
|
|
||||||
environment[env.Key] = env.Value ?? string.Empty;
|
|
||||||
}
|
}
|
||||||
|
|
||||||
// Apply action's intra-action state at last
|
// Apply action's intra-action state at last
|
||||||
|
|||||||
@@ -2,9 +2,9 @@
|
|||||||
using System.Collections.Generic;
|
using System.Collections.Generic;
|
||||||
using System.IO;
|
using System.IO;
|
||||||
using GitHub.Runner.Common.Util;
|
using GitHub.Runner.Common.Util;
|
||||||
using Pipelines = GitHub.DistributedTask.Pipelines;
|
|
||||||
using GitHub.Runner.Common;
|
using GitHub.Runner.Common;
|
||||||
using GitHub.Runner.Sdk;
|
using GitHub.Runner.Sdk;
|
||||||
|
using Pipelines = GitHub.DistributedTask.Pipelines;
|
||||||
|
|
||||||
namespace GitHub.Runner.Worker.Container
|
namespace GitHub.Runner.Worker.Container
|
||||||
{
|
{
|
||||||
@@ -19,7 +19,6 @@ namespace GitHub.Runner.Worker.Container
|
|||||||
|
|
||||||
public ContainerInfo()
|
public ContainerInfo()
|
||||||
{
|
{
|
||||||
|
|
||||||
}
|
}
|
||||||
|
|
||||||
public ContainerInfo(IHostContext hostContext, Pipelines.JobContainer container, bool isJobContainer = true, string networkAlias = null)
|
public ContainerInfo(IHostContext hostContext, Pipelines.JobContainer container, bool isJobContainer = true, string networkAlias = null)
|
||||||
@@ -64,6 +63,8 @@ namespace GitHub.Runner.Worker.Container
|
|||||||
UserMountVolumes[volume] = volume;
|
UserMountVolumes[volume] = volume;
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
|
UpdateWebProxyEnv(hostContext.WebProxy);
|
||||||
}
|
}
|
||||||
|
|
||||||
public string ContainerId { get; set; }
|
public string ContainerId { get; set; }
|
||||||
@@ -223,6 +224,26 @@ namespace GitHub.Runner.Worker.Container
|
|||||||
{
|
{
|
||||||
_pathMappings.Insert(0, new PathMapping(hostCommonPath, containerCommonPath));
|
_pathMappings.Insert(0, new PathMapping(hostCommonPath, containerCommonPath));
|
||||||
}
|
}
|
||||||
|
|
||||||
|
private void UpdateWebProxyEnv(RunnerWebProxy webProxy)
|
||||||
|
{
|
||||||
|
// Set common forms of proxy variables if configured in Runner and not set directly by container.env
|
||||||
|
if (!String.IsNullOrEmpty(webProxy.HttpProxyAddress))
|
||||||
|
{
|
||||||
|
ContainerEnvironmentVariables.TryAdd("HTTP_PROXY", webProxy.HttpProxyAddress);
|
||||||
|
ContainerEnvironmentVariables.TryAdd("http_proxy", webProxy.HttpProxyAddress);
|
||||||
|
}
|
||||||
|
if (!String.IsNullOrEmpty(webProxy.HttpsProxyAddress))
|
||||||
|
{
|
||||||
|
ContainerEnvironmentVariables.TryAdd("HTTPS_PROXY", webProxy.HttpsProxyAddress);
|
||||||
|
ContainerEnvironmentVariables.TryAdd("https_proxy", webProxy.HttpsProxyAddress);
|
||||||
|
}
|
||||||
|
if (!String.IsNullOrEmpty(webProxy.NoProxyString))
|
||||||
|
{
|
||||||
|
ContainerEnvironmentVariables.TryAdd("NO_PROXY", webProxy.NoProxyString);
|
||||||
|
ContainerEnvironmentVariables.TryAdd("no_proxy", webProxy.NoProxyString);
|
||||||
|
}
|
||||||
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
public class MountVolume
|
public class MountVolume
|
||||||
|
|||||||
@@ -20,6 +20,7 @@ using System.Text;
|
|||||||
using System.Collections;
|
using System.Collections;
|
||||||
using ObjectTemplating = GitHub.DistributedTask.ObjectTemplating;
|
using ObjectTemplating = GitHub.DistributedTask.ObjectTemplating;
|
||||||
using Pipelines = GitHub.DistributedTask.Pipelines;
|
using Pipelines = GitHub.DistributedTask.Pipelines;
|
||||||
|
using GitHub.DistributedTask.Expressions2;
|
||||||
|
|
||||||
namespace GitHub.Runner.Worker
|
namespace GitHub.Runner.Worker
|
||||||
{
|
{
|
||||||
@@ -568,6 +569,12 @@ namespace GitHub.Runner.Worker
|
|||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
|
// Expression functions
|
||||||
|
if (Variables.GetBoolean("System.HashFilesV2") == true)
|
||||||
|
{
|
||||||
|
ExpressionConstants.UpdateFunction<Handlers.HashFiles>("hashFiles", 1, byte.MaxValue);
|
||||||
|
}
|
||||||
|
|
||||||
// Expression values
|
// Expression values
|
||||||
if (message.ContextData?.Count > 0)
|
if (message.ContextData?.Count > 0)
|
||||||
{
|
{
|
||||||
|
|||||||
126
src/Runner.Worker/ExpressionFunctions/HashFiles.cs
Normal file
126
src/Runner.Worker/ExpressionFunctions/HashFiles.cs
Normal file
@@ -0,0 +1,126 @@
|
|||||||
|
using System;
|
||||||
|
using System.IO;
|
||||||
|
using GitHub.DistributedTask.Expressions2.Sdk;
|
||||||
|
using GitHub.DistributedTask.Pipelines.ContextData;
|
||||||
|
using GitHub.DistributedTask.Pipelines.ObjectTemplating;
|
||||||
|
using GitHub.Runner.Sdk;
|
||||||
|
using System.Reflection;
|
||||||
|
using System.Threading;
|
||||||
|
using System.Collections.Generic;
|
||||||
|
|
||||||
|
namespace GitHub.Runner.Worker.Handlers
|
||||||
|
{
|
||||||
|
public class FunctionTrace : ITraceWriter
|
||||||
|
{
|
||||||
|
private GitHub.DistributedTask.Expressions2.ITraceWriter _trace;
|
||||||
|
|
||||||
|
public FunctionTrace(GitHub.DistributedTask.Expressions2.ITraceWriter trace)
|
||||||
|
{
|
||||||
|
_trace = trace;
|
||||||
|
}
|
||||||
|
public void Info(string message)
|
||||||
|
{
|
||||||
|
_trace.Info(message);
|
||||||
|
}
|
||||||
|
|
||||||
|
public void Verbose(string message)
|
||||||
|
{
|
||||||
|
_trace.Info(message);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
public sealed class HashFiles : Function
|
||||||
|
{
|
||||||
|
protected sealed override Object EvaluateCore(
|
||||||
|
EvaluationContext context,
|
||||||
|
out ResultMemory resultMemory)
|
||||||
|
{
|
||||||
|
resultMemory = null;
|
||||||
|
var templateContext = context.State as DistributedTask.ObjectTemplating.TemplateContext;
|
||||||
|
ArgUtil.NotNull(templateContext, nameof(templateContext));
|
||||||
|
templateContext.ExpressionValues.TryGetValue(PipelineTemplateConstants.GitHub, out var githubContextData);
|
||||||
|
ArgUtil.NotNull(githubContextData, nameof(githubContextData));
|
||||||
|
var githubContext = githubContextData as DictionaryContextData;
|
||||||
|
ArgUtil.NotNull(githubContext, nameof(githubContext));
|
||||||
|
githubContext.TryGetValue(PipelineTemplateConstants.Workspace, out var workspace);
|
||||||
|
var workspaceData = workspace as StringContextData;
|
||||||
|
ArgUtil.NotNull(workspaceData, nameof(workspaceData));
|
||||||
|
|
||||||
|
string githubWorkspace = workspaceData.Value;
|
||||||
|
bool followSymlink = false;
|
||||||
|
List<string> patterns = new List<string>();
|
||||||
|
var firstParameter = true;
|
||||||
|
foreach (var parameter in Parameters)
|
||||||
|
{
|
||||||
|
var parameterString = parameter.Evaluate(context).ConvertToString();
|
||||||
|
if (firstParameter)
|
||||||
|
{
|
||||||
|
firstParameter = false;
|
||||||
|
if (parameterString.StartsWith("--"))
|
||||||
|
{
|
||||||
|
if (string.Equals(parameterString, "--follow-symbolic-links", StringComparison.OrdinalIgnoreCase))
|
||||||
|
{
|
||||||
|
followSymlink = true;
|
||||||
|
continue;
|
||||||
|
}
|
||||||
|
else
|
||||||
|
{
|
||||||
|
throw new ArgumentOutOfRangeException($"Invalid glob option {parameterString}, avaliable option: '--follow-symbolic-links'.");
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
patterns.Add(parameterString);
|
||||||
|
}
|
||||||
|
|
||||||
|
context.Trace.Info($"Search root directory: '{githubWorkspace}'");
|
||||||
|
context.Trace.Info($"Search pattern: '{string.Join(", ", patterns)}'");
|
||||||
|
|
||||||
|
string binDir = Path.GetDirectoryName(Assembly.GetEntryAssembly().Location);
|
||||||
|
string runnerRoot = new DirectoryInfo(binDir).Parent.FullName;
|
||||||
|
|
||||||
|
string node = Path.Combine(runnerRoot, "externals", "node12", "bin", $"node{IOUtil.ExeExtension}");
|
||||||
|
string hashFilesScript = Path.Combine(binDir, "hashFiles");
|
||||||
|
var hashResult = string.Empty;
|
||||||
|
var p = new ProcessInvoker(new FunctionTrace(context.Trace));
|
||||||
|
p.ErrorDataReceived += ((_, data) =>
|
||||||
|
{
|
||||||
|
if (!string.IsNullOrEmpty(data.Data) && data.Data.StartsWith("__OUTPUT__") && data.Data.EndsWith("__OUTPUT__"))
|
||||||
|
{
|
||||||
|
hashResult = data.Data.Substring(10, data.Data.Length - 20);
|
||||||
|
context.Trace.Info($"Hash result: '{hashResult}'");
|
||||||
|
}
|
||||||
|
else
|
||||||
|
{
|
||||||
|
context.Trace.Info(data.Data);
|
||||||
|
}
|
||||||
|
});
|
||||||
|
|
||||||
|
p.OutputDataReceived += ((_, data) =>
|
||||||
|
{
|
||||||
|
context.Trace.Info(data.Data);
|
||||||
|
});
|
||||||
|
|
||||||
|
var env = new Dictionary<string, string>();
|
||||||
|
if (followSymlink)
|
||||||
|
{
|
||||||
|
env["followSymbolicLinks"] = "true";
|
||||||
|
}
|
||||||
|
env["patterns"] = string.Join(Environment.NewLine, patterns);
|
||||||
|
|
||||||
|
int exitCode = p.ExecuteAsync(workingDirectory: githubWorkspace,
|
||||||
|
fileName: node,
|
||||||
|
arguments: $"\"{hashFilesScript.Replace("\"", "\\\"")}\"",
|
||||||
|
environment: env,
|
||||||
|
requireExitCodeZero: false,
|
||||||
|
cancellationToken: new CancellationTokenSource(TimeSpan.FromSeconds(120)).Token).GetAwaiter().GetResult();
|
||||||
|
|
||||||
|
if (exitCode != 0)
|
||||||
|
{
|
||||||
|
throw new InvalidOperationException($"hashFiles('{ExpressionUtility.StringEscape(string.Join(", ", patterns))}') failed. Fail to hash files under directory '{githubWorkspace}'");
|
||||||
|
}
|
||||||
|
|
||||||
|
return hashResult;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
@@ -189,13 +189,13 @@ namespace GitHub.Runner.Worker.Handlers
|
|||||||
container.ContainerEnvironmentVariables[variable.Key] = container.TranslateToContainerPath(variable.Value);
|
container.ContainerEnvironmentVariables[variable.Key] = container.TranslateToContainerPath(variable.Value);
|
||||||
}
|
}
|
||||||
|
|
||||||
using (var stdoutManager = new OutputManager(ExecutionContext, ActionCommandManager))
|
using (var stdoutManager = new OutputManager(ExecutionContext, ActionCommandManager, container))
|
||||||
using (var stderrManager = new OutputManager(ExecutionContext, ActionCommandManager))
|
using (var stderrManager = new OutputManager(ExecutionContext, ActionCommandManager, container))
|
||||||
{
|
{
|
||||||
var runExitCode = await dockerManger.DockerRun(ExecutionContext, container, stdoutManager.OnDataReceived, stderrManager.OnDataReceived);
|
var runExitCode = await dockerManger.DockerRun(ExecutionContext, container, stdoutManager.OnDataReceived, stderrManager.OnDataReceived);
|
||||||
|
ExecutionContext.Debug($"Docker Action run completed with exit code {runExitCode}");
|
||||||
if (runExitCode != 0)
|
if (runExitCode != 0)
|
||||||
{
|
{
|
||||||
ExecutionContext.Error($"Docker run failed with exit code {runExitCode}");
|
|
||||||
ExecutionContext.Result = TaskResult.Failed;
|
ExecutionContext.Result = TaskResult.Failed;
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|||||||
@@ -122,9 +122,9 @@ namespace GitHub.Runner.Worker.Handlers
|
|||||||
else
|
else
|
||||||
{
|
{
|
||||||
var exitCode = await step;
|
var exitCode = await step;
|
||||||
|
ExecutionContext.Debug($"Node Action run completed with exit code {exitCode}");
|
||||||
if (exitCode != 0)
|
if (exitCode != 0)
|
||||||
{
|
{
|
||||||
ExecutionContext.Error($"Node run failed with exit code {exitCode}");
|
|
||||||
ExecutionContext.Result = TaskResult.Failed;
|
ExecutionContext.Result = TaskResult.Failed;
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|||||||
@@ -6,6 +6,7 @@ using System.Linq;
|
|||||||
using System.Text.RegularExpressions;
|
using System.Text.RegularExpressions;
|
||||||
using GitHub.Runner.Common;
|
using GitHub.Runner.Common;
|
||||||
using GitHub.Runner.Sdk;
|
using GitHub.Runner.Sdk;
|
||||||
|
using GitHub.Runner.Worker.Container;
|
||||||
using DTWebApi = GitHub.DistributedTask.WebApi;
|
using DTWebApi = GitHub.DistributedTask.WebApi;
|
||||||
|
|
||||||
namespace GitHub.Runner.Worker.Handlers
|
namespace GitHub.Runner.Worker.Handlers
|
||||||
@@ -17,6 +18,7 @@ namespace GitHub.Runner.Worker.Handlers
|
|||||||
private const string _timeoutKey = "GITHUB_ACTIONS_RUNNER_ISSUE_MATCHER_TIMEOUT";
|
private const string _timeoutKey = "GITHUB_ACTIONS_RUNNER_ISSUE_MATCHER_TIMEOUT";
|
||||||
private static readonly Regex _colorCodeRegex = new Regex(@"\x0033\[[0-9;]*m?", RegexOptions.Compiled | RegexOptions.CultureInvariant);
|
private static readonly Regex _colorCodeRegex = new Regex(@"\x0033\[[0-9;]*m?", RegexOptions.Compiled | RegexOptions.CultureInvariant);
|
||||||
private readonly IActionCommandManager _commandManager;
|
private readonly IActionCommandManager _commandManager;
|
||||||
|
private readonly ContainerInfo _container;
|
||||||
private readonly IExecutionContext _executionContext;
|
private readonly IExecutionContext _executionContext;
|
||||||
private readonly int _failsafe = 50;
|
private readonly int _failsafe = 50;
|
||||||
private readonly object _matchersLock = new object();
|
private readonly object _matchersLock = new object();
|
||||||
@@ -25,10 +27,11 @@ namespace GitHub.Runner.Worker.Handlers
|
|||||||
// Mapping that indicates whether a directory belongs to the workflow repository
|
// Mapping that indicates whether a directory belongs to the workflow repository
|
||||||
private readonly Dictionary<string, string> _directoryMap = new Dictionary<string, string>();
|
private readonly Dictionary<string, string> _directoryMap = new Dictionary<string, string>();
|
||||||
|
|
||||||
public OutputManager(IExecutionContext executionContext, IActionCommandManager commandManager)
|
public OutputManager(IExecutionContext executionContext, IActionCommandManager commandManager, ContainerInfo container = null)
|
||||||
{
|
{
|
||||||
_executionContext = executionContext;
|
_executionContext = executionContext;
|
||||||
_commandManager = commandManager;
|
_commandManager = commandManager;
|
||||||
|
_container = container ?? executionContext.Container;
|
||||||
|
|
||||||
// Recursion failsafe (test override)
|
// Recursion failsafe (test override)
|
||||||
var failsafeString = Environment.GetEnvironmentVariable("RUNNER_TEST_GET_REPOSITORY_PATH_FAILSAFE");
|
var failsafeString = Environment.GetEnvironmentVariable("RUNNER_TEST_GET_REPOSITORY_PATH_FAILSAFE");
|
||||||
@@ -257,6 +260,7 @@ namespace GitHub.Runner.Worker.Handlers
|
|||||||
if (!string.IsNullOrWhiteSpace(match.File))
|
if (!string.IsNullOrWhiteSpace(match.File))
|
||||||
{
|
{
|
||||||
var file = match.File;
|
var file = match.File;
|
||||||
|
var translate = _container != null;
|
||||||
|
|
||||||
// Root using fromPath
|
// Root using fromPath
|
||||||
if (!string.IsNullOrWhiteSpace(match.FromPath) && !Path.IsPathFullyQualified(file))
|
if (!string.IsNullOrWhiteSpace(match.FromPath) && !Path.IsPathFullyQualified(file))
|
||||||
@@ -275,11 +279,19 @@ namespace GitHub.Runner.Worker.Handlers
|
|||||||
ArgUtil.NotNullOrEmpty(workspace, "workspace");
|
ArgUtil.NotNullOrEmpty(workspace, "workspace");
|
||||||
|
|
||||||
file = Path.Combine(workspace, file);
|
file = Path.Combine(workspace, file);
|
||||||
|
translate = false;
|
||||||
}
|
}
|
||||||
|
|
||||||
// Remove relative pathing and normalize slashes
|
// Remove relative pathing and normalize slashes
|
||||||
file = Path.GetFullPath(file);
|
file = Path.GetFullPath(file);
|
||||||
|
|
||||||
|
// Translate to host
|
||||||
|
if (translate)
|
||||||
|
{
|
||||||
|
file = _container.TranslateToHostPath(file);
|
||||||
|
file = Path.GetFullPath(file);
|
||||||
|
}
|
||||||
|
|
||||||
// Check whether the file exists
|
// Check whether the file exists
|
||||||
if (File.Exists(file))
|
if (File.Exists(file))
|
||||||
{
|
{
|
||||||
|
|||||||
@@ -76,15 +76,38 @@ namespace GitHub.Runner.Worker
|
|||||||
// Start
|
// Start
|
||||||
step.ExecutionContext.Start();
|
step.ExecutionContext.Start();
|
||||||
|
|
||||||
// Set GITHUB_ACTION
|
|
||||||
if (step is IActionRunner actionStep)
|
|
||||||
{
|
|
||||||
step.ExecutionContext.SetGitHubContext("action", actionStep.Action.Name);
|
|
||||||
}
|
|
||||||
|
|
||||||
// Initialize scope
|
// Initialize scope
|
||||||
if (InitializeScope(step, scopeInputs))
|
if (InitializeScope(step, scopeInputs))
|
||||||
{
|
{
|
||||||
|
// Populate env context for each step
|
||||||
|
Trace.Info("Initialize Env context for step");
|
||||||
|
#if OS_WINDOWS
|
||||||
|
var envContext = new DictionaryContextData();
|
||||||
|
#else
|
||||||
|
var envContext = new CaseSensitiveDictionaryContextData();
|
||||||
|
#endif
|
||||||
|
step.ExecutionContext.ExpressionValues["env"] = envContext;
|
||||||
|
foreach (var pair in step.ExecutionContext.EnvironmentVariables)
|
||||||
|
{
|
||||||
|
envContext[pair.Key] = new StringContextData(pair.Value ?? string.Empty);
|
||||||
|
}
|
||||||
|
|
||||||
|
if (step is IActionRunner actionStep)
|
||||||
|
{
|
||||||
|
// Set GITHUB_ACTION
|
||||||
|
step.ExecutionContext.SetGitHubContext("action", actionStep.Action.Name);
|
||||||
|
|
||||||
|
// Evaluate and merge action's env block to env context
|
||||||
|
var templateTrace = step.ExecutionContext.ToTemplateTraceWriter();
|
||||||
|
var schema = new PipelineTemplateSchemaFactory().CreateSchema();
|
||||||
|
var templateEvaluator = new PipelineTemplateEvaluator(templateTrace, schema);
|
||||||
|
var actionEnvironment = templateEvaluator.EvaluateStepEnvironment(actionStep.Action.Environment, step.ExecutionContext.ExpressionValues, VarUtil.EnvironmentVariableKeyComparer);
|
||||||
|
foreach (var env in actionEnvironment)
|
||||||
|
{
|
||||||
|
envContext[env.Key] = new StringContextData(env.Value ?? string.Empty);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
var expressionManager = HostContext.GetService<IExpressionManager>();
|
var expressionManager = HostContext.GetService<IExpressionManager>();
|
||||||
try
|
try
|
||||||
{
|
{
|
||||||
|
|||||||
@@ -5,7 +5,7 @@ using GitHub.DistributedTask.Expressions2.Sdk.Functions;
|
|||||||
|
|
||||||
namespace GitHub.DistributedTask.Expressions2
|
namespace GitHub.DistributedTask.Expressions2
|
||||||
{
|
{
|
||||||
internal static class ExpressionConstants
|
public static class ExpressionConstants
|
||||||
{
|
{
|
||||||
static ExpressionConstants()
|
static ExpressionConstants()
|
||||||
{
|
{
|
||||||
@@ -24,6 +24,12 @@ namespace GitHub.DistributedTask.Expressions2
|
|||||||
WellKnownFunctions.Add(name, new FunctionInfo<T>(name, minParameters, maxParameters));
|
WellKnownFunctions.Add(name, new FunctionInfo<T>(name, minParameters, maxParameters));
|
||||||
}
|
}
|
||||||
|
|
||||||
|
public static void UpdateFunction<T>(String name, Int32 minParameters, Int32 maxParameters)
|
||||||
|
where T : Function, new()
|
||||||
|
{
|
||||||
|
WellKnownFunctions[name] = new FunctionInfo<T>(name, minParameters, maxParameters);
|
||||||
|
}
|
||||||
|
|
||||||
internal static readonly String False = "false";
|
internal static readonly String False = "false";
|
||||||
internal static readonly String Infinity = "Infinity";
|
internal static readonly String Infinity = "Infinity";
|
||||||
internal static readonly Int32 MaxDepth = 50;
|
internal static readonly Int32 MaxDepth = 50;
|
||||||
|
|||||||
@@ -16,6 +16,11 @@ namespace GitHub.DistributedTask.Logging
|
|||||||
{
|
{
|
||||||
return Convert.ToBase64String(Encoding.UTF8.GetBytes(value));
|
return Convert.ToBase64String(Encoding.UTF8.GetBytes(value));
|
||||||
}
|
}
|
||||||
|
|
||||||
|
public static String Base64StringEscapeTrimmed(String value)
|
||||||
|
{
|
||||||
|
return TrimBase64End(Convert.ToBase64String(Encoding.UTF8.GetBytes(value)));
|
||||||
|
}
|
||||||
|
|
||||||
// Base64 is 6 bits -> char
|
// Base64 is 6 bits -> char
|
||||||
// A byte is 8 bits
|
// A byte is 8 bits
|
||||||
@@ -67,15 +72,15 @@ namespace GitHub.DistributedTask.Logging
|
|||||||
{
|
{
|
||||||
var shiftArray = new byte[bytes.Length - shift];
|
var shiftArray = new byte[bytes.Length - shift];
|
||||||
Array.Copy(bytes, shift, shiftArray, 0, bytes.Length - shift);
|
Array.Copy(bytes, shift, shiftArray, 0, bytes.Length - shift);
|
||||||
return Convert.ToBase64String(shiftArray);
|
return TrimBase64End(Convert.ToBase64String(shiftArray));
|
||||||
}
|
}
|
||||||
else
|
else
|
||||||
{
|
{
|
||||||
return Convert.ToBase64String(bytes);
|
return TrimBase64End(Convert.ToBase64String(bytes));
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
private static String UriDataEscape(
|
public static String UriDataEscape(
|
||||||
String value,
|
String value,
|
||||||
Int32 maxSegmentSize)
|
Int32 maxSegmentSize)
|
||||||
{
|
{
|
||||||
@@ -103,5 +108,26 @@ namespace GitHub.DistributedTask.Logging
|
|||||||
|
|
||||||
return result.ToString();
|
return result.ToString();
|
||||||
}
|
}
|
||||||
|
|
||||||
|
private static String TrimBase64End(String value)
|
||||||
|
{
|
||||||
|
if (String.IsNullOrEmpty(value))
|
||||||
|
{
|
||||||
|
return String.Empty;
|
||||||
|
}
|
||||||
|
if (value.EndsWith('='))
|
||||||
|
{
|
||||||
|
var trimmed = value.TrimEnd('=');
|
||||||
|
if (trimmed.Length > 1)
|
||||||
|
{
|
||||||
|
// If a base64 string ends in '=' it indicates that the base 64 character is only using 2 or 4 of the six bytes and will change if another character is added
|
||||||
|
// For example 'ab' is 'YWI=' in base 64
|
||||||
|
// 'abc' is 'YWJj'
|
||||||
|
// We need to detect YW, not YWI so we trim the last character ('I')
|
||||||
|
return trimmed.Substring(0, trimmed.Length - 1);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
return value;
|
||||||
|
}
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|||||||
@@ -35,6 +35,19 @@ namespace GitHub.DistributedTask.Pipelines.ContextData
|
|||||||
throw new ArgumentException($"Unexpected type '{value?.GetType().Name}' encountered while reading '{objectDescription}'. The type '{nameof(DictionaryContextData)}' was expected.");
|
throw new ArgumentException($"Unexpected type '{value?.GetType().Name}' encountered while reading '{objectDescription}'. The type '{nameof(DictionaryContextData)}' was expected.");
|
||||||
}
|
}
|
||||||
|
|
||||||
|
[EditorBrowsable(EditorBrowsableState.Never)]
|
||||||
|
public static CaseSensitiveDictionaryContextData AssertCaseSensitiveDictionary(
|
||||||
|
this PipelineContextData value,
|
||||||
|
String objectDescription)
|
||||||
|
{
|
||||||
|
if (value is CaseSensitiveDictionaryContextData dictionary)
|
||||||
|
{
|
||||||
|
return dictionary;
|
||||||
|
}
|
||||||
|
|
||||||
|
throw new ArgumentException($"Unexpected type '{value?.GetType().Name}' encountered while reading '{objectDescription}'. The type '{nameof(CaseSensitiveDictionaryContextData)}' was expected.");
|
||||||
|
}
|
||||||
|
|
||||||
[EditorBrowsable(EditorBrowsableState.Never)]
|
[EditorBrowsable(EditorBrowsableState.Never)]
|
||||||
public static StringContextData AssertString(
|
public static StringContextData AssertString(
|
||||||
this PipelineContextData value,
|
this PipelineContextData value,
|
||||||
|
|||||||
@@ -93,12 +93,21 @@ namespace GitHub.Runner.Common.Tests
|
|||||||
Assert.Equal("123***123", _hc.SecretMasker.MaskSecrets("123Pass%20word%20123%21123"));
|
Assert.Equal("123***123", _hc.SecretMasker.MaskSecrets("123Pass%20word%20123%21123"));
|
||||||
Assert.Equal("123***123", _hc.SecretMasker.MaskSecrets("123Pass<word>123!123"));
|
Assert.Equal("123***123", _hc.SecretMasker.MaskSecrets("123Pass<word>123!123"));
|
||||||
Assert.Equal("123***123", _hc.SecretMasker.MaskSecrets("123Pass''word''123!123"));
|
Assert.Equal("123***123", _hc.SecretMasker.MaskSecrets("123Pass''word''123!123"));
|
||||||
Assert.Equal("OlBh***", _hc.SecretMasker.MaskSecrets(Convert.ToBase64String(Encoding.UTF8.GetBytes($":Password123!"))));
|
Assert.Equal("OlBh***Q==", _hc.SecretMasker.MaskSecrets(Convert.ToBase64String(Encoding.UTF8.GetBytes($":Password123!"))));
|
||||||
Assert.Equal("YTpQ***", _hc.SecretMasker.MaskSecrets(Convert.ToBase64String(Encoding.UTF8.GetBytes($"a:Password123!"))));
|
Assert.Equal("YTpQ***E=", _hc.SecretMasker.MaskSecrets(Convert.ToBase64String(Encoding.UTF8.GetBytes($"a:Password123!"))));
|
||||||
Assert.Equal("YWI6***", _hc.SecretMasker.MaskSecrets(Convert.ToBase64String(Encoding.UTF8.GetBytes($"ab:Password123!"))));
|
Assert.Equal("YWI6***", _hc.SecretMasker.MaskSecrets(Convert.ToBase64String(Encoding.UTF8.GetBytes($"ab:Password123!"))));
|
||||||
Assert.Equal("YWJjOlBh***", _hc.SecretMasker.MaskSecrets(Convert.ToBase64String(Encoding.UTF8.GetBytes($"abc:Password123!"))));
|
Assert.Equal("YWJjOlBh***Q==", _hc.SecretMasker.MaskSecrets(Convert.ToBase64String(Encoding.UTF8.GetBytes($"abc:Password123!"))));
|
||||||
Assert.Equal("YWJjZDpQ***", _hc.SecretMasker.MaskSecrets(Convert.ToBase64String(Encoding.UTF8.GetBytes($"abcd:Password123!"))));
|
Assert.Equal("YWJjZDpQ***E=", _hc.SecretMasker.MaskSecrets(Convert.ToBase64String(Encoding.UTF8.GetBytes($"abcd:Password123!"))));
|
||||||
Assert.Equal("YWJjZGU6***", _hc.SecretMasker.MaskSecrets(Convert.ToBase64String(Encoding.UTF8.GetBytes($"abcde:Password123!"))));
|
Assert.Equal("YWJjZGU6***", _hc.SecretMasker.MaskSecrets(Convert.ToBase64String(Encoding.UTF8.GetBytes($"abcde:Password123!"))));
|
||||||
|
Assert.Equal("***Og==", _hc.SecretMasker.MaskSecrets(Convert.ToBase64String(Encoding.UTF8.GetBytes($"Password123!:"))));
|
||||||
|
Assert.Equal("***OmE=", _hc.SecretMasker.MaskSecrets(Convert.ToBase64String(Encoding.UTF8.GetBytes($"Password123!:a"))));
|
||||||
|
Assert.Equal("***OmFi", _hc.SecretMasker.MaskSecrets(Convert.ToBase64String(Encoding.UTF8.GetBytes($"Password123!:ab"))));
|
||||||
|
Assert.Equal("***OmFiYw==", _hc.SecretMasker.MaskSecrets(Convert.ToBase64String(Encoding.UTF8.GetBytes($"Password123!:abc"))));
|
||||||
|
Assert.Equal("***OmFiY2Q=", _hc.SecretMasker.MaskSecrets(Convert.ToBase64String(Encoding.UTF8.GetBytes($"Password123!:abcd"))));
|
||||||
|
Assert.Equal("***OmFiY2Rl", _hc.SecretMasker.MaskSecrets(Convert.ToBase64String(Encoding.UTF8.GetBytes($"Password123!:abcde"))));
|
||||||
|
Assert.Equal("OlBh***To=", _hc.SecretMasker.MaskSecrets(Convert.ToBase64String(Encoding.UTF8.GetBytes($":Password123!:"))));
|
||||||
|
Assert.Equal("YTpQ***E6YQ==", _hc.SecretMasker.MaskSecrets(Convert.ToBase64String(Encoding.UTF8.GetBytes($"a:Password123!:a"))));
|
||||||
|
Assert.Equal("YWJjOlBh***Tph", _hc.SecretMasker.MaskSecrets(Convert.ToBase64String(Encoding.UTF8.GetBytes($"abc:Password123!:a"))));
|
||||||
}
|
}
|
||||||
finally
|
finally
|
||||||
{
|
{
|
||||||
|
|||||||
@@ -498,7 +498,7 @@ namespace GitHub.Runner.Common.Tests
|
|||||||
_promptManager
|
_promptManager
|
||||||
.Setup(x => x.ReadValue(
|
.Setup(x => x.ReadValue(
|
||||||
Constants.Runner.CommandLine.Args.Token, // argName
|
Constants.Runner.CommandLine.Args.Token, // argName
|
||||||
"Enter runner deletion token:", // description
|
"Enter runner remove token:", // description
|
||||||
true, // secret
|
true, // secret
|
||||||
string.Empty, // defaultValue
|
string.Empty, // defaultValue
|
||||||
Validators.NonEmptyValidator, // validator
|
Validators.NonEmptyValidator, // validator
|
||||||
|
|||||||
@@ -373,6 +373,45 @@ namespace GitHub.Runner.Common.Tests.Worker
|
|||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
|
[Fact]
|
||||||
|
[Trait("Level", "L0")]
|
||||||
|
[Trait("Category", "Worker")]
|
||||||
|
public async void PrepareActions_RepositoryActionWithActionYamlFile_DockerHubImage()
|
||||||
|
{
|
||||||
|
try
|
||||||
|
{
|
||||||
|
//Arrange
|
||||||
|
Setup();
|
||||||
|
var actionId = Guid.NewGuid();
|
||||||
|
var actions = new List<Pipelines.ActionStep>
|
||||||
|
{
|
||||||
|
new Pipelines.ActionStep()
|
||||||
|
{
|
||||||
|
Name = "action",
|
||||||
|
Id = actionId,
|
||||||
|
Reference = new Pipelines.RepositoryPathReference()
|
||||||
|
{
|
||||||
|
Name = "TingluoHuang/runner_L0",
|
||||||
|
Ref = "RepositoryActionWithActionYamlFile_DockerHubImage",
|
||||||
|
RepositoryType = "GitHub"
|
||||||
|
}
|
||||||
|
}
|
||||||
|
};
|
||||||
|
|
||||||
|
var actionDir = Path.Combine(_hc.GetDirectory(WellKnownDirectory.Actions), "TingluoHuang", "runner_L0", "RepositoryActionWithActionYamlFile_DockerHubImage");
|
||||||
|
|
||||||
|
//Act
|
||||||
|
var steps = await _actionManager.PrepareActionsAsync(_ec.Object, actions);
|
||||||
|
|
||||||
|
Assert.Equal((steps[0].Data as ContainerSetupInfo).StepIds[0], actionId);
|
||||||
|
Assert.Equal("ubuntu:18.04", (steps[0].Data as ContainerSetupInfo).Container.Image);
|
||||||
|
}
|
||||||
|
finally
|
||||||
|
{
|
||||||
|
Teardown();
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
[Fact]
|
[Fact]
|
||||||
[Trait("Level", "L0")]
|
[Trait("Level", "L0")]
|
||||||
[Trait("Category", "Worker")]
|
[Trait("Category", "Worker")]
|
||||||
@@ -672,7 +711,7 @@ namespace GitHub.Runner.Common.Tests.Worker
|
|||||||
name: 'Hello World'
|
name: 'Hello World'
|
||||||
description: 'Greet the world and record the time'
|
description: 'Greet the world and record the time'
|
||||||
author: 'GitHub'
|
author: 'GitHub'
|
||||||
inputs:
|
inputs:
|
||||||
greeting: # id of input
|
greeting: # id of input
|
||||||
description: 'The greeting we choose - will print ""{greeting}, World!"" on stdout'
|
description: 'The greeting we choose - will print ""{greeting}, World!"" on stdout'
|
||||||
required: true
|
required: true
|
||||||
@@ -772,7 +811,7 @@ runs:
|
|||||||
name: 'Hello World'
|
name: 'Hello World'
|
||||||
description: 'Greet the world and record the time'
|
description: 'Greet the world and record the time'
|
||||||
author: 'GitHub'
|
author: 'GitHub'
|
||||||
inputs:
|
inputs:
|
||||||
greeting: # id of input
|
greeting: # id of input
|
||||||
description: 'The greeting we choose - will print ""{greeting}, World!"" on stdout'
|
description: 'The greeting we choose - will print ""{greeting}, World!"" on stdout'
|
||||||
required: true
|
required: true
|
||||||
@@ -871,7 +910,7 @@ runs:
|
|||||||
name: 'Hello World'
|
name: 'Hello World'
|
||||||
description: 'Greet the world and record the time'
|
description: 'Greet the world and record the time'
|
||||||
author: 'GitHub'
|
author: 'GitHub'
|
||||||
inputs:
|
inputs:
|
||||||
greeting: # id of input
|
greeting: # id of input
|
||||||
description: 'The greeting we choose - will print ""{greeting}, World!"" on stdout'
|
description: 'The greeting we choose - will print ""{greeting}, World!"" on stdout'
|
||||||
required: true
|
required: true
|
||||||
@@ -925,6 +964,87 @@ runs:
|
|||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
|
[Fact]
|
||||||
|
[Trait("Level", "L0")]
|
||||||
|
[Trait("Category", "Worker")]
|
||||||
|
public void LoadsNodeActionDefinitionYaml()
|
||||||
|
{
|
||||||
|
try
|
||||||
|
{
|
||||||
|
// Arrange.
|
||||||
|
Setup();
|
||||||
|
const string Content = @"
|
||||||
|
# Container action
|
||||||
|
name: 'Hello World'
|
||||||
|
description: 'Greet the world and record the time'
|
||||||
|
author: 'GitHub'
|
||||||
|
inputs:
|
||||||
|
greeting: # id of input
|
||||||
|
description: 'The greeting we choose - will print ""{greeting}, World!"" on stdout'
|
||||||
|
required: true
|
||||||
|
default: 'Hello'
|
||||||
|
entryPoint: # id of input
|
||||||
|
description: 'optional docker entrypoint overwrite.'
|
||||||
|
required: false
|
||||||
|
outputs:
|
||||||
|
time: # id of output
|
||||||
|
description: 'The time we did the greeting'
|
||||||
|
icon: 'hello.svg' # vector art to display in the GitHub Marketplace
|
||||||
|
color: 'green' # optional, decorates the entry in the GitHub Marketplace
|
||||||
|
runs:
|
||||||
|
using: 'node12'
|
||||||
|
main: 'task.js'
|
||||||
|
";
|
||||||
|
Pipelines.ActionStep instance;
|
||||||
|
string directory;
|
||||||
|
directory = Path.Combine(_workFolder, Constants.Path.ActionsDirectory, "GitHub/actions".Replace(Path.AltDirectorySeparatorChar, Path.DirectorySeparatorChar), "master");
|
||||||
|
string file = Path.Combine(directory, Constants.Path.ActionManifestYamlFile);
|
||||||
|
Directory.CreateDirectory(Path.GetDirectoryName(file));
|
||||||
|
File.WriteAllText(file, Content);
|
||||||
|
instance = new Pipelines.ActionStep()
|
||||||
|
{
|
||||||
|
Id = Guid.NewGuid(),
|
||||||
|
Reference = new Pipelines.RepositoryPathReference()
|
||||||
|
{
|
||||||
|
Name = "GitHub/actions",
|
||||||
|
Ref = "master",
|
||||||
|
RepositoryType = Pipelines.RepositoryTypes.GitHub
|
||||||
|
}
|
||||||
|
};
|
||||||
|
|
||||||
|
// Act.
|
||||||
|
Definition definition = _actionManager.LoadAction(_ec.Object, instance);
|
||||||
|
|
||||||
|
// Assert.
|
||||||
|
Assert.NotNull(definition);
|
||||||
|
Assert.Equal(directory, definition.Directory);
|
||||||
|
Assert.NotNull(definition.Data);
|
||||||
|
Assert.NotNull(definition.Data.Inputs); // inputs
|
||||||
|
Dictionary<string, string> inputDefaults = new Dictionary<string, string>(StringComparer.OrdinalIgnoreCase);
|
||||||
|
foreach (var input in definition.Data.Inputs)
|
||||||
|
{
|
||||||
|
var name = input.Key.AssertString("key").Value;
|
||||||
|
var value = input.Value.AssertScalar("value").ToString();
|
||||||
|
|
||||||
|
_hc.GetTrace().Info($"Default: {name} = {value}");
|
||||||
|
inputDefaults[name] = value;
|
||||||
|
}
|
||||||
|
|
||||||
|
Assert.Equal(2, inputDefaults.Count);
|
||||||
|
Assert.True(inputDefaults.ContainsKey("greeting"));
|
||||||
|
Assert.Equal("Hello", inputDefaults["greeting"]);
|
||||||
|
Assert.True(string.IsNullOrEmpty(inputDefaults["entryPoint"]));
|
||||||
|
Assert.NotNull(definition.Data.Execution); // execution
|
||||||
|
|
||||||
|
Assert.NotNull((definition.Data.Execution as NodeJSActionExecutionData));
|
||||||
|
Assert.Equal("task.js", (definition.Data.Execution as NodeJSActionExecutionData).Script);
|
||||||
|
}
|
||||||
|
finally
|
||||||
|
{
|
||||||
|
Teardown();
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
[Fact]
|
[Fact]
|
||||||
[Trait("Level", "L0")]
|
[Trait("Level", "L0")]
|
||||||
[Trait("Category", "Worker")]
|
[Trait("Category", "Worker")]
|
||||||
@@ -940,7 +1060,7 @@ runs:
|
|||||||
name: 'Hello World'
|
name: 'Hello World'
|
||||||
description: 'Greet the world and record the time'
|
description: 'Greet the world and record the time'
|
||||||
author: 'GitHub'
|
author: 'GitHub'
|
||||||
inputs:
|
inputs:
|
||||||
greeting: # id of input
|
greeting: # id of input
|
||||||
description: 'The greeting we choose - will print ""{greeting}, World!"" on stdout'
|
description: 'The greeting we choose - will print ""{greeting}, World!"" on stdout'
|
||||||
required: true
|
required: true
|
||||||
@@ -1039,7 +1159,7 @@ runs:
|
|||||||
name: 'Hello World'
|
name: 'Hello World'
|
||||||
description: 'Greet the world and record the time'
|
description: 'Greet the world and record the time'
|
||||||
author: 'GitHub'
|
author: 'GitHub'
|
||||||
inputs:
|
inputs:
|
||||||
greeting: # id of input
|
greeting: # id of input
|
||||||
description: 'The greeting we choose - will print ""{greeting}, World!"" on stdout'
|
description: 'The greeting we choose - will print ""{greeting}, World!"" on stdout'
|
||||||
required: true
|
required: true
|
||||||
@@ -1137,7 +1257,7 @@ runs:
|
|||||||
name: 'Hello World'
|
name: 'Hello World'
|
||||||
description: 'Greet the world and record the time'
|
description: 'Greet the world and record the time'
|
||||||
author: 'GitHub'
|
author: 'GitHub'
|
||||||
inputs:
|
inputs:
|
||||||
greeting: # id of input
|
greeting: # id of input
|
||||||
description: 'The greeting we choose - will print ""{greeting}, World!"" on stdout'
|
description: 'The greeting we choose - will print ""{greeting}, World!"" on stdout'
|
||||||
required: true
|
required: true
|
||||||
@@ -1205,7 +1325,7 @@ runs:
|
|||||||
name: 'Hello World'
|
name: 'Hello World'
|
||||||
description: 'Greet the world and record the time'
|
description: 'Greet the world and record the time'
|
||||||
author: 'GitHub'
|
author: 'GitHub'
|
||||||
inputs:
|
inputs:
|
||||||
greeting: # id of input
|
greeting: # id of input
|
||||||
description: 'The greeting we choose - will print ""{greeting}, World!"" on stdout'
|
description: 'The greeting we choose - will print ""{greeting}, World!"" on stdout'
|
||||||
required: true
|
required: true
|
||||||
@@ -1276,7 +1396,7 @@ runs:
|
|||||||
name: 'Hello World'
|
name: 'Hello World'
|
||||||
description: 'Greet the world and record the time'
|
description: 'Greet the world and record the time'
|
||||||
author: 'GitHub'
|
author: 'GitHub'
|
||||||
inputs:
|
inputs:
|
||||||
greeting: # id of input
|
greeting: # id of input
|
||||||
description: 'The greeting we choose - will print ""{greeting}, World!"" on stdout'
|
description: 'The greeting we choose - will print ""{greeting}, World!"" on stdout'
|
||||||
required: true
|
required: true
|
||||||
@@ -1376,7 +1496,7 @@ runs:
|
|||||||
name: 'Hello World'
|
name: 'Hello World'
|
||||||
description: 'Greet the world and record the time'
|
description: 'Greet the world and record the time'
|
||||||
author: 'Test Corporation'
|
author: 'Test Corporation'
|
||||||
inputs:
|
inputs:
|
||||||
greeting: # id of input
|
greeting: # id of input
|
||||||
description: 'The greeting we choose - will print ""{greeting}, World!"" on stdout'
|
description: 'The greeting we choose - will print ""{greeting}, World!"" on stdout'
|
||||||
required: true
|
required: true
|
||||||
@@ -1433,7 +1553,7 @@ runs:
|
|||||||
private void CreateAction(string yamlContent, out Pipelines.ActionStep instance, out string directory)
|
private void CreateAction(string yamlContent, out Pipelines.ActionStep instance, out string directory)
|
||||||
{
|
{
|
||||||
directory = Path.Combine(_workFolder, Constants.Path.ActionsDirectory, "GitHub/actions".Replace(Path.AltDirectorySeparatorChar, Path.DirectorySeparatorChar), "master");
|
directory = Path.Combine(_workFolder, Constants.Path.ActionsDirectory, "GitHub/actions".Replace(Path.AltDirectorySeparatorChar, Path.DirectorySeparatorChar), "master");
|
||||||
string file = Path.Combine(directory, Constants.Path.ActionManifestFile);
|
string file = Path.Combine(directory, Constants.Path.ActionManifestYmlFile);
|
||||||
Directory.CreateDirectory(Path.GetDirectoryName(file));
|
Directory.CreateDirectory(Path.GetDirectoryName(file));
|
||||||
File.WriteAllText(file, yamlContent);
|
File.WriteAllText(file, yamlContent);
|
||||||
instance = new Pipelines.ActionStep()
|
instance = new Pipelines.ActionStep()
|
||||||
@@ -1451,7 +1571,7 @@ runs:
|
|||||||
private void CreateSelfRepoAction(string yamlContent, out Pipelines.ActionStep instance, out string directory)
|
private void CreateSelfRepoAction(string yamlContent, out Pipelines.ActionStep instance, out string directory)
|
||||||
{
|
{
|
||||||
directory = Path.Combine(_workFolder, "actions", "actions");
|
directory = Path.Combine(_workFolder, "actions", "actions");
|
||||||
string file = Path.Combine(directory, Constants.Path.ActionManifestFile);
|
string file = Path.Combine(directory, Constants.Path.ActionManifestYmlFile);
|
||||||
Directory.CreateDirectory(Path.GetDirectoryName(file));
|
Directory.CreateDirectory(Path.GetDirectoryName(file));
|
||||||
File.WriteAllText(file, yamlContent);
|
File.WriteAllText(file, yamlContent);
|
||||||
instance = new Pipelines.ActionStep()
|
instance = new Pipelines.ActionStep()
|
||||||
|
|||||||
@@ -109,6 +109,52 @@ namespace GitHub.Runner.Common.Tests.Worker
|
|||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
|
[Fact]
|
||||||
|
[Trait("Level", "L0")]
|
||||||
|
[Trait("Category", "Worker")]
|
||||||
|
public void Load_ContainerAction_Dockerfile_Post_DefaultCondition()
|
||||||
|
{
|
||||||
|
try
|
||||||
|
{
|
||||||
|
//Arrange
|
||||||
|
Setup();
|
||||||
|
|
||||||
|
var actionManifest = new ActionManifestManager();
|
||||||
|
actionManifest.Initialize(_hc);
|
||||||
|
|
||||||
|
//Act
|
||||||
|
var result = actionManifest.Load(_ec.Object, Path.Combine(TestUtil.GetTestDataPath(), "dockerfileaction_cleanup_default.yml"));
|
||||||
|
|
||||||
|
//Assert
|
||||||
|
|
||||||
|
Assert.Equal("Hello World", result.Name);
|
||||||
|
Assert.Equal("Greet the world and record the time", result.Description);
|
||||||
|
Assert.Equal(2, result.Inputs.Count);
|
||||||
|
Assert.Equal("greeting", result.Inputs[0].Key.AssertString("key").Value);
|
||||||
|
Assert.Equal("Hello", result.Inputs[0].Value.AssertString("value").Value);
|
||||||
|
Assert.Equal("entryPoint", result.Inputs[1].Key.AssertString("key").Value);
|
||||||
|
Assert.Equal("", result.Inputs[1].Value.AssertString("value").Value);
|
||||||
|
|
||||||
|
Assert.Equal(ActionExecutionType.Container, result.Execution.ExecutionType);
|
||||||
|
|
||||||
|
var containerAction = result.Execution as ContainerActionExecutionData;
|
||||||
|
|
||||||
|
Assert.Equal("Dockerfile", containerAction.Image);
|
||||||
|
Assert.Equal("main.sh", containerAction.EntryPoint);
|
||||||
|
Assert.Equal("cleanup.sh", containerAction.Cleanup);
|
||||||
|
Assert.Equal("always()", containerAction.CleanupCondition);
|
||||||
|
Assert.Equal("bzz", containerAction.Arguments[0].ToString());
|
||||||
|
Assert.Equal("Token", containerAction.Environment[0].Key.ToString());
|
||||||
|
Assert.Equal("foo", containerAction.Environment[0].Value.ToString());
|
||||||
|
Assert.Equal("Url", containerAction.Environment[1].Key.ToString());
|
||||||
|
Assert.Equal("bar", containerAction.Environment[1].Value.ToString());
|
||||||
|
}
|
||||||
|
finally
|
||||||
|
{
|
||||||
|
Teardown();
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
[Fact]
|
[Fact]
|
||||||
[Trait("Level", "L0")]
|
[Trait("Level", "L0")]
|
||||||
[Trait("Category", "Worker")]
|
[Trait("Category", "Worker")]
|
||||||
@@ -319,6 +365,50 @@ namespace GitHub.Runner.Common.Tests.Worker
|
|||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
|
[Fact]
|
||||||
|
[Trait("Level", "L0")]
|
||||||
|
[Trait("Category", "Worker")]
|
||||||
|
public void Load_NodeAction_Cleanup_DefaultCondition()
|
||||||
|
{
|
||||||
|
try
|
||||||
|
{
|
||||||
|
//Arrange
|
||||||
|
Setup();
|
||||||
|
|
||||||
|
var actionManifest = new ActionManifestManager();
|
||||||
|
actionManifest.Initialize(_hc);
|
||||||
|
|
||||||
|
//Act
|
||||||
|
var result = actionManifest.Load(_ec.Object, Path.Combine(TestUtil.GetTestDataPath(), "nodeaction_cleanup_default.yml"));
|
||||||
|
|
||||||
|
//Assert
|
||||||
|
Assert.Equal("Hello World", result.Name);
|
||||||
|
Assert.Equal("Greet the world and record the time", result.Description);
|
||||||
|
Assert.Equal(2, result.Inputs.Count);
|
||||||
|
Assert.Equal("greeting", result.Inputs[0].Key.AssertString("key").Value);
|
||||||
|
Assert.Equal("Hello", result.Inputs[0].Value.AssertString("value").Value);
|
||||||
|
Assert.Equal("entryPoint", result.Inputs[1].Key.AssertString("key").Value);
|
||||||
|
Assert.Equal("", result.Inputs[1].Value.AssertString("value").Value);
|
||||||
|
Assert.Equal(1, result.Deprecated.Count);
|
||||||
|
|
||||||
|
Assert.True(result.Deprecated.ContainsKey("greeting"));
|
||||||
|
result.Deprecated.TryGetValue("greeting", out string value);
|
||||||
|
Assert.Equal("This property has been deprecated", value);
|
||||||
|
|
||||||
|
Assert.Equal(ActionExecutionType.NodeJS, result.Execution.ExecutionType);
|
||||||
|
|
||||||
|
var nodeAction = result.Execution as NodeJSActionExecutionData;
|
||||||
|
|
||||||
|
Assert.Equal("main.js", nodeAction.Script);
|
||||||
|
Assert.Equal("cleanup.js", nodeAction.Cleanup);
|
||||||
|
Assert.Equal("always()", nodeAction.CleanupCondition);
|
||||||
|
}
|
||||||
|
finally
|
||||||
|
{
|
||||||
|
Teardown();
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
[Fact]
|
[Fact]
|
||||||
[Trait("Level", "L0")]
|
[Trait("Level", "L0")]
|
||||||
[Trait("Category", "Worker")]
|
[Trait("Category", "Worker")]
|
||||||
|
|||||||
@@ -314,6 +314,12 @@ namespace GitHub.Runner.Common.Tests.Worker
|
|||||||
githubContext.Add("event", JToken.Parse("{\"foo\":\"bar\"}").ToPipelineContextData());
|
githubContext.Add("event", JToken.Parse("{\"foo\":\"bar\"}").ToPipelineContextData());
|
||||||
_context.Add("github", githubContext);
|
_context.Add("github", githubContext);
|
||||||
|
|
||||||
|
#if OS_WINDOWS
|
||||||
|
_context["env"] = new DictionaryContextData();
|
||||||
|
#else
|
||||||
|
_context["env"] = new CaseSensitiveDictionaryContextData();
|
||||||
|
#endif
|
||||||
|
|
||||||
_ec = new Mock<IExecutionContext>();
|
_ec = new Mock<IExecutionContext>();
|
||||||
_ec.Setup(x => x.ExpressionValues).Returns(_context);
|
_ec.Setup(x => x.ExpressionValues).Returns(_context);
|
||||||
_ec.Setup(x => x.IntraActionState).Returns(new Dictionary<string, string>());
|
_ec.Setup(x => x.IntraActionState).Returns(new Dictionary<string, string>());
|
||||||
|
|||||||
@@ -8,6 +8,7 @@ using System.Threading.Tasks;
|
|||||||
using System.Runtime.CompilerServices;
|
using System.Runtime.CompilerServices;
|
||||||
using GitHub.Runner.Sdk;
|
using GitHub.Runner.Sdk;
|
||||||
using GitHub.Runner.Worker;
|
using GitHub.Runner.Worker;
|
||||||
|
using GitHub.Runner.Worker.Container;
|
||||||
using GitHub.Runner.Worker.Handlers;
|
using GitHub.Runner.Worker.Handlers;
|
||||||
using Moq;
|
using Moq;
|
||||||
using Xunit;
|
using Xunit;
|
||||||
@@ -748,6 +749,130 @@ namespace GitHub.Runner.Common.Tests.Worker
|
|||||||
Environment.SetEnvironmentVariable("RUNNER_TEST_GET_REPOSITORY_PATH_FAILSAFE", "");
|
Environment.SetEnvironmentVariable("RUNNER_TEST_GET_REPOSITORY_PATH_FAILSAFE", "");
|
||||||
}
|
}
|
||||||
|
|
||||||
|
#if OS_LINUX
|
||||||
|
[Fact]
|
||||||
|
[Trait("Level", "L0")]
|
||||||
|
[Trait("Category", "Worker")]
|
||||||
|
public async void MatcherFile_JobContainer()
|
||||||
|
{
|
||||||
|
var matchers = new IssueMatchersConfig
|
||||||
|
{
|
||||||
|
Matchers =
|
||||||
|
{
|
||||||
|
new IssueMatcherConfig
|
||||||
|
{
|
||||||
|
Owner = "my-matcher-1",
|
||||||
|
Patterns = new[]
|
||||||
|
{
|
||||||
|
new IssuePatternConfig
|
||||||
|
{
|
||||||
|
Pattern = @"(.+): (.+)",
|
||||||
|
File = 1,
|
||||||
|
Message = 2,
|
||||||
|
},
|
||||||
|
},
|
||||||
|
},
|
||||||
|
},
|
||||||
|
};
|
||||||
|
var container = new ContainerInfo();
|
||||||
|
using (var hostContext = Setup(matchers: matchers, jobContainer: container))
|
||||||
|
using (_outputManager)
|
||||||
|
{
|
||||||
|
// Setup github.workspace, github.repository
|
||||||
|
var workDirectory = hostContext.GetDirectory(WellKnownDirectory.Work);
|
||||||
|
ArgUtil.NotNullOrEmpty(workDirectory, nameof(workDirectory));
|
||||||
|
Directory.CreateDirectory(workDirectory);
|
||||||
|
var workspaceDirectory = Path.Combine(workDirectory, "workspace");
|
||||||
|
Directory.CreateDirectory(workspaceDirectory);
|
||||||
|
_executionContext.Setup(x => x.GetGitHubContext("workspace")).Returns(workspaceDirectory);
|
||||||
|
_executionContext.Setup(x => x.GetGitHubContext("repository")).Returns("my-org/workflow-repo");
|
||||||
|
|
||||||
|
// Setup a git repository
|
||||||
|
await CreateRepository(hostContext, workspaceDirectory, "https://github.com/my-org/workflow-repo");
|
||||||
|
|
||||||
|
// Create test files
|
||||||
|
var file = Path.Combine(workspaceDirectory, "some-file.txt");
|
||||||
|
File.WriteAllText(file, "");
|
||||||
|
|
||||||
|
// Add translation path
|
||||||
|
container.AddPathTranslateMapping(workspaceDirectory, "/container/path/to/workspace");
|
||||||
|
|
||||||
|
// Process
|
||||||
|
Process($"/container/path/to/workspace/some-file.txt: some error 1");
|
||||||
|
Process($"some-file.txt: some error 2");
|
||||||
|
|
||||||
|
Assert.Equal(2, _issues.Count);
|
||||||
|
|
||||||
|
Assert.Equal("some error 1", _issues[0].Item1.Message);
|
||||||
|
Assert.Equal("some-file.txt", _issues[0].Item1.Data["file"]);
|
||||||
|
|
||||||
|
Assert.Equal("some error 2", _issues[1].Item1.Message);
|
||||||
|
Assert.Equal("some-file.txt", _issues[1].Item1.Data["file"]);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
[Fact]
|
||||||
|
[Trait("Level", "L0")]
|
||||||
|
[Trait("Category", "Worker")]
|
||||||
|
public async void MatcherFile_StepContainer()
|
||||||
|
{
|
||||||
|
var matchers = new IssueMatchersConfig
|
||||||
|
{
|
||||||
|
Matchers =
|
||||||
|
{
|
||||||
|
new IssueMatcherConfig
|
||||||
|
{
|
||||||
|
Owner = "my-matcher-1",
|
||||||
|
Patterns = new[]
|
||||||
|
{
|
||||||
|
new IssuePatternConfig
|
||||||
|
{
|
||||||
|
Pattern = @"(.+): (.+)",
|
||||||
|
File = 1,
|
||||||
|
Message = 2,
|
||||||
|
},
|
||||||
|
},
|
||||||
|
},
|
||||||
|
},
|
||||||
|
};
|
||||||
|
var container = new ContainerInfo();
|
||||||
|
using (var hostContext = Setup(matchers: matchers, stepContainer: container))
|
||||||
|
using (_outputManager)
|
||||||
|
{
|
||||||
|
// Setup github.workspace, github.repository
|
||||||
|
var workDirectory = hostContext.GetDirectory(WellKnownDirectory.Work);
|
||||||
|
ArgUtil.NotNullOrEmpty(workDirectory, nameof(workDirectory));
|
||||||
|
Directory.CreateDirectory(workDirectory);
|
||||||
|
var workspaceDirectory = Path.Combine(workDirectory, "workspace");
|
||||||
|
Directory.CreateDirectory(workspaceDirectory);
|
||||||
|
_executionContext.Setup(x => x.GetGitHubContext("workspace")).Returns(workspaceDirectory);
|
||||||
|
_executionContext.Setup(x => x.GetGitHubContext("repository")).Returns("my-org/workflow-repo");
|
||||||
|
|
||||||
|
// Setup a git repository
|
||||||
|
await CreateRepository(hostContext, workspaceDirectory, "https://github.com/my-org/workflow-repo");
|
||||||
|
|
||||||
|
// Create test files
|
||||||
|
var file = Path.Combine(workspaceDirectory, "some-file.txt");
|
||||||
|
File.WriteAllText(file, "");
|
||||||
|
|
||||||
|
// Add translation path
|
||||||
|
container.AddPathTranslateMapping(workspaceDirectory, "/container/path/to/workspace");
|
||||||
|
|
||||||
|
// Process
|
||||||
|
Process($"/container/path/to/workspace/some-file.txt: some error 1");
|
||||||
|
Process($"some-file.txt: some error 2");
|
||||||
|
|
||||||
|
Assert.Equal(2, _issues.Count);
|
||||||
|
|
||||||
|
Assert.Equal("some error 1", _issues[0].Item1.Message);
|
||||||
|
Assert.Equal("some-file.txt", _issues[0].Item1.Data["file"]);
|
||||||
|
|
||||||
|
Assert.Equal("some error 2", _issues[1].Item1.Message);
|
||||||
|
Assert.Equal("some-file.txt", _issues[1].Item1.Data["file"]);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
#endif
|
||||||
|
|
||||||
[Fact]
|
[Fact]
|
||||||
[Trait("Level", "L0")]
|
[Trait("Level", "L0")]
|
||||||
[Trait("Category", "Worker")]
|
[Trait("Category", "Worker")]
|
||||||
@@ -806,7 +931,9 @@ namespace GitHub.Runner.Common.Tests.Worker
|
|||||||
|
|
||||||
private TestHostContext Setup(
|
private TestHostContext Setup(
|
||||||
[CallerMemberName] string name = "",
|
[CallerMemberName] string name = "",
|
||||||
IssueMatchersConfig matchers = null)
|
IssueMatchersConfig matchers = null,
|
||||||
|
ContainerInfo jobContainer = null,
|
||||||
|
ContainerInfo stepContainer = null)
|
||||||
{
|
{
|
||||||
matchers?.Validate();
|
matchers?.Validate();
|
||||||
|
|
||||||
@@ -824,6 +951,8 @@ namespace GitHub.Runner.Common.Tests.Worker
|
|||||||
.Returns(true);
|
.Returns(true);
|
||||||
_executionContext.Setup(x => x.Variables)
|
_executionContext.Setup(x => x.Variables)
|
||||||
.Returns(_variables);
|
.Returns(_variables);
|
||||||
|
_executionContext.Setup(x => x.Container)
|
||||||
|
.Returns(jobContainer);
|
||||||
_executionContext.Setup(x => x.GetMatchers())
|
_executionContext.Setup(x => x.GetMatchers())
|
||||||
.Returns(matchers?.Matchers ?? new List<IssueMatcherConfig>());
|
.Returns(matchers?.Matchers ?? new List<IssueMatcherConfig>());
|
||||||
_executionContext.Setup(x => x.Add(It.IsAny<OnMatcherChanged>()))
|
_executionContext.Setup(x => x.Add(It.IsAny<OnMatcherChanged>()))
|
||||||
@@ -856,7 +985,7 @@ namespace GitHub.Runner.Common.Tests.Worker
|
|||||||
return false;
|
return false;
|
||||||
});
|
});
|
||||||
|
|
||||||
_outputManager = new OutputManager(_executionContext.Object, _commandManager.Object);
|
_outputManager = new OutputManager(_executionContext.Object, _commandManager.Object, stepContainer);
|
||||||
return hostContext;
|
return hostContext;
|
||||||
}
|
}
|
||||||
|
|
||||||
|
|||||||
@@ -19,6 +19,7 @@ namespace GitHub.Runner.Common.Tests.Worker
|
|||||||
private Mock<IExecutionContext> _ec;
|
private Mock<IExecutionContext> _ec;
|
||||||
private StepsRunner _stepsRunner;
|
private StepsRunner _stepsRunner;
|
||||||
private Variables _variables;
|
private Variables _variables;
|
||||||
|
private Dictionary<string, string> _env;
|
||||||
private DictionaryContextData _contexts;
|
private DictionaryContextData _contexts;
|
||||||
private JobContext _jobContext;
|
private JobContext _jobContext;
|
||||||
private StepsContext _stepContext;
|
private StepsContext _stepContext;
|
||||||
@@ -32,6 +33,11 @@ namespace GitHub.Runner.Common.Tests.Worker
|
|||||||
_variables = new Variables(
|
_variables = new Variables(
|
||||||
hostContext: hc,
|
hostContext: hc,
|
||||||
copy: variablesToCopy);
|
copy: variablesToCopy);
|
||||||
|
_env = new Dictionary<string, string>()
|
||||||
|
{
|
||||||
|
{"env1", "1"},
|
||||||
|
{"test", "github_actions"}
|
||||||
|
};
|
||||||
_ec = new Mock<IExecutionContext>();
|
_ec = new Mock<IExecutionContext>();
|
||||||
_ec.SetupAllProperties();
|
_ec.SetupAllProperties();
|
||||||
_ec.Setup(x => x.Variables).Returns(_variables);
|
_ec.Setup(x => x.Variables).Returns(_variables);
|
||||||
@@ -64,9 +70,9 @@ namespace GitHub.Runner.Common.Tests.Worker
|
|||||||
// Arrange.
|
// Arrange.
|
||||||
var variableSets = new[]
|
var variableSets = new[]
|
||||||
{
|
{
|
||||||
new[] { CreateStep(TaskResult.Succeeded, "success()"), CreateStep(TaskResult.Succeeded, "success()") },
|
new[] { CreateStep(hc, TaskResult.Succeeded, "success()"), CreateStep(hc, TaskResult.Succeeded, "success()") },
|
||||||
new[] { CreateStep(TaskResult.Succeeded, "success()"), CreateStep(TaskResult.Succeeded, "success() || failure()") },
|
new[] { CreateStep(hc, TaskResult.Succeeded, "success()"), CreateStep(hc, TaskResult.Succeeded, "success() || failure()") },
|
||||||
new[] { CreateStep(TaskResult.Succeeded, "success()"), CreateStep(TaskResult.Succeeded, "always()") }
|
new[] { CreateStep(hc, TaskResult.Succeeded, "success()"), CreateStep(hc, TaskResult.Succeeded, "always()") }
|
||||||
};
|
};
|
||||||
foreach (var variableSet in variableSets)
|
foreach (var variableSet in variableSets)
|
||||||
{
|
{
|
||||||
@@ -96,12 +102,12 @@ namespace GitHub.Runner.Common.Tests.Worker
|
|||||||
// Arrange.
|
// Arrange.
|
||||||
var variableSets = new[]
|
var variableSets = new[]
|
||||||
{
|
{
|
||||||
new[] { CreateStep(TaskResult.Failed, "success()", true), CreateStep(TaskResult.Succeeded, "success()") },
|
new[] { CreateStep(hc, TaskResult.Failed, "success()", true), CreateStep(hc, TaskResult.Succeeded, "success()") },
|
||||||
new[] { CreateStep(TaskResult.Failed, "success()", true), CreateStep(TaskResult.Succeeded, "success() || failure()") },
|
new[] { CreateStep(hc, TaskResult.Failed, "success()", true), CreateStep(hc, TaskResult.Succeeded, "success() || failure()") },
|
||||||
new[] { CreateStep(TaskResult.Failed, "success()", true), CreateStep(TaskResult.Succeeded, "always()") },
|
new[] { CreateStep(hc, TaskResult.Failed, "success()", true), CreateStep(hc, TaskResult.Succeeded, "always()") },
|
||||||
new[] { CreateStep(TaskResult.Failed, "success()", true), CreateStep(TaskResult.Failed, "success()", true) },
|
new[] { CreateStep(hc, TaskResult.Failed, "success()", true), CreateStep(hc, TaskResult.Failed, "success()", true) },
|
||||||
new[] { CreateStep(TaskResult.Failed, "success()", true), CreateStep(TaskResult.Failed, "success() || failure()", true) },
|
new[] { CreateStep(hc, TaskResult.Failed, "success()", true), CreateStep(hc, TaskResult.Failed, "success() || failure()", true) },
|
||||||
new[] { CreateStep(TaskResult.Failed, "success()", true), CreateStep(TaskResult.Failed, "always()", true) }
|
new[] { CreateStep(hc, TaskResult.Failed, "success()", true), CreateStep(hc, TaskResult.Failed, "always()", true) }
|
||||||
};
|
};
|
||||||
foreach (var variableSet in variableSets)
|
foreach (var variableSet in variableSets)
|
||||||
{
|
{
|
||||||
@@ -133,12 +139,12 @@ namespace GitHub.Runner.Common.Tests.Worker
|
|||||||
{
|
{
|
||||||
new
|
new
|
||||||
{
|
{
|
||||||
Steps = new[] { CreateStep(TaskResult.Failed, "success()"), CreateStep(TaskResult.Succeeded, "success()") },
|
Steps = new[] { CreateStep(hc, TaskResult.Failed, "success()"), CreateStep(hc, TaskResult.Succeeded, "success()") },
|
||||||
Expected = false,
|
Expected = false,
|
||||||
},
|
},
|
||||||
new
|
new
|
||||||
{
|
{
|
||||||
Steps = new[] { CreateStep(TaskResult.Failed, "success()"), CreateStep(TaskResult.Succeeded, "success() || failure()") },
|
Steps = new[] { CreateStep(hc, TaskResult.Failed, "success()"), CreateStep(hc, TaskResult.Succeeded, "success() || failure()") },
|
||||||
Expected = true,
|
Expected = true,
|
||||||
},
|
},
|
||||||
};
|
};
|
||||||
@@ -172,27 +178,27 @@ namespace GitHub.Runner.Common.Tests.Worker
|
|||||||
{
|
{
|
||||||
new
|
new
|
||||||
{
|
{
|
||||||
Steps = new[] { CreateStep(TaskResult.Succeeded, "success()"), CreateStep(TaskResult.Succeeded, "always()") },
|
Steps = new[] { CreateStep(hc, TaskResult.Succeeded, "success()"), CreateStep(hc, TaskResult.Succeeded, "always()") },
|
||||||
Expected = TaskResult.Succeeded,
|
Expected = TaskResult.Succeeded,
|
||||||
},
|
},
|
||||||
new
|
new
|
||||||
{
|
{
|
||||||
Steps = new[] { CreateStep(TaskResult.Failed, "success()"), CreateStep(TaskResult.Succeeded, "always()") },
|
Steps = new[] { CreateStep(hc, TaskResult.Failed, "success()"), CreateStep(hc, TaskResult.Succeeded, "always()") },
|
||||||
Expected = TaskResult.Failed,
|
Expected = TaskResult.Failed,
|
||||||
},
|
},
|
||||||
new
|
new
|
||||||
{
|
{
|
||||||
Steps = new[] { CreateStep(TaskResult.Failed, "success()"), CreateStep(TaskResult.Succeeded, "always()") },
|
Steps = new[] { CreateStep(hc, TaskResult.Failed, "success()"), CreateStep(hc, TaskResult.Succeeded, "always()") },
|
||||||
Expected = TaskResult.Failed,
|
Expected = TaskResult.Failed,
|
||||||
},
|
},
|
||||||
new
|
new
|
||||||
{
|
{
|
||||||
Steps = new[] { CreateStep(TaskResult.Succeeded, "success()"), CreateStep(TaskResult.Failed, "always()") },
|
Steps = new[] { CreateStep(hc, TaskResult.Succeeded, "success()"), CreateStep(hc, TaskResult.Failed, "always()") },
|
||||||
Expected = TaskResult.Failed,
|
Expected = TaskResult.Failed,
|
||||||
},
|
},
|
||||||
new
|
new
|
||||||
{
|
{
|
||||||
Steps = new[] { CreateStep(TaskResult.Succeeded, "success()"), CreateStep(TaskResult.Failed, "always()", true) },
|
Steps = new[] { CreateStep(hc, TaskResult.Succeeded, "success()"), CreateStep(hc, TaskResult.Failed, "always()", true) },
|
||||||
Expected = TaskResult.Succeeded,
|
Expected = TaskResult.Succeeded,
|
||||||
},
|
},
|
||||||
};
|
};
|
||||||
@@ -226,47 +232,47 @@ namespace GitHub.Runner.Common.Tests.Worker
|
|||||||
{
|
{
|
||||||
new
|
new
|
||||||
{
|
{
|
||||||
Steps = new[] { CreateStep(TaskResult.Failed, "success()"), CreateStep(TaskResult.Succeeded, "success()") },
|
Steps = new[] { CreateStep(hc, TaskResult.Failed, "success()"), CreateStep(hc, TaskResult.Succeeded, "success()") },
|
||||||
Expected = TaskResult.Failed
|
Expected = TaskResult.Failed
|
||||||
},
|
},
|
||||||
new
|
new
|
||||||
{
|
{
|
||||||
Steps = new[] { CreateStep(TaskResult.Failed, "success()"), CreateStep(TaskResult.Succeeded, "success() || failure()") },
|
Steps = new[] { CreateStep(hc, TaskResult.Failed, "success()"), CreateStep(hc, TaskResult.Succeeded, "success() || failure()") },
|
||||||
Expected = TaskResult.Failed
|
Expected = TaskResult.Failed
|
||||||
},
|
},
|
||||||
new
|
new
|
||||||
{
|
{
|
||||||
Steps = new[] { CreateStep(TaskResult.Failed, "success()"), CreateStep(TaskResult.Succeeded, "always()") },
|
Steps = new[] { CreateStep(hc, TaskResult.Failed, "success()"), CreateStep(hc, TaskResult.Succeeded, "always()") },
|
||||||
Expected = TaskResult.Failed
|
Expected = TaskResult.Failed
|
||||||
},
|
},
|
||||||
new
|
new
|
||||||
{
|
{
|
||||||
Steps = new[] { CreateStep(TaskResult.Failed, "success()", continueOnError: true), CreateStep(TaskResult.Failed, "success()") },
|
Steps = new[] { CreateStep(hc, TaskResult.Failed, "success()", continueOnError: true), CreateStep(hc, TaskResult.Failed, "success()") },
|
||||||
Expected = TaskResult.Failed
|
Expected = TaskResult.Failed
|
||||||
},
|
},
|
||||||
new
|
new
|
||||||
{
|
{
|
||||||
Steps = new[] { CreateStep(TaskResult.Failed, "success()", continueOnError: true), CreateStep(TaskResult.Succeeded, "success()") },
|
Steps = new[] { CreateStep(hc, TaskResult.Failed, "success()", continueOnError: true), CreateStep(hc, TaskResult.Succeeded, "success()") },
|
||||||
Expected = TaskResult.Succeeded
|
Expected = TaskResult.Succeeded
|
||||||
},
|
},
|
||||||
new
|
new
|
||||||
{
|
{
|
||||||
Steps = new[] { CreateStep(TaskResult.Failed, "success()", continueOnError: true), CreateStep(TaskResult.Failed, "success()", continueOnError: true) },
|
Steps = new[] { CreateStep(hc, TaskResult.Failed, "success()", continueOnError: true), CreateStep(hc, TaskResult.Failed, "success()", continueOnError: true) },
|
||||||
Expected = TaskResult.Succeeded
|
Expected = TaskResult.Succeeded
|
||||||
},
|
},
|
||||||
new
|
new
|
||||||
{
|
{
|
||||||
Steps = new[] { CreateStep(TaskResult.Succeeded, "success() || failure()") },
|
Steps = new[] { CreateStep(hc, TaskResult.Succeeded, "success() || failure()") },
|
||||||
Expected = TaskResult.Succeeded
|
Expected = TaskResult.Succeeded
|
||||||
},
|
},
|
||||||
new
|
new
|
||||||
{
|
{
|
||||||
Steps = new[] { CreateStep(TaskResult.Succeeded, "success()"), CreateStep(TaskResult.Failed, "success()") },
|
Steps = new[] { CreateStep(hc, TaskResult.Succeeded, "success()"), CreateStep(hc, TaskResult.Failed, "success()") },
|
||||||
Expected = TaskResult.Failed
|
Expected = TaskResult.Failed
|
||||||
},
|
},
|
||||||
new
|
new
|
||||||
{
|
{
|
||||||
Steps = new[] { CreateStep(TaskResult.Succeeded, "success()"), CreateStep(TaskResult.Succeeded, "success()") },
|
Steps = new[] { CreateStep(hc, TaskResult.Succeeded, "success()"), CreateStep(hc, TaskResult.Succeeded, "success()") },
|
||||||
Expected = TaskResult.Succeeded
|
Expected = TaskResult.Succeeded
|
||||||
},
|
},
|
||||||
// Abandoned
|
// Abandoned
|
||||||
@@ -304,17 +310,17 @@ namespace GitHub.Runner.Common.Tests.Worker
|
|||||||
{
|
{
|
||||||
new
|
new
|
||||||
{
|
{
|
||||||
Step = new[] { CreateStep(TaskResult.Failed, "success()"), CreateStep(TaskResult.Succeeded, "success()") },
|
Step = new[] { CreateStep(hc, TaskResult.Failed, "success()"), CreateStep(hc, TaskResult.Succeeded, "success()") },
|
||||||
Expected = false
|
Expected = false
|
||||||
},
|
},
|
||||||
new
|
new
|
||||||
{
|
{
|
||||||
Step = new[] { CreateStep(TaskResult.Failed, "success()"), CreateStep(TaskResult.Succeeded, "success() || failure()") },
|
Step = new[] { CreateStep(hc, TaskResult.Failed, "success()"), CreateStep(hc, TaskResult.Succeeded, "success() || failure()") },
|
||||||
Expected = true
|
Expected = true
|
||||||
},
|
},
|
||||||
new
|
new
|
||||||
{
|
{
|
||||||
Step = new[] { CreateStep(TaskResult.Failed, "success()"), CreateStep(TaskResult.Succeeded, "always()") },
|
Step = new[] { CreateStep(hc, TaskResult.Failed, "success()"), CreateStep(hc, TaskResult.Succeeded, "always()") },
|
||||||
Expected = true
|
Expected = true
|
||||||
}
|
}
|
||||||
};
|
};
|
||||||
@@ -345,9 +351,9 @@ namespace GitHub.Runner.Common.Tests.Worker
|
|||||||
// Arrange.
|
// Arrange.
|
||||||
var variableSets = new[]
|
var variableSets = new[]
|
||||||
{
|
{
|
||||||
new[] { CreateStep(TaskResult.Succeeded, "success()"), CreateStep(TaskResult.Succeeded, "always()") },
|
new[] { CreateStep(hc, TaskResult.Succeeded, "success()"), CreateStep(hc, TaskResult.Succeeded, "always()") },
|
||||||
new[] { CreateStep(TaskResult.Failed, "success()"), CreateStep(TaskResult.Succeeded, "always()") },
|
new[] { CreateStep(hc, TaskResult.Failed, "success()"), CreateStep(hc, TaskResult.Succeeded, "always()") },
|
||||||
new[] { CreateStep(TaskResult.Canceled, "success()"), CreateStep(TaskResult.Succeeded, "always()") }
|
new[] { CreateStep(hc, TaskResult.Canceled, "success()"), CreateStep(hc, TaskResult.Succeeded, "always()") }
|
||||||
};
|
};
|
||||||
foreach (var variableSet in variableSets)
|
foreach (var variableSet in variableSets)
|
||||||
{
|
{
|
||||||
@@ -381,8 +387,8 @@ namespace GitHub.Runner.Common.Tests.Worker
|
|||||||
// Arrange.
|
// Arrange.
|
||||||
var variableSets = new[]
|
var variableSets = new[]
|
||||||
{
|
{
|
||||||
new[] { CreateStep(TaskResult.Succeeded, "success()") },
|
new[] { CreateStep(hc, TaskResult.Succeeded, "success()") },
|
||||||
new[] { CreateStep(TaskResult.Succeeded, "success()") },
|
new[] { CreateStep(hc, TaskResult.Succeeded, "success()") },
|
||||||
};
|
};
|
||||||
foreach (var variableSet in variableSets)
|
foreach (var variableSet in variableSets)
|
||||||
{
|
{
|
||||||
@@ -399,18 +405,134 @@ namespace GitHub.Runner.Common.Tests.Worker
|
|||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
private Mock<IStep> CreateStep(TaskResult result, string condition, Boolean continueOnError = false)
|
[Fact]
|
||||||
|
[Trait("Level", "L0")]
|
||||||
|
[Trait("Category", "Worker")]
|
||||||
|
public async Task StepEnvOverrideJobEnvContext()
|
||||||
|
{
|
||||||
|
using (TestHostContext hc = CreateTestContext())
|
||||||
|
{
|
||||||
|
// Arrange.
|
||||||
|
var env1 = new MappingToken(null, null, null);
|
||||||
|
env1.Add(new StringToken(null, null, null, "env1"), new StringToken(null, null, null, "100"));
|
||||||
|
env1.Add(new StringToken(null, null, null, "env2"), new BasicExpressionToken(null, null, null, "env.test"));
|
||||||
|
var step1 = CreateStep(hc, TaskResult.Succeeded, "success()", env: env1);
|
||||||
|
|
||||||
|
_ec.Object.Result = null;
|
||||||
|
|
||||||
|
_ec.Setup(x => x.JobSteps).Returns(new Queue<IStep>(new[] { step1.Object }));
|
||||||
|
|
||||||
|
// Act.
|
||||||
|
await _stepsRunner.RunAsync(jobContext: _ec.Object);
|
||||||
|
|
||||||
|
// Assert.
|
||||||
|
Assert.Equal(TaskResult.Succeeded, _ec.Object.Result ?? TaskResult.Succeeded);
|
||||||
|
|
||||||
|
#if OS_WINDOWS
|
||||||
|
Assert.Equal("100", _ec.Object.ExpressionValues["env"].AssertDictionary("env")["env1"].AssertString("100"));
|
||||||
|
Assert.Equal("github_actions", _ec.Object.ExpressionValues["env"].AssertDictionary("env")["env2"].AssertString("github_actions"));
|
||||||
|
#else
|
||||||
|
Assert.Equal("100", _ec.Object.ExpressionValues["env"].AssertCaseSensitiveDictionary("env")["env1"].AssertString("100"));
|
||||||
|
Assert.Equal("github_actions", _ec.Object.ExpressionValues["env"].AssertCaseSensitiveDictionary("env")["env2"].AssertString("github_actions"));
|
||||||
|
#endif
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
[Fact]
|
||||||
|
[Trait("Level", "L0")]
|
||||||
|
[Trait("Category", "Worker")]
|
||||||
|
public async Task PopulateEnvContextForEachStep()
|
||||||
|
{
|
||||||
|
using (TestHostContext hc = CreateTestContext())
|
||||||
|
{
|
||||||
|
// Arrange.
|
||||||
|
var env1 = new MappingToken(null, null, null);
|
||||||
|
env1.Add(new StringToken(null, null, null, "env1"), new StringToken(null, null, null, "100"));
|
||||||
|
env1.Add(new StringToken(null, null, null, "env2"), new BasicExpressionToken(null, null, null, "env.test"));
|
||||||
|
var step1 = CreateStep(hc, TaskResult.Succeeded, "success()", env: env1);
|
||||||
|
|
||||||
|
var env2 = new MappingToken(null, null, null);
|
||||||
|
env2.Add(new StringToken(null, null, null, "env1"), new StringToken(null, null, null, "1000"));
|
||||||
|
env2.Add(new StringToken(null, null, null, "env3"), new BasicExpressionToken(null, null, null, "env.test"));
|
||||||
|
var step2 = CreateStep(hc, TaskResult.Succeeded, "success()", env: env2);
|
||||||
|
|
||||||
|
_ec.Object.Result = null;
|
||||||
|
|
||||||
|
_ec.Setup(x => x.JobSteps).Returns(new Queue<IStep>(new[] { step1.Object, step2.Object }));
|
||||||
|
|
||||||
|
// Act.
|
||||||
|
await _stepsRunner.RunAsync(jobContext: _ec.Object);
|
||||||
|
|
||||||
|
// Assert.
|
||||||
|
Assert.Equal(TaskResult.Succeeded, _ec.Object.Result ?? TaskResult.Succeeded);
|
||||||
|
#if OS_WINDOWS
|
||||||
|
Assert.Equal("1000", _ec.Object.ExpressionValues["env"].AssertDictionary("env")["env1"].AssertString("1000"));
|
||||||
|
Assert.Equal("github_actions", _ec.Object.ExpressionValues["env"].AssertDictionary("env")["env3"].AssertString("github_actions"));
|
||||||
|
Assert.False(_ec.Object.ExpressionValues["env"].AssertDictionary("env").ContainsKey("env2"));
|
||||||
|
#else
|
||||||
|
Assert.Equal("1000", _ec.Object.ExpressionValues["env"].AssertCaseSensitiveDictionary("env")["env1"].AssertString("1000"));
|
||||||
|
Assert.Equal("github_actions", _ec.Object.ExpressionValues["env"].AssertCaseSensitiveDictionary("env")["env3"].AssertString("github_actions"));
|
||||||
|
Assert.False(_ec.Object.ExpressionValues["env"].AssertCaseSensitiveDictionary("env").ContainsKey("env2"));
|
||||||
|
#endif
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
[Fact]
|
||||||
|
[Trait("Level", "L0")]
|
||||||
|
[Trait("Category", "Worker")]
|
||||||
|
public async Task PopulateEnvContextAfterSetupStepsContext()
|
||||||
|
{
|
||||||
|
using (TestHostContext hc = CreateTestContext())
|
||||||
|
{
|
||||||
|
// Arrange.
|
||||||
|
var env1 = new MappingToken(null, null, null);
|
||||||
|
env1.Add(new StringToken(null, null, null, "env1"), new StringToken(null, null, null, "100"));
|
||||||
|
var step1 = CreateStep(hc, TaskResult.Succeeded, "success()", env: env1, name: "foo", setOutput: true);
|
||||||
|
|
||||||
|
var env2 = new MappingToken(null, null, null);
|
||||||
|
env2.Add(new StringToken(null, null, null, "env1"), new StringToken(null, null, null, "1000"));
|
||||||
|
env2.Add(new StringToken(null, null, null, "env2"), new BasicExpressionToken(null, null, null, "steps.foo.outputs.test"));
|
||||||
|
var step2 = CreateStep(hc, TaskResult.Succeeded, "success()", env: env2);
|
||||||
|
|
||||||
|
_ec.Object.Result = null;
|
||||||
|
|
||||||
|
_ec.Setup(x => x.JobSteps).Returns(new Queue<IStep>(new[] { step1.Object, step2.Object }));
|
||||||
|
|
||||||
|
// Act.
|
||||||
|
await _stepsRunner.RunAsync(jobContext: _ec.Object);
|
||||||
|
|
||||||
|
// Assert.
|
||||||
|
Assert.Equal(TaskResult.Succeeded, _ec.Object.Result ?? TaskResult.Succeeded);
|
||||||
|
#if OS_WINDOWS
|
||||||
|
Assert.Equal("1000", _ec.Object.ExpressionValues["env"].AssertDictionary("env")["env1"].AssertString("1000"));
|
||||||
|
Assert.Equal("something", _ec.Object.ExpressionValues["env"].AssertDictionary("env")["env2"].AssertString("something"));
|
||||||
|
#else
|
||||||
|
Assert.Equal("1000", _ec.Object.ExpressionValues["env"].AssertCaseSensitiveDictionary("env")["env1"].AssertString("1000"));
|
||||||
|
Assert.Equal("something", _ec.Object.ExpressionValues["env"].AssertCaseSensitiveDictionary("env")["env2"].AssertString("something"));
|
||||||
|
#endif
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
private Mock<IActionRunner> CreateStep(TestHostContext hc, TaskResult result, string condition, Boolean continueOnError = false, MappingToken env = null, string name = "Test", bool setOutput = false)
|
||||||
{
|
{
|
||||||
// Setup the step.
|
// Setup the step.
|
||||||
var step = new Mock<IStep>();
|
var step = new Mock<IActionRunner>();
|
||||||
step.Setup(x => x.Condition).Returns(condition);
|
step.Setup(x => x.Condition).Returns(condition);
|
||||||
step.Setup(x => x.ContinueOnError).Returns(new BooleanToken(null, null, null, continueOnError));
|
step.Setup(x => x.ContinueOnError).Returns(new BooleanToken(null, null, null, continueOnError));
|
||||||
step.Setup(x => x.RunAsync()).Returns(Task.CompletedTask);
|
step.Setup(x => x.Action)
|
||||||
|
.Returns(new DistributedTask.Pipelines.ActionStep()
|
||||||
|
{
|
||||||
|
Name = name,
|
||||||
|
Id = Guid.NewGuid(),
|
||||||
|
Environment = env
|
||||||
|
});
|
||||||
|
|
||||||
// Setup the step execution context.
|
// Setup the step execution context.
|
||||||
var stepContext = new Mock<IExecutionContext>();
|
var stepContext = new Mock<IExecutionContext>();
|
||||||
stepContext.SetupAllProperties();
|
stepContext.SetupAllProperties();
|
||||||
|
stepContext.Setup(x => x.WriteDebug).Returns(true);
|
||||||
stepContext.Setup(x => x.Variables).Returns(_variables);
|
stepContext.Setup(x => x.Variables).Returns(_variables);
|
||||||
|
stepContext.Setup(x => x.EnvironmentVariables).Returns(_env);
|
||||||
stepContext.Setup(x => x.ExpressionValues).Returns(_contexts);
|
stepContext.Setup(x => x.ExpressionValues).Returns(_contexts);
|
||||||
stepContext.Setup(x => x.JobContext).Returns(_jobContext);
|
stepContext.Setup(x => x.JobContext).Returns(_jobContext);
|
||||||
stepContext.Setup(x => x.StepsContext).Returns(_stepContext);
|
stepContext.Setup(x => x.StepsContext).Returns(_stepContext);
|
||||||
@@ -422,13 +544,24 @@ namespace GitHub.Runner.Common.Tests.Worker
|
|||||||
stepContext.Object.Result = r;
|
stepContext.Object.Result = r;
|
||||||
}
|
}
|
||||||
});
|
});
|
||||||
|
var trace = hc.GetTrace();
|
||||||
|
stepContext.Setup(x => x.Write(It.IsAny<string>(), It.IsAny<string>())).Callback((string tag, string message) => { trace.Info($"[{tag}]{message}"); });
|
||||||
stepContext.Object.Result = result;
|
stepContext.Object.Result = result;
|
||||||
step.Setup(x => x.ExecutionContext).Returns(stepContext.Object);
|
step.Setup(x => x.ExecutionContext).Returns(stepContext.Object);
|
||||||
|
|
||||||
|
if (setOutput)
|
||||||
|
{
|
||||||
|
step.Setup(x => x.RunAsync()).Callback(() => { _stepContext.SetOutput(null, name, "test", "something", out string reference); }).Returns(Task.CompletedTask);
|
||||||
|
}
|
||||||
|
else
|
||||||
|
{
|
||||||
|
step.Setup(x => x.RunAsync()).Returns(Task.CompletedTask);
|
||||||
|
}
|
||||||
|
|
||||||
return step;
|
return step;
|
||||||
}
|
}
|
||||||
|
|
||||||
private string FormatSteps(IEnumerable<Mock<IStep>> steps)
|
private string FormatSteps(IEnumerable<Mock<IActionRunner>> steps)
|
||||||
{
|
{
|
||||||
return String.Join(
|
return String.Join(
|
||||||
" ; ",
|
" ; ",
|
||||||
|
|||||||
26
src/Test/TestData/dockerfileaction_cleanup_default.yml
Normal file
26
src/Test/TestData/dockerfileaction_cleanup_default.yml
Normal file
@@ -0,0 +1,26 @@
|
|||||||
|
name: 'Hello World'
|
||||||
|
description: 'Greet the world and record the time'
|
||||||
|
author: 'Test Corporation'
|
||||||
|
inputs:
|
||||||
|
greeting: # id of input
|
||||||
|
description: 'The greeting we choose - will print ""{greeting}, World!"" on stdout'
|
||||||
|
required: true
|
||||||
|
default: 'Hello'
|
||||||
|
entryPoint: # id of input
|
||||||
|
description: 'optional docker entrypoint overwrite.'
|
||||||
|
required: false
|
||||||
|
outputs:
|
||||||
|
time: # id of output
|
||||||
|
description: 'The time we did the greeting'
|
||||||
|
icon: 'hello.svg' # vector art to display in the GitHub Marketplace
|
||||||
|
color: 'green' # optional, decorates the entry in the GitHub Marketplace
|
||||||
|
runs:
|
||||||
|
using: 'docker'
|
||||||
|
image: 'Dockerfile'
|
||||||
|
args:
|
||||||
|
- 'bzz'
|
||||||
|
entrypoint: 'main.sh'
|
||||||
|
env:
|
||||||
|
Token: foo
|
||||||
|
Url: bar
|
||||||
|
post-entrypoint: 'cleanup.sh'
|
||||||
21
src/Test/TestData/nodeaction_cleanup_default.yml
Normal file
21
src/Test/TestData/nodeaction_cleanup_default.yml
Normal file
@@ -0,0 +1,21 @@
|
|||||||
|
name: 'Hello World'
|
||||||
|
description: 'Greet the world and record the time'
|
||||||
|
author: 'Test Corporation'
|
||||||
|
inputs:
|
||||||
|
greeting: # id of input
|
||||||
|
description: 'The greeting we choose - will print ""{greeting}, World!"" on stdout'
|
||||||
|
required: true
|
||||||
|
default: 'Hello'
|
||||||
|
deprecationMessage: 'This property has been deprecated'
|
||||||
|
entryPoint: # id of input
|
||||||
|
description: 'optional docker entrypoint overwrite.'
|
||||||
|
required: false
|
||||||
|
outputs:
|
||||||
|
time: # id of output
|
||||||
|
description: 'The time we did the greeting'
|
||||||
|
icon: 'hello.svg' # vector art to display in the GitHub Marketplace
|
||||||
|
color: 'green' # optional, decorates the entry in the GitHub Marketplace
|
||||||
|
runs:
|
||||||
|
using: 'node12'
|
||||||
|
main: 'main.js'
|
||||||
|
post: 'cleanup.js'
|
||||||
@@ -1 +1 @@
|
|||||||
2.164.0
|
2.165.0
|
||||||
|
|||||||
Reference in New Issue
Block a user