Compare commits

..

1 Commits

Author SHA1 Message Date
TingluoHuang
eed84f6cf3 c 2020-06-07 22:48:57 -04:00
92 changed files with 1465 additions and 3680 deletions

View File

@@ -1,10 +1,9 @@
name: Runner CI
on:
workflow_dispatch:
push:
branches:
- main
- master
- releases/*
paths-ignore:
- '**.md'

View File

@@ -1,14 +1,13 @@
name: Runner CD
on:
workflow_dispatch:
push:
paths:
- releaseVersion
jobs:
check:
if: startsWith(github.ref, 'refs/heads/releases/') || github.ref == 'refs/heads/main'
if: startsWith(github.ref, 'refs/heads/releases/') || github.ref == 'refs/heads/master'
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v2

View File

@@ -6,7 +6,7 @@
[![Actions Status](https://github.com/actions/runner/workflows/Runner%20CI/badge.svg)](https://github.com/actions/runner/actions)
The runner is the application that runs a job from a GitHub Actions workflow. It is used by GitHub Actions in the [hosted virtual environments](https://github.com/actions/virtual-environments), or you can [self-host the runner](https://help.github.com/en/actions/automating-your-workflow-with-github-actions/about-self-hosted-runners) in your own environment.
The runner is the application that runs a job from a GitHub Actions workflow. The runner can run on the [hosted machine pools](https://github.com/actions/virtual-environments) or run on [self-hosted environments](https://help.github.com/en/actions/automating-your-workflow-with-github-actions/about-self-hosted-runners).
## Get Started

View File

@@ -1,378 +0,0 @@
# ADR 0549: Composite Run Steps
**Date**: 2020-06-17
**Status**: Accepted
## Context
Customers want to be able to compose actions from actions (ex: https://github.com/actions/runner/issues/438)
An important step towards meeting this goal is to build in functionality for actions where users can simply execute any number of steps.
### Guiding Principles
We don't want the workflow author to need to know how the internal workings of the action work. Users shouldn't know the internal workings of the composite action (for example, `default.shell` and `default.workingDir` should not be inherited from the workflow file to the action file). When deciding how to design certain parts of composite run steps, we want to think one logical step from the consumer.
A composite action is treated as **one** individual job step (this is known as encapsulation).
## Decision
**In this ADR, we only support running multiple run steps in an Action.** In doing so, we build in support for mapping and flowing the inputs, outputs, and env variables (ex: All nested steps should have access to its parents' input variables and nested steps can overwrite the input variables).
### Composite Run Steps Features
This feature supports at the top action level:
- name
- description
- inputs
- runs
- outputs
This feature supports at the run step level:
- name
- id
- run
- env
- shell
- working-directory
This feature **does not support** at the run step level:
- timeout-minutes
- secrets
- conditionals (needs, if, etc.)
- continue-on-error
### Steps
Example `workflow.yml`
```yaml
jobs:
build:
runs-on: self-hosted
steps:
- id: step1
uses: actions/setup-python@v1
- id: step2
uses: actions/setup-node@v2
- uses: actions/checkout@v2
- uses: user/composite@v1
- name: workflow step 1
run: echo hello world 3
- name: workflow step 2
run: echo hello world 4
```
Example `user/composite/action.yml`
```yaml
runs:
using: "composite"
steps:
- run: pip install -r requirements.txt
shell: bash
- run: npm install
shell: bash
```
Example Output
```yaml
[npm installation output]
[pip requirements output]
echo hello world 3
echo hello world 4
```
We add a token called "composite" which allows our Runner code to process composite actions. By invoking "using: composite", our Runner code then processes the "steps" attribute, converts this template code to a list of steps, and finally runs each run step sequentially. If any step fails and there are no `if` conditions defined, the whole composite action job fails.
### Defaults
We will not support "defaults" in a composite action.
### Shell and Working-directory
For each run step in a composite action, the action author can set the `shell` and `working-directory` attributes for that step. The shell attribute is **required** for each run step because the action author does not know what the workflow author is using for the operating system so we need to explicitly prevent unknown behavior by making sure that each run step has an explicit shell **set by the action author.** On the other hand, `working-directory` is optional. Moreover, the composite action author can map in values from the `inputs` for it's `shell` and `working-directory` attributes at the step level for an action.
For example,
`action.yml`
```yaml
inputs:
shell_1:
description: 'Your name'
default: 'pwsh'
steps:
- run: echo 1
shell: ${{ inputs.shell_1 }}
```
Note, the workflow file and action file are treated as separate entities. **So, the workflow `defaults` will never change the `shell` and `working-directory` value in the run steps in a composite action.** Note, `defaults` in a workflow only apply to run steps not "uses" steps (steps that use an action).
### Running Local Scripts
Example 'workflow.yml':
```yaml
jobs:
build:
runs-on: self-hosted
steps:
- uses: user/composite@v1
```
Example `user/composite/action.yml`:
```yaml
runs:
using: "composite"
steps:
- run: chmod +x ${{ github.action_path }}/test/script2.sh
shell: bash
- run: chmod +x $GITHUB_ACTION_PATH/script.sh
shell: bash
- run: ${{ github.action_path }}/test/script2.sh
shell: bash
- run: $GITHUB_ACTION_PATH/script.sh
shell: bash
```
Where `user/composite` has the file structure:
```
.
+-- action.yml
+-- script.sh
+-- test
| +-- script2.sh
```
Users will be able to run scripts located in their action folder by first prepending the relative path and script name with `$GITHUB_ACTION_PATH` or `github.action_path` which contains the path in which the composite action is downloaded to and where those "files" live. Note, you'll have to use `chmod` before running each script if you do not git check in your script files into your github repo with the executable bit turned on.
### Inputs
Example `workflow.yml`:
```yaml
steps:
- id: foo
uses: user/composite@v1
with:
your_name: "Octocat"
```
Example `user/composite/action.yml`:
```yaml
inputs:
your_name:
description: 'Your name'
default: 'Ethan'
runs:
using: "composite"
steps:
- run: echo hello ${{ inputs.your_name }}
shell: bash
```
Example Output:
```
hello Octocat
```
Each input variable in the composite action is only viewable in its own scope.
### Outputs
Example `workflow.yml`:
```yaml
...
steps:
- id: foo
uses: user/composite@v1
- run: echo random-number ${{ steps.foo.outputs.random-number }}
shell: bash
```
Example `user/composite/action.yml`:
```yaml
outputs:
random-number:
description: "Random number"
value: ${{ steps.random-number-generator.outputs.random-id }}
runs:
using: "composite"
steps:
- id: random-number-generator
run: echo "::set-output name=random-id::$(echo $RANDOM)"
shell: bash
```
Example Output:
```
::set-output name=my-output::43243
random-number 43243
```
Each of the output variables from the composite action is viewable from the workflow file that uses the composite action. In other words, every child action output(s) is viewable only by its parent using dot notation (ex `steps.foo.outputs.random-number`).
Moreover, the output ids are only accessible within the scope where it was defined. Note that in the example above, in our `workflow.yml` file, it should not have access to output id (i.e. `random-id`). The reason why we are doing this is because we don't want to require the workflow author to know the internal workings of the composite action.
### Context
Similar to the workflow file, the composite action has access to the [same context objects](https://help.github.com/en/actions/reference/context-and-expression-syntax-for-github-actions#contexts) (ex: `github`, `env`, `strategy`).
### Environment
In the Composite Action, you'll only be able to use `::set-env::` to set environment variables just like you could with other actions.
### Secrets
**We will not support "Secrets" in a composite action for now. This functionality will be focused on in a future ADR.**
We'll pass the secrets from the composite action's parents (ex: the workflow file) to the composite action. Secrets can be created in the composite action with the secrets context. In the actions yaml, we'll automatically mask the secret.
### If Condition
** If and needs conditions will not be supported in the composite run steps feature. It will be supported later on in a new feature. **
Old reasoning:
Example `workflow.yml`:
```yaml
steps:
- run: exit 1
- uses: user/composite@v1 # <--- this will run, as it's marked as always runing
if: always()
```
Example `user/composite/action.yml`:
```yaml
runs:
using: "composite"
steps:
- run: echo "just succeeding"
shell: bash
- run: echo "I will run, as my current scope is succeeding"
shell: bash
if: success()
- run: exit 1
shell: bash
- run: echo "I will not run, as my current scope is now failing"
shell: bash
```
**We will not support "if Condition" in a composite action for now. This functionality will be focused on in a future ADR.**
See the paragraph below for a rudimentary approach (thank you to @cybojenix for the idea, example, and explanation for this approach):
The `if` statement in the parent (in the example above, this is the `workflow.yml`) shows whether or not we should run the composite action. So, our composite action will run since the `if` condition for running the composite action is `always()`.
**Note that the if condition on the parent does not propagate to the rest of its children though.**
In the child action (in this example, this is the `action.yml`), it starts with a clean slate (in other words, no imposing if conditions). Similar to the logic in the paragraph above, `echo "I will run, as my current scope is succeeding"` will run since the `if` condition checks if the previous steps **within this composite action** has not failed. `run: echo "I will not run, as my current scope is now failing"` will not run since the previous step resulted in an error and by default, the if expression is set to `success()` if the if condition is not set for a step.
What if a step has `cancelled()`? We do the opposite of our approach above if `cancelled()` is used for any of our composite run steps. We will cancel any step that has this condition if the workflow is cancelled at all.
### Timeout-minutes
Example `workflow.yml`:
```yaml
steps:
- id: bar
uses: user/test@v1
timeout-minutes: 50
```
Example `user/composite/action.yml`:
```yaml
runs:
using: "composite"
steps:
- id: foo1
run: echo test 1
timeout-minutes: 10
shell: bash
- id: foo2
run: echo test 2
shell: bash
- id: foo3
run: echo test 3
timeout-minutes: 10
shell: bash
```
**We will not support "timeout-minutes" in a composite action for now. This functionality will be focused on in a future ADR.**
A composite action in its entirety is a job. You can set both timeout-minutes for the whole composite action or its steps as long as the the sum of the `timeout-minutes` for each composite action step that has the attribute `timeout-minutes` is less than or equals to `timeout-minutes` for the composite action. There is no default timeout-minutes for each composite action step.
If the time taken for any of the steps in combination or individually exceed the whole composite action `timeout-minutes` attribute, the whole job will fail (1). If an individual step exceeds its own `timeout-minutes` attribute but the total time that has been used including this step is below the overall composite action `timeout-minutes`, the individual step will fail but the rest of the steps will run based on their own `timeout-minutes` attribute (they will still abide by condition (1) though).
For reference, in the example above, if the composite step `foo1` takes 11 minutes to run, that step will fail but the rest of the steps, `foo1` and `foo2`, will proceed as long as their total runtime with the previous failed `foo1` action is less than the composite action's `timeout-minutes` (50 minutes). If the composite step `foo2` takes 51 minutes to run, it will cause the whole composite action job to fail. I
The rationale behind this is that users can configure their steps with the `if` condition to conditionally set how steps rely on each other. Due to the additional capabilities that are offered with combining `timeout-minutes` and/or `if`, we wanted the `timeout-minutes` condition to be as dumb as possible and not effect other steps.
[Usage limits still apply](https://help.github.com/en/actions/reference/workflow-syntax-for-github-actions?query=if%28%29#usage-limits)
### Continue-on-error
Example `workflow.yml`:
```yaml
steps:
- run: exit 1
- id: bar
uses: user/test@v1
continue-on-error: false
- id: foo
run: echo "Hello World" <------- This step will not run
```
Example `user/composite/action.yml`:
```yaml
runs:
using: "composite"
steps:
- run: exit 1
continue-on-error: true
shell: bash
- run: echo "Hello World 2" <----- This step will run
shell: bash
```
**We will not support "continue-on-error" in a composite action for now. This functionality will be focused on in a future ADR.**
If any of the steps fail in the composite action and the `continue-on-error` is set to `false` for the whole composite action step in the workflow file, then the steps below it will run. On the flip side, if `continue-on-error` is set to `true` for the whole composite action step in the workflow file, the next job step will run.
For the composite action steps, it follows the same logic as above. In this example, `"Hello World 2"` will be outputted because the previous step has `continue-on-error` set to `true` although that previous step errored.
### Visualizing Composite Action in the GitHub Actions UI
We want all the composite action's steps to be condensed into the original composite action node.
Here is a visual represenation of the [first example](#Steps)
```yaml
| composite_action_node |
| echo hello world 1 |
| echo hello world 2 |
| echo hello world 3 |
| echo hello world 4 |
```
## Consequences
This ADR lays the framework for eventually supporting nested Composite Actions within Composite Actions. This ADR allows for users to run multiple run steps within a GitHub Composite Action with the support of inputs, outputs, environment, and context for use in any steps as well as the if, timeout-minutes, and the continue-on-error attributes for each Composite Action step.

View File

@@ -23,7 +23,7 @@ An ADR is an Architectural Decision Record. This allows consensus on the direct
### Required Dev Dependencies
![Win](res/win_sm.png) ![*nix](res/linux_sm.png) Git for Windows and Linux [Install Here](https://git-scm.com/downloads) (needed for dev sh script)
![Win](res/win_sm.png) Git for Windows [Install Here](https://git-scm.com/downloads) (needed for dev sh script)
### To Build, Test, Layout
@@ -43,7 +43,6 @@ Sample developer flow:
```bash
git clone https://github.com/actions/runner
cd runner
cd ./src
./dev.(sh/cmd) layout # the runner that built from source is in {root}/_layout
<make code changes>
@@ -51,23 +50,10 @@ cd ./src
./dev.(sh/cmd) test # run all unit tests before git commit/push
```
View logs:
```bash
cd runner/_layout/_diag
ls
cat (Runner/Worker)_TIMESTAMP.log # view your log file
```
Run Runner:
```bash
cd runner/_layout
./run.sh # run your custom runner
```
### Editors
[Using Visual Studio Code](https://code.visualstudio.com/)
[Using Visual Studio](https://code.visualstudio.com/docs)
[Using Visual Studio 2019](https://www.visualstudio.com/vs/)
### Styling

View File

@@ -1,9 +1,12 @@
## Features
- N/A
## Bugs
- Raise error for set-env, block set node_options using `::set-env::` (#784)
- Handle `jq` returns "null" if the field does not exist in create-latest-svc.sh (#478)
- Switch GITHUB_URL to GITHUB_SERVER_URL (#482)
- Fix problem matcher for GHES (#488)
- Fix container action inputs validation warning (#490)
- Fix post step display name (#490)
- Fix worker crash due to exception from evaluating step.env (#490)
## Misc
- N/A

View File

@@ -1 +1 @@
2.273.6
<Update to ./src/runnerversion when creating release>

View File

@@ -69,8 +69,6 @@
.PARAMETER ProxyUseDefaultCredentials
Default: false
Use default credentials, when using proxy address.
.PARAMETER ProxyBypassList
If set with ProxyAddress, will provide the list of comma separated urls that will bypass the proxy
.PARAMETER SkipNonVersionedFiles
Default: false
Skips installing non-versioned files if they already exist, such as dotnet.exe.
@@ -98,7 +96,6 @@ param(
[string]$FeedCredential,
[string]$ProxyAddress,
[switch]$ProxyUseDefaultCredentials,
[string[]]$ProxyBypassList=@(),
[switch]$SkipNonVersionedFiles,
[switch]$NoCdn
)
@@ -122,27 +119,11 @@ $VersionRegEx="/\d+\.\d+[^/]+/"
$OverrideNonVersionedFiles = !$SkipNonVersionedFiles
function Say($str) {
try
{
Write-Host "dotnet-install: $str"
}
catch
{
# Some platforms cannot utilize Write-Host (Azure Functions, for instance). Fall back to Write-Output
Write-Output "dotnet-install: $str"
}
Write-Host "dotnet-install: $str"
}
function Say-Verbose($str) {
try
{
Write-Verbose "dotnet-install: $str"
}
catch
{
# Some platforms cannot utilize Write-Verbose (Azure Functions, for instance). Fall back to Write-Output
Write-Output "dotnet-install: $str"
}
Write-Verbose "dotnet-install: $str"
}
function Say-Invocation($Invocation) {
@@ -173,16 +154,7 @@ function Invoke-With-Retry([ScriptBlock]$ScriptBlock, [int]$MaxAttempts = 3, [in
function Get-Machine-Architecture() {
Say-Invocation $MyInvocation
# On PS x86, PROCESSOR_ARCHITECTURE reports x86 even on x64 systems.
# To get the correct architecture, we need to use PROCESSOR_ARCHITEW6432.
# PS x64 doesn't define this, so we fall back to PROCESSOR_ARCHITECTURE.
# Possible values: amd64, x64, x86, arm64, arm
if( $ENV:PROCESSOR_ARCHITEW6432 -ne $null )
{
return $ENV:PROCESSOR_ARCHITEW6432
}
# possible values: amd64, x64, x86, arm64, arm
return $ENV:PROCESSOR_ARCHITECTURE
}
@@ -195,7 +167,7 @@ function Get-CLIArchitecture-From-Architecture([string]$Architecture) {
{ $_ -eq "x86" } { return "x86" }
{ $_ -eq "arm" } { return "arm" }
{ $_ -eq "arm64" } { return "arm64" }
default { throw "Architecture '$Architecture' not supported. If you think this is a bug, report it at https://github.com/dotnet/install-scripts/issues" }
default { throw "Architecture not supported. If you think this is a bug, report it at https://github.com/dotnet/sdk/issues" }
}
}
@@ -256,11 +228,7 @@ function GetHTTPResponse([Uri] $Uri)
if($ProxyAddress) {
$HttpClientHandler = New-Object System.Net.Http.HttpClientHandler
$HttpClientHandler.Proxy = New-Object System.Net.WebProxy -Property @{
Address=$ProxyAddress;
UseDefaultCredentials=$ProxyUseDefaultCredentials;
BypassList = $ProxyBypassList;
}
$HttpClientHandler.Proxy = New-Object System.Net.WebProxy -Property @{Address=$ProxyAddress;UseDefaultCredentials=$ProxyUseDefaultCredentials}
$HttpClient = New-Object System.Net.Http.HttpClient -ArgumentList $HttpClientHandler
}
else {
@@ -395,20 +363,17 @@ function Get-Specific-Version-From-Version([string]$AzureFeed, [string]$Channel,
function Get-Download-Link([string]$AzureFeed, [string]$SpecificVersion, [string]$CLIArchitecture) {
Say-Invocation $MyInvocation
# If anything fails in this lookup it will default to $SpecificVersion
$SpecificProductVersion = Get-Product-Version -AzureFeed $AzureFeed -SpecificVersion $SpecificVersion
if ($Runtime -eq "dotnet") {
$PayloadURL = "$AzureFeed/Runtime/$SpecificVersion/dotnet-runtime-$SpecificProductVersion-win-$CLIArchitecture.zip"
$PayloadURL = "$AzureFeed/Runtime/$SpecificVersion/dotnet-runtime-$SpecificVersion-win-$CLIArchitecture.zip"
}
elseif ($Runtime -eq "aspnetcore") {
$PayloadURL = "$AzureFeed/aspnetcore/Runtime/$SpecificVersion/aspnetcore-runtime-$SpecificProductVersion-win-$CLIArchitecture.zip"
$PayloadURL = "$AzureFeed/aspnetcore/Runtime/$SpecificVersion/aspnetcore-runtime-$SpecificVersion-win-$CLIArchitecture.zip"
}
elseif ($Runtime -eq "windowsdesktop") {
$PayloadURL = "$AzureFeed/Runtime/$SpecificVersion/windowsdesktop-runtime-$SpecificProductVersion-win-$CLIArchitecture.zip"
$PayloadURL = "$AzureFeed/Runtime/$SpecificVersion/windowsdesktop-runtime-$SpecificVersion-win-$CLIArchitecture.zip"
}
elseif (-not $Runtime) {
$PayloadURL = "$AzureFeed/Sdk/$SpecificVersion/dotnet-sdk-$SpecificProductVersion-win-$CLIArchitecture.zip"
$PayloadURL = "$AzureFeed/Sdk/$SpecificVersion/dotnet-sdk-$SpecificVersion-win-$CLIArchitecture.zip"
}
else {
throw "Invalid value for `$Runtime"
@@ -416,7 +381,7 @@ function Get-Download-Link([string]$AzureFeed, [string]$SpecificVersion, [string
Say-Verbose "Constructed primary named payload URL: $PayloadURL"
return $PayloadURL, $SpecificProductVersion
return $PayloadURL
}
function Get-LegacyDownload-Link([string]$AzureFeed, [string]$SpecificVersion, [string]$CLIArchitecture) {
@@ -437,51 +402,6 @@ function Get-LegacyDownload-Link([string]$AzureFeed, [string]$SpecificVersion, [
return $PayloadURL
}
function Get-Product-Version([string]$AzureFeed, [string]$SpecificVersion) {
Say-Invocation $MyInvocation
if ($Runtime -eq "dotnet") {
$ProductVersionTxtURL = "$AzureFeed/Runtime/$SpecificVersion/productVersion.txt"
}
elseif ($Runtime -eq "aspnetcore") {
$ProductVersionTxtURL = "$AzureFeed/aspnetcore/Runtime/$SpecificVersion/productVersion.txt"
}
elseif ($Runtime -eq "windowsdesktop") {
$ProductVersionTxtURL = "$AzureFeed/Runtime/$SpecificVersion/productVersion.txt"
}
elseif (-not $Runtime) {
$ProductVersionTxtURL = "$AzureFeed/Sdk/$SpecificVersion/productVersion.txt"
}
else {
throw "Invalid value '$Runtime' specified for `$Runtime"
}
Say-Verbose "Checking for existence of $ProductVersionTxtURL"
try {
$productVersionResponse = GetHTTPResponse($productVersionTxtUrl)
if ($productVersionResponse.StatusCode -eq 200) {
$productVersion = $productVersionResponse.Content.ReadAsStringAsync().Result.Trim()
if ($productVersion -ne $SpecificVersion)
{
Say "Using alternate version $productVersion found in $ProductVersionTxtURL"
}
return $productVersion
}
else {
Say-Verbose "Got StatusCode $($productVersionResponse.StatusCode) trying to get productVersion.txt at $productVersionTxtUrl, so using default value of $SpecificVersion"
$productVersion = $SpecificVersion
}
} catch {
Say-Verbose "Could not read productVersion.txt at $productVersionTxtUrl, so using default value of $SpecificVersion (Exception: '$($_.Exception.Message)' )"
$productVersion = $SpecificVersion
}
return $productVersion
}
function Get-User-Share-Path() {
Say-Invocation $MyInvocation
@@ -637,7 +557,7 @@ function Prepend-Sdk-InstallRoot-To-Path([string]$InstallRoot, [string]$BinFolde
$CLIArchitecture = Get-CLIArchitecture-From-Architecture $Architecture
$SpecificVersion = Get-Specific-Version-From-Version -AzureFeed $AzureFeed -Channel $Channel -Version $Version -JSonFile $JSonFile
$DownloadLink, $EffectiveVersion = Get-Download-Link -AzureFeed $AzureFeed -SpecificVersion $SpecificVersion -CLIArchitecture $CLIArchitecture
$DownloadLink = Get-Download-Link -AzureFeed $AzureFeed -SpecificVersion $SpecificVersion -CLIArchitecture $CLIArchitecture
$LegacyDownloadLink = Get-LegacyDownload-Link -AzureFeed $AzureFeed -SpecificVersion $SpecificVersion -CLIArchitecture $CLIArchitecture
$InstallRoot = Resolve-Installation-Path $InstallDir
@@ -663,11 +583,6 @@ if ($DryRun) {
}
}
Say "Repeatable invocation: $RepeatableCommand"
if ($SpecificVersion -ne $EffectiveVersion)
{
Say "NOTE: Due to finding a version manifest with this runtime, it would actually install with version '$EffectiveVersion'"
}
exit 0
}
@@ -691,12 +606,6 @@ else {
throw "Invalid value for `$Runtime"
}
if ($SpecificVersion -ne $EffectiveVersion)
{
Say "Performing installation checks for effective version: $EffectiveVersion"
$SpecificVersion = $EffectiveVersion
}
# Check if the SDK version is already installed.
$isAssetInstalled = Is-Dotnet-Package-Installed -InstallRoot $InstallRoot -RelativePathToPackage $dotnetPackageRelativePath -SpecificVersion $SpecificVersion
if ($isAssetInstalled) {
@@ -775,11 +684,12 @@ Prepend-Sdk-InstallRoot-To-Path -InstallRoot $InstallRoot -BinFolderRelativePath
Say "Installation finished"
exit 0
# SIG # Begin signature block
# MIIjkgYJKoZIhvcNAQcCoIIjgzCCI38CAQExDzANBglghkgBZQMEAgEFADB5Bgor
# MIIjkQYJKoZIhvcNAQcCoIIjgjCCI34CAQExDzANBglghkgBZQMEAgEFADB5Bgor
# BgEEAYI3AgEEoGswaTA0BgorBgEEAYI3AgEeMCYCAwEAAAQQH8w7YFlLCE63JNLG
# KX7zUQIBAAIBAAIBAAIBAAIBADAxMA0GCWCGSAFlAwQCAQUABCAdMJOqDPFy5F1i
# HBXPyOE4hGkUv5EGyQzmS901lRr+baCCDYEwggX/MIID56ADAgECAhMzAAABh3IX
# KX7zUQIBAAIBAAIBAAIBAAIBADAxMA0GCWCGSAFlAwQCAQUABCAwp4UsNdAkvwY3
# VhbuN9D6NGOz+qNqW2+62YubWa4qJaCCDYEwggX/MIID56ADAgECAhMzAAABh3IX
# chVZQMcJAAAAAAGHMA0GCSqGSIb3DQEBCwUAMH4xCzAJBgNVBAYTAlVTMRMwEQYD
# VQQIEwpXYXNoaW5ndG9uMRAwDgYDVQQHEwdSZWRtb25kMR4wHAYDVQQKExVNaWNy
# b3NvZnQgQ29ycG9yYXRpb24xKDAmBgNVBAMTH01pY3Jvc29mdCBDb2RlIFNpZ25p
@@ -851,119 +761,119 @@ exit 0
# xw4o7t5lL+yX9qFcltgA1qFGvVnzl6UJS0gQmYAf0AApxbGbpT9Fdx41xtKiop96
# eiL6SJUfq/tHI4D1nvi/a7dLl+LrdXga7Oo3mXkYS//WsyNodeav+vyL6wuA6mk7
# r/ww7QRMjt/fdW1jkT3RnVZOT7+AVyKheBEyIXrvQQqxP/uozKRdwaGIm1dxVk5I
# RcBCyZt2WwqASGv9eZ/BvW1taslScxMNelDNMYIVZzCCFWMCAQEwgZUwfjELMAkG
# RcBCyZt2WwqASGv9eZ/BvW1taslScxMNelDNMYIVZjCCFWICAQEwgZUwfjELMAkG
# A1UEBhMCVVMxEzARBgNVBAgTCldhc2hpbmd0b24xEDAOBgNVBAcTB1JlZG1vbmQx
# HjAcBgNVBAoTFU1pY3Jvc29mdCBDb3Jwb3JhdGlvbjEoMCYGA1UEAxMfTWljcm9z
# b2Z0IENvZGUgU2lnbmluZyBQQ0EgMjAxMQITMwAAAYdyF3IVWUDHCQAAAAABhzAN
# BglghkgBZQMEAgEFAKCBrjAZBgkqhkiG9w0BCQMxDAYKKwYBBAGCNwIBBDAcBgor
# BgEEAYI3AgELMQ4wDAYKKwYBBAGCNwIBFTAvBgkqhkiG9w0BCQQxIgQgGfshXxhl
# 7+O9cl90lOU62gZCBmJzcomUxEL8+XyoDYQwQgYKKwYBBAGCNwIBDDE0MDKgFIAS
# BgEEAYI3AgELMQ4wDAYKKwYBBAGCNwIBFTAvBgkqhkiG9w0BCQQxIgQga11B1DE+
# y9z0lmEO+MC+bhXPKfWALB7Snkn7G/wCUncwQgYKKwYBBAGCNwIBDDE0MDKgFIAS
# AE0AaQBjAHIAbwBzAG8AZgB0oRqAGGh0dHA6Ly93d3cubWljcm9zb2Z0LmNvbTAN
# BgkqhkiG9w0BAQEFAASCAQCPVhcZxxdIzkFdrv/FCW737QgR8fCO1/oRXwhigOyQ
# P2MF39fIYsVXuzVnO8pYZZOeW04kMECcWf9420okd4lXP7Xc5m+5UrqPuN1UgNle
# hhwLBiXuZaAfllBMWMeQi7DZmg7XW8Yay9TAbc2XSTGQ8foDxPllKFbdPvvQ2DRy
# VRLyNNQQEo3IuHHa0nnVNaL2PUYJf0udMCdGkxIMbApAYcitJLSwMLqMzrMkrvS9
# ubm7CgigsKRJ3cZtCtFFMUkMsstoVuKLFtu69OvOfgLy1qmKotE6EnF7xudV+qAA
# a+UxGVT715tK5kgb5eTr1K2NdWRj517oANQNOjR/m6OPoYIS8TCCEu0GCisGAQQB
# gjcDAwExghLdMIIS2QYJKoZIhvcNAQcCoIISyjCCEsYCAQMxDzANBglghkgBZQME
# AgEFADCCAVUGCyqGSIb3DQEJEAEEoIIBRASCAUAwggE8AgEBBgorBgEEAYRZCgMB
# MDEwDQYJYIZIAWUDBAIBBQAEIHYNJoLIl+IWj/Npb6r479Guw3UW/q0/jJhqKgHm
# xq1NAgZfdIY1B90YEzIwMjAxMDE0MTcxOTIwLjg5NVowBIACAfSggdSkgdEwgc4x
# CzAJBgNVBAYTAlVTMRMwEQYDVQQIEwpXYXNoaW5ndG9uMRAwDgYDVQQHEwdSZWRt
# b25kMR4wHAYDVQQKExVNaWNyb3NvZnQgQ29ycG9yYXRpb24xKTAnBgNVBAsTIE1p
# Y3Jvc29mdCBPcGVyYXRpb25zIFB1ZXJ0byBSaWNvMSYwJAYDVQQLEx1UaGFsZXMg
# VFNTIEVTTjo2MEJDLUUzODMtMjYzNTElMCMGA1UEAxMcTWljcm9zb2Z0IFRpbWUt
# U3RhbXAgU2VydmljZaCCDkQwggT1MIID3aADAgECAhMzAAABJt+6SyK5goIHAAAA
# AAEmMA0GCSqGSIb3DQEBCwUAMHwxCzAJBgNVBAYTAlVTMRMwEQYDVQQIEwpXYXNo
# aW5ndG9uMRAwDgYDVQQHEwdSZWRtb25kMR4wHAYDVQQKExVNaWNyb3NvZnQgQ29y
# cG9yYXRpb24xJjAkBgNVBAMTHU1pY3Jvc29mdCBUaW1lLVN0YW1wIFBDQSAyMDEw
# MB4XDTE5MTIxOTAxMTQ1OVoXDTIxMDMxNzAxMTQ1OVowgc4xCzAJBgNVBAYTAlVT
# MRMwEQYDVQQIEwpXYXNoaW5ndG9uMRAwDgYDVQQHEwdSZWRtb25kMR4wHAYDVQQK
# ExVNaWNyb3NvZnQgQ29ycG9yYXRpb24xKTAnBgNVBAsTIE1pY3Jvc29mdCBPcGVy
# YXRpb25zIFB1ZXJ0byBSaWNvMSYwJAYDVQQLEx1UaGFsZXMgVFNTIEVTTjo2MEJD
# LUUzODMtMjYzNTElMCMGA1UEAxMcTWljcm9zb2Z0IFRpbWUtU3RhbXAgU2Vydmlj
# ZTCCASIwDQYJKoZIhvcNAQEBBQADggEPADCCAQoCggEBAJ4wvoacTvMNlXQTtfF/
# Cx5Ol3X0fcjUNMvjLgTmO5+WHYJFbp725P3+qvFKDRQHWEI1Sz0gB24urVDIjXjB
# h5NVNJVMQJI2tltv7M4/4IbhZJb3xzQW7LolEoZYUZanBTUuyly9osCg4o5joViT
# 2GtmyxK+Fv5kC20l2opeaeptd/E7ceDAFRM87hiNCsK/KHyC+8+swnlg4gTOey6z
# QqhzgNsG6HrjLBuDtDs9izAMwS2yWT0T52QA9h3Q+B1C9ps2fMKMe+DHpG+0c61D
# 94Yh6cV2XHib4SBCnwIFZAeZE2UJ4qPANSYozI8PH+E5rCT3SVqYvHou97HsXvP2
# I3MCAwEAAaOCARswggEXMB0GA1UdDgQWBBRJq6wfF7B+mEKN0VimX8ajNA5hQTAf
# BgNVHSMEGDAWgBTVYzpcijGQ80N7fEYbxTNoWoVtVTBWBgNVHR8ETzBNMEugSaBH
# hkVodHRwOi8vY3JsLm1pY3Jvc29mdC5jb20vcGtpL2NybC9wcm9kdWN0cy9NaWNU
# aW1TdGFQQ0FfMjAxMC0wNy0wMS5jcmwwWgYIKwYBBQUHAQEETjBMMEoGCCsGAQUF
# BzAChj5odHRwOi8vd3d3Lm1pY3Jvc29mdC5jb20vcGtpL2NlcnRzL01pY1RpbVN0
# YVBDQV8yMDEwLTA3LTAxLmNydDAMBgNVHRMBAf8EAjAAMBMGA1UdJQQMMAoGCCsG
# AQUFBwMIMA0GCSqGSIb3DQEBCwUAA4IBAQBAlvudaOlv9Cfzv56bnX41czF6tLtH
# LB46l6XUch+qNN45ZmOTFwLot3JjwSrn4oycQ9qTET1TFDYd1QND0LiXmKz9OqBX
# ai6S8XdyCQEZvfL82jIAs9pwsAQ6XvV9jNybPStRgF/sOAM/Deyfmej9Tg9FcRwX
# ank2qgzdZZNb8GoEze7f1orcTF0Q89IUXWIlmwEwQFYF1wjn87N4ZxL9Z/xA2m/R
# 1zizFylWP/mpamCnVfZZLkafFLNUNVmcvc+9gM7vceJs37d3ydabk4wR6ObR34sW
# aLppmyPlsI1Qq5Lu6bJCWoXzYuWpkoK6oEep1gML6SRC3HKVS3UscZhtMIIGcTCC
# BFmgAwIBAgIKYQmBKgAAAAAAAjANBgkqhkiG9w0BAQsFADCBiDELMAkGA1UEBhMC
# VVMxEzARBgNVBAgTCldhc2hpbmd0b24xEDAOBgNVBAcTB1JlZG1vbmQxHjAcBgNV
# BAoTFU1pY3Jvc29mdCBDb3Jwb3JhdGlvbjEyMDAGA1UEAxMpTWljcm9zb2Z0IFJv
# b3QgQ2VydGlmaWNhdGUgQXV0aG9yaXR5IDIwMTAwHhcNMTAwNzAxMjEzNjU1WhcN
# MjUwNzAxMjE0NjU1WjB8MQswCQYDVQQGEwJVUzETMBEGA1UECBMKV2FzaGluZ3Rv
# bjEQMA4GA1UEBxMHUmVkbW9uZDEeMBwGA1UEChMVTWljcm9zb2Z0IENvcnBvcmF0
# aW9uMSYwJAYDVQQDEx1NaWNyb3NvZnQgVGltZS1TdGFtcCBQQ0EgMjAxMDCCASIw
# DQYJKoZIhvcNAQEBBQADggEPADCCAQoCggEBAKkdDbx3EYo6IOz8E5f1+n9plGt0
# VBDVpQoAgoX77XxoSyxfxcPlYcJ2tz5mK1vwFVMnBDEfQRsalR3OCROOfGEwWbEw
# RA/xYIiEVEMM1024OAizQt2TrNZzMFcmgqNFDdDq9UeBzb8kYDJYYEbyWEeGMoQe
# dGFnkV+BVLHPk0ySwcSmXdFhE24oxhr5hoC732H8RsEnHSRnEnIaIYqvS2SJUGKx
# Xf13Hz3wV3WsvYpCTUBR0Q+cBj5nf/VmwAOWRH7v0Ev9buWayrGo8noqCjHw2k4G
# kbaICDXoeByw6ZnNPOcvRLqn9NxkvaQBwSAJk3jN/LzAyURdXhacAQVPIk0CAwEA
# AaOCAeYwggHiMBAGCSsGAQQBgjcVAQQDAgEAMB0GA1UdDgQWBBTVYzpcijGQ80N7
# fEYbxTNoWoVtVTAZBgkrBgEEAYI3FAIEDB4KAFMAdQBiAEMAQTALBgNVHQ8EBAMC
# AYYwDwYDVR0TAQH/BAUwAwEB/zAfBgNVHSMEGDAWgBTV9lbLj+iiXGJo0T2UkFvX
# zpoYxDBWBgNVHR8ETzBNMEugSaBHhkVodHRwOi8vY3JsLm1pY3Jvc29mdC5jb20v
# cGtpL2NybC9wcm9kdWN0cy9NaWNSb29DZXJBdXRfMjAxMC0wNi0yMy5jcmwwWgYI
# KwYBBQUHAQEETjBMMEoGCCsGAQUFBzAChj5odHRwOi8vd3d3Lm1pY3Jvc29mdC5j
# b20vcGtpL2NlcnRzL01pY1Jvb0NlckF1dF8yMDEwLTA2LTIzLmNydDCBoAYDVR0g
# AQH/BIGVMIGSMIGPBgkrBgEEAYI3LgMwgYEwPQYIKwYBBQUHAgEWMWh0dHA6Ly93
# d3cubWljcm9zb2Z0LmNvbS9QS0kvZG9jcy9DUFMvZGVmYXVsdC5odG0wQAYIKwYB
# BQUHAgIwNB4yIB0ATABlAGcAYQBsAF8AUABvAGwAaQBjAHkAXwBTAHQAYQB0AGUA
# bQBlAG4AdAAuIB0wDQYJKoZIhvcNAQELBQADggIBAAfmiFEN4sbgmD+BcQM9naOh
# IW+z66bM9TG+zwXiqf76V20ZMLPCxWbJat/15/B4vceoniXj+bzta1RXCCtRgkQS
# +7lTjMz0YBKKdsxAQEGb3FwX/1z5Xhc1mCRWS3TvQhDIr79/xn/yN31aPxzymXlK
# kVIArzgPF/UveYFl2am1a+THzvbKegBvSzBEJCI8z+0DpZaPWSm8tv0E4XCfMkon
# /VWvL/625Y4zu2JfmttXQOnxzplmkIz/amJ/3cVKC5Em4jnsGUpxY517IW3DnKOi
# PPp/fZZqkHimbdLhnPkd/DjYlPTGpQqWhqS9nhquBEKDuLWAmyI4ILUl5WTs9/S/
# fmNZJQ96LjlXdqJxqgaKD4kWumGnEcua2A5HmoDF0M2n0O99g/DhO3EJ3110mCII
# YdqwUB5vvfHhAN/nMQekkzr3ZUd46PioSKv33nJ+YWtvd6mBy6cJrDm77MbL2IK0
# cs0d9LiFAR6A+xuJKlQ5slvayA1VmXqHczsI5pgt6o3gMy4SKfXAL1QnIffIrE7a
# KLixqduWsqdCosnPGUFN4Ib5KpqjEWYw07t0MkvfY3v1mYovG8chr1m1rtxEPJdQ
# cdeh0sVV42neV8HR3jDA/czmTfsNv11P6Z0eGTgvvM9YBS7vDaBQNdrvCScc1bN+
# NR4Iuto229Nfj950iEkSoYIC0jCCAjsCAQEwgfyhgdSkgdEwgc4xCzAJBgNVBAYT
# AlVTMRMwEQYDVQQIEwpXYXNoaW5ndG9uMRAwDgYDVQQHEwdSZWRtb25kMR4wHAYD
# VQQKExVNaWNyb3NvZnQgQ29ycG9yYXRpb24xKTAnBgNVBAsTIE1pY3Jvc29mdCBP
# cGVyYXRpb25zIFB1ZXJ0byBSaWNvMSYwJAYDVQQLEx1UaGFsZXMgVFNTIEVTTjo2
# MEJDLUUzODMtMjYzNTElMCMGA1UEAxMcTWljcm9zb2Z0IFRpbWUtU3RhbXAgU2Vy
# dmljZaIjCgEBMAcGBSsOAwIaAxUACmcyOWmZxErpq06B8dy6oMZ6//yggYMwgYCk
# fjB8MQswCQYDVQQGEwJVUzETMBEGA1UECBMKV2FzaGluZ3RvbjEQMA4GA1UEBxMH
# UmVkbW9uZDEeMBwGA1UEChMVTWljcm9zb2Z0IENvcnBvcmF0aW9uMSYwJAYDVQQD
# Ex1NaWNyb3NvZnQgVGltZS1TdGFtcCBQQ0EgMjAxMDANBgkqhkiG9w0BAQUFAAIF
# AOMxeOgwIhgPMjAyMDEwMTQxNzE3MjhaGA8yMDIwMTAxNTE3MTcyOFowdzA9Bgor
# BgEEAYRZCgQBMS8wLTAKAgUA4zF46AIBADAKAgEAAgIQPAIB/zAHAgEAAgIRZDAK
# AgUA4zLKaAIBADA2BgorBgEEAYRZCgQCMSgwJjAMBgorBgEEAYRZCgMCoAowCAIB
# AAIDB6EgoQowCAIBAAIDAYagMA0GCSqGSIb3DQEBBQUAA4GBALEDKhtH6no+VBWb
# KHscN3Q0bphy1tgMhLZ0UBYpPSgcrPnF36tX3nswRAci3gLdgc77hjn2Zc6UyVJk
# WhFguWv6KoyTunGPejS/fPIGKm1CXQnEV/JUvt1EAf7YRpHImfjZBhNXbVyV61gy
# fEGA6fNNgbI+57xQJCZqdKBYX3EFMYIDDTCCAwkCAQEwgZMwfDELMAkGA1UEBhMC
# VVMxEzARBgNVBAgTCldhc2hpbmd0b24xEDAOBgNVBAcTB1JlZG1vbmQxHjAcBgNV
# BAoTFU1pY3Jvc29mdCBDb3Jwb3JhdGlvbjEmMCQGA1UEAxMdTWljcm9zb2Z0IFRp
# bWUtU3RhbXAgUENBIDIwMTACEzMAAAEm37pLIrmCggcAAAAAASYwDQYJYIZIAWUD
# BAIBBQCgggFKMBoGCSqGSIb3DQEJAzENBgsqhkiG9w0BCRABBDAvBgkqhkiG9w0B
# CQQxIgQgmfmj5y7wRFTyeI0TaXaljaCJoRQMvGBEAXsAQuY3ZOcwgfoGCyqGSIb3
# DQEJEAIvMYHqMIHnMIHkMIG9BCA2/c/vnr1ecAzvapOWZ2xGfAkzrkfpGcrvMW07
# CQl1DzCBmDCBgKR+MHwxCzAJBgNVBAYTAlVTMRMwEQYDVQQIEwpXYXNoaW5ndG9u
# BgkqhkiG9w0BAQEFAASCAQBIgx+sFXkLXf7Xbx7opCD3uhpQGEQ4x/LsqTax0bu1
# GC/cxiI+dodUz+T4hKj1ZQyUH0Zlce32GutY048O9tkr7fQyuohoFUgChdIATEOY
# qAIESFbDT07i7khJfO2pewlhgM+A5ClvBa8HAvV0wOd+2IVgv3pgow1LEJm0/5NB
# E3IFA+hFrqiWALOY0uUep4H20EHMrbqw3YoV3EodIkTj3fC76q4K/bF84EZLUgjY
# e4rmXac8n7A9qR18QzGl8usEJej4OHU4nlUT1J734m+AWIFmfb/Zr2MyXED0V4q4
# Vbmw3O7xD9STeNYrn5RjPmGPEN04akHxhNUSqLIc9vxQoYIS8DCCEuwGCisGAQQB
# gjcDAwExghLcMIIS2AYJKoZIhvcNAQcCoIISyTCCEsUCAQMxDzANBglghkgBZQME
# AgEFADCCAVQGCyqGSIb3DQEJEAEEoIIBQwSCAT8wggE7AgEBBgorBgEEAYRZCgMB
# MDEwDQYJYIZIAWUDBAIBBQAEIPPK1A0D1n7ZEdgTjKPY4sWiOMtohMqGpFvG55NY
# SFHeAgZepuJh/dEYEjIwMjAwNTI5MTYyNzE1LjMxWjAEgAIB9KCB1KSB0TCBzjEL
# MAkGA1UEBhMCVVMxEzARBgNVBAgTCldhc2hpbmd0b24xEDAOBgNVBAcTB1JlZG1v
# bmQxHjAcBgNVBAoTFU1pY3Jvc29mdCBDb3Jwb3JhdGlvbjEpMCcGA1UECxMgTWlj
# cm9zb2Z0IE9wZXJhdGlvbnMgUHVlcnRvIFJpY28xJjAkBgNVBAsTHVRoYWxlcyBU
# U1MgRVNOOjYwQkMtRTM4My0yNjM1MSUwIwYDVQQDExxNaWNyb3NvZnQgVGltZS1T
# dGFtcCBTZXJ2aWNloIIORDCCBPUwggPdoAMCAQICEzMAAAEm37pLIrmCggcAAAAA
# ASYwDQYJKoZIhvcNAQELBQAwfDELMAkGA1UEBhMCVVMxEzARBgNVBAgTCldhc2hp
# bmd0b24xEDAOBgNVBAcTB1JlZG1vbmQxHjAcBgNVBAoTFU1pY3Jvc29mdCBDb3Jw
# b3JhdGlvbjEmMCQGA1UEAxMdTWljcm9zb2Z0IFRpbWUtU3RhbXAgUENBIDIwMTAw
# HhcNMTkxMjE5MDExNDU5WhcNMjEwMzE3MDExNDU5WjCBzjELMAkGA1UEBhMCVVMx
# EzARBgNVBAgTCldhc2hpbmd0b24xEDAOBgNVBAcTB1JlZG1vbmQxHjAcBgNVBAoT
# FU1pY3Jvc29mdCBDb3Jwb3JhdGlvbjEpMCcGA1UECxMgTWljcm9zb2Z0IE9wZXJh
# dGlvbnMgUHVlcnRvIFJpY28xJjAkBgNVBAsTHVRoYWxlcyBUU1MgRVNOOjYwQkMt
# RTM4My0yNjM1MSUwIwYDVQQDExxNaWNyb3NvZnQgVGltZS1TdGFtcCBTZXJ2aWNl
# MIIBIjANBgkqhkiG9w0BAQEFAAOCAQ8AMIIBCgKCAQEAnjC+hpxO8w2VdBO18X8L
# Hk6XdfR9yNQ0y+MuBOY7n5YdgkVunvbk/f6q8UoNFAdYQjVLPSAHbi6tUMiNeMGH
# k1U0lUxAkja2W2/szj/ghuFklvfHNBbsuiUShlhRlqcFNS7KXL2iwKDijmOhWJPY
# a2bLEr4W/mQLbSXail5p6m138Ttx4MAVEzzuGI0Kwr8ofIL7z6zCeWDiBM57LrNC
# qHOA2wboeuMsG4O0Oz2LMAzBLbJZPRPnZAD2HdD4HUL2mzZ8wox74Mekb7RzrUP3
# hiHpxXZceJvhIEKfAgVkB5kTZQnio8A1JijMjw8f4TmsJPdJWpi8ei73sexe8/Yj
# cwIDAQABo4IBGzCCARcwHQYDVR0OBBYEFEmrrB8XsH6YQo3RWKZfxqM0DmFBMB8G
# A1UdIwQYMBaAFNVjOlyKMZDzQ3t8RhvFM2hahW1VMFYGA1UdHwRPME0wS6BJoEeG
# RWh0dHA6Ly9jcmwubWljcm9zb2Z0LmNvbS9wa2kvY3JsL3Byb2R1Y3RzL01pY1Rp
# bVN0YVBDQV8yMDEwLTA3LTAxLmNybDBaBggrBgEFBQcBAQROMEwwSgYIKwYBBQUH
# MAKGPmh0dHA6Ly93d3cubWljcm9zb2Z0LmNvbS9wa2kvY2VydHMvTWljVGltU3Rh
# UENBXzIwMTAtMDctMDEuY3J0MAwGA1UdEwEB/wQCMAAwEwYDVR0lBAwwCgYIKwYB
# BQUHAwgwDQYJKoZIhvcNAQELBQADggEBAECW+51o6W/0J/O/npudfjVzMXq0u0cs
# HjqXpdRyH6o03jlmY5MXAui3cmPBKufijJxD2pMRPVMUNh3VA0PQuJeYrP06oFdq
# LpLxd3IJARm98vzaMgCz2nCwBDpe9X2M3Js9K1GAX+w4Az8N7J+Z6P1OD0VxHBdq
# eTaqDN1lk1vwagTN7t/WitxMXRDz0hRdYiWbATBAVgXXCOfzs3hnEv1n/EDab9HX
# OLMXKVY/+alqYKdV9lkuRp8Us1Q1WZy9z72Azu9x4mzft3fJ1puTjBHo5tHfixZo
# ummbI+WwjVCrku7pskJahfNi5amSgrqgR6nWAwvpJELccpVLdSxxmG0wggZxMIIE
# WaADAgECAgphCYEqAAAAAAACMA0GCSqGSIb3DQEBCwUAMIGIMQswCQYDVQQGEwJV
# UzETMBEGA1UECBMKV2FzaGluZ3RvbjEQMA4GA1UEBxMHUmVkbW9uZDEeMBwGA1UE
# ChMVTWljcm9zb2Z0IENvcnBvcmF0aW9uMTIwMAYDVQQDEylNaWNyb3NvZnQgUm9v
# dCBDZXJ0aWZpY2F0ZSBBdXRob3JpdHkgMjAxMDAeFw0xMDA3MDEyMTM2NTVaFw0y
# NTA3MDEyMTQ2NTVaMHwxCzAJBgNVBAYTAlVTMRMwEQYDVQQIEwpXYXNoaW5ndG9u
# MRAwDgYDVQQHEwdSZWRtb25kMR4wHAYDVQQKExVNaWNyb3NvZnQgQ29ycG9yYXRp
# b24xJjAkBgNVBAMTHU1pY3Jvc29mdCBUaW1lLVN0YW1wIFBDQSAyMDEwAhMzAAAB
# Jt+6SyK5goIHAAAAAAEmMCIEIJ7sOcZ9sNFABAvIMRs2kk0cZhB239DZbXCLYMT8
# frPMMA0GCSqGSIb3DQEBCwUABIIBAEiXebYdQ9DIz74YpfQ9FBaLHiSfD3s+jO7x
# 1noNe0HIdZaX/Asow0OqsEMzZanOpa3yO8BJskKoDJW9pU//xqCzV1W5FzoOT4Qs
# ZJpG0R5f/eHqMMeRBVUPn1FfT4pQVcHfRHOW/I3hWC0G4SeVwU/L9d8JLSQKzl39
# 8bMFbtLJWxUJMM4Vp8Tf+cR7ShZdsK9w88QokR9xbuQgn6jsqhOuyw+dUGrwEI7h
# GCdUmsT614oSgdnuUBf/g1aew0e3ulmZYYQ2QLKqnDXuqUIFnPtWFB90h++mdlFg
# fvIEusNgYkb2kl5xQfxm3wynbxtP249vWF4GACZtqqSj3tcQ+xQ=
# b24xJjAkBgNVBAMTHU1pY3Jvc29mdCBUaW1lLVN0YW1wIFBDQSAyMDEwMIIBIjAN
# BgkqhkiG9w0BAQEFAAOCAQ8AMIIBCgKCAQEAqR0NvHcRijog7PwTl/X6f2mUa3RU
# ENWlCgCChfvtfGhLLF/Fw+Vhwna3PmYrW/AVUycEMR9BGxqVHc4JE458YTBZsTBE
# D/FgiIRUQwzXTbg4CLNC3ZOs1nMwVyaCo0UN0Or1R4HNvyRgMlhgRvJYR4YyhB50
# YWeRX4FUsc+TTJLBxKZd0WETbijGGvmGgLvfYfxGwScdJGcSchohiq9LZIlQYrFd
# /XcfPfBXday9ikJNQFHRD5wGPmd/9WbAA5ZEfu/QS/1u5ZrKsajyeioKMfDaTgaR
# togINeh4HLDpmc085y9Euqf03GS9pAHBIAmTeM38vMDJRF1eFpwBBU8iTQIDAQAB
# o4IB5jCCAeIwEAYJKwYBBAGCNxUBBAMCAQAwHQYDVR0OBBYEFNVjOlyKMZDzQ3t8
# RhvFM2hahW1VMBkGCSsGAQQBgjcUAgQMHgoAUwB1AGIAQwBBMAsGA1UdDwQEAwIB
# hjAPBgNVHRMBAf8EBTADAQH/MB8GA1UdIwQYMBaAFNX2VsuP6KJcYmjRPZSQW9fO
# mhjEMFYGA1UdHwRPME0wS6BJoEeGRWh0dHA6Ly9jcmwubWljcm9zb2Z0LmNvbS9w
# a2kvY3JsL3Byb2R1Y3RzL01pY1Jvb0NlckF1dF8yMDEwLTA2LTIzLmNybDBaBggr
# BgEFBQcBAQROMEwwSgYIKwYBBQUHMAKGPmh0dHA6Ly93d3cubWljcm9zb2Z0LmNv
# bS9wa2kvY2VydHMvTWljUm9vQ2VyQXV0XzIwMTAtMDYtMjMuY3J0MIGgBgNVHSAB
# Af8EgZUwgZIwgY8GCSsGAQQBgjcuAzCBgTA9BggrBgEFBQcCARYxaHR0cDovL3d3
# dy5taWNyb3NvZnQuY29tL1BLSS9kb2NzL0NQUy9kZWZhdWx0Lmh0bTBABggrBgEF
# BQcCAjA0HjIgHQBMAGUAZwBhAGwAXwBQAG8AbABpAGMAeQBfAFMAdABhAHQAZQBt
# AGUAbgB0AC4gHTANBgkqhkiG9w0BAQsFAAOCAgEAB+aIUQ3ixuCYP4FxAz2do6Eh
# b7Prpsz1Mb7PBeKp/vpXbRkws8LFZslq3/Xn8Hi9x6ieJeP5vO1rVFcIK1GCRBL7
# uVOMzPRgEop2zEBAQZvcXBf/XPleFzWYJFZLdO9CEMivv3/Gf/I3fVo/HPKZeUqR
# UgCvOA8X9S95gWXZqbVr5MfO9sp6AG9LMEQkIjzP7QOllo9ZKby2/QThcJ8ySif9
# Va8v/rbljjO7Yl+a21dA6fHOmWaQjP9qYn/dxUoLkSbiOewZSnFjnXshbcOco6I8
# +n99lmqQeKZt0uGc+R38ONiU9MalCpaGpL2eGq4EQoO4tYCbIjggtSXlZOz39L9+
# Y1klD3ouOVd2onGqBooPiRa6YacRy5rYDkeagMXQzafQ732D8OE7cQnfXXSYIghh
# 2rBQHm+98eEA3+cxB6STOvdlR3jo+KhIq/fecn5ha293qYHLpwmsObvsxsvYgrRy
# zR30uIUBHoD7G4kqVDmyW9rIDVWZeodzOwjmmC3qjeAzLhIp9cAvVCch98isTtoo
# uLGp25ayp0Kiyc8ZQU3ghvkqmqMRZjDTu3QyS99je/WZii8bxyGvWbWu3EQ8l1Bx
# 16HSxVXjad5XwdHeMMD9zOZN+w2/XU/pnR4ZOC+8z1gFLu8NoFA12u8JJxzVs341
# Hgi62jbb01+P3nSISRKhggLSMIICOwIBATCB/KGB1KSB0TCBzjELMAkGA1UEBhMC
# VVMxEzARBgNVBAgTCldhc2hpbmd0b24xEDAOBgNVBAcTB1JlZG1vbmQxHjAcBgNV
# BAoTFU1pY3Jvc29mdCBDb3Jwb3JhdGlvbjEpMCcGA1UECxMgTWljcm9zb2Z0IE9w
# ZXJhdGlvbnMgUHVlcnRvIFJpY28xJjAkBgNVBAsTHVRoYWxlcyBUU1MgRVNOOjYw
# QkMtRTM4My0yNjM1MSUwIwYDVQQDExxNaWNyb3NvZnQgVGltZS1TdGFtcCBTZXJ2
# aWNloiMKAQEwBwYFKw4DAhoDFQAKZzI5aZnESumrToHx3Lqgxnr//KCBgzCBgKR+
# MHwxCzAJBgNVBAYTAlVTMRMwEQYDVQQIEwpXYXNoaW5ndG9uMRAwDgYDVQQHEwdS
# ZWRtb25kMR4wHAYDVQQKExVNaWNyb3NvZnQgQ29ycG9yYXRpb24xJjAkBgNVBAMT
# HU1pY3Jvc29mdCBUaW1lLVN0YW1wIFBDQSAyMDEwMA0GCSqGSIb3DQEBBQUAAgUA
# 4nuQTDAiGA8yMDIwMDUyOTE3NDQ0NFoYDzIwMjAwNTMwMTc0NDQ0WjB3MD0GCisG
# AQQBhFkKBAExLzAtMAoCBQDie5BMAgEAMAoCAQACAiZJAgH/MAcCAQACAhEjMAoC
# BQDifOHMAgEAMDYGCisGAQQBhFkKBAIxKDAmMAwGCisGAQQBhFkKAwKgCjAIAgEA
# AgMHoSChCjAIAgEAAgMBhqAwDQYJKoZIhvcNAQEFBQADgYEAprmyJTXdH9FmQZ0I
# mRSJdjc/RrSqDm8DUEq/h3FL73G/xvg9MbQj1J/h3hdlSIPcQXjrhL8hud/vyF0j
# IFaTK5YOcixkX++9t7Vz3Mn0KkQo8F4DNSyZEPpz682AyKKwLMJDy52pFFFKNP5l
# NpOz6YY1Od1xvk4nyN1WwfLnGswxggMNMIIDCQIBATCBkzB8MQswCQYDVQQGEwJV
# UzETMBEGA1UECBMKV2FzaGluZ3RvbjEQMA4GA1UEBxMHUmVkbW9uZDEeMBwGA1UE
# ChMVTWljcm9zb2Z0IENvcnBvcmF0aW9uMSYwJAYDVQQDEx1NaWNyb3NvZnQgVGlt
# ZS1TdGFtcCBQQ0EgMjAxMAITMwAAASbfuksiuYKCBwAAAAABJjANBglghkgBZQME
# AgEFAKCCAUowGgYJKoZIhvcNAQkDMQ0GCyqGSIb3DQEJEAEEMC8GCSqGSIb3DQEJ
# BDEiBCB0IE0Q6P23RQlh8TFyp57UQQUF/sbui7mOMStRgTFZxTCB+gYLKoZIhvcN
# AQkQAi8xgeowgecwgeQwgb0EIDb9z++evV5wDO9qk5ZnbEZ8CTOuR+kZyu8xbTsJ
# CXUPMIGYMIGApH4wfDELMAkGA1UEBhMCVVMxEzARBgNVBAgTCldhc2hpbmd0b24x
# EDAOBgNVBAcTB1JlZG1vbmQxHjAcBgNVBAoTFU1pY3Jvc29mdCBDb3Jwb3JhdGlv
# bjEmMCQGA1UEAxMdTWljcm9zb2Z0IFRpbWUtU3RhbXAgUENBIDIwMTACEzMAAAEm
# 37pLIrmCggcAAAAAASYwIgQgtwi02bvsGAOdpAxEF607G6g9PlyS8vc2bAUSHovH
# /IIwDQYJKoZIhvcNAQELBQAEggEAEMCfsXNudrjztjI6JNyNDVpdF1axRVcGiNy6
# 67pgb1EePsjA2EaBB+5ZjgO/73JxuiVgsoXgH7em8tKG5RQJtcm5obVDb+jKksK4
# qcFLA1f7seQRGfE06UAPnSFh2GqMtTNJGCXWwqWLH2LduTjOqPt8Nupo16ABFIT2
# akTzBSJ81EHBkEU0Et6CgeaZiBYrCCXUtD+ASvLDkPSrjweQGu3Zk1SSROEzxMY9
# jdlGfMkK2krMd9ub9UZ13RcQDijJqo+h6mz76pAuiFFvuQl6wMoSGFaaUQwfd+WQ
# gXlVVX/A9JFBihrxnDVglEPlsIOxCHkTeIxLfnAkCbax+9pevA==
# SIG # End signature block

View File

@@ -270,7 +270,7 @@ check_pre_reqs() {
[ -z "$($LDCONFIG_COMMAND 2>/dev/null | grep zlib)" ] && say_warning "Unable to locate zlib. Probable prerequisite missing; install zlib."
[ -z "$($LDCONFIG_COMMAND 2>/dev/null | grep ssl)" ] && say_warning "Unable to locate libssl. Probable prerequisite missing; install libssl."
[ -z "$($LDCONFIG_COMMAND 2>/dev/null | grep libicu)" ] && say_warning "Unable to locate libicu. Probable prerequisite missing; install libicu."
[ -z "$($LDCONFIG_COMMAND 2>/dev/null | grep lttng)" ] && say_warning "Unable to locate liblttng. Probable prerequisite missing; install liblttng."
[ -z "$($LDCONFIG_COMMAND 2>/dev/null | grep lttng)" ] && say_warning "Unable to locate liblttng. Probable prerequisite missing; install libcurl."
[ -z "$($LDCONFIG_COMMAND 2>/dev/null | grep libcurl)" ] && say_warning "Unable to locate libcurl. Probable prerequisite missing; install libcurl."
fi
@@ -373,7 +373,7 @@ get_normalized_architecture_from_architecture() {
;;
esac
say_err "Architecture \`$architecture\` not supported. If you think this is a bug, report it at https://github.com/dotnet/install-scripts/issues"
say_err "Architecture \`$architecture\` not supported. If you think this is a bug, report it at https://github.com/dotnet/sdk/issues"
return 1
}
@@ -545,18 +545,17 @@ construct_download_link() {
local channel="$2"
local normalized_architecture="$3"
local specific_version="${4//[$'\t\r\n']}"
local specific_product_version="$(get_specific_product_version "$1" "$4")"
local osname
osname="$(get_current_os_name)" || return 1
local download_link=null
if [[ "$runtime" == "dotnet" ]]; then
download_link="$azure_feed/Runtime/$specific_version/dotnet-runtime-$specific_product_version-$osname-$normalized_architecture.tar.gz"
download_link="$azure_feed/Runtime/$specific_version/dotnet-runtime-$specific_version-$osname-$normalized_architecture.tar.gz"
elif [[ "$runtime" == "aspnetcore" ]]; then
download_link="$azure_feed/aspnetcore/Runtime/$specific_version/aspnetcore-runtime-$specific_product_version-$osname-$normalized_architecture.tar.gz"
download_link="$azure_feed/aspnetcore/Runtime/$specific_version/aspnetcore-runtime-$specific_version-$osname-$normalized_architecture.tar.gz"
elif [ -z "$runtime" ]; then
download_link="$azure_feed/Sdk/$specific_version/dotnet-sdk-$specific_product_version-$osname-$normalized_architecture.tar.gz"
download_link="$azure_feed/Sdk/$specific_version/dotnet-sdk-$specific_version-$osname-$normalized_architecture.tar.gz"
else
return 1
fi
@@ -565,44 +564,6 @@ construct_download_link() {
return 0
}
# args:
# azure_feed - $1
# specific_version - $2
get_specific_product_version() {
# If we find a 'productVersion.txt' at the root of any folder, we'll use its contents
# to resolve the version of what's in the folder, superseding the specified version.
eval $invocation
local azure_feed="$1"
local specific_version="${2//[$'\t\r\n']}"
local specific_product_version=$specific_version
local download_link=null
if [[ "$runtime" == "dotnet" ]]; then
download_link="$azure_feed/Runtime/$specific_version/productVersion.txt${feed_credential}"
elif [[ "$runtime" == "aspnetcore" ]]; then
download_link="$azure_feed/aspnetcore/Runtime/$specific_version/productVersion.txt${feed_credential}"
elif [ -z "$runtime" ]; then
download_link="$azure_feed/Sdk/$specific_version/productVersion.txt${feed_credential}"
else
return 1
fi
specific_product_version=$(curl -s --fail "$download_link")
if [ $? -ne 0 ]
then
specific_product_version=$(wget -qO- "$download_link")
if [ $? -ne 0 ]
then
specific_product_version=$specific_version
fi
fi
specific_product_version="${specific_product_version//[$'\t\r\n']}"
echo "$specific_product_version"
return 0
}
# args:
# azure_feed - $1
# channel - $2
@@ -810,7 +771,6 @@ calculate_vars() {
say_verbose "normalized_architecture=$normalized_architecture"
specific_version="$(get_specific_version_from_version "$azure_feed" "$channel" "$normalized_architecture" "$version" "$json_file")"
specific_product_version="$(get_specific_product_version "$azure_feed" "$specific_version")"
say_verbose "specific_version=$specific_version"
if [ -z "$specific_version" ]; then
say_err "Could not resolve version information."
@@ -909,12 +869,12 @@ install_dotnet() {
fi
# Check if the standard SDK version is installed.
say_verbose "Checking installation: version = $specific_product_version"
if is_dotnet_package_installed "$install_root" "$asset_relative_path" "$specific_product_version"; then
say_verbose "Checking installation: version = $specific_version"
if is_dotnet_package_installed "$install_root" "$asset_relative_path" "$specific_version"; then
return 0
fi
say_err "\`$asset_name\` with version = $specific_product_version failed to install with an unknown error."
say_err "\`$asset_name\` with version = $specific_version failed to install with an unknown error."
return 1
}

View File

@@ -1683,9 +1683,9 @@
}
},
"lodash": {
"version": "4.17.19",
"resolved": "https://registry.npmjs.org/lodash/-/lodash-4.17.19.tgz",
"integrity": "sha512-JNvd8XER9GQX0v2qJgsaN/mzFCNA5BRe/j8JN9d+tWyGLSodKQHKFicdwNYzWwI3wjRnaKPsGj1XkBjx/F96DQ==",
"version": "4.17.15",
"resolved": "https://registry.npmjs.org/lodash/-/lodash-4.17.15.tgz",
"integrity": "sha512-8xOcRHvCjnocdS5cpwXQXVzmmh5e5+saE2QGoeQmbKmRS6J3VQppPOIt0MnmE+4xlZoumy0GPG0D0MVIQbNA1A==",
"dev": true
},
"lodash.unescape": {

View File

@@ -23,7 +23,5 @@
<key>ACTIONS_RUNNER_SVC</key>
<string>1</string>
</dict>
<key>ProcessType</key>
<string>Interactive</string>
</dict>
</plist>

View File

@@ -70,8 +70,8 @@ then
exit 1
fi
# libicu version prefer: libicu66 -> libicu63 -> libicu60 -> libicu57 -> libicu55 -> libicu52
apt install -y libicu66 || apt install -y libicu63 || apt install -y libicu60 || apt install -y libicu57 || apt install -y libicu55 || apt install -y libicu52
# libicu version prefer: libicu63 -> libicu60 -> libicu57 -> libicu55 -> libicu52
apt install -y libicu63 || apt install -y libicu60 || apt install -y libicu57 || apt install -y libicu55 || apt install -y libicu52
if [ $? -ne 0 ]
then
echo "'apt' failed with exit code '$?'"
@@ -99,8 +99,8 @@ then
exit 1
fi
# libicu version prefer: libicu66 -> libicu63 -> libicu60 -> libicu57 -> libicu55 -> libicu52
apt-get install -y libicu66 || apt-get install -y libicu63 || apt-get install -y libicu60 || apt install -y libicu57 || apt install -y libicu55 || apt install -y libicu52
# libicu version prefer: libicu63 -> libicu60 -> libicu57 -> libicu55 -> libicu52
apt-get install -y libicu63 || apt-get install -y libicu60 || apt install -y libicu57 || apt install -y libicu55 || apt install -y libicu52
if [ $? -ne 0 ]
then
echo "'apt-get' failed with exit code '$?'"

View File

@@ -63,25 +63,12 @@ function install()
sed "s/{{User}}/${run_as_user}/g; s/{{Description}}/$(echo ${SVC_DESCRIPTION} | sed -e 's/[\/&]/\\&/g')/g; s/{{RunnerRoot}}/$(echo ${RUNNER_ROOT} | sed -e 's/[\/&]/\\&/g')/g;" "${TEMPLATE_PATH}" > "${TEMP_PATH}" || failed "failed to create replacement temp file"
mv "${TEMP_PATH}" "${UNIT_PATH}" || failed "failed to copy unit file"
# Recent Fedora based Linux (CentOS/Redhat) has SELinux enabled by default
# We need to restore security context on the unit file we added otherwise SystemD have no access to it.
command -v getenforce > /dev/null
if [ $? -eq 0 ]
then
selinuxEnabled=$(getenforce)
if [[ $selinuxEnabled == "Enforcing" ]]
then
# SELinux is enabled, we will need to Restore SELinux Context for the service file
restorecon -r -v "${UNIT_PATH}" || failed "failed to restore SELinux context on ${UNIT_PATH}"
fi
fi
# unit file should not be executable and world writable
chmod 664 "${UNIT_PATH}" || failed "failed to set permissions on ${UNIT_PATH}"
chmod 664 ${UNIT_PATH} || failed "failed to set permissions on ${UNIT_PATH}"
systemctl daemon-reload || failed "failed to reload daemons"
# Since we started with sudo, runsvc.sh will be owned by root. Change this to current login user.
# Since we started with sudo, runsvc.sh will be owned by root. Change this to current login user.
cp ./bin/runsvc.sh ./runsvc.sh || failed "failed to copy runsvc.sh"
chown ${run_as_uid}:${run_as_gid} ./runsvc.sh || failed "failed to set owner for runsvc.sh"
chmod 755 ./runsvc.sh || failed "failed to set permission for runsvc.sh"

View File

@@ -67,7 +67,7 @@ while [ -h "$SOURCE" ]; do # resolve $SOURCE until the file is no longer a symli
[[ $SOURCE != /* ]] && SOURCE="$DIR/$SOURCE" # if $SOURCE was a relative symlink, we need to resolve it relative to the path where the symlink file was located
done
DIR="$( cd -P "$( dirname "$SOURCE" )" && pwd )"
cd "$DIR"
cd $DIR
source ./env.sh

View File

@@ -90,7 +90,7 @@ namespace GitHub.Runner.Common
public static readonly string Labels = "labels";
public static readonly string MonitorSocketAddress = "monitorsocketaddress";
public static readonly string Name = "name";
public static readonly string RunnerGroup = "runnergroup";
public static readonly string Pool = "pool";
public static readonly string StartupType = "startuptype";
public static readonly string Url = "url";
public static readonly string UserName = "username";
@@ -140,9 +140,6 @@ namespace GitHub.Runner.Common
public static readonly string InternalTelemetryIssueDataKey = "_internal_telemetry";
public static readonly string WorkerCrash = "WORKER_CRASH";
public static readonly string UnsupportedCommand = "UNSUPPORTED_COMMAND";
public static readonly string UnsupportedCommandMessage = "The `{0}` command is deprecated and will be disabled soon. Please upgrade to using Environment Files. For more information see: https://github.blog/changelog/2020-10-01-github-actions-deprecating-set-env-and-add-path-commands/";
public static readonly string UnsupportedCommandMessageDisabled = "The `{0}` command is disabled. Please upgrade to using Environment Files or opt into unsecure command execution by setting the `ACTIONS_ALLOW_UNSECURE_COMMANDS` environment variable to `true`. For more information see: https://github.blog/changelog/2020-10-01-github-actions-deprecating-set-env-and-add-path-commands/";
}
public static class RunnerEvent
@@ -201,7 +198,6 @@ namespace GitHub.Runner.Common
//
// Keep alphabetical
//
public static readonly string AllowUnsupportedCommands = "ACTIONS_ALLOW_UNSECURE_COMMANDS";
public static readonly string RunnerDebug = "ACTIONS_RUNNER_DEBUG";
public static readonly string StepDebug = "ACTIONS_STEP_DEBUG";
}

View File

@@ -56,10 +56,6 @@ namespace GitHub.Runner.Common
Add<T>(extensions, "GitHub.Runner.Worker.EndGroupCommandExtension, Runner.Worker");
Add<T>(extensions, "GitHub.Runner.Worker.EchoCommandExtension, Runner.Worker");
break;
case "GitHub.Runner.Worker.IFileCommandExtension":
Add<T>(extensions, "GitHub.Runner.Worker.AddPathFileCommand, Runner.Worker");
Add<T>(extensions, "GitHub.Runner.Worker.SetEnvFileCommand, Runner.Worker");
break;
default:
// This should never happen.
throw new NotSupportedException($"Unexpected extension type: '{typeof(T).FullName}'");

View File

@@ -16,14 +16,12 @@ namespace GitHub.Runner.Common
// logging and console
Task<TaskLog> AppendLogContentAsync(Guid scopeIdentifier, string hubName, Guid planId, int logId, Stream uploadStream, CancellationToken cancellationToken);
Task AppendTimelineRecordFeedAsync(Guid scopeIdentifier, string hubName, Guid planId, Guid timelineId, Guid timelineRecordId, Guid stepId, IList<string> lines, CancellationToken cancellationToken);
Task AppendTimelineRecordFeedAsync(Guid scopeIdentifier, string hubName, Guid planId, Guid timelineId, Guid timelineRecordId, Guid stepId, IList<string> lines, long startLine, CancellationToken cancellationToken);
Task<TaskAttachment> CreateAttachmentAsync(Guid scopeIdentifier, string hubName, Guid planId, Guid timelineId, Guid timelineRecordId, String type, String name, Stream uploadStream, CancellationToken cancellationToken);
Task<TaskLog> CreateLogAsync(Guid scopeIdentifier, string hubName, Guid planId, TaskLog log, CancellationToken cancellationToken);
Task<Timeline> CreateTimelineAsync(Guid scopeIdentifier, string hubName, Guid planId, Guid timelineId, CancellationToken cancellationToken);
Task<List<TimelineRecord>> UpdateTimelineRecordsAsync(Guid scopeIdentifier, string hubName, Guid planId, Guid timelineId, IEnumerable<TimelineRecord> records, CancellationToken cancellationToken);
Task RaisePlanEventAsync<T>(Guid scopeIdentifier, string hubName, Guid planId, T eventData, CancellationToken cancellationToken) where T : JobEvent;
Task<Timeline> GetTimelineAsync(Guid scopeIdentifier, string hubName, Guid planId, Guid timelineId, CancellationToken cancellationToken);
Task<ActionDownloadInfoCollection> ResolveActionDownloadInfoAsync(Guid scopeIdentifier, string hubName, Guid planId, ActionReferenceList actions, CancellationToken cancellationToken);
}
public sealed class JobServer : RunnerService, IJobServer
@@ -80,12 +78,6 @@ namespace GitHub.Runner.Common
return _taskClient.AppendTimelineRecordFeedAsync(scopeIdentifier, hubName, planId, timelineId, timelineRecordId, stepId, lines, cancellationToken: cancellationToken);
}
public Task AppendTimelineRecordFeedAsync(Guid scopeIdentifier, string hubName, Guid planId, Guid timelineId, Guid timelineRecordId, Guid stepId, IList<string> lines, long startLine, CancellationToken cancellationToken)
{
CheckConnection();
return _taskClient.AppendTimelineRecordFeedAsync(scopeIdentifier, hubName, planId, timelineId, timelineRecordId, stepId, lines, startLine, cancellationToken: cancellationToken);
}
public Task<TaskAttachment> CreateAttachmentAsync(Guid scopeIdentifier, string hubName, Guid planId, Guid timelineId, Guid timelineRecordId, string type, string name, Stream uploadStream, CancellationToken cancellationToken)
{
CheckConnection();
@@ -121,14 +113,5 @@ namespace GitHub.Runner.Common
CheckConnection();
return _taskClient.GetTimelineAsync(scopeIdentifier, hubName, planId, timelineId, includeRecords: true, cancellationToken: cancellationToken);
}
//-----------------------------------------------------------------
// Action download info
//-----------------------------------------------------------------
public Task<ActionDownloadInfoCollection> ResolveActionDownloadInfoAsync(Guid scopeIdentifier, string hubName, Guid planId, ActionReferenceList actions, CancellationToken cancellationToken)
{
CheckConnection();
return _taskClient.ResolveActionDownloadInfoAsync(scopeIdentifier, hubName, planId, actions, cancellationToken: cancellationToken);
}
}
}

View File

@@ -18,7 +18,7 @@ namespace GitHub.Runner.Common
event EventHandler<ThrottlingEventArgs> JobServerQueueThrottling;
Task ShutdownAsync();
void Start(Pipelines.AgentJobRequestMessage jobRequest);
void QueueWebConsoleLine(Guid stepRecordId, string line, long? lineNumber = null);
void QueueWebConsoleLine(Guid stepRecordId, string line);
void QueueFileUpload(Guid timelineId, Guid timelineRecordId, string type, string name, string path, bool deleteSource);
void QueueTimelineRecordUpdate(Guid timelineId, TimelineRecord timelineRecord);
}
@@ -155,10 +155,10 @@ namespace GitHub.Runner.Common
Trace.Info("All queue process tasks have been stopped, and all queues are drained.");
}
public void QueueWebConsoleLine(Guid stepRecordId, string line, long? lineNumber)
public void QueueWebConsoleLine(Guid stepRecordId, string line)
{
Trace.Verbose("Enqueue web console line queue: {0}", line);
_webConsoleLineQueue.Enqueue(new ConsoleLineInfo(stepRecordId, line, lineNumber));
_webConsoleLineQueue.Enqueue(new ConsoleLineInfo(stepRecordId, line));
}
public void QueueFileUpload(Guid timelineId, Guid timelineRecordId, string type, string name, string path, bool deleteSource)
@@ -214,7 +214,7 @@ namespace GitHub.Runner.Common
}
// Group consolelines by timeline record of each step
Dictionary<Guid, List<TimelineRecordLogLine>> stepsConsoleLines = new Dictionary<Guid, List<TimelineRecordLogLine>>();
Dictionary<Guid, List<string>> stepsConsoleLines = new Dictionary<Guid, List<string>>();
List<Guid> stepRecordIds = new List<Guid>(); // We need to keep lines in order
int linesCounter = 0;
ConsoleLineInfo lineInfo;
@@ -222,7 +222,7 @@ namespace GitHub.Runner.Common
{
if (!stepsConsoleLines.ContainsKey(lineInfo.StepRecordId))
{
stepsConsoleLines[lineInfo.StepRecordId] = new List<TimelineRecordLogLine>();
stepsConsoleLines[lineInfo.StepRecordId] = new List<string>();
stepRecordIds.Add(lineInfo.StepRecordId);
}
@@ -232,7 +232,7 @@ namespace GitHub.Runner.Common
lineInfo.Line = $"{lineInfo.Line.Substring(0, 1024)}...";
}
stepsConsoleLines[lineInfo.StepRecordId].Add(new TimelineRecordLogLine(lineInfo.Line, lineInfo.LineNumber));
stepsConsoleLines[lineInfo.StepRecordId].Add(lineInfo.Line);
linesCounter++;
// process at most about 500 lines of web console line during regular timer dequeue task.
@@ -247,13 +247,13 @@ namespace GitHub.Runner.Common
{
// Split consolelines into batch, each batch will container at most 100 lines.
int batchCounter = 0;
List<List<TimelineRecordLogLine>> batchedLines = new List<List<TimelineRecordLogLine>>();
List<List<string>> batchedLines = new List<List<string>>();
foreach (var line in stepsConsoleLines[stepRecordId])
{
var currentBatch = batchedLines.ElementAtOrDefault(batchCounter);
if (currentBatch == null)
{
batchedLines.Add(new List<TimelineRecordLogLine>());
batchedLines.Add(new List<string>());
currentBatch = batchedLines.ElementAt(batchCounter);
}
@@ -275,6 +275,7 @@ namespace GitHub.Runner.Common
{
Trace.Info($"Skip {batchedLines.Count - 2} batches web console lines for last run");
batchedLines = batchedLines.TakeLast(2).ToList();
batchedLines[0].Insert(0, "...");
}
int errorCount = 0;
@@ -283,15 +284,7 @@ namespace GitHub.Runner.Common
try
{
// we will not requeue failed batch, since the web console lines are time sensitive.
if (batch[0].LineNumber.HasValue)
{
await _jobServer.AppendTimelineRecordFeedAsync(_scopeIdentifier, _hubName, _planId, _jobTimelineId, _jobTimelineRecordId, stepRecordId, batch.Select(logLine => logLine.Line).ToList(), batch[0].LineNumber.Value, default(CancellationToken));
}
else
{
await _jobServer.AppendTimelineRecordFeedAsync(_scopeIdentifier, _hubName, _planId, _jobTimelineId, _jobTimelineRecordId, stepRecordId, batch.Select(logLine => logLine.Line).ToList(), default(CancellationToken));
}
await _jobServer.AppendTimelineRecordFeedAsync(_scopeIdentifier, _hubName, _planId, _jobTimelineId, _jobTimelineRecordId, stepRecordId, batch, default(CancellationToken));
if (_firstConsoleOutputs)
{
HostContext.WritePerfCounter($"WorkerJobServerQueueAppendFirstConsoleOutput_{_planId.ToString()}");
@@ -660,15 +653,13 @@ namespace GitHub.Runner.Common
internal class ConsoleLineInfo
{
public ConsoleLineInfo(Guid recordId, string line, long? lineNumber)
public ConsoleLineInfo(Guid recordId, string line)
{
this.StepRecordId = recordId;
this.Line = line;
this.LineNumber = lineNumber;
}
public Guid StepRecordId { get; set; }
public string Line { get; set; }
public long? LineNumber { get; set; }
}
}

View File

@@ -1,51 +0,0 @@
using System;
using System.Threading;
using System.Threading.Tasks;
using GitHub.Runner.Sdk;
using GitHub.Runner.Common;
namespace GitHub.Runner.Common.Util
{
public static class EncodingUtil
{
public static async Task SetEncoding(IHostContext hostContext, Tracing trace, CancellationToken cancellationToken)
{
#if OS_WINDOWS
try
{
if (Console.InputEncoding.CodePage != 65001)
{
using (var p = hostContext.CreateService<IProcessInvoker>())
{
// Use UTF8 code page
int exitCode = await p.ExecuteAsync(workingDirectory: hostContext.GetDirectory(WellKnownDirectory.Work),
fileName: WhichUtil.Which("chcp", true, trace),
arguments: "65001",
environment: null,
requireExitCodeZero: false,
outputEncoding: null,
killProcessOnCancel: false,
redirectStandardIn: null,
inheritConsoleHandler: true,
cancellationToken: cancellationToken);
if (exitCode == 0)
{
trace.Info("Successfully returned to code page 65001 (UTF8)");
}
else
{
trace.Warning($"'chcp 65001' failed with exit code {exitCode}");
}
}
}
}
catch (Exception ex)
{
trace.Warning($"'chcp 65001' failed with exception {ex.Message}");
}
#endif
// Dummy variable to prevent compiler error CS1998: "This async method lacks 'await' operators and will run synchronously..."
await Task.CompletedTask;
}
}
}

View File

@@ -42,7 +42,7 @@ namespace GitHub.Runner.Listener
Constants.Runner.CommandLine.Args.Labels,
Constants.Runner.CommandLine.Args.MonitorSocketAddress,
Constants.Runner.CommandLine.Args.Name,
Constants.Runner.CommandLine.Args.RunnerGroup,
Constants.Runner.CommandLine.Args.Pool,
Constants.Runner.CommandLine.Args.StartupType,
Constants.Runner.CommandLine.Args.Token,
Constants.Runner.CommandLine.Args.Url,
@@ -169,15 +169,6 @@ namespace GitHub.Runner.Listener
validator: Validators.NonEmptyValidator);
}
public string GetRunnerGroupName(string defaultPoolName = null)
{
return GetArgOrPrompt(
name: Constants.Runner.CommandLine.Args.RunnerGroup,
description: "Enter the name of the runner group to add this runner to:",
defaultValue: defaultPoolName ?? "default",
validator: Validators.NonEmptyValidator);
}
public string GetToken()
{
return GetArgOrPrompt(

View File

@@ -159,34 +159,17 @@ namespace GitHub.Runner.Listener.Configuration
_term.WriteSection("Runner Registration");
// If we have more than one runner group available, allow the user to specify which one to be added into
string poolName = null;
TaskAgentPool agentPool = null;
//Get all the agent pools, and select the first private pool
List<TaskAgentPool> agentPools = await _runnerServer.GetAgentPoolsAsync();
TaskAgentPool defaultPool = agentPools?.Where(x => x.IsInternal).FirstOrDefault();
TaskAgentPool agentPool = agentPools?.Where(x => x.IsHosted == false).FirstOrDefault();
if (agentPools?.Where(x => !x.IsHosted).Count() > 1)
if (agentPool == null)
{
poolName = command.GetRunnerGroupName(defaultPool?.Name);
_term.WriteLine();
agentPool = agentPools.Where(x => string.Equals(poolName, x.Name, StringComparison.OrdinalIgnoreCase) && !x.IsHosted).FirstOrDefault();
throw new TaskAgentPoolNotFoundException($"Could not find any private pool. Contact support.");
}
else
{
agentPool = defaultPool;
}
if (agentPool == null && poolName == null)
{
throw new TaskAgentPoolNotFoundException($"Could not find any self-hosted runner groups. Contact support.");
}
else if (agentPool == null && poolName != null)
{
throw new TaskAgentPoolNotFoundException($"Could not find any self-hosted runner group named \"{poolName}\".");
}
else
{
Trace.Info("Found a self-hosted runner group with id {1} and name {2}", agentPool.Id, agentPool.Name);
Trace.Info("Found a private pool with id {1} and name {2}", agentPool.Id, agentPool.Name);
runnerSettings.PoolId = agentPool.Id;
runnerSettings.PoolName = agentPool.Name;
}
@@ -227,7 +210,7 @@ namespace GitHub.Runner.Listener.Configuration
else if (command.Unattended)
{
// if not replace and it is unattended config.
throw new TaskAgentExistsException($"A runner exists with the same name {runnerSettings.AgentName}.");
throw new TaskAgentExistsException($"Pool {runnerSettings.PoolId} already contains a runner with name {runnerSettings.AgentName}.");
}
}
else

View File

@@ -462,14 +462,13 @@ Options:
--commit Prints the runner commit
Config Options:
--unattended Disable interactive prompts for missing arguments. Defaults will be used for missing options
--url string Repository to add the runner to. Required if unattended
--token string Registration token. Required if unattended
--name string Name of the runner to configure (default {Environment.MachineName ?? "myrunner"})
--runnergroup string Name of the runner group to add this runner to (defaults to the default runner group)
--labels string Extra labels in addition to the default: 'self-hosted,{Constants.Runner.Platform},{Constants.Runner.PlatformArchitecture}'
--work string Relative runner work directory (default {Constants.Path.WorkDirectory})
--replace Replace any existing runner with the same name (default false)");
--unattended Disable interactive prompts for missing arguments. Defaults will be used for missing options
--url string Repository to add the runner to. Required if unattended
--token string Registration token. Required if unattended
--name string Name of the runner to configure (default {Environment.MachineName ?? "myrunner"})
--labels string Extra labels in addition to the default: 'self-hosted,{Constants.Runner.Platform},{Constants.Runner.PlatformArchitecture}'
--work string Relative runner work directory (default {Constants.Path.WorkDirectory})
--replace Replace any existing runner with the same name (default false)");
#if OS_WINDOWS
_term.WriteLine($@" --runasservice Run the runner as a service");
_term.WriteLine($@" --windowslogonaccount string Account to run the service as. Requires runasservice");

View File

@@ -346,16 +346,16 @@ namespace GitHub.Runner.Sdk
// data buffers one last time before returning
ProcessOutput();
if (cancellationToken.IsCancellationRequested)
{
// Ensure cancellation also finish on the cancellationToken.Register thread.
await cancellationFinished.Task;
Trace.Info($"Process Cancellation finished.");
}
Trace.Info($"Finished process {_proc.Id} with exit code {_proc.ExitCode}, and elapsed time {_stopWatch.Elapsed}.");
}
if (cancellationToken.IsCancellationRequested)
{
// Ensure cancellation also finish on the cancellationToken.Register thread.
await cancellationFinished.Task;
Trace.Info($"Process Cancellation finished.");
}
cancellationToken.ThrowIfCancellationRequested();
// Wait for process to finish.

View File

@@ -30,7 +30,7 @@ namespace GitHub.Runner.Sdk
//
// For example, on an en-US box, this is required for loading the encoding for the
// default console output code page '437'. Without loading the correct encoding for
// code page IBM437, some characters cannot be translated correctly, e.g. write 'ç'
// code page IBM437, some characters cannot be translated correctly, e.g. write 'ç'
// from powershell.exe.
Encoding.RegisterProvider(CodePagesEncodingProvider.Instance);
#endif

View File

@@ -11,11 +11,6 @@ namespace GitHub.Runner.Sdk
{
ArgUtil.NotNullOrEmpty(command, nameof(command));
trace?.Info($"Which: '{command}'");
if (Path.IsPathFullyQualified(command) && File.Exists(command))
{
trace?.Info($"Fully qualified path: '{command}'");
return command;
}
string path = Environment.GetEnvironmentVariable(PathUtil.PathVariable);
if (string.IsNullOrEmpty(path))
{

View File

@@ -1,5 +1,4 @@
using GitHub.DistributedTask.Pipelines;
using GitHub.DistributedTask.Pipelines.ContextData;
using GitHub.DistributedTask.WebApi;
using GitHub.Runner.Common.Util;
using GitHub.Runner.Worker.Container;
@@ -184,64 +183,12 @@ namespace GitHub.Runner.Worker
public void ProcessCommand(IExecutionContext context, string line, ActionCommand command, ContainerInfo container)
{
var configurationStore = HostContext.GetService<IConfigurationStore>();
var isHostedServer = configurationStore.GetSettings().IsHostedServer;
var allowUnsecureCommands = false;
bool.TryParse(Environment.GetEnvironmentVariable(Constants.Variables.Actions.AllowUnsupportedCommands), out allowUnsecureCommands);
// Apply environment from env context, env context contains job level env and action's env block
#if OS_WINDOWS
var envContext = context.ExpressionValues["env"] as DictionaryContextData;
#else
var envContext = context.ExpressionValues["env"] as CaseSensitiveDictionaryContextData;
#endif
if (!allowUnsecureCommands && envContext.ContainsKey(Constants.Variables.Actions.AllowUnsupportedCommands))
{
bool.TryParse(envContext[Constants.Variables.Actions.AllowUnsupportedCommands].ToString(), out allowUnsecureCommands);
}
// TODO: Eventually remove isHostedServer and apply this to dotcom customers as well
if (!isHostedServer && !allowUnsecureCommands)
{
throw new Exception(String.Format(Constants.Runner.UnsupportedCommandMessageDisabled, this.Command));
}
else if (!allowUnsecureCommands)
{
// Log Telemetry and let user know they shouldn't do this
var issue = new Issue()
{
Type = IssueType.Error,
Message = String.Format(Constants.Runner.UnsupportedCommandMessage, this.Command)
};
issue.Data[Constants.Runner.InternalTelemetryIssueDataKey] = Constants.Runner.UnsupportedCommand;
context.AddIssue(issue);
}
if (!command.Properties.TryGetValue(SetEnvCommandProperties.Name, out string envName) || string.IsNullOrEmpty(envName))
{
throw new Exception("Required field 'name' is missing in ##[set-env] command.");
}
foreach (var blocked in _setEnvBlockList)
{
if (string.Equals(blocked, envName, StringComparison.OrdinalIgnoreCase))
{
// Log Telemetry and let user know they shouldn't do this
var issue = new Issue()
{
Type = IssueType.Error,
Message = $"Can't update {blocked} environment variable using ::set-env:: command."
};
issue.Data[Constants.Runner.InternalTelemetryIssueDataKey] = $"{Constants.Runner.UnsupportedCommand}_{envName}";
context.AddIssue(issue);
return;
}
}
context.Global.EnvironmentVariables[envName] = command.Data;
context.EnvironmentVariables[envName] = command.Data;
context.SetEnvContext(envName, command.Data);
context.Debug($"{envName}='{command.Data}'");
}
@@ -250,11 +197,6 @@ namespace GitHub.Runner.Worker
{
public const String Name = "name";
}
private string[] _setEnvBlockList =
{
"NODE_OPTIONS"
};
}
public sealed class SetOutputCommandExtension : RunnerService, IActionCommandExtension
@@ -340,43 +282,9 @@ namespace GitHub.Runner.Worker
public void ProcessCommand(IExecutionContext context, string line, ActionCommand command, ContainerInfo container)
{
var configurationStore = HostContext.GetService<IConfigurationStore>();
var isHostedServer = configurationStore.GetSettings().IsHostedServer;
var allowUnsecureCommands = false;
bool.TryParse(Environment.GetEnvironmentVariable(Constants.Variables.Actions.AllowUnsupportedCommands), out allowUnsecureCommands);
// Apply environment from env context, env context contains job level env and action's env block
#if OS_WINDOWS
var envContext = context.ExpressionValues["env"] as DictionaryContextData;
#else
var envContext = context.ExpressionValues["env"] as CaseSensitiveDictionaryContextData;
#endif
if (!allowUnsecureCommands && envContext.ContainsKey(Constants.Variables.Actions.AllowUnsupportedCommands))
{
bool.TryParse(envContext[Constants.Variables.Actions.AllowUnsupportedCommands].ToString(), out allowUnsecureCommands);
}
// TODO: Eventually remove isHostedServer and apply this to dotcom customers as well
if (!isHostedServer && !allowUnsecureCommands)
{
throw new Exception(String.Format(Constants.Runner.UnsupportedCommandMessageDisabled, this.Command));
}
else if (!allowUnsecureCommands)
{
// Log Telemetry and let user know they shouldn't do this
var issue = new Issue()
{
Type = IssueType.Error,
Message = String.Format(Constants.Runner.UnsupportedCommandMessage, this.Command)
};
issue.Data[Constants.Runner.InternalTelemetryIssueDataKey] = Constants.Runner.UnsupportedCommand;
context.AddIssue(issue);
}
ArgUtil.NotNullOrEmpty(command.Data, "path");
context.Global.PrependPath.RemoveAll(x => string.Equals(x, command.Data, StringComparison.CurrentCulture));
context.Global.PrependPath.Add(command.Data);
context.PrependPath.RemoveAll(x => string.Equals(x, command.Data, StringComparison.CurrentCulture));
context.PrependPath.Add(command.Data);
}
}

View File

@@ -14,7 +14,6 @@ using GitHub.Runner.Common;
using GitHub.Runner.Sdk;
using GitHub.Runner.Worker.Container;
using GitHub.Services.Common;
using WebApi = GitHub.DistributedTask.WebApi;
using Pipelines = GitHub.DistributedTask.Pipelines;
using PipelineTemplateConstants = GitHub.DistributedTask.Pipelines.ObjectTemplating.PipelineTemplateConstants;
@@ -66,7 +65,7 @@ namespace GitHub.Runner.Worker
// TODO: Deprecate the PREVIEW_ACTION_TOKEN
// Log even if we aren't using it to ensure users know.
if (!string.IsNullOrEmpty(executionContext.Global.Variables.Get("PREVIEW_ACTION_TOKEN")))
if (!string.IsNullOrEmpty(executionContext.Variables.Get("PREVIEW_ACTION_TOKEN")))
{
executionContext.Warning("The 'PREVIEW_ACTION_TOKEN' secret is deprecated. Please remove it from the repository's secrets");
}
@@ -75,7 +74,7 @@ namespace GitHub.Runner.Worker
IOUtil.DeleteDirectory(HostContext.GetDirectory(WellKnownDirectory.Actions), executionContext.CancellationToken);
// todo: Remove when feature flag DistributedTask.NewActionMetadata is removed
var newActionMetadata = executionContext.Global.Variables.GetBoolean("DistributedTask.NewActionMetadata") ?? false;
var newActionMetadata = executionContext.Variables.GetBoolean("DistributedTask.NewActionMetadata") ?? false;
var repositoryActions = new List<Pipelines.ActionStep>();
@@ -395,14 +394,6 @@ namespace GitHub.Runner.Worker
Trace.Info($"Action cleanup plugin: {plugin.PluginTypeName}.");
}
}
else if (definition.Data.Execution.ExecutionType == ActionExecutionType.Composite)
{
var compositeAction = definition.Data.Execution as CompositeActionExecutionData;
Trace.Info($"Load {compositeAction.Steps?.Count ?? 0} action steps.");
Trace.Verbose($"Details: {StringUtil.ConvertToJson(compositeAction?.Steps)}");
Trace.Info($"Load: {compositeAction.Outputs?.Count ?? 0} number of outputs");
Trace.Info($"Details: {StringUtil.ConvertToJson(compositeAction?.Outputs)}");
}
else
{
throw new NotSupportedException(definition.Data.Execution.ExecutionType.ToString());
@@ -468,7 +459,7 @@ namespace GitHub.Runner.Worker
ArgUtil.NotNull(setupInfo, nameof(setupInfo));
ArgUtil.NotNullOrEmpty(setupInfo.Container.Image, nameof(setupInfo.Container.Image));
executionContext.Output($"##[group]Pull down action image '{setupInfo.Container.Image}'");
executionContext.Output($"Pull down action image '{setupInfo.Container.Image}'");
// Pull down docker image with retry up to 3 times
var dockerManger = HostContext.GetService<IDockerCommandManager>();
@@ -492,7 +483,6 @@ namespace GitHub.Runner.Worker
}
}
}
executionContext.Output("##[endgroup]");
if (retryCount == 3 && pullExitCode != 0)
{
@@ -512,7 +502,7 @@ namespace GitHub.Runner.Worker
ArgUtil.NotNull(setupInfo, nameof(setupInfo));
ArgUtil.NotNullOrEmpty(setupInfo.Container.Dockerfile, nameof(setupInfo.Container.Dockerfile));
executionContext.Output($"##[group]Build container for action use: '{setupInfo.Container.Dockerfile}'.");
executionContext.Output($"Build container for action use: '{setupInfo.Container.Dockerfile}'.");
// Build docker image with retry up to 3 times
var dockerManger = HostContext.GetService<IDockerCommandManager>();
@@ -542,7 +532,6 @@ namespace GitHub.Runner.Worker
}
}
}
executionContext.Output("##[endgroup]");
if (retryCount == 3 && buildExitCode != 0)
{
@@ -556,77 +545,124 @@ namespace GitHub.Runner.Worker
}
}
// This implementation is temporary and will be replaced with a REST API call to the service to resolve
private async Task<IDictionary<string, WebApi.ActionDownloadInfo>> GetDownloadInfoAsync(IExecutionContext executionContext, List<Pipelines.ActionStep> actions)
// This implementation is temporary and will be removed when we switch to a REST API call to the service to resolve the download info
private async Task<bool> RepoExistsAsync(IExecutionContext executionContext, Pipelines.RepositoryPathReference repositoryReference, string authorization)
{
executionContext.Output("Getting action download info");
// Convert to action reference
var actionReferences = actions
.GroupBy(x => GetDownloadInfoLookupKey(x))
.Where(x => !string.IsNullOrEmpty(x.Key))
.Select(x =>
{
var action = x.First();
var repositoryReference = action.Reference as Pipelines.RepositoryPathReference;
ArgUtil.NotNull(repositoryReference, nameof(repositoryReference));
return new WebApi.ActionReference
{
NameWithOwner = repositoryReference.Name,
Ref = repositoryReference.Ref,
};
})
.ToList();
// Nothing to resolve?
if (actionReferences.Count == 0)
{
return new Dictionary<string, WebApi.ActionDownloadInfo>();
}
// Resolve download info
var jobServer = HostContext.GetService<IJobServer>();
var actionDownloadInfos = default(WebApi.ActionDownloadInfoCollection);
var apiUrl = GetApiUrl(executionContext);
var repoUrl = $"{apiUrl}/repos/{repositoryReference.Name}";
for (var attempt = 1; attempt <= 3; attempt++)
{
executionContext.Debug($"Checking whether repo exists: {repoUrl}");
try
{
actionDownloadInfos = await jobServer.ResolveActionDownloadInfoAsync(executionContext.Global.Plan.ScopeIdentifier, executionContext.Global.Plan.PlanType, executionContext.Global.Plan.PlanId, new WebApi.ActionReferenceList { Actions = actionReferences }, executionContext.CancellationToken);
break;
}
catch (Exception ex) when (attempt < 3)
{
executionContext.Output($"Failed to resolve action download info. Error: {ex.Message}");
executionContext.Debug(ex.ToString());
if (String.IsNullOrEmpty(Environment.GetEnvironmentVariable("_GITHUB_ACTION_DOWNLOAD_NO_BACKOFF")))
using (var httpClientHandler = HostContext.CreateHttpClientHandler())
using (var httpClient = new HttpClient(httpClientHandler))
{
var backoff = BackoffTimerHelper.GetRandomBackoff(TimeSpan.FromSeconds(10), TimeSpan.FromSeconds(30));
executionContext.Output($"Retrying in {backoff.TotalSeconds} seconds");
await Task.Delay(backoff);
httpClient.DefaultRequestHeaders.Authorization = CreateAuthHeader(authorization);
httpClient.DefaultRequestHeaders.UserAgent.AddRange(HostContext.UserAgents);
using (var response = await httpClient.GetAsync(repoUrl))
{
if (response.IsSuccessStatusCode)
{
return true;
}
else if (response.StatusCode == HttpStatusCode.NotFound)
{
return false;
}
else
{
// Throw
response.EnsureSuccessStatusCode();
}
}
}
}
catch (Exception ex)
{
if (attempt < 3)
{
executionContext.Debug($"Failed checking whether repo '{repositoryReference.Name}' exists: {ex.Message}");
}
else
{
executionContext.Error($"Failed checking whether repo '{repositoryReference.Name}' exists: {ex.Message}");
throw;
}
}
}
ArgUtil.NotNull(actionDownloadInfos, nameof(actionDownloadInfos));
ArgUtil.NotNull(actionDownloadInfos.Actions, nameof(actionDownloadInfos.Actions));
var apiUrl = GetApiUrl(executionContext);
var defaultAccessToken = executionContext.GetGitHubContext("token");
return false; // Never reaches here
}
// This implementation is temporary and will be replaced with a REST API call to the service to resolve
private async Task<Dictionary<string, ActionDownloadInfo>> GetDownloadInfoAsync(IExecutionContext executionContext, List<Pipelines.ActionStep> actions)
{
var result = new Dictionary<string, ActionDownloadInfo>(StringComparer.OrdinalIgnoreCase);
var configurationStore = HostContext.GetService<IConfigurationStore>();
var runnerSettings = configurationStore.GetSettings();
var apiUrl = GetApiUrl(executionContext);
var accessToken = executionContext.GetGitHubContext("token");
var base64EncodingToken = Convert.ToBase64String(Encoding.UTF8.GetBytes($"x-access-token:{accessToken}"));
var authorization = $"Basic {base64EncodingToken}";
foreach (var actionDownloadInfo in actionDownloadInfos.Actions.Values)
foreach (var action in actions)
{
// Add secret
HostContext.SecretMasker.AddValue(actionDownloadInfo.Authentication?.Token);
// Default auth token
if (string.IsNullOrEmpty(actionDownloadInfo.Authentication?.Token))
var lookupKey = GetDownloadInfoLookupKey(action);
if (string.IsNullOrEmpty(lookupKey) || result.ContainsKey(lookupKey))
{
actionDownloadInfo.Authentication = new WebApi.ActionDownloadAuthentication { Token = defaultAccessToken };
continue;
}
var repositoryReference = action.Reference as Pipelines.RepositoryPathReference;
ArgUtil.NotNull(repositoryReference, nameof(repositoryReference));
var downloadInfo = default(ActionDownloadInfo);
if (runnerSettings.IsHostedServer)
{
downloadInfo = new ActionDownloadInfo
{
NameWithOwner = repositoryReference.Name,
Ref = repositoryReference.Ref,
ArchiveLink = BuildLinkToActionArchive(apiUrl, repositoryReference.Name, repositoryReference.Ref),
Authorization = authorization,
};
}
// Test whether the repo exists in the instance
else if (await RepoExistsAsync(executionContext, repositoryReference, authorization))
{
downloadInfo = new ActionDownloadInfo
{
NameWithOwner = repositoryReference.Name,
Ref = repositoryReference.Ref,
ArchiveLink = BuildLinkToActionArchive(apiUrl, repositoryReference.Name, repositoryReference.Ref),
Authorization = authorization,
};
}
// Fallback to dotcom
else
{
downloadInfo = new ActionDownloadInfo
{
NameWithOwner = repositoryReference.Name,
Ref = repositoryReference.Ref,
ArchiveLink = BuildLinkToActionArchive(_dotcomApiUrl, repositoryReference.Name, repositoryReference.Ref),
Authorization = null,
};
}
result.Add(lookupKey, downloadInfo);
}
return actionDownloadInfos.Actions;
// Register secrets
foreach (var downloadInfo in result.Values)
{
HostContext.SecretMasker.AddValue(downloadInfo.Authorization);
}
return result;
}
// todo: Remove when feature flag DistributedTask.NewActionMetadata is removed
@@ -673,6 +709,7 @@ namespace GitHub.Runner.Worker
{
string apiUrl = GetApiUrl(executionContext);
string archiveLink = BuildLinkToActionArchive(apiUrl, repositoryReference.Name, repositoryReference.Ref);
Trace.Info($"Download archive '{archiveLink}' to '{destDirectory}'.");
var downloadDetails = new ActionDownloadDetails(archiveLink, ConfigureAuthorizationFromContext);
await DownloadRepositoryActionAsync(executionContext, downloadDetails, null, destDirectory);
return;
@@ -698,6 +735,7 @@ namespace GitHub.Runner.Worker
foreach (var downloadAttempt in downloadAttempts)
{
Trace.Info($"Download archive '{downloadAttempt.ArchiveLink}' to '{destDirectory}'.");
try
{
await DownloadRepositoryActionAsync(executionContext, downloadAttempt, null, destDirectory);
@@ -713,7 +751,7 @@ namespace GitHub.Runner.Worker
}
}
private async Task DownloadRepositoryActionAsync(IExecutionContext executionContext, WebApi.ActionDownloadInfo downloadInfo)
private async Task DownloadRepositoryActionAsync(IExecutionContext executionContext, ActionDownloadInfo downloadInfo)
{
Trace.Entering();
ArgUtil.NotNull(executionContext, nameof(executionContext));
@@ -736,6 +774,7 @@ namespace GitHub.Runner.Worker
executionContext.Output($"Download action repository '{downloadInfo.NameWithOwner}@{downloadInfo.Ref}'");
}
Trace.Info($"Download archive '{downloadInfo.ArchiveLink}' to '{destDirectory}'.");
await DownloadRepositoryActionAsync(executionContext, null, downloadInfo, destDirectory);
}
@@ -760,7 +799,7 @@ namespace GitHub.Runner.Worker
}
// todo: Remove the parameter "actionDownloadDetails" when feature flag DistributedTask.NewActionMetadata is removed
private async Task DownloadRepositoryActionAsync(IExecutionContext executionContext, ActionDownloadDetails actionDownloadDetails, WebApi.ActionDownloadInfo downloadInfo, string destDirectory)
private async Task DownloadRepositoryActionAsync(IExecutionContext executionContext, ActionDownloadDetails actionDownloadDetails, ActionDownloadInfo downloadInfo, string destDirectory)
{
//download and extract action in a temp folder and rename it on success
string tempDirectory = Path.Combine(HostContext.GetDirectory(WellKnownDirectory.Actions), "_temp_" + Guid.NewGuid());
@@ -768,12 +807,11 @@ namespace GitHub.Runner.Worker
#if OS_WINDOWS
string archiveFile = Path.Combine(tempDirectory, $"{Guid.NewGuid()}.zip");
string link = downloadInfo?.ZipballUrl ?? actionDownloadDetails.ArchiveLink;
#else
string archiveFile = Path.Combine(tempDirectory, $"{Guid.NewGuid()}.tar.gz");
string link = downloadInfo?.TarballUrl ?? actionDownloadDetails.ArchiveLink;
#endif
string link = downloadInfo != null ? downloadInfo.ArchiveLink : actionDownloadDetails.ArchiveLink;
Trace.Info($"Save archive '{link}' into {archiveFile}.");
try
{
@@ -801,7 +839,7 @@ namespace GitHub.Runner.Worker
// FF DistributedTask.NewActionMetadata
else
{
httpClient.DefaultRequestHeaders.Authorization = CreateAuthHeader(downloadInfo.Authentication?.Token);
httpClient.DefaultRequestHeaders.Authorization = CreateAuthHeader(downloadInfo.Authorization);
}
httpClient.DefaultRequestHeaders.UserAgent.AddRange(HostContext.UserAgents);
@@ -949,7 +987,7 @@ namespace GitHub.Runner.Worker
if (string.IsNullOrEmpty(authToken))
{
// TODO: Deprecate the PREVIEW_ACTION_TOKEN
authToken = executionContext.Global.Variables.Get("PREVIEW_ACTION_TOKEN");
authToken = executionContext.Variables.Get("PREVIEW_ACTION_TOKEN");
}
if (!string.IsNullOrEmpty(authToken))
@@ -1048,11 +1086,6 @@ namespace GitHub.Runner.Worker
Trace.Info($"Action plugin: {(actionDefinitionData.Execution as PluginActionExecutionData).Plugin}, no more preparation.");
return null;
}
else if (actionDefinitionData.Execution.ExecutionType == ActionExecutionType.Composite)
{
Trace.Info($"Action composite: {(actionDefinitionData.Execution as CompositeActionExecutionData).Steps}, no more preparation.");
return null;
}
else
{
throw new NotSupportedException(actionDefinitionData.Execution.ExecutionType.ToString());
@@ -1104,23 +1137,20 @@ namespace GitHub.Runner.Worker
return $"{repositoryReference.Name}@{repositoryReference.Ref}";
}
private static string GetDownloadInfoLookupKey(WebApi.ActionDownloadInfo info)
private static AuthenticationHeaderValue CreateAuthHeader(string authorization)
{
ArgUtil.NotNullOrEmpty(info.NameWithOwner, nameof(info.NameWithOwner));
ArgUtil.NotNullOrEmpty(info.Ref, nameof(info.Ref));
return $"{info.NameWithOwner}@{info.Ref}";
}
private AuthenticationHeaderValue CreateAuthHeader(string token)
{
if (string.IsNullOrEmpty(token))
if (string.IsNullOrEmpty(authorization))
{
return null;
}
var base64EncodingToken = Convert.ToBase64String(Encoding.UTF8.GetBytes($"x-access-token:{token}"));
HostContext.SecretMasker.AddValue(base64EncodingToken);
return new AuthenticationHeaderValue("Basic", base64EncodingToken);
var split = authorization.Split(new char[] { ' ' }, 2);
if (split.Length != 2 || string.IsNullOrWhiteSpace(split[0]) || string.IsNullOrWhiteSpace(split[1]))
{
throw new Exception("Unexpected authorization header format");
}
return new AuthenticationHeaderValue(split[0].Trim(), split[1].Trim());
}
// todo: Remove when feature flag DistributedTask.NewActionMetadata is removed
@@ -1136,6 +1166,17 @@ namespace GitHub.Runner.Worker
ConfigureAuthorization = configureAuthorization;
}
}
private class ActionDownloadInfo
{
public string NameWithOwner { get; set; }
public string Ref { get; set; }
public string ArchiveLink { get; set; }
public string Authorization { get; set; }
}
}
public sealed class Definition
@@ -1163,7 +1204,6 @@ namespace GitHub.Runner.Worker
NodeJS,
Plugin,
Script,
Composite,
}
public sealed class ContainerActionExecutionData : ActionExecutionData
@@ -1220,15 +1260,6 @@ namespace GitHub.Runner.Worker
public override bool HasPost => false;
}
public sealed class CompositeActionExecutionData : ActionExecutionData
{
public override ActionExecutionType ExecutionType => ActionExecutionType.Composite;
public override bool HasPre => false;
public override bool HasPost => false;
public List<Pipelines.ActionStep> Steps { get; set; }
public MappingToken Outputs { get; set; }
}
public abstract class ActionExecutionData
{
private string _initCondition = $"{Constants.Expressions.Always}()";

View File

@@ -14,7 +14,6 @@ using YamlDotNet.Core;
using YamlDotNet.Core.Events;
using System.Globalization;
using System.Linq;
using Pipelines = GitHub.DistributedTask.Pipelines;
namespace GitHub.Runner.Worker
{
@@ -23,8 +22,6 @@ namespace GitHub.Runner.Worker
{
ActionDefinitionData Load(IExecutionContext executionContext, string manifestFile);
DictionaryContextData EvaluateCompositeOutputs(IExecutionContext executionContext, TemplateToken token, IDictionary<string, PipelineContextData> extraExpressionValues);
List<string> EvaluateContainerArguments(IExecutionContext executionContext, SequenceToken token, IDictionary<string, PipelineContextData> extraExpressionValues);
Dictionary<string, string> EvaluateContainerEnvironment(IExecutionContext executionContext, MappingToken token, IDictionary<string, PipelineContextData> extraExpressionValues);
@@ -35,6 +32,8 @@ namespace GitHub.Runner.Worker
public sealed class ActionManifestManager : RunnerService, IActionManifestManager
{
private TemplateSchema _actionManifestSchema;
private IReadOnlyList<String> _fileTable;
public override void Initialize(IHostContext hostContext)
{
base.Initialize(hostContext);
@@ -55,45 +54,25 @@ namespace GitHub.Runner.Worker
public ActionDefinitionData Load(IExecutionContext executionContext, string manifestFile)
{
var templateContext = CreateTemplateContext(executionContext);
var context = CreateContext(executionContext);
ActionDefinitionData actionDefinition = new ActionDefinitionData();
// Clean up file name real quick
// Instead of using Regex which can be computationally expensive,
// we can just remove the # of characters from the fileName according to the length of the basePath
string basePath = HostContext.GetDirectory(WellKnownDirectory.Actions);
string fileRelativePath = manifestFile;
if (manifestFile.Contains(basePath))
{
fileRelativePath = manifestFile.Remove(0, basePath.Length + 1);
}
try
{
var token = default(TemplateToken);
// Get the file ID
var fileId = templateContext.GetFileId(fileRelativePath);
// Add this file to the FileTable in executionContext if it hasn't been added already
// we use > since fileID is 1 indexed
if (fileId > executionContext.Global.FileTable.Count)
{
executionContext.Global.FileTable.Add(fileRelativePath);
}
var fileId = context.GetFileId(manifestFile);
_fileTable = context.GetFileTable();
// Read the file
var fileContent = File.ReadAllText(manifestFile);
using (var stringReader = new StringReader(fileContent))
{
var yamlObjectReader = new YamlObjectReader(fileId, stringReader);
token = TemplateReader.Read(templateContext, "action-root", yamlObjectReader, fileId, out _);
var yamlObjectReader = new YamlObjectReader(null, stringReader);
token = TemplateReader.Read(context, "action-root", yamlObjectReader, fileId, out _);
}
var actionMapping = token.AssertMapping("action manifest root");
var actionOutputs = default(MappingToken);
var actionRunValueToken = default(TemplateToken);
foreach (var actionPair in actionMapping)
{
var propertyName = actionPair.Key.AssertString($"action.yml property key");
@@ -104,56 +83,44 @@ namespace GitHub.Runner.Worker
actionDefinition.Name = actionPair.Value.AssertString("name").Value;
break;
case "outputs":
actionOutputs = actionPair.Value.AssertMapping("outputs");
break;
case "description":
actionDefinition.Description = actionPair.Value.AssertString("description").Value;
break;
case "inputs":
ConvertInputs(actionPair.Value, actionDefinition);
ConvertInputs(context, actionPair.Value, actionDefinition);
break;
case "runs":
// Defer runs token evaluation to after for loop to ensure that order of outputs doesn't matter.
actionRunValueToken = actionPair.Value;
actionDefinition.Execution = ConvertRuns(context, actionPair.Value);
break;
default:
Trace.Info($"Ignore action property {propertyName}.");
break;
}
}
// Evaluate Runs Last
if (actionRunValueToken != null)
{
actionDefinition.Execution = ConvertRuns(executionContext, templateContext, actionRunValueToken, fileRelativePath, actionOutputs);
}
}
catch (Exception ex)
{
Trace.Error(ex);
templateContext.Errors.Add(ex);
context.Errors.Add(ex);
}
if (templateContext.Errors.Count > 0)
if (context.Errors.Count > 0)
{
foreach (var error in templateContext.Errors)
foreach (var error in context.Errors)
{
Trace.Error($"Action.yml load error: {error.Message}");
executionContext.Error(error.Message);
}
throw new ArgumentException($"Fail to load {fileRelativePath}");
throw new ArgumentException($"Fail to load {manifestFile}");
}
if (actionDefinition.Execution == null)
{
executionContext.Debug($"Loaded action.yml file: {StringUtil.ConvertToJson(actionDefinition)}");
throw new ArgumentException($"Top level 'runs:' section is required for {fileRelativePath}");
throw new ArgumentException($"Top level 'runs:' section is required for {manifestFile}");
}
else
{
@@ -163,33 +130,6 @@ namespace GitHub.Runner.Worker
return actionDefinition;
}
public DictionaryContextData EvaluateCompositeOutputs(
IExecutionContext executionContext,
TemplateToken token,
IDictionary<string, PipelineContextData> extraExpressionValues)
{
var result = default(DictionaryContextData);
if (token != null)
{
var templateContext = CreateTemplateContext(executionContext, extraExpressionValues);
try
{
token = TemplateEvaluator.Evaluate(templateContext, "outputs", token, 0, null, omitHeader: true);
templateContext.Errors.Check();
result = token.ToContextData().AssertDictionary("composite outputs");
}
catch (Exception ex) when (!(ex is TemplateValidationException))
{
templateContext.Errors.Add(ex);
}
templateContext.Errors.Check();
}
return result ?? new DictionaryContextData();
}
public List<string> EvaluateContainerArguments(
IExecutionContext executionContext,
SequenceToken token,
@@ -199,11 +139,11 @@ namespace GitHub.Runner.Worker
if (token != null)
{
var templateContext = CreateTemplateContext(executionContext, extraExpressionValues);
var context = CreateContext(executionContext, extraExpressionValues);
try
{
var evaluateResult = TemplateEvaluator.Evaluate(templateContext, "container-runs-args", token, 0, null, omitHeader: true);
templateContext.Errors.Check();
var evaluateResult = TemplateEvaluator.Evaluate(context, "container-runs-args", token, 0, null, omitHeader: true);
context.Errors.Check();
Trace.Info($"Arguments evaluate result: {StringUtil.ConvertToJson(evaluateResult)}");
@@ -220,10 +160,10 @@ namespace GitHub.Runner.Worker
catch (Exception ex) when (!(ex is TemplateValidationException))
{
Trace.Error(ex);
templateContext.Errors.Add(ex);
context.Errors.Add(ex);
}
templateContext.Errors.Check();
context.Errors.Check();
}
return result;
@@ -238,11 +178,11 @@ namespace GitHub.Runner.Worker
if (token != null)
{
var templateContext = CreateTemplateContext(executionContext, extraExpressionValues);
var context = CreateContext(executionContext, extraExpressionValues);
try
{
var evaluateResult = TemplateEvaluator.Evaluate(templateContext, "container-runs-env", token, 0, null, omitHeader: true);
templateContext.Errors.Check();
var evaluateResult = TemplateEvaluator.Evaluate(context, "container-runs-env", token, 0, null, omitHeader: true);
context.Errors.Check();
Trace.Info($"Environments evaluate result: {StringUtil.ConvertToJson(evaluateResult)}");
@@ -264,10 +204,10 @@ namespace GitHub.Runner.Worker
catch (Exception ex) when (!(ex is TemplateValidationException))
{
Trace.Error(ex);
templateContext.Errors.Add(ex);
context.Errors.Add(ex);
}
templateContext.Errors.Check();
context.Errors.Check();
}
return result;
@@ -281,11 +221,11 @@ namespace GitHub.Runner.Worker
string result = "";
if (token != null)
{
var templateContext = CreateTemplateContext(executionContext);
var context = CreateContext(executionContext);
try
{
var evaluateResult = TemplateEvaluator.Evaluate(templateContext, "input-default-context", token, 0, null, omitHeader: true);
templateContext.Errors.Check();
var evaluateResult = TemplateEvaluator.Evaluate(context, "input-default-context", token, 0, null, omitHeader: true);
context.Errors.Check();
Trace.Info($"Input '{inputName}': default value evaluate result: {StringUtil.ConvertToJson(evaluateResult)}");
@@ -295,16 +235,16 @@ namespace GitHub.Runner.Worker
catch (Exception ex) when (!(ex is TemplateValidationException))
{
Trace.Error(ex);
templateContext.Errors.Add(ex);
context.Errors.Add(ex);
}
templateContext.Errors.Check();
context.Errors.Check();
}
return result;
}
private TemplateContext CreateTemplateContext(
private TemplateContext CreateContext(
IExecutionContext executionContext,
IDictionary<string, PipelineContextData> extraExpressionValues = null)
{
@@ -341,21 +281,21 @@ namespace GitHub.Runner.Worker
result.ExpressionFunctions.Add(item);
}
// Add the file table from the Execution Context
for (var i = 0; i < executionContext.Global.FileTable.Count; i++)
// Add the file table
if (_fileTable?.Count > 0)
{
result.GetFileId(executionContext.Global.FileTable[i]);
for (var i = 0 ; i < _fileTable.Count ; i++)
{
result.GetFileId(_fileTable[i]);
}
}
return result;
}
private ActionExecutionData ConvertRuns(
IExecutionContext executionContext,
TemplateContext templateContext,
TemplateToken inputsToken,
String fileRelativePath,
MappingToken outputs = null)
TemplateContext context,
TemplateToken inputsToken)
{
var runsMapping = inputsToken.AssertMapping("runs");
var usingToken = default(StringToken);
@@ -371,8 +311,6 @@ namespace GitHub.Runner.Worker
var postToken = default(StringToken);
var postEntrypointToken = default(StringToken);
var postIfToken = default(StringToken);
var steps = default(List<Pipelines.Step>);
foreach (var run in runsMapping)
{
var runsKey = run.Key.AssertString("runs key").Value;
@@ -417,11 +355,6 @@ namespace GitHub.Runner.Worker
case "pre-if":
preIfToken = run.Value.AssertString("pre-if");
break;
case "steps":
var stepsToken = run.Value.AssertSequence("steps");
steps = PipelineTemplateConverter.ConvertToSteps(templateContext, stepsToken);
templateContext.Errors.Check();
break;
default:
Trace.Info($"Ignore run property {runsKey}.");
break;
@@ -434,7 +367,7 @@ namespace GitHub.Runner.Worker
{
if (string.IsNullOrEmpty(imageToken?.Value))
{
throw new ArgumentNullException($"You are using a Container Action but an image is not provided in {fileRelativePath}.");
throw new ArgumentNullException($"Image is not provided.");
}
else
{
@@ -455,7 +388,7 @@ namespace GitHub.Runner.Worker
{
if (string.IsNullOrEmpty(mainToken?.Value))
{
throw new ArgumentNullException($"You are using a JavaScript Action but there is not an entry JavaScript file provided in {fileRelativePath}.");
throw new ArgumentNullException($"Entry javascript file is not provided.");
}
else
{
@@ -469,21 +402,6 @@ namespace GitHub.Runner.Worker
};
}
}
else if (string.Equals(usingToken.Value, "composite", StringComparison.OrdinalIgnoreCase))
{
if (steps == null)
{
throw new ArgumentNullException($"You are using a composite action but there are no steps provided in {fileRelativePath}.");
}
else
{
return new CompositeActionExecutionData()
{
Steps = steps.Cast<Pipelines.ActionStep>().ToList(),
Outputs = outputs
};
}
}
else
{
throw new ArgumentOutOfRangeException($"'using: {usingToken.Value}' is not supported, use 'docker' or 'node12' instead.");
@@ -501,6 +419,7 @@ namespace GitHub.Runner.Worker
}
private void ConvertInputs(
TemplateContext context,
TemplateToken inputsToken,
ActionDefinitionData actionDefinition)
{

View File

@@ -136,19 +136,15 @@ namespace GitHub.Runner.Worker
}
// Setup container stephost for running inside the container.
if (ExecutionContext.Global.Container != null)
if (ExecutionContext.Container != null)
{
// Make sure required container is already created.
ArgUtil.NotNullOrEmpty(ExecutionContext.Global.Container.ContainerId, nameof(ExecutionContext.Global.Container.ContainerId));
ArgUtil.NotNullOrEmpty(ExecutionContext.Container.ContainerId, nameof(ExecutionContext.Container.ContainerId));
var containerStepHost = HostContext.CreateService<IContainerStepHost>();
containerStepHost.Container = ExecutionContext.Global.Container;
containerStepHost.Container = ExecutionContext.Container;
stepHost = containerStepHost;
}
// Setup File Command Manager
var fileCommandManager = HostContext.CreateService<IFileCommandManager>();
fileCommandManager.InitializeFiles(ExecutionContext, null);
// Load the inputs.
ExecutionContext.Debug("Loading inputs");
var templateEvaluator = ExecutionContext.ToPipelineTemplateEvaluator();
@@ -235,22 +231,14 @@ namespace GitHub.Runner.Worker
handlerData,
inputs,
environment,
ExecutionContext.Global.Variables,
ExecutionContext.Variables,
actionDirectory: definition.Directory);
// Print out action details
handler.PrintActionDetails(Stage);
// Run the task.
try
{
await handler.RunAsync(Stage);
}
finally
{
fileCommandManager.ProcessFiles(ExecutionContext, ExecutionContext.Global.Container);
}
await handler.RunAsync(Stage);
}
public bool TryEvaluateDisplayName(DictionaryContextData contextData, IExecutionContext context)

View File

@@ -34,9 +34,6 @@ namespace GitHub.Runner.Worker.Container
_environmentVariables = container.Environment;
this.IsJobContainer = isJobContainer;
this.ContainerNetworkAlias = networkAlias;
this.RegistryAuthUsername = container.Credentials?.Username;
this.RegistryAuthPassword = container.Credentials?.Password;
this.RegistryServer = DockerUtil.ParseRegistryHostnameFromImageName(this.ContainerImage);
#if OS_WINDOWS
_pathMappings.Add(new PathMapping(hostContext.GetDirectory(WellKnownDirectory.Work), "C:\\__w"));
@@ -82,9 +79,6 @@ namespace GitHub.Runner.Worker.Container
public string ContainerWorkDirectory { get; set; }
public string ContainerCreateOptions { get; private set; }
public string ContainerRuntimePath { get; set; }
public string RegistryServer { get; set; }
public string RegistryAuthUsername { get; set; }
public string RegistryAuthPassword { get; set; }
public bool IsJobContainer { get; set; }
public IDictionary<string, string> ContainerEnvironmentVariables

View File

@@ -4,7 +4,6 @@ using System.IO;
using System.Linq;
using System.Text.RegularExpressions;
using System.Threading;
using System.Threading.Channels;
using System.Threading.Tasks;
using GitHub.Runner.Common;
using GitHub.Runner.Sdk;
@@ -18,7 +17,6 @@ namespace GitHub.Runner.Worker.Container
string DockerInstanceLabel { get; }
Task<DockerVersion> DockerVersion(IExecutionContext context);
Task<int> DockerPull(IExecutionContext context, string image);
Task<int> DockerPull(IExecutionContext context, string image, string configFileDirectory);
Task<int> DockerBuild(IExecutionContext context, string workingDirectory, string dockerFile, string dockerContext, string tag);
Task<string> DockerCreate(IExecutionContext context, ContainerInfo container);
Task<int> DockerRun(IExecutionContext context, ContainerInfo container, EventHandler<ProcessDataReceivedEventArgs> stdoutDataReceived, EventHandler<ProcessDataReceivedEventArgs> stderrDataReceived);
@@ -33,7 +31,6 @@ namespace GitHub.Runner.Worker.Container
Task<int> DockerExec(IExecutionContext context, string containerId, string options, string command, List<string> outputs);
Task<List<string>> DockerInspect(IExecutionContext context, string dockerObject, string options);
Task<List<PortMapping>> DockerPort(IExecutionContext context, string containerId);
Task<int> DockerLogin(IExecutionContext context, string configFileDirectory, string registry, string username, string password);
}
public class DockerCommandManager : RunnerService, IDockerCommandManager
@@ -85,18 +82,9 @@ namespace GitHub.Runner.Worker.Container
return new DockerVersion(serverVersion, clientVersion);
}
public Task<int> DockerPull(IExecutionContext context, string image)
public async Task<int> DockerPull(IExecutionContext context, string image)
{
return DockerPull(context, image, null);
}
public async Task<int> DockerPull(IExecutionContext context, string image, string configFileDirectory)
{
if (string.IsNullOrEmpty(configFileDirectory))
{
return await ExecuteDockerCommandAsync(context, $"pull", image, context.CancellationToken);
}
return await ExecuteDockerCommandAsync(context, $"--config {configFileDirectory} pull", image, context.CancellationToken);
return await ExecuteDockerCommandAsync(context, "pull", image, context.CancellationToken);
}
public async Task<int> DockerBuild(IExecutionContext context, string workingDirectory, string dockerFile, string dockerContext, string tag)
@@ -358,28 +346,6 @@ namespace GitHub.Runner.Worker.Container
return DockerUtil.ParseDockerPort(portMappingLines);
}
public Task<int> DockerLogin(IExecutionContext context, string configFileDirectory, string registry, string username, string password)
{
string args = $"--config {configFileDirectory} login {registry} -u {username} --password-stdin";
context.Command($"{DockerPath} {args}");
var input = Channel.CreateBounded<string>(new BoundedChannelOptions(1) { SingleReader = true, SingleWriter = true });
input.Writer.TryWrite(password);
var processInvoker = HostContext.CreateService<IProcessInvoker>();
return processInvoker.ExecuteAsync(
workingDirectory: context.GetGitHubContext("workspace"),
fileName: DockerPath,
arguments: args,
environment: null,
requireExitCodeZero: false,
outputEncoding: null,
killProcessOnCancel: false,
redirectStandardIn: input,
cancellationToken: context.CancellationToken);
}
private Task<int> ExecuteDockerCommandAsync(IExecutionContext context, string command, string options, CancellationToken cancellationToken = default(CancellationToken))
{
return ExecuteDockerCommandAsync(context, command, options, null, cancellationToken);

View File

@@ -45,21 +45,5 @@ namespace GitHub.Runner.Worker.Container
}
return "";
}
public static string ParseRegistryHostnameFromImageName(string name)
{
var nameSplit = name.Split('/');
// Single slash is implictly from Dockerhub, unless first part has .tld or :port
if (nameSplit.Length == 2 && (nameSplit[0].Contains(":") || nameSplit[0].Contains(".")))
{
return nameSplit[0];
}
// All other non Dockerhub registries
else if (nameSplit.Length > 2)
{
return nameSplit[0];
}
return "";
}
}
}

View File

@@ -91,10 +91,7 @@ namespace GitHub.Runner.Worker
#endif
// Check docker client/server version
executionContext.Output("##[group]Checking docker version");
DockerVersion dockerVersion = await _dockerManger.DockerVersion(executionContext);
executionContext.Output("##[endgroup]");
ArgUtil.NotNull(dockerVersion.ServerVersion, nameof(dockerVersion.ServerVersion));
ArgUtil.NotNull(dockerVersion.ClientVersion, nameof(dockerVersion.ClientVersion));
@@ -114,7 +111,7 @@ namespace GitHub.Runner.Worker
}
// Clean up containers left by previous runs
executionContext.Output("##[group]Clean up resources from previous jobs");
executionContext.Debug($"Delete stale containers from previous jobs");
var staleContainers = await _dockerManger.DockerPS(executionContext, $"--all --quiet --no-trunc --filter \"label={_dockerManger.DockerInstanceLabel}\"");
foreach (var staleContainer in staleContainers)
{
@@ -125,20 +122,18 @@ namespace GitHub.Runner.Worker
}
}
executionContext.Debug($"Delete stale container networks from previous jobs");
int networkPruneExitCode = await _dockerManger.DockerNetworkPrune(executionContext);
if (networkPruneExitCode != 0)
{
executionContext.Warning($"Delete stale container networks failed, docker network prune fail with exit code {networkPruneExitCode}");
}
executionContext.Output("##[endgroup]");
// Create local docker network for this job to avoid port conflict when multiple runners run on same machine.
// All containers within a job join the same network
executionContext.Output("##[group]Create local container network");
var containerNetwork = $"github_network_{Guid.NewGuid().ToString("N")}";
await CreateContainerNetworkAsync(executionContext, containerNetwork);
executionContext.JobContext.Container["network"] = new StringContextData(containerNetwork);
executionContext.Output("##[endgroup]");
foreach (var container in containers)
{
@@ -146,12 +141,10 @@ namespace GitHub.Runner.Worker
await StartContainerAsync(executionContext, container);
}
executionContext.Output("##[group]Waiting for all services to be ready");
foreach (var container in containers.Where(c => !c.IsJobContainer))
{
await ContainerHealthcheck(executionContext, container);
}
executionContext.Output("##[endgroup]");
}
public async Task StopContainersAsync(IExecutionContext executionContext, object data)
@@ -180,10 +173,6 @@ namespace GitHub.Runner.Worker
Trace.Info($"Container name: {container.ContainerName}");
Trace.Info($"Container image: {container.ContainerImage}");
Trace.Info($"Container options: {container.ContainerCreateOptions}");
var groupName = container.IsJobContainer ? "Starting job container" : $"Starting {container.ContainerNetworkAlias} service container";
executionContext.Output($"##[group]{groupName}");
foreach (var port in container.UserPortMappings)
{
Trace.Info($"User provided port: {port.Value}");
@@ -198,18 +187,12 @@ namespace GitHub.Runner.Worker
}
}
// TODO: Add at a later date. This currently no local package registry to test with
// UpdateRegistryAuthForGitHubToken(executionContext, container);
// Before pulling, generate client authentication if required
var configLocation = await ContainerRegistryLogin(executionContext, container);
// Pull down docker image with retry up to 3 times
int retryCount = 0;
int pullExitCode = 0;
while (retryCount < 3)
{
pullExitCode = await _dockerManger.DockerPull(executionContext, container.ContainerImage, configLocation);
pullExitCode = await _dockerManger.DockerPull(executionContext, container.ContainerImage);
if (pullExitCode == 0)
{
break;
@@ -226,9 +209,6 @@ namespace GitHub.Runner.Worker
}
}
// Remove credentials after pulling
ContainerRegistryLogout(configLocation);
if (retryCount == 3 && pullExitCode != 0)
{
throw new InvalidOperationException($"Docker pull failed with exit code {pullExitCode}");
@@ -324,7 +304,6 @@ namespace GitHub.Runner.Worker
container.ContainerRuntimePath = DockerUtil.ParsePathFromConfigEnv(containerEnv);
executionContext.JobContext.Container["id"] = new StringContextData(container.ContainerId);
}
executionContext.Output("##[endgroup]");
}
private async Task StopContainerAsync(IExecutionContext executionContext, ContainerInfo container)
@@ -446,83 +425,5 @@ namespace GitHub.Runner.Worker
throw new InvalidOperationException($"Failed to initialize, {container.ContainerNetworkAlias} service is {serviceHealth}.");
}
}
private async Task<string> ContainerRegistryLogin(IExecutionContext executionContext, ContainerInfo container)
{
if (string.IsNullOrEmpty(container.RegistryAuthUsername) || string.IsNullOrEmpty(container.RegistryAuthPassword))
{
// No valid client config can be generated
return "";
}
var configLocation = Path.Combine(HostContext.GetDirectory(WellKnownDirectory.Temp), $".docker_{Guid.NewGuid()}");
try
{
var dirInfo = Directory.CreateDirectory(configLocation);
}
catch (Exception e)
{
throw new InvalidOperationException($"Failed to create directory to store registry client credentials: {e.Message}");
}
var loginExitCode = await _dockerManger.DockerLogin(
executionContext,
configLocation,
container.RegistryServer,
container.RegistryAuthUsername,
container.RegistryAuthPassword);
if (loginExitCode != 0)
{
throw new InvalidOperationException($"Docker login for '{container.RegistryServer}' failed with exit code {loginExitCode}");
}
return configLocation;
}
private void ContainerRegistryLogout(string configLocation)
{
try
{
if (!string.IsNullOrEmpty(configLocation) && Directory.Exists(configLocation))
{
Directory.Delete(configLocation, recursive: true);
}
}
catch (Exception e)
{
throw new InvalidOperationException($"Failed to remove directory containing Docker client credentials: {e.Message}");
}
}
private void UpdateRegistryAuthForGitHubToken(IExecutionContext executionContext, ContainerInfo container)
{
var registryIsTokenCompatible = container.RegistryServer.Equals("docker.pkg.github.com", StringComparison.OrdinalIgnoreCase);
if (!registryIsTokenCompatible)
{
return;
}
var registryMatchesWorkflow = false;
// REGISTRY/OWNER/REPO/IMAGE[:TAG]
var imageParts = container.ContainerImage.Split('/');
if (imageParts.Length != 4)
{
executionContext.Warning($"Could not identify owner and repo for container image {container.ContainerImage}. Skipping automatic token auth");
return;
}
var owner = imageParts[1];
var repo = imageParts[2];
var nwo = $"{owner}/{repo}";
if (nwo.Equals(executionContext.GetGitHubContext("repository"), StringComparison.OrdinalIgnoreCase))
{
registryMatchesWorkflow = true;
}
var registryCredentialsNotSupplied = string.IsNullOrEmpty(container.RegistryAuthUsername) && string.IsNullOrEmpty(container.RegistryAuthPassword);
if (registryCredentialsNotSupplied && registryMatchesWorkflow)
{
container.RegistryAuthUsername = executionContext.GetGitHubContext("actor");
container.RegistryAuthPassword = executionContext.GetGitHubContext("token");
}
}
}
}

View File

@@ -86,9 +86,9 @@ namespace GitHub.Runner.Worker
executionContext.Debug("Zipping diagnostic files.");
string buildNumber = executionContext.Global.Variables.Build_Number ?? "UnknownBuildNumber";
string buildNumber = executionContext.Variables.Build_Number ?? "UnknownBuildNumber";
string buildName = $"Build {buildNumber}";
string phaseName = executionContext.Global.Variables.System_PhaseDisplayName ?? "UnknownPhaseName";
string phaseName = executionContext.Variables.System_PhaseDisplayName ?? "UnknownPhaseName";
// zip the files
string diagnosticsZipFileName = $"{buildName}-{phaseName}.zip";

View File

@@ -6,7 +6,6 @@ using System.Globalization;
using System.IO;
using System.Linq;
using System.Text;
using System.Text.RegularExpressions;
using System.Threading;
using System.Threading.Tasks;
using System.Web;
@@ -44,12 +43,22 @@ namespace GitHub.Runner.Worker
string ResultCode { get; set; }
TaskResult? CommandResult { get; set; }
CancellationToken CancellationToken { get; }
GlobalContext Global { get; }
List<ServiceEndpoint> Endpoints { get; }
PlanFeatures Features { get; }
Variables Variables { get; }
Dictionary<string, string> IntraActionState { get; }
IDictionary<String, IDictionary<String, String>> JobDefaults { get; }
Dictionary<string, VariableValue> JobOutputs { get; }
IDictionary<String, String> EnvironmentVariables { get; }
IDictionary<String, ContextScope> Scopes { get; }
IList<String> FileTable { get; }
StepsContext StepsContext { get; }
DictionaryContextData ExpressionValues { get; }
IList<IFunctionInfo> ExpressionFunctions { get; }
List<string> PrependPath { get; }
ContainerInfo Container { get; set; }
List<ContainerInfo> ServiceContainers { get; }
JobContext JobContext { get; }
// Only job level ExecutionContext has JobSteps
@@ -60,16 +69,13 @@ namespace GitHub.Runner.Worker
bool EchoOnActionCommand { get; set; }
bool InsideComposite { get; }
ExecutionContext Root { get; }
// Initialize
void InitializeJob(Pipelines.AgentJobRequestMessage message, CancellationToken token);
void CancelToken();
IExecutionContext CreateChild(Guid recordId, string displayName, string refName, string scopeName, string contextName, Dictionary<string, string> intraActionState = null, int? recordOrder = null, IPagingLogger logger = null, bool insideComposite = false, CancellationTokenSource cancellationTokenSource = null);
IExecutionContext CreateChild(Guid recordId, string displayName, string refName, string scopeName, string contextName, Dictionary<string, string> intraActionState = null, int? recordOrder = null);
// logging
bool WriteDebug { get; }
long Write(string tag, string message);
void QueueAttachFile(string type, string name, string filePath);
@@ -98,7 +104,6 @@ namespace GitHub.Runner.Worker
// others
void ForceTaskComplete();
void RegisterPostJobStep(IStep step);
IStep CreateCompositeStep(string scopeName, IActionRunner step, DictionaryContextData inputsData, Dictionary<string, string> envData);
}
public sealed class ExecutionContext : RunnerService, IExecutionContext
@@ -135,13 +140,21 @@ namespace GitHub.Runner.Worker
public string ContextName { get; private set; }
public Task ForceCompleted => _forceCompleted.Task;
public CancellationToken CancellationToken => _cancellationTokenSource.Token;
public List<ServiceEndpoint> Endpoints { get; private set; }
public Variables Variables { get; private set; }
public Dictionary<string, string> IntraActionState { get; private set; }
public IDictionary<String, IDictionary<String, String>> JobDefaults { get; private set; }
public Dictionary<string, VariableValue> JobOutputs { get; private set; }
public IDictionary<String, String> EnvironmentVariables { get; private set; }
public IDictionary<String, ContextScope> Scopes { get; private set; }
public IList<String> FileTable { get; private set; }
public StepsContext StepsContext { get; private set; }
public DictionaryContextData ExpressionValues { get; } = new DictionaryContextData();
public IList<IFunctionInfo> ExpressionFunctions { get; } = new List<IFunctionInfo>();
// Shared pointer across job-level execution context and step-level execution contexts
public GlobalContext Global { get; private set; }
public bool WriteDebug { get; private set; }
public List<string> PrependPath { get; private set; }
public ContainerInfo Container { get; set; }
public List<ContainerInfo> ServiceContainers { get; private set; }
// Only job level ExecutionContext has JobSteps
public Queue<IStep> JobSteps { get; private set; }
@@ -154,7 +167,6 @@ namespace GitHub.Runner.Worker
public bool EchoOnActionCommand { get; set; }
public bool InsideComposite { get; private set; }
public TaskResult? Result
{
@@ -186,7 +198,9 @@ namespace GitHub.Runner.Worker
}
}
public ExecutionContext Root
public PlanFeatures Features { get; private set; }
private ExecutionContext Root
{
get
{
@@ -250,44 +264,17 @@ namespace GitHub.Runner.Worker
Root.PostJobSteps.Push(step);
}
/// <summary>
/// Helper function used in CompositeActionHandler::RunAsync to
/// add a child node, aka a step, to the current job to the Root.JobSteps based on the location.
/// </summary>
public IStep CreateCompositeStep(
string scopeName,
IActionRunner step,
DictionaryContextData inputsData,
Dictionary<string, string> envData)
{
step.ExecutionContext = Root.CreateChild(_record.Id, _record.Name, _record.Id.ToString("N"), scopeName, step.Action.ContextName, logger: _logger, insideComposite: true, cancellationTokenSource: CancellationTokenSource.CreateLinkedTokenSource(_cancellationTokenSource.Token));
step.ExecutionContext.ExpressionValues["inputs"] = inputsData;
step.ExecutionContext.ExpressionValues["steps"] = Global.StepsContext.GetScope(step.ExecutionContext.GetFullyQualifiedContextName());
// Add the composite action environment variables to each step.
#if OS_WINDOWS
var envContext = new DictionaryContextData();
#else
var envContext = new CaseSensitiveDictionaryContextData();
#endif
foreach (var pair in envData)
{
envContext[pair.Key] = new StringContextData(pair.Value ?? string.Empty);
}
step.ExecutionContext.ExpressionValues["env"] = envContext;
return step;
}
public IExecutionContext CreateChild(Guid recordId, string displayName, string refName, string scopeName, string contextName, Dictionary<string, string> intraActionState = null, int? recordOrder = null, IPagingLogger logger = null, bool insideComposite = false, CancellationTokenSource cancellationTokenSource = null)
public IExecutionContext CreateChild(Guid recordId, string displayName, string refName, string scopeName, string contextName, Dictionary<string, string> intraActionState = null, int? recordOrder = null)
{
Trace.Entering();
var child = new ExecutionContext();
child.Initialize(HostContext);
child.Global = Global;
child.ScopeName = scopeName;
child.ContextName = contextName;
child.Features = Features;
child.Variables = Variables;
child.Endpoints = Endpoints;
if (intraActionState == null)
{
child.IntraActionState = new Dictionary<string, string>(StringComparer.OrdinalIgnoreCase);
@@ -296,6 +283,11 @@ namespace GitHub.Runner.Worker
{
child.IntraActionState = intraActionState;
}
child.EnvironmentVariables = EnvironmentVariables;
child.JobDefaults = JobDefaults;
child.Scopes = Scopes;
child.FileTable = FileTable;
child.StepsContext = StepsContext;
foreach (var pair in ExpressionValues)
{
child.ExpressionValues[pair.Key] = pair.Value;
@@ -304,8 +296,12 @@ namespace GitHub.Runner.Worker
{
child.ExpressionFunctions.Add(item);
}
child._cancellationTokenSource = cancellationTokenSource ?? new CancellationTokenSource();
child._cancellationTokenSource = new CancellationTokenSource();
child.WriteDebug = WriteDebug;
child._parentExecutionContext = this;
child.PrependPath = PrependPath;
child.Container = Container;
child.ServiceContainers = ServiceContainers;
child.EchoOnActionCommand = EchoOnActionCommand;
if (recordOrder != null)
@@ -316,17 +312,9 @@ namespace GitHub.Runner.Worker
{
child.InitializeTimelineRecord(_mainTimelineId, recordId, _record.Id, ExecutionContextType.Task, displayName, refName, ++_childTimelineRecordOrder);
}
if (logger != null)
{
child._logger = logger;
}
else
{
child._logger = HostContext.CreateService<IPagingLogger>();
child._logger.Setup(_mainTimelineId, recordId);
}
child.InsideComposite = insideComposite;
child._logger = HostContext.CreateService<IPagingLogger>();
child._logger.Setup(_mainTimelineId, recordId);
return child;
}
@@ -384,11 +372,10 @@ namespace GitHub.Runner.Worker
_logger.End();
// Skip if generated context name. Generated context names start with "__". After M271-ish the server will never send an empty context name.
if (!string.IsNullOrEmpty(ContextName) && !ContextName.StartsWith("__", StringComparison.Ordinal))
if (!string.IsNullOrEmpty(ContextName))
{
Global.StepsContext.SetOutcome(ScopeName, ContextName, (Outcome ?? Result ?? TaskResult.Succeeded).ToActionResult());
Global.StepsContext.SetConclusion(ScopeName, ContextName, (Result ?? TaskResult.Succeeded).ToActionResult());
StepsContext.SetOutcome(ScopeName, ContextName, (Outcome ?? Result ?? TaskResult.Succeeded).ToActionResult());
StepsContext.SetConclusion(ScopeName, ContextName, (Result ?? TaskResult.Succeeded).ToActionResult());
}
return Result.Value;
@@ -447,8 +434,7 @@ namespace GitHub.Runner.Worker
{
ArgUtil.NotNullOrEmpty(name, nameof(name));
// Skip if generated context name. Generated context names start with "__". After M271-ish the server will never send an empty context name.
if (string.IsNullOrEmpty(ContextName) || ContextName.StartsWith("__", StringComparison.Ordinal))
if (String.IsNullOrEmpty(ContextName))
{
reference = null;
return;
@@ -456,7 +442,7 @@ namespace GitHub.Runner.Worker
// todo: restrict multiline?
Global.StepsContext.SetOutput(ScopeName, ContextName, name, value, out reference);
StepsContext.SetOutput(ScopeName, ContextName, name, value, out reference);
}
public void SetTimeout(TimeSpan? timeout)
@@ -590,35 +576,42 @@ namespace GitHub.Runner.Worker
_cancellationTokenSource = CancellationTokenSource.CreateLinkedTokenSource(token);
Global = new GlobalContext();
// Plan
Global.Plan = message.Plan;
Global.Features = PlanUtil.GetFeatures(message.Plan);
// Features
Features = PlanUtil.GetFeatures(message.Plan);
// Endpoints
Global.Endpoints = message.Resources.Endpoints;
Endpoints = message.Resources.Endpoints;
// Variables
Global.Variables = new Variables(HostContext, message.Variables);
Variables = new Variables(HostContext, message.Variables);
// Environment variables shared across all actions
Global.EnvironmentVariables = new Dictionary<string, string>(VarUtil.EnvironmentVariableKeyComparer);
EnvironmentVariables = new Dictionary<string, string>(VarUtil.EnvironmentVariableKeyComparer);
// Job defaults shared across all actions
Global.JobDefaults = new Dictionary<string, IDictionary<string, string>>(StringComparer.OrdinalIgnoreCase);
JobDefaults = new Dictionary<string, IDictionary<string, string>>(StringComparer.OrdinalIgnoreCase);
// Job Outputs
JobOutputs = new Dictionary<string, VariableValue>(StringComparer.OrdinalIgnoreCase);
// Service container info
Global.ServiceContainers = new List<ContainerInfo>();
ServiceContainers = new List<ContainerInfo>();
// Steps context (StepsRunner manages adding the scoped steps context)
Global.StepsContext = new StepsContext();
StepsContext = new StepsContext();
// Scopes
Scopes = new Dictionary<String, ContextScope>(StringComparer.OrdinalIgnoreCase);
if (message.Scopes?.Count > 0)
{
foreach (var scope in message.Scopes)
{
Scopes[scope.Name] = scope;
}
}
// File table
Global.FileTable = new List<String>(message.FileTable ?? new string[0]);
FileTable = new List<String>(message.FileTable ?? new string[0]);
// Expression values
if (message.ContextData?.Count > 0)
@@ -629,15 +622,15 @@ namespace GitHub.Runner.Worker
}
}
ExpressionValues["secrets"] = Global.Variables.ToSecretsContext();
ExpressionValues["secrets"] = Variables.ToSecretsContext();
ExpressionValues["runner"] = new RunnerContext();
ExpressionValues["job"] = new JobContext();
Trace.Info("Initialize GitHub context");
var githubAccessToken = new StringContextData(Global.Variables.Get("system.github.token"));
var githubAccessToken = new StringContextData(Variables.Get("system.github.token"));
var base64EncodedToken = Convert.ToBase64String(Encoding.UTF8.GetBytes($"x-access-token:{githubAccessToken}"));
HostContext.SecretMasker.AddValue(base64EncodedToken);
var githubJob = Global.Variables.Get("system.github.job");
var githubJob = Variables.Get("system.github.job");
var githubContext = new GitHubContext();
githubContext["token"] = githubAccessToken;
if (!string.IsNullOrEmpty(githubJob))
@@ -660,7 +653,7 @@ namespace GitHub.Runner.Worker
#endif
// Prepend Path
Global.PrependPath = new List<string>();
PrependPath = new List<string>();
// JobSteps for job ExecutionContext
JobSteps = new Queue<IStep>();
@@ -686,10 +679,10 @@ namespace GitHub.Runner.Worker
_logger.Setup(_mainTimelineId, _record.Id);
// Initialize 'echo on action command success' property, default to false, unless Step_Debug is set
EchoOnActionCommand = Global.Variables.Step_Debug ?? false;
EchoOnActionCommand = Variables.Step_Debug ?? false;
// Verbosity (from GitHub.Step_Debug).
Global.WriteDebug = Global.Variables.Step_Debug ?? false;
WriteDebug = Variables.Step_Debug ?? false;
// Hook up JobServerQueueThrottling event, we will log warning on server tarpit.
_jobServerQueue.JobServerQueueThrottling += JobServerQueueThrottling_EventReceived;
@@ -717,8 +710,7 @@ namespace GitHub.Runner.Worker
}
}
_jobServerQueue.QueueWebConsoleLine(_record.Id, msg, totalLines);
_jobServerQueue.QueueWebConsoleLine(_record.Id, msg);
return totalLines;
}
@@ -891,16 +883,6 @@ namespace GitHub.Runner.Worker
// Otherwise individual overloads would need to be implemented (depending on the unit test).
public static class ExecutionContextExtension
{
public static string GetFullyQualifiedContextName(this IExecutionContext context)
{
if (!string.IsNullOrEmpty(context.ScopeName))
{
return $"{context.ScopeName}.{context.ContextName}";
}
return context.ContextName;
}
public static void Error(this IExecutionContext context, Exception ex)
{
context.Error(ex.Message);
@@ -939,7 +921,7 @@ namespace GitHub.Runner.Worker
// Do not add a format string overload. See comment on ExecutionContext.Write().
public static void Debug(this IExecutionContext context, string message)
{
if (context.Global.WriteDebug)
if (context.WriteDebug)
{
var multilines = message?.Replace("\r\n", "\n")?.Split("\n");
if (multilines != null)
@@ -964,7 +946,7 @@ namespace GitHub.Runner.Worker
traceWriter = context.ToTemplateTraceWriter();
}
var schema = PipelineTemplateSchemaFactory.GetSchema();
return new PipelineTemplateEvaluator(traceWriter, schema, context.Global.FileTable);
return new PipelineTemplateEvaluator(traceWriter, schema, context.FileTable);
}
public static ObjectTemplating.ITraceWriter ToTemplateTraceWriter(this IExecutionContext context)

View File

@@ -1,262 +0,0 @@
using GitHub.DistributedTask.WebApi;
using GitHub.Runner.Worker.Container;
using GitHub.Runner.Common;
using GitHub.Runner.Sdk;
using System;
using System.Collections;
using System.Collections.Generic;
using System.IO;
using System.Text;
namespace GitHub.Runner.Worker
{
[ServiceLocator(Default = typeof(FileCommandManager))]
public interface IFileCommandManager : IRunnerService
{
void InitializeFiles(IExecutionContext context, ContainerInfo container);
void ProcessFiles(IExecutionContext context, ContainerInfo container);
}
public sealed class FileCommandManager : RunnerService, IFileCommandManager
{
private const string _folderName = "_runner_file_commands";
private List<IFileCommandExtension> _commandExtensions;
private string _fileSuffix = String.Empty;
private string _fileCommandDirectory;
private Tracing _trace;
public override void Initialize(IHostContext hostContext)
{
base.Initialize(hostContext);
_trace = HostContext.GetTrace(nameof(FileCommandManager));
_fileCommandDirectory = Path.Combine(HostContext.GetDirectory(WellKnownDirectory.Temp), _folderName);
if (!Directory.Exists(_fileCommandDirectory))
{
Directory.CreateDirectory(_fileCommandDirectory);
}
var extensionManager = hostContext.GetService<IExtensionManager>();
_commandExtensions = extensionManager.GetExtensions<IFileCommandExtension>() ?? new List<IFileCommandExtension>();
}
public void InitializeFiles(IExecutionContext context, ContainerInfo container)
{
var oldSuffix = _fileSuffix;
_fileSuffix = Guid.NewGuid().ToString();
foreach (var fileCommand in _commandExtensions)
{
var oldPath = Path.Combine(_fileCommandDirectory, fileCommand.FilePrefix + oldSuffix);
if (oldSuffix != String.Empty && File.Exists(oldPath))
{
TryDeleteFile(oldPath);
}
var newPath = Path.Combine(_fileCommandDirectory, fileCommand.FilePrefix + _fileSuffix);
TryDeleteFile(newPath);
File.Create(newPath).Dispose();
var pathToSet = container != null ? container.TranslateToContainerPath(newPath) : newPath;
context.SetGitHubContext(fileCommand.ContextName, pathToSet);
}
}
public void ProcessFiles(IExecutionContext context, ContainerInfo container)
{
foreach (var fileCommand in _commandExtensions)
{
try
{
fileCommand.ProcessCommand(context, Path.Combine(_fileCommandDirectory, fileCommand.FilePrefix + _fileSuffix),container);
}
catch (Exception ex)
{
context.Error($"Unable to process file command '{fileCommand.ContextName}' successfully.");
context.Error(ex);
context.CommandResult = TaskResult.Failed;
}
}
}
private bool TryDeleteFile(string path)
{
if (!File.Exists(path))
{
return true;
}
try
{
File.Delete(path);
}
catch (Exception e)
{
_trace.Warning($"Unable to delete file {path} for reason: {e.ToString()}");
return false;
}
return true;
}
}
public interface IFileCommandExtension : IExtension
{
string ContextName { get; }
string FilePrefix { get; }
void ProcessCommand(IExecutionContext context, string filePath, ContainerInfo container);
}
public sealed class AddPathFileCommand : RunnerService, IFileCommandExtension
{
public string ContextName => "path";
public string FilePrefix => "add_path_";
public Type ExtensionType => typeof(IFileCommandExtension);
public void ProcessCommand(IExecutionContext context, string filePath, ContainerInfo container)
{
if (File.Exists(filePath))
{
var lines = File.ReadAllLines(filePath, Encoding.UTF8);
foreach(var line in lines)
{
if (line == string.Empty)
{
continue;
}
context.Global.PrependPath.RemoveAll(x => string.Equals(x, line, StringComparison.CurrentCulture));
context.Global.PrependPath.Add(line);
}
}
}
}
public sealed class SetEnvFileCommand : RunnerService, IFileCommandExtension
{
public string ContextName => "env";
public string FilePrefix => "set_env_";
public Type ExtensionType => typeof(IFileCommandExtension);
public void ProcessCommand(IExecutionContext context, string filePath, ContainerInfo container)
{
try
{
var text = File.ReadAllText(filePath) ?? string.Empty;
var index = 0;
var line = ReadLine(text, ref index);
while (line != null)
{
if (!string.IsNullOrEmpty(line))
{
var equalsIndex = line.IndexOf("=", StringComparison.Ordinal);
var heredocIndex = line.IndexOf("<<", StringComparison.Ordinal);
// Normal style NAME=VALUE
if (equalsIndex >= 0 && (heredocIndex < 0 || equalsIndex < heredocIndex))
{
var split = line.Split(new[] { '=' }, 2, StringSplitOptions.None);
if (string.IsNullOrEmpty(line))
{
throw new Exception($"Invalid environment variable format '{line}'. Environment variable name must not be empty");
}
SetEnvironmentVariable(context, split[0], split[1]);
}
// Heredoc style NAME<<EOF
else if (heredocIndex >= 0 && (equalsIndex < 0 || heredocIndex < equalsIndex))
{
var split = line.Split(new[] { "<<" }, 2, StringSplitOptions.None);
if (string.IsNullOrEmpty(split[0]) || string.IsNullOrEmpty(split[1]))
{
throw new Exception($"Invalid environment variable format '{line}'. Environment variable name must not be empty and delimiter must not be empty");
}
var name = split[0];
var delimiter = split[1];
var startIndex = index; // Start index of the value (inclusive)
var endIndex = index; // End index of the value (exclusive)
var tempLine = ReadLine(text, ref index, out var newline);
while (!string.Equals(tempLine, delimiter, StringComparison.Ordinal))
{
if (tempLine == null)
{
throw new Exception($"Invalid environment variable value. Matching delimiter not found '{delimiter}'");
}
endIndex = index - newline.Length;
tempLine = ReadLine(text, ref index, out newline);
}
var value = endIndex > startIndex ? text.Substring(startIndex, endIndex - startIndex) : string.Empty;
SetEnvironmentVariable(context, name, value);
}
else
{
throw new Exception($"Invalid environment variable format '{line}'");
}
}
line = ReadLine(text, ref index);
}
}
catch (DirectoryNotFoundException)
{
context.Debug($"Environment variables file does not exist '{filePath}'");
}
catch (FileNotFoundException)
{
context.Debug($"Environment variables file does not exist '{filePath}'");
}
}
private static void SetEnvironmentVariable(
IExecutionContext context,
string name,
string value)
{
context.Global.EnvironmentVariables[name] = value;
context.SetEnvContext(name, value);
context.Debug($"{name}='{value}'");
}
private static string ReadLine(
string text,
ref int index)
{
return ReadLine(text, ref index, out _);
}
private static string ReadLine(
string text,
ref int index,
out string newline)
{
if (index >= text.Length)
{
newline = null;
return null;
}
var originalIndex = index;
var lfIndex = text.IndexOf("\n", index, StringComparison.Ordinal);
if (lfIndex < 0)
{
index = text.Length;
newline = null;
return text.Substring(originalIndex);
}
#if OS_WINDOWS
var crLFIndex = text.IndexOf("\r\n", index, StringComparison.Ordinal);
if (crLFIndex >= 0 && crLFIndex < lfIndex)
{
index = crLFIndex + 2; // Skip over CRLF
newline = "\r\n";
return text.Substring(originalIndex, crLFIndex - originalIndex);
}
#endif
index = lfIndex + 1; // Skip over LF
newline = "\n";
return text.Substring(originalIndex, lfIndex - originalIndex);
}
}
}

View File

@@ -6,24 +6,20 @@ namespace GitHub.Runner.Worker
{
public sealed class GitHubContext : DictionaryContextData, IEnvironmentContextData
{
private readonly HashSet<string> _contextEnvAllowlist = new HashSet<string>(StringComparer.OrdinalIgnoreCase)
private readonly HashSet<string> _contextEnvWhitelist = new HashSet<string>(StringComparer.OrdinalIgnoreCase)
{
"action",
"action_path",
"actor",
"api_url",
"base_ref",
"env",
"event_name",
"event_path",
"graphql_url",
"head_ref",
"job",
"path",
"ref",
"repository",
"repository_owner",
"retention_days",
"run_id",
"run_number",
"server_url",
@@ -36,23 +32,11 @@ namespace GitHub.Runner.Worker
{
foreach (var data in this)
{
if (_contextEnvAllowlist.Contains(data.Key) && data.Value is StringContextData value)
if (_contextEnvWhitelist.Contains(data.Key) && data.Value is StringContextData value)
{
yield return new KeyValuePair<string, string>($"GITHUB_{data.Key.ToUpperInvariant()}", value);
}
}
}
public GitHubContext ShallowCopy()
{
var copy = new GitHubContext();
foreach (var pair in this)
{
copy[pair.Key] = pair.Value;
}
return copy;
}
}
}

View File

@@ -1,24 +0,0 @@
using System;
using System.Collections.Generic;
using GitHub.DistributedTask.WebApi;
using GitHub.Runner.Common.Util;
using GitHub.Runner.Worker.Container;
namespace GitHub.Runner.Worker
{
public sealed class GlobalContext
{
public ContainerInfo Container { get; set; }
public List<ServiceEndpoint> Endpoints { get; set; }
public IDictionary<String, String> EnvironmentVariables { get; set; }
public PlanFeatures Features { get; set; }
public IList<String> FileTable { get; set; }
public IDictionary<String, IDictionary<String, String>> JobDefaults { get; set; }
public TaskOrchestrationPlanReference Plan { get; set; }
public List<string> PrependPath { get; set; }
public List<ContainerInfo> ServiceContainers { get; set; }
public StepsContext StepsContext { get; set; }
public Variables Variables { get; set; }
public bool WriteDebug { get; set; }
}
}

View File

@@ -1,282 +0,0 @@
using System;
using System.Collections.Generic;
using System.IO;
using System.Linq;
using System.Text;
using System.Threading;
using System.Threading.Tasks;
using GitHub.DistributedTask.ObjectTemplating.Tokens;
using GitHub.DistributedTask.Pipelines.ContextData;
using GitHub.DistributedTask.WebApi;
using GitHub.Runner.Common;
using GitHub.Runner.Sdk;
using Pipelines = GitHub.DistributedTask.Pipelines;
namespace GitHub.Runner.Worker.Handlers
{
[ServiceLocator(Default = typeof(CompositeActionHandler))]
public interface ICompositeActionHandler : IHandler
{
CompositeActionExecutionData Data { get; set; }
}
public sealed class CompositeActionHandler : Handler, ICompositeActionHandler
{
public CompositeActionExecutionData Data { get; set; }
public async Task RunAsync(ActionRunStage stage)
{
// Validate args.
Trace.Entering();
ArgUtil.NotNull(ExecutionContext, nameof(ExecutionContext));
ArgUtil.NotNull(Inputs, nameof(Inputs));
ArgUtil.NotNull(Data.Steps, nameof(Data.Steps));
// Resolve action steps
var actionSteps = Data.Steps;
// Create Context Data to reuse for each composite action step
var inputsData = new DictionaryContextData();
foreach (var i in Inputs)
{
inputsData[i.Key] = new StringContextData(i.Value);
}
// Initialize Composite Steps List of Steps
var compositeSteps = new List<IStep>();
// Temporary hack until after M271-ish. After M271-ish the server will never send an empty
// context name. Generated context names start with "__"
var childScopeName = ExecutionContext.GetFullyQualifiedContextName();
if (string.IsNullOrEmpty(childScopeName))
{
childScopeName = $"__{Guid.NewGuid()}";
}
foreach (Pipelines.ActionStep actionStep in actionSteps)
{
var actionRunner = HostContext.CreateService<IActionRunner>();
actionRunner.Action = actionStep;
actionRunner.Stage = stage;
actionRunner.Condition = actionStep.Condition;
var step = ExecutionContext.CreateCompositeStep(childScopeName, actionRunner, inputsData, Environment);
// Shallow copy github context
var gitHubContext = step.ExecutionContext.ExpressionValues["github"] as GitHubContext;
ArgUtil.NotNull(gitHubContext, nameof(gitHubContext));
gitHubContext = gitHubContext.ShallowCopy();
step.ExecutionContext.ExpressionValues["github"] = gitHubContext;
// Set GITHUB_ACTION_PATH
step.ExecutionContext.SetGitHubContext("action_path", ActionDirectory);
compositeSteps.Add(step);
}
try
{
// This is where we run each step.
await RunStepsAsync(compositeSteps);
// Get the pointer of the correct "steps" object and pass it to the ExecutionContext so that we can process the outputs correctly
ExecutionContext.ExpressionValues["inputs"] = inputsData;
ExecutionContext.ExpressionValues["steps"] = ExecutionContext.Global.StepsContext.GetScope(ExecutionContext.GetFullyQualifiedContextName());
ProcessCompositeActionOutputs();
ExecutionContext.Global.StepsContext.ClearScope(childScopeName);
}
catch (Exception ex)
{
// Composite StepRunner should never throw exception out.
Trace.Error($"Caught exception from composite steps {nameof(CompositeActionHandler)}: {ex}");
ExecutionContext.Error(ex);
ExecutionContext.Result = TaskResult.Failed;
}
}
private void ProcessCompositeActionOutputs()
{
ArgUtil.NotNull(ExecutionContext, nameof(ExecutionContext));
// Evaluate the mapped outputs value
if (Data.Outputs != null)
{
// Evaluate the outputs in the steps context to easily retrieve the values
var actionManifestManager = HostContext.GetService<IActionManifestManager>();
// Format ExpressionValues to Dictionary<string, PipelineContextData>
var evaluateContext = new Dictionary<string, PipelineContextData>(StringComparer.OrdinalIgnoreCase);
foreach (var pair in ExecutionContext.ExpressionValues)
{
evaluateContext[pair.Key] = pair.Value;
}
// Get the evluated composite outputs' values mapped to the outputs named
DictionaryContextData actionOutputs = actionManifestManager.EvaluateCompositeOutputs(ExecutionContext, Data.Outputs, evaluateContext);
// Set the outputs for the outputs object in the whole composite action
// Each pair is structured like this
// We ignore "description" for now
// {
// "the-output-name": {
// "description": "",
// "value": "the value"
// },
// ...
// }
foreach (var pair in actionOutputs)
{
var outputsName = pair.Key;
var outputsAttributes = pair.Value as DictionaryContextData;
outputsAttributes.TryGetValue("value", out var val);
if (val != null)
{
var outputsValue = val as StringContextData;
// Set output in the whole composite scope.
if (!String.IsNullOrEmpty(outputsValue))
{
ExecutionContext.SetOutput(outputsName, outputsValue, out _);
}
else
{
ExecutionContext.SetOutput(outputsName, "", out _);
}
}
}
}
}
private async Task RunStepsAsync(List<IStep> compositeSteps)
{
ArgUtil.NotNull(compositeSteps, nameof(compositeSteps));
// The parent StepsRunner of the whole Composite Action Step handles the cancellation stuff already.
foreach (IStep step in compositeSteps)
{
Trace.Info($"Processing composite step: DisplayName='{step.DisplayName}'");
step.ExecutionContext.ExpressionValues["steps"] = ExecutionContext.Global.StepsContext.GetScope(step.ExecutionContext.ScopeName);
// Populate env context for each step
Trace.Info("Initialize Env context for step");
#if OS_WINDOWS
var envContext = new DictionaryContextData();
#else
var envContext = new CaseSensitiveDictionaryContextData();
#endif
// Global env
foreach (var pair in ExecutionContext.Global.EnvironmentVariables)
{
envContext[pair.Key] = new StringContextData(pair.Value ?? string.Empty);
}
// Stomps over with outside step env
if (step.ExecutionContext.ExpressionValues.TryGetValue("env", out var envContextData))
{
#if OS_WINDOWS
var dict = envContextData as DictionaryContextData;
#else
var dict = envContextData as CaseSensitiveDictionaryContextData;
#endif
foreach (var pair in dict)
{
envContext[pair.Key] = pair.Value;
}
}
step.ExecutionContext.ExpressionValues["env"] = envContext;
var actionStep = step as IActionRunner;
try
{
// Evaluate and merge action's env block to env context
var templateEvaluator = step.ExecutionContext.ToPipelineTemplateEvaluator();
var actionEnvironment = templateEvaluator.EvaluateStepEnvironment(actionStep.Action.Environment, step.ExecutionContext.ExpressionValues, step.ExecutionContext.ExpressionFunctions, Common.Util.VarUtil.EnvironmentVariableKeyComparer);
foreach (var env in actionEnvironment)
{
envContext[env.Key] = new StringContextData(env.Value ?? string.Empty);
}
}
catch (Exception ex)
{
// fail the step since there is an evaluate error.
Trace.Info("Caught exception in Composite Steps Runner from expression for step.env");
// evaluateStepEnvFailed = true;
step.ExecutionContext.Error(ex);
step.ExecutionContext.Complete(TaskResult.Failed);
}
await RunStepAsync(step);
// Directly after the step, check if the step has failed or cancelled
// If so, return that to the output
if (step.ExecutionContext.Result == TaskResult.Failed || step.ExecutionContext.Result == TaskResult.Canceled)
{
ExecutionContext.Result = step.ExecutionContext.Result;
break;
}
// TODO: Add compat for other types of steps.
}
// Completion Status handled by StepsRunner for the whole Composite Action Step
}
private async Task RunStepAsync(IStep step)
{
// Start the step.
Trace.Info("Starting the step.");
step.ExecutionContext.Debug($"Starting: {step.DisplayName}");
// TODO: Fix for Step Level Timeout Attributes for an individual Composite Run Step
// For now, we are not going to support this for an individual composite run step
var templateEvaluator = step.ExecutionContext.ToPipelineTemplateEvaluator();
await Common.Util.EncodingUtil.SetEncoding(HostContext, Trace, step.ExecutionContext.CancellationToken);
try
{
await step.RunAsync();
}
catch (OperationCanceledException ex)
{
if (step.ExecutionContext.CancellationToken.IsCancellationRequested &&
!ExecutionContext.Root.CancellationToken.IsCancellationRequested)
{
Trace.Error($"Caught timeout exception from step: {ex.Message}");
step.ExecutionContext.Error("The action has timed out.");
step.ExecutionContext.Result = TaskResult.Failed;
}
else
{
Trace.Error($"Caught cancellation exception from step: {ex}");
step.ExecutionContext.Error(ex);
step.ExecutionContext.Result = TaskResult.Canceled;
}
}
catch (Exception ex)
{
// Log the error and fail the step.
Trace.Error($"Caught exception from step: {ex}");
step.ExecutionContext.Error(ex);
step.ExecutionContext.Result = TaskResult.Failed;
}
// Merge execution context result with command result
if (step.ExecutionContext.CommandResult != null)
{
step.ExecutionContext.Result = Common.Util.TaskResultUtil.MergeTaskResults(step.ExecutionContext.Result, step.ExecutionContext.CommandResult.Value);
}
Trace.Info($"Step result: {step.ExecutionContext.Result}");
// Complete the step context.
step.ExecutionContext.Debug($"Finishing: {step.DisplayName}");
}
}
}

View File

@@ -49,9 +49,8 @@ namespace GitHub.Runner.Worker.Handlers
// ensure docker file exist
var dockerFile = Path.Combine(ActionDirectory, Data.Image);
ArgUtil.File(dockerFile, nameof(Data.Image));
ExecutionContext.Output($"##[group]Building docker image");
ExecutionContext.Output($"Dockerfile for action: '{dockerFile}'.");
var imageName = $"{dockerManger.DockerInstanceLabel}:{ExecutionContext.Id.ToString("N")}";
var buildExitCode = await dockerManger.DockerBuild(
ExecutionContext,
@@ -59,8 +58,6 @@ namespace GitHub.Runner.Worker.Handlers
dockerFile,
Directory.GetParent(dockerFile).FullName,
imageName);
ExecutionContext.Output("##[endgroup]");
if (buildExitCode != 0)
{
throw new InvalidOperationException($"Docker build failed with exit code {buildExitCode}");
@@ -161,21 +158,16 @@ namespace GitHub.Runner.Worker.Handlers
Directory.CreateDirectory(tempHomeDirectory);
this.Environment["HOME"] = tempHomeDirectory;
var tempFileCommandDirectory = Path.Combine(tempDirectory, "_runner_file_commands");
ArgUtil.Directory(tempFileCommandDirectory, nameof(tempFileCommandDirectory));
var tempWorkflowDirectory = Path.Combine(tempDirectory, "_github_workflow");
ArgUtil.Directory(tempWorkflowDirectory, nameof(tempWorkflowDirectory));
container.MountVolumes.Add(new MountVolume("/var/run/docker.sock", "/var/run/docker.sock"));
container.MountVolumes.Add(new MountVolume(tempHomeDirectory, "/github/home"));
container.MountVolumes.Add(new MountVolume(tempWorkflowDirectory, "/github/workflow"));
container.MountVolumes.Add(new MountVolume(tempFileCommandDirectory, "/github/file_commands"));
container.MountVolumes.Add(new MountVolume(defaultWorkingDirectory, "/github/workspace"));
container.AddPathTranslateMapping(tempHomeDirectory, "/github/home");
container.AddPathTranslateMapping(tempWorkflowDirectory, "/github/workflow");
container.AddPathTranslateMapping(tempFileCommandDirectory, "/github/file_commands");
container.AddPathTranslateMapping(defaultWorkingDirectory, "/github/workspace");
container.ContainerWorkDirectory = "/github/workspace";
@@ -193,7 +185,7 @@ namespace GitHub.Runner.Worker.Handlers
}
// Add Actions Runtime server info
var systemConnection = ExecutionContext.Global.Endpoints.Single(x => string.Equals(x.Name, WellKnownServiceEndpointNames.SystemVssConnection, StringComparison.OrdinalIgnoreCase));
var systemConnection = ExecutionContext.Endpoints.Single(x => string.Equals(x.Name, WellKnownServiceEndpointNames.SystemVssConnection, StringComparison.OrdinalIgnoreCase));
Environment["ACTIONS_RUNTIME_URL"] = systemConnection.Url.AbsoluteUri;
Environment["ACTIONS_RUNTIME_TOKEN"] = systemConnection.Authorization.Parameters[EndpointAuthorizationParameters.AccessToken];
if (systemConnection.Data.TryGetValue("CacheServerUrl", out var cacheUrl) && !string.IsNullOrEmpty(cacheUrl))

View File

@@ -148,14 +148,14 @@ namespace GitHub.Runner.Worker.Handlers
{
// Validate args.
Trace.Entering();
ArgUtil.NotNull(ExecutionContext.Global.PrependPath, nameof(ExecutionContext.Global.PrependPath));
if (ExecutionContext.Global.PrependPath.Count == 0)
ArgUtil.NotNull(ExecutionContext.PrependPath, nameof(ExecutionContext.PrependPath));
if (ExecutionContext.PrependPath.Count == 0)
{
return;
}
// Prepend path.
string prepend = string.Join(Path.PathSeparator.ToString(), ExecutionContext.Global.PrependPath.Reverse<string>());
string prepend = string.Join(Path.PathSeparator.ToString(), ExecutionContext.PrependPath.Reverse<string>());
var containerStepHost = StepHost as ContainerStepHost;
if (containerStepHost != null)
{

View File

@@ -66,11 +66,6 @@ namespace GitHub.Runner.Worker.Handlers
handler = HostContext.CreateService<IRunnerPluginHandler>();
(handler as IRunnerPluginHandler).Data = data as PluginActionExecutionData;
}
else if (data.ExecutionType == ActionExecutionType.Composite)
{
handler = HostContext.CreateService<ICompositeActionHandler>();
(handler as ICompositeActionHandler).Data = data as CompositeActionExecutionData;
}
else
{
// This should never happen.

View File

@@ -46,7 +46,7 @@ namespace GitHub.Runner.Worker.Handlers
}
// Add Actions Runtime server info
var systemConnection = ExecutionContext.Global.Endpoints.Single(x => string.Equals(x.Name, WellKnownServiceEndpointNames.SystemVssConnection, StringComparison.OrdinalIgnoreCase));
var systemConnection = ExecutionContext.Endpoints.Single(x => string.Equals(x.Name, WellKnownServiceEndpointNames.SystemVssConnection, StringComparison.OrdinalIgnoreCase));
Environment["ACTIONS_RUNTIME_URL"] = systemConnection.Url.AbsoluteUri;
Environment["ACTIONS_RUNTIME_TOKEN"] = systemConnection.Authorization.Parameters[EndpointAuthorizationParameters.AccessToken];
if (systemConnection.Data.TryGetValue("CacheServerUrl", out var cacheUrl) && !string.IsNullOrEmpty(cacheUrl))
@@ -113,7 +113,7 @@ namespace GitHub.Runner.Worker.Handlers
requireExitCodeZero: false,
outputEncoding: outputEncoding,
killProcessOnCancel: false,
inheritConsoleHandler: !ExecutionContext.Global.Variables.Retain_Default_Encoding,
inheritConsoleHandler: !ExecutionContext.Variables.Retain_Default_Encoding,
cancellationToken: ExecutionContext.CancellationToken);
// Wait for either the node exit or force finish through ##vso command

View File

@@ -31,7 +31,7 @@ namespace GitHub.Runner.Worker.Handlers
{
_executionContext = executionContext;
_commandManager = commandManager;
_container = container ?? executionContext.Global.Container;
_container = container ?? executionContext.Container;
// Recursion failsafe (test override)
var failsafeString = Environment.GetEnvironmentVariable("RUNNER_TEST_GET_REPOSITORY_PATH_FAILSAFE");
@@ -41,7 +41,7 @@ namespace GitHub.Runner.Worker.Handlers
}
// Determine the timeout
var timeoutStr = _executionContext.Global.Variables.Get(_timeoutKey);
var timeoutStr = _executionContext.Variables.Get(_timeoutKey);
if (string.IsNullOrEmpty(timeoutStr) ||
!TimeSpan.TryParse(timeoutStr, CultureInfo.InvariantCulture, out _timeout) ||
_timeout <= TimeSpan.Zero)

View File

@@ -23,19 +23,6 @@ namespace GitHub.Runner.Worker.Handlers
public override void PrintActionDetails(ActionRunStage stage)
{
// We don't want to display the internal workings if composite (similar/equivalent information can be found in debug)
void writeDetails(string message)
{
if (ExecutionContext.InsideComposite)
{
ExecutionContext.Debug(message);
}
else
{
ExecutionContext.Output(message);
}
}
if (stage == ActionRunStage.Post)
{
throw new NotSupportedException("Script action should not have 'Post' job action.");
@@ -52,7 +39,7 @@ namespace GitHub.Runner.Worker.Handlers
firstLine = firstLine.Substring(0, firstNewLine);
}
writeDetails(ExecutionContext.InsideComposite ? $"Run {firstLine}" : $"##[group]Run {firstLine}");
ExecutionContext.Output($"##[group]Run {firstLine}");
}
else
{
@@ -63,20 +50,20 @@ namespace GitHub.Runner.Worker.Handlers
foreach (var line in multiLines)
{
// Bright Cyan color
writeDetails($"\x1b[36;1m{line}\x1b[0m");
ExecutionContext.Output($"\x1b[36;1m{line}\x1b[0m");
}
string argFormat;
string shellCommand;
string shellCommandPath = null;
bool validateShellOnHost = !(StepHost is ContainerStepHost);
string prependPath = string.Join(Path.PathSeparator.ToString(), ExecutionContext.Global.PrependPath.Reverse<string>());
string prependPath = string.Join(Path.PathSeparator.ToString(), ExecutionContext.PrependPath.Reverse<string>());
string shell = null;
if (!Inputs.TryGetValue("shell", out shell) || string.IsNullOrEmpty(shell))
{
// TODO: figure out how defaults interact with template later
// for now, we won't check job.defaults if we are inside a template.
if (string.IsNullOrEmpty(ExecutionContext.ScopeName) && ExecutionContext.Global.JobDefaults.TryGetValue("run", out var runDefaults))
if (string.IsNullOrEmpty(ExecutionContext.ScopeName) && ExecutionContext.JobDefaults.TryGetValue("run", out var runDefaults))
{
runDefaults.TryGetValue("shell", out shell);
}
@@ -122,23 +109,23 @@ namespace GitHub.Runner.Worker.Handlers
if (!string.IsNullOrEmpty(shellCommandPath))
{
writeDetails($"shell: {shellCommandPath} {argFormat}");
ExecutionContext.Output($"shell: {shellCommandPath} {argFormat}");
}
else
{
writeDetails($"shell: {shellCommand} {argFormat}");
ExecutionContext.Output($"shell: {shellCommand} {argFormat}");
}
if (this.Environment?.Count > 0)
{
writeDetails("env:");
ExecutionContext.Output("env:");
foreach (var env in this.Environment)
{
writeDetails($" {env.Key}: {env.Value}");
ExecutionContext.Output($" {env.Key}: {env.Value}");
}
}
writeDetails(ExecutionContext.InsideComposite ? "" : "##[endgroup]");
ExecutionContext.Output("##[endgroup]");
}
public async Task RunAsync(ActionRunStage stage)
@@ -164,7 +151,9 @@ namespace GitHub.Runner.Worker.Handlers
string workingDirectory = null;
if (!Inputs.TryGetValue("workingDirectory", out workingDirectory))
{
if (string.IsNullOrEmpty(ExecutionContext.ScopeName) && ExecutionContext.Global.JobDefaults.TryGetValue("run", out var runDefaults))
// TODO: figure out how defaults interact with template later
// for now, we won't check job.defaults if we are inside a template.
if (string.IsNullOrEmpty(ExecutionContext.ScopeName) && ExecutionContext.JobDefaults.TryGetValue("run", out var runDefaults))
{
if (runDefaults.TryGetValue("working-directory", out workingDirectory))
{
@@ -178,7 +167,9 @@ namespace GitHub.Runner.Worker.Handlers
string shell = null;
if (!Inputs.TryGetValue("shell", out shell) || string.IsNullOrEmpty(shell))
{
if (string.IsNullOrEmpty(ExecutionContext.ScopeName) && ExecutionContext.Global.JobDefaults.TryGetValue("run", out var runDefaults))
// TODO: figure out how defaults interact with template later
// for now, we won't check job.defaults if we are inside a template.
if (string.IsNullOrEmpty(ExecutionContext.ScopeName) && ExecutionContext.JobDefaults.TryGetValue("run", out var runDefaults))
{
if (runDefaults.TryGetValue("shell", out shell))
{
@@ -189,7 +180,7 @@ namespace GitHub.Runner.Worker.Handlers
var isContainerStepHost = StepHost is ContainerStepHost;
string prependPath = string.Join(Path.PathSeparator.ToString(), ExecutionContext.Global.PrependPath.Reverse<string>());
string prependPath = string.Join(Path.PathSeparator.ToString(), ExecutionContext.PrependPath.Reverse<string>());
string commandPath, argFormat, shellCommand;
// Set up default command and arguments
if (string.IsNullOrEmpty(shell))
@@ -241,7 +232,7 @@ namespace GitHub.Runner.Worker.Handlers
#if OS_WINDOWS
// Normalize Windows line endings
contents = contents.Replace("\r\n", "\n").Replace("\n", "\r\n");
var encoding = ExecutionContext.Global.Variables.Retain_Default_Encoding && Console.InputEncoding.CodePage != 65001
var encoding = ExecutionContext.Variables.Retain_Default_Encoding && Console.InputEncoding.CodePage != 65001
? Console.InputEncoding
: new UTF8Encoding(false);
#else
@@ -294,7 +285,7 @@ namespace GitHub.Runner.Worker.Handlers
requireExitCodeZero: false,
outputEncoding: null,
killProcessOnCancel: false,
inheritConsoleHandler: !ExecutionContext.Global.Variables.Retain_Default_Encoding,
inheritConsoleHandler: !ExecutionContext.Variables.Retain_Default_Encoding,
cancellationToken: ExecutionContext.CancellationToken);
// Error

View File

@@ -64,20 +64,6 @@ namespace GitHub.Runner.Worker
context.Debug($"Starting: Set up job");
context.Output($"Current runner version: '{BuildConstants.RunnerPackage.Version}'");
var setting = HostContext.GetService<IConfigurationStore>().GetSettings();
var credFile = HostContext.GetConfigFile(WellKnownConfigFile.Credentials);
if (File.Exists(credFile))
{
var credData = IOUtil.LoadObject<CredentialData>(credFile);
if (credData != null &&
credData.Data.TryGetValue("clientId", out var clientId))
{
// print out HostName for self-hosted runner
context.Output($"Runner name: '{setting.AgentName}'");
context.Output($"Machine name: '{Environment.MachineName}'");
}
}
var setupInfoFile = HostContext.GetConfigFile(WellKnownConfigFile.SetupInfo);
if (File.Exists(setupInfoFile))
{
@@ -162,7 +148,7 @@ namespace GitHub.Runner.Worker
var environmentVariables = templateEvaluator.EvaluateStepEnvironment(token, jobContext.ExpressionValues, jobContext.ExpressionFunctions, VarUtil.EnvironmentVariableKeyComparer);
foreach (var pair in environmentVariables)
{
context.Global.EnvironmentVariables[pair.Key] = pair.Value ?? string.Empty;
context.EnvironmentVariables[pair.Key] = pair.Value ?? string.Empty;
context.SetEnvContext(pair.Key, pair.Value ?? string.Empty);
}
}
@@ -172,7 +158,7 @@ namespace GitHub.Runner.Worker
var container = templateEvaluator.EvaluateJobContainer(message.JobContainer, jobContext.ExpressionValues, jobContext.ExpressionFunctions);
if (container != null)
{
jobContext.Global.Container = new Container.ContainerInfo(HostContext, container);
jobContext.Container = new Container.ContainerInfo(HostContext, container);
}
// Evaluate the job service containers
@@ -184,7 +170,7 @@ namespace GitHub.Runner.Worker
{
var networkAlias = pair.Key;
var serviceContainer = pair.Value;
jobContext.Global.ServiceContainers.Add(new Container.ContainerInfo(HostContext, serviceContainer, false, networkAlias));
jobContext.ServiceContainers.Add(new Container.ContainerInfo(HostContext, serviceContainer, false, networkAlias));
}
}
@@ -195,14 +181,14 @@ namespace GitHub.Runner.Worker
var defaults = token.AssertMapping("defaults");
if (defaults.Any(x => string.Equals(x.Key.AssertString("defaults key").Value, "run", StringComparison.OrdinalIgnoreCase)))
{
context.Global.JobDefaults["run"] = new Dictionary<string, string>(StringComparer.OrdinalIgnoreCase);
context.JobDefaults["run"] = new Dictionary<string, string>(StringComparer.OrdinalIgnoreCase);
var defaultsRun = defaults.First(x => string.Equals(x.Key.AssertString("defaults key").Value, "run", StringComparison.OrdinalIgnoreCase));
var jobDefaults = templateEvaluator.EvaluateJobDefaultsRun(defaultsRun.Value, jobContext.ExpressionValues, jobContext.ExpressionFunctions);
foreach (var pair in jobDefaults)
{
if (!string.IsNullOrEmpty(pair.Value))
{
context.Global.JobDefaults["run"][pair.Key] = pair.Value;
context.JobDefaults["run"][pair.Key] = pair.Value;
}
}
}
@@ -216,15 +202,15 @@ namespace GitHub.Runner.Worker
preJobSteps.AddRange(prepareResult.ContainerSetupSteps);
// Add start-container steps, record and stop-container steps
if (jobContext.Global.Container != null || jobContext.Global.ServiceContainers.Count > 0)
if (jobContext.Container != null || jobContext.ServiceContainers.Count > 0)
{
var containerProvider = HostContext.GetService<IContainerOperationProvider>();
var containers = new List<Container.ContainerInfo>();
if (jobContext.Global.Container != null)
if (jobContext.Container != null)
{
containers.Add(jobContext.Global.Container);
containers.Add(jobContext.Container);
}
containers.AddRange(jobContext.Global.ServiceContainers);
containers.AddRange(jobContext.ServiceContainers);
preJobSteps.Add(new JobExtensionRunner(runAsync: containerProvider.StartContainersAsync,
condition: $"{PipelineTemplateConstants.Success}()",
@@ -296,7 +282,7 @@ namespace GitHub.Runner.Worker
{
ArgUtil.NotNull(actionStep, step.DisplayName);
intraActionStates.TryGetValue(actionStep.Action.Id, out var intraActionState);
actionStep.ExecutionContext = jobContext.CreateChild(actionStep.Action.Id, actionStep.DisplayName, actionStep.Action.Name, null, actionStep.Action.ContextName, intraActionState);
actionStep.ExecutionContext = jobContext.CreateChild(actionStep.Action.Id, actionStep.DisplayName, actionStep.Action.Name, actionStep.Action.ScopeName, actionStep.Action.ContextName, intraActionState);
}
}
@@ -305,7 +291,7 @@ namespace GitHub.Runner.Worker
steps.AddRange(jobSteps);
// Prepare for orphan process cleanup
_processCleanup = jobContext.Global.Variables.GetBoolean("process.clean") ?? true;
_processCleanup = jobContext.Variables.GetBoolean("process.clean") ?? true;
if (_processCleanup)
{
// Set the RUNNER_TRACKING_ID env variable.
@@ -376,13 +362,13 @@ namespace GitHub.Runner.Worker
var envContext = new CaseSensitiveDictionaryContextData();
#endif
context.ExpressionValues["env"] = envContext;
foreach (var pair in context.Global.EnvironmentVariables)
foreach (var pair in context.EnvironmentVariables)
{
envContext[pair.Key] = new StringContextData(pair.Value ?? string.Empty);
}
Trace.Info("Initialize steps context for evaluating job outputs");
context.ExpressionValues["steps"] = context.Global.StepsContext.GetScope(context.ScopeName);
context.ExpressionValues["steps"] = context.StepsContext.GetScope(context.ScopeName);
var templateEvaluator = context.ToPipelineTemplateEvaluator();
var outputs = templateEvaluator.EvaluateJobOutput(message.JobOutputs, context.ExpressionValues, context.ExpressionFunctions);
@@ -413,7 +399,7 @@ namespace GitHub.Runner.Worker
}
}
if (context.Global.Variables.GetBoolean(Constants.Variables.Actions.RunnerDebug) ?? false)
if (context.Variables.GetBoolean(Constants.Variables.Actions.RunnerDebug) ?? false)
{
Trace.Info("Support log upload starting.");
context.Output("Uploading runner diagnostic logs");

View File

@@ -99,7 +99,7 @@ namespace GitHub.Runner.Worker
return await CompleteJobAsync(jobServer, jobContext, message, TaskResult.Failed);
}
if (jobContext.Global.WriteDebug)
if (jobContext.WriteDebug)
{
jobContext.SetRunnerContext("debug", "1");
}
@@ -209,7 +209,7 @@ namespace GitHub.Runner.Worker
// Clean TEMP after finish process jobserverqueue, since there might be a pending fileupload still use the TEMP dir.
_tempDirectoryManager?.CleanupTempDirectory();
if (!jobContext.Global.Features.HasFlag(PlanFeatures.JobCompletedPlanEvent))
if (!jobContext.Features.HasFlag(PlanFeatures.JobCompletedPlanEvent))
{
Trace.Info($"Skip raise job completed event call from worker because Plan version is {message.Plan.Version}");
return result;

View File

@@ -100,12 +100,12 @@ namespace GitHub.Runner.Worker
RunnerActionPluginExecutionContext pluginContext = new RunnerActionPluginExecutionContext
{
Inputs = inputs,
Endpoints = context.Global.Endpoints,
Endpoints = context.Endpoints,
Context = context.ExpressionValues
};
// variables
foreach (var variable in context.Global.Variables.AllVariables)
foreach (var variable in context.Variables.AllVariables)
{
pluginContext.Variables[variable.Name] = new VariableValue(variable.Value, variable.Secret);
}

View File

@@ -15,14 +15,6 @@ namespace GitHub.Runner.Worker
private static readonly Regex _propertyRegex = new Regex("^[a-zA-Z_][a-zA-Z0-9_]*$", RegexOptions.Compiled);
private readonly DictionaryContextData _contextData = new DictionaryContextData();
public void ClearScope(string scopeName)
{
if (_contextData.TryGetValue(scopeName, out _))
{
_contextData[scopeName] = new DictionaryContextData();
}
}
public DictionaryContextData GetScope(string scopeName)
{
if (scopeName == null)

View File

@@ -66,11 +66,11 @@ namespace GitHub.Runner.Worker
}
var step = jobContext.JobSteps.Dequeue();
var nextStep = jobContext.JobSteps.Count > 0 ? jobContext.JobSteps.Peek() : null;
Trace.Info($"Processing step: DisplayName='{step.DisplayName}'");
ArgUtil.NotNull(step.ExecutionContext, nameof(step.ExecutionContext));
ArgUtil.NotNull(step.ExecutionContext.Global, nameof(step.ExecutionContext.Global));
ArgUtil.NotNull(step.ExecutionContext.Global.Variables, nameof(step.ExecutionContext.Global.Variables));
ArgUtil.NotNull(step.ExecutionContext.Variables, nameof(step.ExecutionContext.Variables));
// Start
step.ExecutionContext.Start();
@@ -82,156 +82,155 @@ namespace GitHub.Runner.Worker
step.ExecutionContext.ExpressionFunctions.Add(new FunctionInfo<SuccessFunction>(PipelineTemplateConstants.Success, 0, 0));
step.ExecutionContext.ExpressionFunctions.Add(new FunctionInfo<HashFilesFunction>(PipelineTemplateConstants.HashFiles, 1, byte.MaxValue));
step.ExecutionContext.ExpressionValues["steps"] = step.ExecutionContext.Global.StepsContext.GetScope(step.ExecutionContext.ScopeName);
// Populate env context for each step
Trace.Info("Initialize Env context for step");
// Initialize scope
if (InitializeScope(step, scopeInputs))
{
// Populate env context for each step
Trace.Info("Initialize Env context for step");
#if OS_WINDOWS
var envContext = new DictionaryContextData();
var envContext = new DictionaryContextData();
#else
var envContext = new CaseSensitiveDictionaryContextData();
var envContext = new CaseSensitiveDictionaryContextData();
#endif
// Global env
foreach (var pair in step.ExecutionContext.Global.EnvironmentVariables)
{
envContext[pair.Key] = new StringContextData(pair.Value ?? string.Empty);
}
step.ExecutionContext.ExpressionValues["env"] = envContext;
bool evaluateStepEnvFailed = false;
if (step is IActionRunner actionStep)
{
// Set GITHUB_ACTION
step.ExecutionContext.SetGitHubContext("action", actionStep.Action.Name);
try
step.ExecutionContext.ExpressionValues["env"] = envContext;
foreach (var pair in step.ExecutionContext.EnvironmentVariables)
{
// Evaluate and merge action's env block to env context
var templateEvaluator = step.ExecutionContext.ToPipelineTemplateEvaluator();
var actionEnvironment = templateEvaluator.EvaluateStepEnvironment(actionStep.Action.Environment, step.ExecutionContext.ExpressionValues, step.ExecutionContext.ExpressionFunctions, VarUtil.EnvironmentVariableKeyComparer);
foreach (var env in actionEnvironment)
{
envContext[env.Key] = new StringContextData(env.Value ?? string.Empty);
}
envContext[pair.Key] = new StringContextData(pair.Value ?? string.Empty);
}
catch (Exception ex)
bool evaluateStepEnvFailed = false;
if (step is IActionRunner actionStep)
{
// fail the step since there is an evaluate error.
Trace.Info("Caught exception from expression for step.env");
evaluateStepEnvFailed = true;
step.ExecutionContext.Error(ex);
CompleteStep(step, TaskResult.Failed);
}
}
// Set GITHUB_ACTION
step.ExecutionContext.SetGitHubContext("action", actionStep.Action.Name);
if (!evaluateStepEnvFailed)
{
try
{
// Register job cancellation call back only if job cancellation token not been fire before each step run
if (!jobContext.CancellationToken.IsCancellationRequested)
try
{
// Test the condition again. The job was canceled after the condition was originally evaluated.
jobCancelRegister = jobContext.CancellationToken.Register(() =>
// Evaluate and merge action's env block to env context
var templateEvaluator = step.ExecutionContext.ToPipelineTemplateEvaluator();
var actionEnvironment = templateEvaluator.EvaluateStepEnvironment(actionStep.Action.Environment, step.ExecutionContext.ExpressionValues, step.ExecutionContext.ExpressionFunctions, VarUtil.EnvironmentVariableKeyComparer);
foreach (var env in actionEnvironment)
{
// mark job as cancelled
jobContext.Result = TaskResult.Canceled;
jobContext.JobContext.Status = jobContext.Result?.ToActionResult();
step.ExecutionContext.Debug($"Re-evaluate condition on job cancellation for step: '{step.DisplayName}'.");
var conditionReTestTraceWriter = new ConditionTraceWriter(Trace, null); // host tracing only
var conditionReTestResult = false;
if (HostContext.RunnerShutdownToken.IsCancellationRequested)
{
step.ExecutionContext.Debug($"Skip Re-evaluate condition on runner shutdown.");
}
else
{
try
{
var templateEvaluator = step.ExecutionContext.ToPipelineTemplateEvaluator(conditionReTestTraceWriter);
var condition = new BasicExpressionToken(null, null, null, step.Condition);
conditionReTestResult = templateEvaluator.EvaluateStepIf(condition, step.ExecutionContext.ExpressionValues, step.ExecutionContext.ExpressionFunctions, step.ExecutionContext.ToExpressionState());
}
catch (Exception ex)
{
// Cancel the step since we get exception while re-evaluate step condition.
Trace.Info("Caught exception from expression when re-test condition on job cancellation.");
step.ExecutionContext.Error(ex);
}
}
if (!conditionReTestResult)
{
// Cancel the step.
Trace.Info("Cancel current running step.");
step.ExecutionContext.CancelToken();
}
});
}
else
{
if (jobContext.Result != TaskResult.Canceled)
{
// mark job as cancelled
jobContext.Result = TaskResult.Canceled;
jobContext.JobContext.Status = jobContext.Result?.ToActionResult();
envContext[env.Key] = new StringContextData(env.Value ?? string.Empty);
}
}
// Evaluate condition.
step.ExecutionContext.Debug($"Evaluating condition for step: '{step.DisplayName}'");
var conditionTraceWriter = new ConditionTraceWriter(Trace, step.ExecutionContext);
var conditionResult = false;
var conditionEvaluateError = default(Exception);
if (HostContext.RunnerShutdownToken.IsCancellationRequested)
{
step.ExecutionContext.Debug($"Skip evaluate condition on runner shutdown.");
}
else
{
try
{
var templateEvaluator = step.ExecutionContext.ToPipelineTemplateEvaluator(conditionTraceWriter);
var condition = new BasicExpressionToken(null, null, null, step.Condition);
conditionResult = templateEvaluator.EvaluateStepIf(condition, step.ExecutionContext.ExpressionValues, step.ExecutionContext.ExpressionFunctions, step.ExecutionContext.ToExpressionState());
}
catch (Exception ex)
{
Trace.Info("Caught exception from expression.");
Trace.Error(ex);
conditionEvaluateError = ex;
}
}
// no evaluate error but condition is false
if (!conditionResult && conditionEvaluateError == null)
{
// Condition == false
Trace.Info("Skipping step due to condition evaluation.");
CompleteStep(step, TaskResult.Skipped, resultCode: conditionTraceWriter.Trace);
}
else if (conditionEvaluateError != null)
catch (Exception ex)
{
// fail the step since there is an evaluate error.
step.ExecutionContext.Error(conditionEvaluateError);
CompleteStep(step, TaskResult.Failed);
}
else
{
// Run the step.
await RunStepAsync(step, jobContext.CancellationToken);
CompleteStep(step);
Trace.Info("Caught exception from expression for step.env");
evaluateStepEnvFailed = true;
step.ExecutionContext.Error(ex);
CompleteStep(step, nextStep, TaskResult.Failed);
}
}
finally
if (!evaluateStepEnvFailed)
{
if (jobCancelRegister != null)
try
{
jobCancelRegister?.Dispose();
jobCancelRegister = null;
// Register job cancellation call back only if job cancellation token not been fire before each step run
if (!jobContext.CancellationToken.IsCancellationRequested)
{
// Test the condition again. The job was canceled after the condition was originally evaluated.
jobCancelRegister = jobContext.CancellationToken.Register(() =>
{
// mark job as cancelled
jobContext.Result = TaskResult.Canceled;
jobContext.JobContext.Status = jobContext.Result?.ToActionResult();
step.ExecutionContext.Debug($"Re-evaluate condition on job cancellation for step: '{step.DisplayName}'.");
var conditionReTestTraceWriter = new ConditionTraceWriter(Trace, null); // host tracing only
var conditionReTestResult = false;
if (HostContext.RunnerShutdownToken.IsCancellationRequested)
{
step.ExecutionContext.Debug($"Skip Re-evaluate condition on runner shutdown.");
}
else
{
try
{
var templateEvaluator = step.ExecutionContext.ToPipelineTemplateEvaluator(conditionReTestTraceWriter);
var condition = new BasicExpressionToken(null, null, null, step.Condition);
conditionReTestResult = templateEvaluator.EvaluateStepIf(condition, step.ExecutionContext.ExpressionValues, step.ExecutionContext.ExpressionFunctions, step.ExecutionContext.ToExpressionState());
}
catch (Exception ex)
{
// Cancel the step since we get exception while re-evaluate step condition.
Trace.Info("Caught exception from expression when re-test condition on job cancellation.");
step.ExecutionContext.Error(ex);
}
}
if (!conditionReTestResult)
{
// Cancel the step.
Trace.Info("Cancel current running step.");
step.ExecutionContext.CancelToken();
}
});
}
else
{
if (jobContext.Result != TaskResult.Canceled)
{
// mark job as cancelled
jobContext.Result = TaskResult.Canceled;
jobContext.JobContext.Status = jobContext.Result?.ToActionResult();
}
}
// Evaluate condition.
step.ExecutionContext.Debug($"Evaluating condition for step: '{step.DisplayName}'");
var conditionTraceWriter = new ConditionTraceWriter(Trace, step.ExecutionContext);
var conditionResult = false;
var conditionEvaluateError = default(Exception);
if (HostContext.RunnerShutdownToken.IsCancellationRequested)
{
step.ExecutionContext.Debug($"Skip evaluate condition on runner shutdown.");
}
else
{
try
{
var templateEvaluator = step.ExecutionContext.ToPipelineTemplateEvaluator(conditionTraceWriter);
var condition = new BasicExpressionToken(null, null, null, step.Condition);
conditionResult = templateEvaluator.EvaluateStepIf(condition, step.ExecutionContext.ExpressionValues, step.ExecutionContext.ExpressionFunctions, step.ExecutionContext.ToExpressionState());
}
catch (Exception ex)
{
Trace.Info("Caught exception from expression.");
Trace.Error(ex);
conditionEvaluateError = ex;
}
}
// no evaluate error but condition is false
if (!conditionResult && conditionEvaluateError == null)
{
// Condition == false
Trace.Info("Skipping step due to condition evaluation.");
CompleteStep(step, nextStep, TaskResult.Skipped, resultCode: conditionTraceWriter.Trace);
}
else if (conditionEvaluateError != null)
{
// fail the step since there is an evaluate error.
step.ExecutionContext.Error(conditionEvaluateError);
CompleteStep(step, nextStep, TaskResult.Failed);
}
else
{
// Run the step.
await RunStepAsync(step, jobContext.CancellationToken);
CompleteStep(step, nextStep);
}
}
finally
{
if (jobCancelRegister != null)
{
jobCancelRegister?.Dispose();
jobCancelRegister = null;
}
}
}
}
@@ -286,7 +285,40 @@ namespace GitHub.Runner.Worker
step.ExecutionContext.SetTimeout(timeout);
}
await EncodingUtil.SetEncoding(HostContext, Trace, step.ExecutionContext.CancellationToken);
#if OS_WINDOWS
try
{
if (Console.InputEncoding.CodePage != 65001)
{
using (var p = HostContext.CreateService<IProcessInvoker>())
{
// Use UTF8 code page
int exitCode = await p.ExecuteAsync(workingDirectory: HostContext.GetDirectory(WellKnownDirectory.Work),
fileName: WhichUtil.Which("chcp", true, Trace),
arguments: "65001",
environment: null,
requireExitCodeZero: false,
outputEncoding: null,
killProcessOnCancel: false,
redirectStandardIn: null,
inheritConsoleHandler: true,
cancellationToken: step.ExecutionContext.CancellationToken);
if (exitCode == 0)
{
Trace.Info("Successfully returned to code page 65001 (UTF8)");
}
else
{
Trace.Warning($"'chcp 65001' failed with exit code {exitCode}");
}
}
}
}
catch (Exception ex)
{
Trace.Warning($"'chcp 65001' failed with exception {ex.Message}");
}
#endif
try
{
@@ -352,9 +384,117 @@ namespace GitHub.Runner.Worker
step.ExecutionContext.Debug($"Finishing: {step.DisplayName}");
}
private void CompleteStep(IStep step, TaskResult? result = null, string resultCode = null)
private bool InitializeScope(IStep step, Dictionary<string, PipelineContextData> scopeInputs)
{
var executionContext = step.ExecutionContext;
var stepsContext = executionContext.StepsContext;
if (!string.IsNullOrEmpty(executionContext.ScopeName))
{
// Gather uninitialized current and ancestor scopes
var scope = executionContext.Scopes[executionContext.ScopeName];
var scopesToInitialize = default(Stack<ContextScope>);
while (scope != null && !scopeInputs.ContainsKey(scope.Name))
{
if (scopesToInitialize == null)
{
scopesToInitialize = new Stack<ContextScope>();
}
scopesToInitialize.Push(scope);
scope = string.IsNullOrEmpty(scope.ParentName) ? null : executionContext.Scopes[scope.ParentName];
}
// Initialize current and ancestor scopes
while (scopesToInitialize?.Count > 0)
{
scope = scopesToInitialize.Pop();
executionContext.Debug($"Initializing scope '{scope.Name}'");
executionContext.ExpressionValues["steps"] = stepsContext.GetScope(scope.ParentName);
executionContext.ExpressionValues["inputs"] = !String.IsNullOrEmpty(scope.ParentName) ? scopeInputs[scope.ParentName] : null;
var templateEvaluator = executionContext.ToPipelineTemplateEvaluator();
var inputs = default(DictionaryContextData);
try
{
inputs = templateEvaluator.EvaluateStepScopeInputs(scope.Inputs, executionContext.ExpressionValues, executionContext.ExpressionFunctions);
}
catch (Exception ex)
{
Trace.Info($"Caught exception from initialize scope '{scope.Name}'");
Trace.Error(ex);
executionContext.Error(ex);
executionContext.Complete(TaskResult.Failed);
return false;
}
scopeInputs[scope.Name] = inputs;
}
}
// Setup expression values
var scopeName = executionContext.ScopeName;
executionContext.ExpressionValues["steps"] = stepsContext.GetScope(scopeName);
executionContext.ExpressionValues["inputs"] = string.IsNullOrEmpty(scopeName) ? null : scopeInputs[scopeName];
return true;
}
private void CompleteStep(IStep step, IStep nextStep, TaskResult? result = null, string resultCode = null)
{
var executionContext = step.ExecutionContext;
if (!string.IsNullOrEmpty(executionContext.ScopeName))
{
// Gather current and ancestor scopes to finalize
var scope = executionContext.Scopes[executionContext.ScopeName];
var scopesToFinalize = default(Queue<ContextScope>);
var nextStepScopeName = nextStep?.ExecutionContext.ScopeName;
while (scope != null &&
!string.Equals(nextStepScopeName, scope.Name, StringComparison.OrdinalIgnoreCase) &&
!(nextStepScopeName ?? string.Empty).StartsWith($"{scope.Name}.", StringComparison.OrdinalIgnoreCase))
{
if (scopesToFinalize == null)
{
scopesToFinalize = new Queue<ContextScope>();
}
scopesToFinalize.Enqueue(scope);
scope = string.IsNullOrEmpty(scope.ParentName) ? null : executionContext.Scopes[scope.ParentName];
}
// Finalize current and ancestor scopes
var stepsContext = step.ExecutionContext.StepsContext;
while (scopesToFinalize?.Count > 0)
{
scope = scopesToFinalize.Dequeue();
executionContext.Debug($"Finalizing scope '{scope.Name}'");
executionContext.ExpressionValues["steps"] = stepsContext.GetScope(scope.Name);
executionContext.ExpressionValues["inputs"] = null;
var templateEvaluator = executionContext.ToPipelineTemplateEvaluator();
var outputs = default(DictionaryContextData);
try
{
outputs = templateEvaluator.EvaluateStepScopeOutputs(scope.Outputs, executionContext.ExpressionValues, executionContext.ExpressionFunctions);
}
catch (Exception ex)
{
Trace.Info($"Caught exception from finalize scope '{scope.Name}'");
Trace.Error(ex);
executionContext.Error(ex);
executionContext.Complete(TaskResult.Failed);
return;
}
if (outputs?.Count > 0)
{
var parentScopeName = scope.ParentName;
var contextName = scope.ContextName;
foreach (var pair in outputs)
{
var outputName = pair.Key;
var outputValue = pair.Value.ToString();
stepsContext.SetOutput(parentScopeName, contextName, outputName, outputValue, out var reference);
executionContext.Debug($"{reference}='{outputValue}'");
}
}
}
}
executionContext.Complete(result, resultCode: resultCode);
}

View File

@@ -7,8 +7,7 @@
"name": "string",
"description": "string",
"inputs": "inputs",
"runs": "runs",
"outputs": "outputs"
"runs": "runs"
},
"loose-key-type": "non-empty-string",
"loose-value-type": "any"
@@ -29,26 +28,11 @@
"loose-value-type": "any"
}
},
"outputs": {
"mapping": {
"loose-key-type": "non-empty-string",
"loose-value-type": "outputs-attributes"
}
},
"outputs-attributes": {
"mapping": {
"properties": {
"description": "string",
"value": "output-value"
}
}
},
"runs": {
"one-of": [
"container-runs",
"node12-runs",
"plugin-runs",
"composite-runs"
"plugin-runs"
]
},
"container-runs": {
@@ -99,56 +83,12 @@
}
}
},
"composite-runs": {
"mapping": {
"properties": {
"using": "non-empty-string",
"steps": "composite-steps"
}
}
},
"composite-steps": {
"sequence": {
"item-type": "composite-step"
}
},
"composite-step": {
"mapping": {
"properties": {
"name": "string-steps-context",
"id": "non-empty-string",
"run": {
"type": "string-steps-context",
"required": true
},
"env": "step-env",
"working-directory": "string-steps-context",
"shell": {
"type": "non-empty-string",
"required": true
}
}
}
},
"container-runs-context": {
"context": [
"inputs"
],
"string": {}
},
"output-value": {
"context": [
"github",
"strategy",
"matrix",
"steps",
"inputs",
"job",
"runner",
"env"
],
"string": {}
},
"input-default-context": {
"context": [
"github",
@@ -164,37 +104,6 @@
"string": {
"require-non-empty": true
}
},
"string-steps-context": {
"context": [
"github",
"inputs",
"strategy",
"matrix",
"steps",
"job",
"runner",
"env",
"hashFiles(1,255)"
],
"string": {}
},
"step-env": {
"context": [
"github",
"inputs",
"strategy",
"matrix",
"steps",
"job",
"runner",
"env",
"hashFiles(1,255)"
],
"mapping": {
"loose-key-type": "non-empty-string",
"loose-value-type": "string"
}
}
}
}
}

View File

@@ -317,37 +317,5 @@ namespace GitHub.DistributedTask.WebApi
userState: userState,
cancellationToken: cancellationToken);
}
/// <summary>
/// [Preview API] Resolves information required to download actions (URL, token) defined in an orchestration.
/// </summary>
/// <param name="scopeIdentifier">The project GUID to scope the request</param>
/// <param name="hubName">The name of the server hub: "build" for the Build server or "rm" for the Release Management server</param>
/// <param name="planId"></param>
/// <param name="actionReferenceList"></param>
/// <param name="userState"></param>
/// <param name="cancellationToken">The cancellation token to cancel operation.</param>
public virtual Task<ActionDownloadInfoCollection> ResolveActionDownloadInfoAsync(
Guid scopeIdentifier,
string hubName,
Guid planId,
ActionReferenceList actionReferenceList,
object userState = null,
CancellationToken cancellationToken = default)
{
HttpMethod httpMethod = new HttpMethod("POST");
Guid locationId = new Guid("27d7f831-88c1-4719-8ca1-6a061dad90eb");
object routeValues = new { scopeIdentifier = scopeIdentifier, hubName = hubName, planId = planId };
HttpContent content = new ObjectContent<ActionReferenceList>(actionReferenceList, new VssJsonMediaTypeFormatter(true));
return SendAsync<ActionDownloadInfoCollection>(
httpMethod,
locationId,
routeValues: routeValues,
version: new ApiResourceVersion(6.0, 1),
userState: userState,
cancellationToken: cancellationToken,
content: content);
}
}
}

View File

@@ -24,6 +24,7 @@ namespace GitHub.DistributedTask.Pipelines
Environment = actionToClone.Environment?.Clone();
Inputs = actionToClone.Inputs?.Clone();
ContextName = actionToClone?.ContextName;
ScopeName = actionToClone?.ScopeName;
DisplayNameToken = actionToClone.DisplayNameToken?.Clone();
}
@@ -40,6 +41,9 @@ namespace GitHub.DistributedTask.Pipelines
[DataMember(EmitDefaultValue = false)]
public TemplateToken DisplayNameToken { get; set; }
[DataMember(EmitDefaultValue = false)]
public String ScopeName { get; set; }
[DataMember(EmitDefaultValue = false)]
public String ContextName { get; set; }

View File

@@ -39,6 +39,7 @@ namespace GitHub.DistributedTask.Pipelines
DictionaryContextData contextData,
WorkspaceOptions workspaceOptions,
IEnumerable<JobStep> steps,
IEnumerable<ContextScope> scopes,
IList<String> fileTable,
TemplateToken jobOutputs,
IList<TemplateToken> defaults)
@@ -59,6 +60,11 @@ namespace GitHub.DistributedTask.Pipelines
m_maskHints = new List<MaskHint>(maskHints);
m_steps = new List<JobStep>(steps);
if (scopes != null)
{
m_scopes = new List<ContextScope>(scopes);
}
if (environmentVariables?.Count > 0)
{
m_environmentVariables = new List<TemplateToken>(environmentVariables);
@@ -255,6 +261,18 @@ namespace GitHub.DistributedTask.Pipelines
}
}
public IList<ContextScope> Scopes
{
get
{
if (m_scopes == null)
{
m_scopes = new List<ContextScope>();
}
return m_scopes;
}
}
/// <summary>
/// Gets the table of files used when parsing the pipeline (e.g. yaml files)
/// </summary>
@@ -397,6 +415,11 @@ namespace GitHub.DistributedTask.Pipelines
m_maskHints = new List<MaskHint>(this.m_maskHints.Distinct());
}
if (m_scopes?.Count == 0)
{
m_scopes = null;
}
if (m_variables?.Count == 0)
{
m_variables = null;
@@ -424,6 +447,9 @@ namespace GitHub.DistributedTask.Pipelines
[DataMember(Name = "Steps", EmitDefaultValue = false)]
private List<JobStep> m_steps;
[DataMember(Name = "Scopes", EmitDefaultValue = false)]
private List<ContextScope> m_scopes;
[DataMember(Name = "Variables", EmitDefaultValue = false)]
private IDictionary<String, VariableValue> m_variables;

View File

@@ -42,12 +42,7 @@ namespace GitHub.DistributedTask.Pipelines.ContextData
var floored = Math.Floor(m_value);
if (m_value == floored && m_value <= (Double)Int32.MaxValue && m_value >= (Double)Int32.MinValue)
{
var flooredInt = (Int32)floored;
return (JToken)flooredInt;
}
else if (m_value == floored && m_value <= (Double)Int64.MaxValue && m_value >= (Double)Int64.MinValue)
{
var flooredInt = (Int64)floored;
Int32 flooredInt = (Int32)floored;
return (JToken)flooredInt;
}
else

View File

@@ -0,0 +1,53 @@
using System;
using System.Collections.Generic;
using System.ComponentModel;
using System.Runtime.Serialization;
using GitHub.DistributedTask.ObjectTemplating.Tokens;
using Newtonsoft.Json;
namespace GitHub.DistributedTask.Pipelines
{
[DataContract]
[EditorBrowsable(EditorBrowsableState.Never)]
public sealed class ContextScope
{
[DataMember(EmitDefaultValue = false)]
public String Name { get; set; }
[IgnoreDataMember]
public String ContextName
{
get
{
var index = Name.LastIndexOf('.');
if (index >= 0)
{
return Name.Substring(index + 1);
}
return Name;
}
}
[IgnoreDataMember]
public String ParentName
{
get
{
var index = Name.LastIndexOf('.');
if (index >= 0)
{
return Name.Substring(0, index);
}
return String.Empty;
}
}
[DataMember(EmitDefaultValue = false)]
public TemplateToken Inputs { get; set; }
[DataMember(EmitDefaultValue = false)]
public TemplateToken Outputs { get; set; }
}
}

View File

@@ -56,36 +56,5 @@ namespace GitHub.DistributedTask.Pipelines
get;
set;
}
/// <summary>
/// Gets or sets the credentials used for pulling the container iamge.
/// </summary>
public ContainerRegistryCredentials Credentials
{
get;
set;
}
}
[EditorBrowsable(EditorBrowsableState.Never)]
public sealed class ContainerRegistryCredentials
{
/// <summary>
/// Gets or sets the user to authenticate to a registry with
/// </summary>
public String Username
{
get;
set;
}
/// <summary>
/// Gets or sets the password to authenticate to a registry with
/// </summary>
public String Password
{
get;
set;
}
}
}

View File

@@ -11,10 +11,10 @@ namespace GitHub.DistributedTask.Pipelines.ObjectTemplating
public const String BooleanStrategyContext = "boolean-strategy-context";
public const String CancelTimeoutMinutes = "cancel-timeout-minutes";
public const String Cancelled = "cancelled";
public const String Clean= "clean";
public const String Checkout = "checkout";
public const String Clean = "clean";
public const String Container = "container";
public const String ContinueOnError = "continue-on-error";
public const String Credentials = "credentials";
public const String Defaults = "defaults";
public const String Env = "env";
public const String Event = "event";
@@ -23,6 +23,7 @@ namespace GitHub.DistributedTask.Pipelines.ObjectTemplating
public const String FailFast = "fail-fast";
public const String Failure = "failure";
public const String FetchDepth = "fetch-depth";
public const String GeneratedId = "generated-id";
public const String GitHub = "github";
public const String HashFiles = "hashFiles";
public const String Id = "id";
@@ -35,6 +36,7 @@ namespace GitHub.DistributedTask.Pipelines.ObjectTemplating
public const String JobIfResult = "job-if-result";
public const String JobOutputs = "job-outputs";
public const String Jobs = "jobs";
public const String Labels = "labels";
public const String Lfs = "lfs";
public const String Matrix = "matrix";
public const String MaxParallel = "max-parallel";
@@ -46,23 +48,27 @@ namespace GitHub.DistributedTask.Pipelines.ObjectTemplating
public const String Options = "options";
public const String Outputs = "outputs";
public const String OutputsPattern = "needs.*.outputs";
public const String Password = "password";
public const String Path = "path";
public const String Pool = "pool";
public const String Ports = "ports";
public const String Result = "result";
public const String Run = "run";
public const String RunDisplayPrefix = "Run ";
public const String Run = "run";
public const String Runner = "runner";
public const String RunsOn = "runs-on";
public const String Scope = "scope";
public const String Scopes = "scopes";
public const String Secrets = "secrets";
public const String Services = "services";
public const String Shell = "shell";
public const String Skipped = "skipped";
public const String StepEnv = "step-env";
public const String StepIfResult = "step-if-result";
public const String StepWith = "step-with";
public const String Steps = "steps";
public const String StepsScopeInputs = "steps-scope-inputs";
public const String StepsScopeOutputs = "steps-scope-outputs";
public const String StepsTemplateRoot = "steps-template-root";
public const String StepWith = "step-with";
public const String Strategy = "strategy";
public const String StringStepsContext = "string-steps-context";
public const String StringStrategyContext = "string-strategy-context";
@@ -70,7 +76,7 @@ namespace GitHub.DistributedTask.Pipelines.ObjectTemplating
public const String Success = "success";
public const String Template = "template";
public const String TimeoutMinutes = "timeout-minutes";
public const String Username = "username";
public const String Token = "token";
public const String Uses = "uses";
public const String VmImage = "vmImage";
public const String Volumes = "volumes";

View File

@@ -1,6 +1,6 @@
using System;
using System.Collections.Generic;
using System.ComponentModel;
using System.Globalization;
using System.Linq;
using GitHub.DistributedTask.Expressions2;
using GitHub.DistributedTask.Expressions2.Sdk;
@@ -14,62 +14,8 @@ using Newtonsoft.Json.Linq;
namespace GitHub.DistributedTask.Pipelines.ObjectTemplating
{
[EditorBrowsable(EditorBrowsableState.Never)]
public static class PipelineTemplateConverter
internal static class PipelineTemplateConverter
{
public static List<Step> ConvertToSteps(
TemplateContext context,
TemplateToken steps)
{
var stepsSequence = steps.AssertSequence($"job {PipelineTemplateConstants.Steps}");
var result = new List<Step>();
var nameBuilder = new ReferenceNameBuilder();
foreach (var stepsItem in stepsSequence)
{
var step = ConvertToStep(context, stepsItem, nameBuilder);
if (step != null) // step = null means we are hitting error during step conversion, there should be an error in context.errors
{
if (step.Enabled)
{
result.Add(step);
}
}
}
// Generate context name if empty
foreach (ActionStep step in result)
{
if (!String.IsNullOrEmpty(step.ContextName))
{
continue;
}
var name = default(string);
switch (step.Reference.Type)
{
case ActionSourceType.ContainerRegistry:
var containerReference = step.Reference as ContainerRegistryReference;
name = containerReference.Image;
break;
case ActionSourceType.Repository:
var repositoryReference = step.Reference as RepositoryPathReference;
name = !String.IsNullOrEmpty(repositoryReference.Name) ? repositoryReference.Name : PipelineConstants.SelfAlias;
break;
}
if (String.IsNullOrEmpty(name))
{
name = "run";
}
nameBuilder.AppendSegment($"__{name}");
step.ContextName = nameBuilder.Build();
}
return result;
}
internal static Boolean ConvertToIfResult(
TemplateContext context,
TemplateToken ifResult)
@@ -209,30 +155,6 @@ namespace GitHub.DistributedTask.Pipelines.ObjectTemplating
return (Int32)numberToken.Value;
}
internal static ContainerRegistryCredentials ConvertToContainerCredentials(TemplateToken token)
{
var credentials = token.AssertMapping(PipelineTemplateConstants.Credentials);
var result = new ContainerRegistryCredentials();
foreach (var credentialProperty in credentials)
{
var propertyName = credentialProperty.Key.AssertString($"{PipelineTemplateConstants.Credentials} key");
switch (propertyName.Value)
{
case PipelineTemplateConstants.Username:
result.Username = credentialProperty.Value.AssertString(PipelineTemplateConstants.Username).Value;
break;
case PipelineTemplateConstants.Password:
result.Password = credentialProperty.Value.AssertString(PipelineTemplateConstants.Password).Value;
break;
default:
propertyName.AssertUnexpectedValue($"{PipelineTemplateConstants.Credentials} key {propertyName}");
break;
}
}
return result;
}
internal static JobContainer ConvertToJobContainer(
TemplateContext context,
TemplateToken value,
@@ -299,9 +221,6 @@ namespace GitHub.DistributedTask.Pipelines.ObjectTemplating
}
result.Volumes = volumeList;
break;
case PipelineTemplateConstants.Credentials:
result.Credentials = ConvertToContainerCredentials(containerPropertyPair.Value);
break;
default:
propertyName.AssertUnexpectedValue($"{PipelineTemplateConstants.Container} key");
break;
@@ -345,311 +264,5 @@ namespace GitHub.DistributedTask.Pipelines.ObjectTemplating
return result;
}
private static ActionStep ConvertToStep(
TemplateContext context,
TemplateToken stepsItem,
ReferenceNameBuilder nameBuilder)
{
var step = stepsItem.AssertMapping($"{PipelineTemplateConstants.Steps} item");
var continueOnError = default(ScalarToken);
var env = default(TemplateToken);
var id = default(StringToken);
var ifCondition = default(String);
var ifToken = default(ScalarToken);
var name = default(ScalarToken);
var run = default(ScalarToken);
var timeoutMinutes = default(ScalarToken);
var uses = default(StringToken);
var with = default(TemplateToken);
var workingDir = default(ScalarToken);
var path = default(ScalarToken);
var clean = default(ScalarToken);
var fetchDepth = default(ScalarToken);
var lfs = default(ScalarToken);
var submodules = default(ScalarToken);
var shell = default(ScalarToken);
foreach (var stepProperty in step)
{
var propertyName = stepProperty.Key.AssertString($"{PipelineTemplateConstants.Steps} item key");
switch (propertyName.Value)
{
case PipelineTemplateConstants.Clean:
clean = stepProperty.Value.AssertScalar($"{PipelineTemplateConstants.Steps} item {PipelineTemplateConstants.Clean}");
break;
case PipelineTemplateConstants.ContinueOnError:
ConvertToStepContinueOnError(context, stepProperty.Value, allowExpressions: true); // Validate early if possible
continueOnError = stepProperty.Value.AssertScalar($"{PipelineTemplateConstants.Steps} {PipelineTemplateConstants.ContinueOnError}");
break;
case PipelineTemplateConstants.Env:
ConvertToStepEnvironment(context, stepProperty.Value, StringComparer.Ordinal, allowExpressions: true); // Validate early if possible
env = stepProperty.Value;
break;
case PipelineTemplateConstants.FetchDepth:
fetchDepth = stepProperty.Value.AssertScalar($"{PipelineTemplateConstants.Steps} item {PipelineTemplateConstants.FetchDepth}");
break;
case PipelineTemplateConstants.Id:
id = stepProperty.Value.AssertString($"{PipelineTemplateConstants.Steps} item {PipelineTemplateConstants.Id}");
if (!String.IsNullOrEmpty(id.Value))
{
if (!nameBuilder.TryAddKnownName(id.Value, out var error))
{
context.Error(id, error);
}
}
break;
case PipelineTemplateConstants.If:
ifToken = stepProperty.Value.AssertScalar($"{PipelineTemplateConstants.Steps} item {PipelineTemplateConstants.If}");
break;
case PipelineTemplateConstants.Lfs:
lfs = stepProperty.Value.AssertScalar($"{PipelineTemplateConstants.Steps} item {PipelineTemplateConstants.Lfs}");
break;
case PipelineTemplateConstants.Name:
name = stepProperty.Value.AssertScalar($"{PipelineTemplateConstants.Steps} item {PipelineTemplateConstants.Name}");
break;
case PipelineTemplateConstants.Path:
path = stepProperty.Value.AssertScalar($"{PipelineTemplateConstants.Steps} item {PipelineTemplateConstants.Path}");
break;
case PipelineTemplateConstants.Run:
run = stepProperty.Value.AssertScalar($"{PipelineTemplateConstants.Steps} item {PipelineTemplateConstants.Run}");
break;
case PipelineTemplateConstants.Shell:
shell = stepProperty.Value.AssertScalar($"{PipelineTemplateConstants.Steps} item {PipelineTemplateConstants.Shell}");
break;
case PipelineTemplateConstants.Submodules:
submodules = stepProperty.Value.AssertScalar($"{PipelineTemplateConstants.Steps} item {PipelineTemplateConstants.Submodules}");
break;
case PipelineTemplateConstants.TimeoutMinutes:
ConvertToStepTimeout(context, stepProperty.Value, allowExpressions: true); // Validate early if possible
timeoutMinutes = stepProperty.Value.AssertScalar($"{PipelineTemplateConstants.Steps} item {PipelineTemplateConstants.TimeoutMinutes}");
break;
case PipelineTemplateConstants.Uses:
uses = stepProperty.Value.AssertString($"{PipelineTemplateConstants.Steps} item {PipelineTemplateConstants.Uses}");
break;
case PipelineTemplateConstants.With:
ConvertToStepInputs(context, stepProperty.Value, allowExpressions: true); // Validate early if possible
with = stepProperty.Value;
break;
case PipelineTemplateConstants.WorkingDirectory:
workingDir = stepProperty.Value.AssertScalar($"{PipelineTemplateConstants.Steps} item {PipelineTemplateConstants.WorkingDirectory}");
break;
default:
propertyName.AssertUnexpectedValue($"{PipelineTemplateConstants.Steps} item key"); // throws
break;
}
}
// Fixup the if-condition
ifCondition = ConvertToIfCondition(context, ifToken, false);
if (run != null)
{
var result = new ActionStep
{
ContextName = id?.Value,
ContinueOnError = continueOnError,
DisplayNameToken = name,
Condition = ifCondition,
TimeoutInMinutes = timeoutMinutes,
Environment = env,
Reference = new ScriptReference(),
};
var inputs = new MappingToken(null, null, null);
inputs.Add(new StringToken(null, null, null, PipelineConstants.ScriptStepInputs.Script), run);
if (workingDir != null)
{
inputs.Add(new StringToken(null, null, null, PipelineConstants.ScriptStepInputs.WorkingDirectory), workingDir);
}
if (shell != null)
{
inputs.Add(new StringToken(null, null, null, PipelineConstants.ScriptStepInputs.Shell), shell);
}
result.Inputs = inputs;
return result;
}
else
{
uses.AssertString($"{PipelineTemplateConstants.Steps} item {PipelineTemplateConstants.Uses}");
var result = new ActionStep
{
ContextName = id?.Value,
ContinueOnError = continueOnError,
DisplayNameToken = name,
Condition = ifCondition,
TimeoutInMinutes = timeoutMinutes,
Inputs = with,
Environment = env,
};
if (uses.Value.StartsWith("docker://", StringComparison.Ordinal))
{
var image = uses.Value.Substring("docker://".Length);
result.Reference = new ContainerRegistryReference { Image = image };
}
else if (uses.Value.StartsWith("./") || uses.Value.StartsWith(".\\"))
{
result.Reference = new RepositoryPathReference
{
RepositoryType = PipelineConstants.SelfAlias,
Path = uses.Value
};
}
else
{
var usesSegments = uses.Value.Split('@');
var pathSegments = usesSegments[0].Split(new[] { '/', '\\' }, StringSplitOptions.RemoveEmptyEntries);
var gitRef = usesSegments.Length == 2 ? usesSegments[1] : String.Empty;
if (usesSegments.Length != 2 ||
pathSegments.Length < 2 ||
String.IsNullOrEmpty(pathSegments[0]) ||
String.IsNullOrEmpty(pathSegments[1]) ||
String.IsNullOrEmpty(gitRef))
{
// todo: loc
context.Error(uses, $"Expected format {{org}}/{{repo}}[/path]@ref. Actual '{uses.Value}'");
}
else
{
var repositoryName = $"{pathSegments[0]}/{pathSegments[1]}";
var directoryPath = pathSegments.Length > 2 ? String.Join("/", pathSegments.Skip(2)) : String.Empty;
result.Reference = new RepositoryPathReference
{
RepositoryType = RepositoryTypes.GitHub,
Name = repositoryName,
Ref = gitRef,
Path = directoryPath,
};
}
}
return result;
}
}
/// <summary>
/// When empty, default to "success()".
/// When a status function is not referenced, format as "success() &amp;&amp; &lt;CONDITION&gt;".
/// </summary>
private static String ConvertToIfCondition(
TemplateContext context,
TemplateToken token,
Boolean isJob)
{
String condition;
if (token is null)
{
condition = null;
}
else if (token is BasicExpressionToken expressionToken)
{
condition = expressionToken.Expression;
}
else
{
var stringToken = token.AssertString($"{(isJob ? "job" : "step")} {PipelineTemplateConstants.If}");
condition = stringToken.Value;
}
if (String.IsNullOrWhiteSpace(condition))
{
return $"{PipelineTemplateConstants.Success}()";
}
var expressionParser = new ExpressionParser();
var functions = default(IFunctionInfo[]);
var namedValues = default(INamedValueInfo[]);
if (isJob)
{
namedValues = s_jobIfNamedValues;
// TODO: refactor into seperate functions
// functions = PhaseCondition.FunctionInfo;
}
else
{
namedValues = s_stepNamedValues;
functions = s_stepConditionFunctions;
}
var node = default(ExpressionNode);
try
{
node = expressionParser.CreateTree(condition, null, namedValues, functions) as ExpressionNode;
}
catch (Exception ex)
{
context.Error(token, ex);
return null;
}
if (node == null)
{
return $"{PipelineTemplateConstants.Success}()";
}
var hasStatusFunction = node.Traverse().Any(x =>
{
if (x is Function function)
{
return String.Equals(function.Name, PipelineTemplateConstants.Always, StringComparison.OrdinalIgnoreCase) ||
String.Equals(function.Name, PipelineTemplateConstants.Cancelled, StringComparison.OrdinalIgnoreCase) ||
String.Equals(function.Name, PipelineTemplateConstants.Failure, StringComparison.OrdinalIgnoreCase) ||
String.Equals(function.Name, PipelineTemplateConstants.Success, StringComparison.OrdinalIgnoreCase);
}
return false;
});
return hasStatusFunction ? condition : $"{PipelineTemplateConstants.Success}() && ({condition})";
}
private static readonly INamedValueInfo[] s_jobIfNamedValues = new INamedValueInfo[]
{
new NamedValueInfo<NoOperationNamedValue>(PipelineTemplateConstants.GitHub),
new NamedValueInfo<NoOperationNamedValue>(PipelineTemplateConstants.Needs),
};
private static readonly INamedValueInfo[] s_stepNamedValues = new INamedValueInfo[]
{
new NamedValueInfo<NoOperationNamedValue>(PipelineTemplateConstants.Strategy),
new NamedValueInfo<NoOperationNamedValue>(PipelineTemplateConstants.Matrix),
new NamedValueInfo<NoOperationNamedValue>(PipelineTemplateConstants.Steps),
new NamedValueInfo<NoOperationNamedValue>(PipelineTemplateConstants.GitHub),
new NamedValueInfo<NoOperationNamedValue>(PipelineTemplateConstants.Job),
new NamedValueInfo<NoOperationNamedValue>(PipelineTemplateConstants.Runner),
new NamedValueInfo<NoOperationNamedValue>(PipelineTemplateConstants.Env),
new NamedValueInfo<NoOperationNamedValue>(PipelineTemplateConstants.Needs),
};
private static readonly IFunctionInfo[] s_stepConditionFunctions = new IFunctionInfo[]
{
new FunctionInfo<NoOperation>(PipelineTemplateConstants.Always, 0, 0),
new FunctionInfo<NoOperation>(PipelineTemplateConstants.Cancelled, 0, 0),
new FunctionInfo<NoOperation>(PipelineTemplateConstants.Failure, 0, 0),
new FunctionInfo<NoOperation>(PipelineTemplateConstants.Success, 0, 0),
new FunctionInfo<NoOperation>(PipelineTemplateConstants.HashFiles, 1, Byte.MaxValue),
};
}
}

View File

@@ -51,6 +51,60 @@ namespace GitHub.DistributedTask.Pipelines.ObjectTemplating
public Int32 MaxResultSize { get; set; } = 10 * 1024 * 1024; // 10 mb
public DictionaryContextData EvaluateStepScopeInputs(
TemplateToken token,
DictionaryContextData contextData,
IList<IFunctionInfo> expressionFunctions)
{
var result = default(DictionaryContextData);
if (token != null && token.Type != TokenType.Null)
{
var context = CreateContext(contextData, expressionFunctions);
try
{
token = TemplateEvaluator.Evaluate(context, PipelineTemplateConstants.StepsScopeInputs, token, 0, null, omitHeader: true);
context.Errors.Check();
result = token.ToContextData().AssertDictionary("steps scope inputs");
}
catch (Exception ex) when (!(ex is TemplateValidationException))
{
context.Errors.Add(ex);
}
context.Errors.Check();
}
return result ?? new DictionaryContextData();
}
public DictionaryContextData EvaluateStepScopeOutputs(
TemplateToken token,
DictionaryContextData contextData,
IList<IFunctionInfo> expressionFunctions)
{
var result = default(DictionaryContextData);
if (token != null && token.Type != TokenType.Null)
{
var context = CreateContext(contextData, expressionFunctions);
try
{
token = TemplateEvaluator.Evaluate(context, PipelineTemplateConstants.StepsScopeOutputs, token, 0, null, omitHeader: true);
context.Errors.Check();
result = token.ToContextData().AssertDictionary("steps scope outputs");
}
catch (Exception ex) when (!(ex is TemplateValidationException))
{
context.Errors.Add(ex);
}
context.Errors.Check();
}
return result ?? new DictionaryContextData();
}
public Boolean EvaluateStepContinueOnError(
TemplateToken token,
DictionaryContextData contextData,

View File

@@ -1,121 +0,0 @@
using System;
using System.Collections.Generic;
using System.Globalization;
using System.Text;
using GitHub.DistributedTask.Pipelines.Validation;
namespace GitHub.DistributedTask.Pipelines.ObjectTemplating
{
internal sealed class ReferenceNameBuilder
{
internal void AppendSegment(String value)
{
if (String.IsNullOrEmpty(value))
{
return;
}
if (m_name.Length == 0)
{
var first = value[0];
if ((first >= 'a' && first <= 'z') ||
(first >= 'A' && first <= 'Z') ||
first == '_')
{
// Legal first char
}
else if ((first >= '0' && first <= '9') || first == '-')
{
// Illegal first char, but legal char.
// Prepend "_".
m_name.Append("_");
}
else
{
// Illegal char
}
}
else
{
// Separator
m_name.Append(c_separator);
}
foreach (var c in value)
{
if ((c >= 'a' && c <= 'z') ||
(c >= 'A' && c <= 'Z') ||
(c >= '0' && c <= '9') ||
c == '_' ||
c == '-')
{
// Legal
m_name.Append(c);
}
else
{
// Illegal
m_name.Append("_");
}
}
}
internal String Build()
{
var original = m_name.Length > 0 ? m_name.ToString() : "job";
var attempt = 1;
var suffix = default(String);
while (true)
{
if (attempt == 1)
{
suffix = String.Empty;
}
else if (attempt < 1000)
{
suffix = String.Format(CultureInfo.InvariantCulture, "_{0}", attempt);
}
else
{
throw new InvalidOperationException("Unable to create a unique name");
}
var candidate = original.Substring(0, Math.Min(original.Length, PipelineConstants.MaxNodeNameLength - suffix.Length)) + suffix;
if (m_distinctNames.Add(candidate))
{
m_name.Clear();
return candidate;
}
attempt++;
}
}
internal Boolean TryAddKnownName(
String value,
out String error)
{
if (!NameValidation.IsValid(value, allowHyphens: true) && value.Length < PipelineConstants.MaxNodeNameLength)
{
error = $"The identifier '{value}' is invalid. IDs may only contain alphanumeric characters, '_', and '-'. IDs must start with a letter or '_' and and must be less than {PipelineConstants.MaxNodeNameLength} characters.";
return false;
}
else if (!m_distinctNames.Add(value))
{
error = $"The identifier '{value}' may not be used more than once within the same scope.";
return false;
}
else
{
error = null;
return true;
}
}
private const String c_separator = "_";
private readonly HashSet<String> m_distinctNames = new HashSet<String>(StringComparer.OrdinalIgnoreCase);
private readonly StringBuilder m_name = new StringBuilder();
}
}

View File

@@ -94,12 +94,5 @@ namespace GitHub.DistributedTask.Pipelines
public static readonly String Resources = "resources";
public static readonly String All = "all";
}
public static class ScriptStepInputs
{
public static readonly String Script = "script";
public static readonly String WorkingDirectory = "workingDirectory";
public static readonly String Shell = "shell";
}
}
}

View File

@@ -16,6 +16,116 @@
}
},
"steps-template-root": {
"description": "Steps template file",
"mapping": {
"properties": {
"inputs": "steps-template-inputs",
"outputs": "steps-template-outputs",
"steps": "steps-in-template"
}
}
},
"steps-scope-inputs": {
"description": "Used when evaluating steps scope inputs",
"mapping": {
"loose-key-type": "non-empty-string",
"loose-value-type": "steps-scope-input-value"
}
},
"steps-scope-input-value": {
"context": [
"github",
"needs",
"strategy",
"matrix",
"secrets",
"steps",
"inputs",
"job",
"runner",
"env"
],
"one-of": [
"string",
"sequence",
"mapping"
]
},
"steps-scope-outputs": {
"description": "Used when evaluating steps scope outputs",
"mapping": {
"loose-key-type": "non-empty-string",
"loose-value-type": "steps-scope-output-value"
}
},
"steps-scope-output-value": {
"context": [
"github",
"needs",
"strategy",
"matrix",
"secrets",
"steps",
"inputs",
"job",
"runner",
"env"
],
"string": {}
},
"steps-template-inputs": {
"description": "Allowed inputs in a steps template",
"mapping": {
"loose-key-type": "non-empty-string",
"loose-value-type": "steps-template-input-value"
}
},
"steps-template-input-value": {
"description": "Default input values for a steps template",
"context": [
"github",
"needs",
"strategy",
"matrix"
],
"one-of": [
"string",
"sequence",
"mapping"
]
},
"steps-template-outputs": {
"description": "Output mapping for a steps template",
"mapping": {
"loose-key-type": "non-empty-string",
"loose-value-type": "steps-template-output-value"
}
},
"steps-template-output-value": {
"description": "Output values for a steps template",
"context": [
"github",
"needs",
"strategy",
"matrix",
"secrets",
"steps",
"job",
"runner",
"env"
],
"string": {}
},
"workflow-defaults": {
"mapping": {
"properties": {
@@ -130,27 +240,54 @@
"matrix": {
"mapping": {
"properties": {
"include": "matrix-filter",
"exclude": "matrix-filter"
"include": "matrix-include",
"exclude": "matrix-exclude"
},
"loose-key-type": "non-empty-string",
"loose-value-type": "sequence"
}
},
"matrix-filter": {
"matrix-include": {
"sequence": {
"item-type": "matrix-filter-item"
"item-type": "matrix-include-item"
}
},
"matrix-filter-item": {
"matrix-include-item": {
"mapping": {
"loose-key-type": "non-empty-string",
"loose-value-type": "any"
}
},
"matrix-exclude": {
"sequence": {
"item-type": "matrix-exclude-item"
}
},
"matrix-exclude-item": {
"mapping": {
"loose-key-type": "non-empty-string",
"loose-value-type": "matrix-exclude-filter-item"
}
},
"matrix-exclude-filter-item": {
"one-of": [
"string",
"matrix-exclude-mapping-filter"
]
},
"matrix-exclude-mapping-filter": {
"mapping": {
"loose-key-type": "non-empty-string",
"loose-value-type": "matrix-exclude-filter-item"
}
},
"runs-on": {
"context": [
"github",
@@ -227,10 +364,25 @@
}
},
"steps-in-template": {
"sequence": {
"item-type": "steps-item-in-template"
}
},
"steps-item": {
"one-of": [
"run-step",
"regular-step"
"regular-step",
"steps-template-reference"
]
},
"steps-item-in-template": {
"one-of": [
"run-step-in-template",
"regular-step-in-template",
"steps-template-reference-in-template"
]
},
@@ -253,6 +405,25 @@
}
},
"run-step-in-template": {
"mapping": {
"properties": {
"name": "string-steps-context-in-template",
"id": "non-empty-string",
"if": "step-if-in-template",
"timeout-minutes": "number-steps-context-in-template",
"run": {
"type": "string-steps-context-in-template",
"required": true
},
"continue-on-error": "boolean-steps-context-in-template",
"env": "step-env-in-template",
"working-directory": "string-steps-context-in-template",
"shell": "non-empty-string"
}
}
},
"regular-step": {
"mapping": {
"properties": {
@@ -271,6 +442,24 @@
}
},
"regular-step-in-template": {
"mapping": {
"properties": {
"name": "string-steps-context-in-template",
"id": "non-empty-string",
"if": "step-if-in-template",
"continue-on-error": "boolean-steps-context-in-template",
"timeout-minutes": "number-steps-context-in-template",
"uses": {
"type": "non-empty-string",
"required": true
},
"with": "step-with-in-template",
"env": "step-env-in-template"
}
}
},
"step-if": {
"context": [
"github",
@@ -290,6 +479,26 @@
"string": {}
},
"step-if-in-template": {
"context": [
"github",
"needs",
"strategy",
"matrix",
"steps",
"inputs",
"job",
"runner",
"env",
"always(0,0)",
"failure(0,0)",
"cancelled(0,0)",
"success(0,0)",
"hashFiles(1,255)"
],
"string": {}
},
"step-if-result": {
"context": [
"github",
@@ -315,6 +524,89 @@
]
},
"step-if-result-in-template": {
"context": [
"github",
"strategy",
"matrix",
"steps",
"inputs",
"job",
"runner",
"env",
"always(0,0)",
"failure(0,0)",
"cancelled(0,0)",
"success(0,0)",
"hashFiles(1,255)"
],
"one-of": [
"null",
"boolean",
"number",
"string",
"sequence",
"mapping"
]
},
"steps-template-reference": {
"mapping": {
"properties": {
"template": "non-empty-string",
"id": "non-empty-string",
"inputs": "steps-template-reference-inputs"
}
}
},
"steps-template-reference-in-template": {
"mapping": {
"properties": {
"template": "non-empty-string",
"id": "non-empty-string",
"inputs": "steps-template-reference-inputs-in-template"
}
}
},
"steps-template-reference-inputs": {
"context": [
"github",
"needs",
"strategy",
"matrix",
"secrets",
"steps",
"job",
"runner",
"env"
],
"mapping": {
"loose-key-type": "non-empty-string",
"loose-value-type": "string"
}
},
"steps-template-reference-inputs-in-template": {
"context": [
"github",
"needs",
"strategy",
"matrix",
"secrets",
"steps",
"inputs",
"job",
"runner",
"env"
],
"mapping": {
"loose-key-type": "non-empty-string",
"loose-value-type": "string"
}
},
"step-env": {
"context": [
"github",
@@ -334,6 +626,26 @@
}
},
"step-env-in-template": {
"context": [
"github",
"needs",
"strategy",
"matrix",
"secrets",
"steps",
"inputs",
"job",
"runner",
"env",
"hashFiles(1,255)"
],
"mapping": {
"loose-key-type": "non-empty-string",
"loose-value-type": "string"
}
},
"step-with": {
"context": [
"github",
@@ -353,6 +665,26 @@
}
},
"step-with-in-template": {
"context": [
"github",
"needs",
"strategy",
"matrix",
"secrets",
"steps",
"inputs",
"job",
"runner",
"env",
"hashFiles(1,255)"
],
"mapping": {
"loose-key-type": "non-empty-string",
"loose-value-type": "string"
}
},
"container": {
"context": [
"github",
@@ -373,8 +705,7 @@
"options": "non-empty-string",
"env": "container-env",
"ports": "sequence-of-non-empty-string",
"volumes": "sequence-of-non-empty-string",
"credentials": "container-registry-credentials"
"volumes": "sequence-of-non-empty-string"
}
}
},
@@ -405,20 +736,6 @@
]
},
"container-registry-credentials": {
"context": [
"secrets",
"env",
"github"
],
"mapping": {
"properties": {
"username": "non-empty-string",
"password": "non-empty-string"
}
}
},
"container-env": {
"mapping": {
"loose-key-type": "non-empty-string",
@@ -484,6 +801,23 @@
"boolean": {}
},
"boolean-steps-context-in-template": {
"context": [
"github",
"needs",
"strategy",
"matrix",
"secrets",
"steps",
"inputs",
"job",
"runner",
"env",
"hashFiles(1,255)"
],
"boolean": {}
},
"number-steps-context": {
"context": [
"github",
@@ -500,6 +834,23 @@
"number": {}
},
"number-steps-context-in-template": {
"context": [
"github",
"needs",
"strategy",
"matrix",
"secrets",
"steps",
"inputs",
"job",
"runner",
"env",
"hashFiles(1,255)"
],
"number": {}
},
"string-runner-context": {
"context": [
"github",
@@ -529,6 +880,23 @@
"hashFiles(1,255)"
],
"string": {}
},
"string-steps-context-in-template": {
"context": [
"github",
"needs",
"strategy",
"matrix",
"secrets",
"steps",
"inputs",
"job",
"runner",
"env",
"hashFiles(1,255)"
],
"string": {}
}
}
}

View File

@@ -1,40 +0,0 @@
using System;
using System.Runtime.Serialization;
namespace GitHub.DistributedTask.WebApi
{
[DataContract]
public class ActionDownloadInfo
{
[DataMember(EmitDefaultValue = false)]
public ActionDownloadAuthentication Authentication { get; set; }
[DataMember(EmitDefaultValue = false)]
public string NameWithOwner { get; set; }
[DataMember(EmitDefaultValue = false)]
public string ResolvedNameWithOwner { get; set; }
[DataMember(EmitDefaultValue = false)]
public string ResolvedSha { get; set; }
[DataMember(EmitDefaultValue = false)]
public string TarballUrl { get; set; }
[DataMember(EmitDefaultValue = false)]
public string Ref { get; set; }
[DataMember(EmitDefaultValue = false)]
public string ZipballUrl { get; set; }
}
[DataContract]
public class ActionDownloadAuthentication
{
[DataMember(EmitDefaultValue = false)]
public DateTime ExpiresAt { get; set; }
[DataMember(EmitDefaultValue = false)]
public string Token { get; set; }
}
}

View File

@@ -1,16 +0,0 @@
using System.Collections.Generic;
using System.Runtime.Serialization;
namespace GitHub.DistributedTask.WebApi
{
[DataContract]
public class ActionDownloadInfoCollection
{
[DataMember]
public IDictionary<string, ActionDownloadInfo> Actions
{
get;
set;
}
}
}

View File

@@ -1,22 +0,0 @@
using System.Runtime.Serialization;
namespace GitHub.DistributedTask.WebApi
{
[DataContract]
public class ActionReference
{
[DataMember]
public string NameWithOwner
{
get;
set;
}
[DataMember]
public string Ref
{
get;
set;
}
}
}

View File

@@ -1,16 +0,0 @@
using System.Collections.Generic;
using System.Runtime.Serialization;
namespace GitHub.DistributedTask.WebApi
{
[DataContract]
public class ActionReferenceList
{
[DataMember]
public IList<ActionReference> Actions
{
get;
set;
}
}
}

View File

@@ -29,7 +29,6 @@ namespace GitHub.DistributedTask.WebApi
this.PoolType = referenceToBeCloned.PoolType;
this.Size = referenceToBeCloned.Size;
this.IsLegacy = referenceToBeCloned.IsLegacy;
this.IsInternal = referenceToBeCloned.IsInternal;
}
public TaskAgentPoolReference Clone()
@@ -68,16 +67,6 @@ namespace GitHub.DistributedTask.WebApi
set;
}
/// <summary>
/// Gets or sets a value indicating whether or not this pool is internal and can't be modified by users
/// </summary>
[DataMember]
public bool IsInternal
{
get;
set;
}
/// <summary>
/// Gets or sets the type of the pool
/// </summary>

View File

@@ -50,7 +50,7 @@ namespace GitHub.DistributedTask.WebApi
: base(baseUrl, pipeline, disposeHandler)
{
}
public Task AppendTimelineRecordFeedAsync(
Guid scopeIdentifier,
String planType,
@@ -91,28 +91,6 @@ namespace GitHub.DistributedTask.WebApi
userState,
cancellationToken);
}
public Task AppendTimelineRecordFeedAsync(
Guid scopeIdentifier,
String planType,
Guid planId,
Guid timelineId,
Guid recordId,
Guid stepId,
IList<String> lines,
long startLine,
CancellationToken cancellationToken = default(CancellationToken),
Object userState = null)
{
return AppendTimelineRecordFeedAsync(scopeIdentifier,
planType,
planId,
timelineId,
recordId,
new TimelineRecordFeedLinesWrapper(stepId, lines, startLine),
userState,
cancellationToken);
}
public async Task RaisePlanEventAsync<T>(
Guid scopeIdentifier,

View File

@@ -20,12 +20,6 @@ namespace GitHub.DistributedTask.WebApi
this.Count = lines.Count;
}
public TimelineRecordFeedLinesWrapper(Guid stepId, IList<string> lines, Int64 startLine)
: this(stepId, lines)
{
this.StartLine = startLine;
}
[DataMember(Order = 0)]
public Int32 Count { get; private set; }
@@ -37,8 +31,5 @@ namespace GitHub.DistributedTask.WebApi
[DataMember(EmitDefaultValue = false)]
public Guid StepId { get; set; }
[DataMember (EmitDefaultValue = false)]
public Int64? StartLine { get; private set; }
}
}

View File

@@ -1,29 +0,0 @@
using System;
using System.Runtime.Serialization;
namespace GitHub.DistributedTask.WebApi
{
[DataContract]
public sealed class TimelineRecordLogLine
{
public TimelineRecordLogLine(String line, long? lineNumber)
{
this.Line = line;
this.LineNumber = lineNumber;
}
[DataMember]
public String Line
{
get;
set;
}
[DataMember (EmitDefaultValue = false)]
public long? LineNumber
{
get;
set;
}
}
}

View File

@@ -39,7 +39,7 @@ namespace GitHub.Services.OAuth
/// <summary>
/// Gets or sets the error description for the response.
/// </summary>
[DataMember(Name = "error_description", EmitDefaultValue = false)]
[DataMember(Name = "errordescription", EmitDefaultValue = false)]
public String ErrorDescription
{
get;

View File

@@ -126,23 +126,5 @@ namespace GitHub.Runner.Common.Tests.Worker.Container
Assert.NotNull(result5);
Assert.Equal("/foo/bar:/baz", result5);
}
[Theory]
[Trait("Level", "L0")]
[Trait("Category", "Worker")]
[InlineData("dockerhub/repo", "")]
[InlineData("localhost/doesnt_work", "")]
[InlineData("localhost:port/works", "localhost:port")]
[InlineData("host.tld/works", "host.tld")]
[InlineData("ghcr.io/owner/image", "ghcr.io")]
[InlineData("gcr.io/project/image", "gcr.io")]
[InlineData("myregistry.azurecr.io/namespace/image", "myregistry.azurecr.io")]
[InlineData("account.dkr.ecr.region.amazonaws.com/image", "account.dkr.ecr.region.amazonaws.com")]
[InlineData("docker.pkg.github.com/owner/repo/image", "docker.pkg.github.com")]
public void ParseRegistryHostnameFromImageName(string input, string expected)
{
var actual = DockerUtil.ParseRegistryHostnameFromImageName(input);
Assert.Equal(expected, actual);
}
}
}

View File

@@ -39,12 +39,10 @@ namespace GitHub.Runner.Common.Tests.Listener.Configuration
private string _expectedToken = "expectedToken";
private string _expectedServerUrl = "https://codedev.ms";
private string _expectedAgentName = "expectedAgentName";
private string _defaultRunnerGroupName = "defaultRunnerGroup";
private string _secondRunnerGroupName = "secondRunnerGroup";
private string _expectedPoolName = "poolName";
private string _expectedAuthType = "pat";
private string _expectedWorkFolder = "_work";
private int _defaultRunnerGroupId = 1;
private int _secondRunnerGroupId = 2;
private int _expectedPoolId = 1;
private RSACryptoServiceProvider rsa = null;
private RunnerSettings _configMgrAgentSettings = new RunnerSettings();
@@ -99,7 +97,7 @@ namespace GitHub.Runner.Common.Tests.Listener.Configuration
_serviceControlManager.Setup(x => x.GenerateScripts(It.IsAny<RunnerSettings>()));
#endif
var expectedPools = new List<TaskAgentPool>() { new TaskAgentPool(_defaultRunnerGroupName) { Id = _defaultRunnerGroupId, IsInternal = true }, new TaskAgentPool(_secondRunnerGroupName) { Id = _secondRunnerGroupId } };
var expectedPools = new List<TaskAgentPool>() { new TaskAgentPool(_expectedPoolName) { Id = _expectedPoolId } };
_runnerServer.Setup(x => x.GetAgentPoolsAsync(It.IsAny<string>(), It.IsAny<TaskAgentPoolType>())).Returns(Task.FromResult(expectedPools));
var expectedAgents = new List<TaskAgent>();
@@ -157,7 +155,7 @@ namespace GitHub.Runner.Common.Tests.Listener.Configuration
"configure",
"--url", _expectedServerUrl,
"--name", _expectedAgentName,
"--runnergroup", _secondRunnerGroupName,
"--pool", _expectedPoolName,
"--work", _expectedWorkFolder,
"--auth", _expectedAuthType,
"--token", _expectedToken,
@@ -177,7 +175,7 @@ namespace GitHub.Runner.Common.Tests.Listener.Configuration
Assert.NotNull(s);
Assert.True(s.ServerUrl.Equals(_expectedServerUrl));
Assert.True(s.AgentName.Equals(_expectedAgentName));
Assert.True(s.PoolId.Equals(_secondRunnerGroupId));
Assert.True(s.PoolId.Equals(_expectedPoolId));
Assert.True(s.WorkFolder.Equals(_expectedWorkFolder));
// validate GetAgentPoolsAsync gets called twice with automation pool type

View File

@@ -33,7 +33,7 @@ namespace GitHub.Runner.Common.Tests.Listener
TaskOrchestrationPlanReference plan = new TaskOrchestrationPlanReference();
TimelineReference timeline = null;
Guid jobId = Guid.NewGuid();
var result = new Pipelines.AgentJobRequestMessage(plan, timeline, jobId, "someJob", "someJob", null, null, null, new Dictionary<string, VariableValue>(), new List<MaskHint>(), new Pipelines.JobResources(), new Pipelines.ContextData.DictionaryContextData(), new Pipelines.WorkspaceOptions(), new List<Pipelines.ActionStep>(), null, null, null);
var result = new Pipelines.AgentJobRequestMessage(plan, timeline, jobId, "someJob", "someJob", null, null, null, new Dictionary<string, VariableValue>(), new List<MaskHint>(), new Pipelines.JobResources(), new Pipelines.ContextData.DictionaryContextData(), new Pipelines.WorkspaceOptions(), new List<Pipelines.ActionStep>(), null, null, null, null);
result.ContextData["github"] = new Pipelines.ContextData.DictionaryContextData();
return result;
}

View File

@@ -43,7 +43,7 @@ namespace GitHub.Runner.Common.Tests.Listener
TaskOrchestrationPlanReference plan = new TaskOrchestrationPlanReference();
TimelineReference timeline = null;
Guid jobId = Guid.NewGuid();
return new Pipelines.AgentJobRequestMessage(plan, timeline, jobId, "test", "test", null, null, null, new Dictionary<string, VariableValue>(), new List<MaskHint>(), new Pipelines.JobResources(), new Pipelines.ContextData.DictionaryContextData(), new Pipelines.WorkspaceOptions(), new List<Pipelines.ActionStep>(), null, null, null);
return new Pipelines.AgentJobRequestMessage(plan, timeline, jobId, "test", "test", null, null, null, new Dictionary<string, VariableValue>(), new List<MaskHint>(), new Pipelines.JobResources(), new Pipelines.ContextData.DictionaryContextData(), new Pipelines.WorkspaceOptions(), new List<Pipelines.ActionStep>(), null, null, null, null);
}
private JobCancelMessage CreateJobCancelMessage()

View File

@@ -60,7 +60,6 @@ namespace GitHub.Runner.Common.Tests
{
typeof(IActionCommandExtension),
typeof(IExecutionContext),
typeof(IFileCommandExtension),
typeof(IHandler),
typeof(IJobExtension),
typeof(IStep),

View File

@@ -70,24 +70,5 @@ namespace GitHub.Runner.Common.Tests.Util
}
}
}
[Fact]
[Trait("Level", "L0")]
[Trait("Category", "Common")]
public void WhichHandleFullyQualifiedPath()
{
using (TestHostContext hc = new TestHostContext(this))
{
//Arrange
Tracing trace = hc.GetTrace();
// Act.
var gitPath = WhichUtil.Which("git", require: true, trace: trace);
var gitPath2 = WhichUtil.Which(gitPath, require: true, trace: trace);
// Assert.
Assert.Equal(gitPath, gitPath2);
}
}
}
}

View File

@@ -96,7 +96,7 @@ namespace GitHub.Runner.Common.Tests.Worker
hc.GetTrace().Info($"{issue.Type} {issue.Message} {message ?? string.Empty}");
});
_ec.Object.Global.EnvironmentVariables = new Dictionary<string, string>();
_ec.Setup(x => x.EnvironmentVariables).Returns(new Dictionary<string, string>());
Assert.True(_commandManager.TryProcessCommand(_ec.Object, "##[stop-commands]stopToken", null));
Assert.False(_commandManager.TryProcessCommand(_ec.Object, "##[set-env name=foo]bar", null));
@@ -119,6 +119,8 @@ namespace GitHub.Runner.Common.Tests.Worker
return 1;
});
_ec.SetupAllProperties();
Assert.False(_ec.Object.EchoOnActionCommand);
Assert.True(_commandManager.TryProcessCommand(_ec.Object, "::echo::on", null));
@@ -148,7 +150,7 @@ namespace GitHub.Runner.Common.Tests.Worker
TimelineReference timeline = new TimelineReference();
Guid jobId = Guid.NewGuid();
string jobName = "some job name";
var jobRequest = new Pipelines.AgentJobRequestMessage(plan, timeline, jobId, jobName, jobName, null, null, null, new Dictionary<string, VariableValue>(), new List<MaskHint>(), new Pipelines.JobResources(), new Pipelines.ContextData.DictionaryContextData(), new Pipelines.WorkspaceOptions(), new List<Pipelines.ActionStep>(), null, null, null);
var jobRequest = new Pipelines.AgentJobRequestMessage(plan, timeline, jobId, jobName, jobName, null, null, null, new Dictionary<string, VariableValue>(), new List<MaskHint>(), new Pipelines.JobResources(), new Pipelines.ContextData.DictionaryContextData(), new Pipelines.WorkspaceOptions(), new List<Pipelines.ActionStep>(), null, null, null, null);
jobRequest.Resources.Repositories.Add(new Pipelines.RepositoryResource()
{
Alias = Pipelines.PipelineConstants.SelfAlias,
@@ -202,6 +204,8 @@ namespace GitHub.Runner.Common.Tests.Worker
return 1;
});
_ec.SetupAllProperties();
// Echo commands below are considered "processed", but are invalid
// 1. Invalid echo value
Assert.True(_commandManager.TryProcessCommand(_ec.Object, "::echo::invalid", null));
@@ -283,8 +287,6 @@ namespace GitHub.Runner.Common.Tests.Worker
// Execution context
_ec = new Mock<IExecutionContext>();
_ec.SetupAllProperties();
_ec.Setup(x => x.Global).Returns(new GlobalContext());
// Command manager
_commandManager = new ActionCommandManager();

View File

@@ -28,7 +28,6 @@ namespace GitHub.Runner.Common.Tests.Worker
private Mock<IConfigurationStore> _configurationStore;
private Mock<IDockerCommandManager> _dockerManager;
private Mock<IExecutionContext> _ec;
private Mock<IJobServer> _jobServer;
private Mock<IRunnerPluginManager> _pluginManager;
private TestHostContext _hc;
private ActionManager _actionManager;
@@ -133,7 +132,7 @@ namespace GitHub.Runner.Common.Tests.Worker
Reference = new Pipelines.RepositoryPathReference()
{
Name = ActionName,
Ref = "main",
Ref = "master",
RepositoryType = "GitHub"
}
}
@@ -141,7 +140,7 @@ namespace GitHub.Runner.Common.Tests.Worker
// Return a valid action from GHES via mock
const string ApiUrl = "https://ghes.example.com/api/v3";
string expectedArchiveLink = GetLinkToActionArchive(ApiUrl, ActionName, "main");
string expectedArchiveLink = GetLinkToActionArchive(ApiUrl, ActionName, "master");
string archiveFile = await CreateRepoArchive();
using var stream = File.OpenRead(archiveFile);
var mockClientHandler = new Mock<HttpClientHandler>();
@@ -159,10 +158,10 @@ namespace GitHub.Runner.Common.Tests.Worker
await _actionManager.PrepareActionsAsync(_ec.Object, actions);
//Assert
var watermarkFile = Path.Combine(_hc.GetDirectory(WellKnownDirectory.Actions), ActionName, "main.completed");
var watermarkFile = Path.Combine(_hc.GetDirectory(WellKnownDirectory.Actions), ActionName, "master.completed");
Assert.True(File.Exists(watermarkFile));
var actionYamlFile = Path.Combine(_hc.GetDirectory(WellKnownDirectory.Actions), ActionName, "main", "action.yml");
var actionYamlFile = Path.Combine(_hc.GetDirectory(WellKnownDirectory.Actions), ActionName, "master", "action.yml");
Assert.True(File.Exists(actionYamlFile));
_hc.GetTrace().Info(File.ReadAllText(actionYamlFile));
}
@@ -191,7 +190,7 @@ namespace GitHub.Runner.Common.Tests.Worker
Reference = new Pipelines.RepositoryPathReference()
{
Name = ActionName,
Ref = "main",
Ref = "master",
RepositoryType = "GitHub"
}
}
@@ -199,8 +198,8 @@ namespace GitHub.Runner.Common.Tests.Worker
// Return a valid action from GHES via mock
const string ApiUrl = "https://ghes.example.com/api/v3";
string builtInArchiveLink = GetLinkToActionArchive(ApiUrl, ActionName, "main");
string dotcomArchiveLink = GetLinkToActionArchive("https://api.github.com", ActionName, "main");
string builtInArchiveLink = GetLinkToActionArchive(ApiUrl, ActionName, "master");
string dotcomArchiveLink = GetLinkToActionArchive("https://api.github.com", ActionName, "master");
string archiveFile = await CreateRepoArchive();
using var stream = File.OpenRead(archiveFile);
var mockClientHandler = new Mock<HttpClientHandler>();
@@ -220,10 +219,10 @@ namespace GitHub.Runner.Common.Tests.Worker
await _actionManager.PrepareActionsAsync(_ec.Object, actions);
//Assert
var watermarkFile = Path.Combine(_hc.GetDirectory(WellKnownDirectory.Actions), ActionName, "main.completed");
var watermarkFile = Path.Combine(_hc.GetDirectory(WellKnownDirectory.Actions), ActionName, "master.completed");
Assert.True(File.Exists(watermarkFile));
var actionYamlFile = Path.Combine(_hc.GetDirectory(WellKnownDirectory.Actions), ActionName, "main", "action.yml");
var actionYamlFile = Path.Combine(_hc.GetDirectory(WellKnownDirectory.Actions), ActionName, "master", "action.yml");
Assert.True(File.Exists(actionYamlFile));
_hc.GetTrace().Info(File.ReadAllText(actionYamlFile));
}
@@ -252,7 +251,7 @@ namespace GitHub.Runner.Common.Tests.Worker
Reference = new Pipelines.RepositoryPathReference()
{
Name = ActionName,
Ref = "main",
Ref = "master",
RepositoryType = "GitHub"
}
}
@@ -260,7 +259,7 @@ namespace GitHub.Runner.Common.Tests.Worker
// Return a valid action from GHES via mock
const string ApiUrl = "https://ghes.example.com/api/v3";
string archiveLink = GetLinkToActionArchive(ApiUrl, ActionName, "main");
string archiveLink = GetLinkToActionArchive(ApiUrl, ActionName, "master");
string archiveFile = await CreateRepoArchive();
using var stream = File.OpenRead(archiveFile);
var mockClientHandler = new Mock<HttpClientHandler>();
@@ -280,10 +279,10 @@ namespace GitHub.Runner.Common.Tests.Worker
//Assert
await Assert.ThrowsAsync<ActionNotFoundException>(action);
var watermarkFile = Path.Combine(_hc.GetDirectory(WellKnownDirectory.Actions), ActionName, "main.completed");
var watermarkFile = Path.Combine(_hc.GetDirectory(WellKnownDirectory.Actions), ActionName, "master.completed");
Assert.False(File.Exists(watermarkFile));
var actionYamlFile = Path.Combine(_hc.GetDirectory(WellKnownDirectory.Actions), ActionName, "main", "action.yml");
var actionYamlFile = Path.Combine(_hc.GetDirectory(WellKnownDirectory.Actions), ActionName, "master", "action.yml");
Assert.False(File.Exists(actionYamlFile));
}
finally
@@ -846,7 +845,7 @@ namespace GitHub.Runner.Common.Tests.Worker
{
var traceFile = Path.GetTempFileName();
File.Copy(_hc.TraceFileName, traceFile, true);
Assert.Contains("You are using a JavaScript Action but there is not an entry JavaScript file provided in", File.ReadAllText(traceFile));
Assert.Contains("Entry javascript file is not provided.", File.ReadAllText(traceFile));
}
}
finally
@@ -1278,7 +1277,7 @@ runs:
";
Pipelines.ActionStep instance;
string directory;
directory = Path.Combine(_workFolder, Constants.Path.ActionsDirectory, "GitHub/actions".Replace(Path.AltDirectorySeparatorChar, Path.DirectorySeparatorChar), "main");
directory = Path.Combine(_workFolder, Constants.Path.ActionsDirectory, "GitHub/actions".Replace(Path.AltDirectorySeparatorChar, Path.DirectorySeparatorChar), "master");
string file = Path.Combine(directory, Constants.Path.ActionManifestYamlFile);
Directory.CreateDirectory(Path.GetDirectoryName(file));
File.WriteAllText(file, Content);
@@ -1288,7 +1287,7 @@ runs:
Reference = new Pipelines.RepositoryPathReference()
{
Name = "GitHub/actions",
Ref = "main",
Ref = "master",
RepositoryType = Pipelines.RepositoryTypes.GitHub
}
};
@@ -2466,7 +2465,7 @@ runs:
{
var traceFile = Path.GetTempFileName();
File.Copy(_hc.TraceFileName, traceFile, true);
Assert.Contains("You are using a JavaScript Action but there is not an entry JavaScript file provided in", File.ReadAllText(traceFile));
Assert.Contains("Entry javascript file is not provided.", File.ReadAllText(traceFile));
}
}
finally
@@ -2898,7 +2897,7 @@ runs:
";
Pipelines.ActionStep instance;
string directory;
directory = Path.Combine(_workFolder, Constants.Path.ActionsDirectory, "GitHub/actions".Replace(Path.AltDirectorySeparatorChar, Path.DirectorySeparatorChar), "main");
directory = Path.Combine(_workFolder, Constants.Path.ActionsDirectory, "GitHub/actions".Replace(Path.AltDirectorySeparatorChar, Path.DirectorySeparatorChar), "master");
string file = Path.Combine(directory, Constants.Path.ActionManifestYamlFile);
Directory.CreateDirectory(Path.GetDirectoryName(file));
File.WriteAllText(file, Content);
@@ -2908,7 +2907,7 @@ runs:
Reference = new Pipelines.RepositoryPathReference()
{
Name = "GitHub/actions",
Ref = "main",
Ref = "master",
RepositoryType = Pipelines.RepositoryTypes.GitHub
}
};
@@ -3453,7 +3452,7 @@ runs:
private void CreateAction(string yamlContent, out Pipelines.ActionStep instance, out string directory)
{
directory = Path.Combine(_workFolder, Constants.Path.ActionsDirectory, "GitHub/actions".Replace(Path.AltDirectorySeparatorChar, Path.DirectorySeparatorChar), "main");
directory = Path.Combine(_workFolder, Constants.Path.ActionsDirectory, "GitHub/actions".Replace(Path.AltDirectorySeparatorChar, Path.DirectorySeparatorChar), "master");
string file = Path.Combine(directory, Constants.Path.ActionManifestYmlFile);
Directory.CreateDirectory(Path.GetDirectoryName(file));
File.WriteAllText(file, yamlContent);
@@ -3463,7 +3462,7 @@ runs:
Reference = new Pipelines.RepositoryPathReference()
{
Name = "GitHub/actions",
Ref = "main",
Ref = "master",
RepositoryType = Pipelines.RepositoryTypes.GitHub
}
};
@@ -3481,7 +3480,7 @@ runs:
Reference = new Pipelines.RepositoryPathReference()
{
Name = "GitHub/actions",
Ref = "main",
Ref = "master",
RepositoryType = Pipelines.PipelineConstants.SelfAlias
}
};
@@ -3575,18 +3574,15 @@ runs:
_workFolder = _hc.GetDirectory(WellKnownDirectory.Work);
_ec = new Mock<IExecutionContext>();
_ec.Setup(x => x.Global).Returns(new GlobalContext());
_ec.Setup(x => x.CancellationToken).Returns(_ecTokenSource.Token);
var variables = new Dictionary<string, VariableValue>();
if (newActionMetadata)
{
variables["DistributedTask.NewActionMetadata"] = "true";
}
_ec.Object.Global.Variables = new Variables(_hc, variables);
_ec.Setup(x => x.Variables).Returns(new Variables(_hc, variables));
_ec.Setup(x => x.ExpressionValues).Returns(new DictionaryContextData());
_ec.Setup(x => x.ExpressionFunctions).Returns(new List<IFunctionInfo>());
_ec.Object.Global.FileTable = new List<String>();
_ec.Object.Global.Plan = new TaskOrchestrationPlanReference();
_ec.Setup(x => x.Write(It.IsAny<string>(), It.IsAny<string>())).Callback((string tag, string message) => { _hc.GetTrace().Info($"[{tag}]{message}"); });
_ec.Setup(x => x.AddIssue(It.IsAny<Issue>(), It.IsAny<string>())).Callback((Issue issue, string message) => { _hc.GetTrace().Info($"[{issue.Type}]{issue.Message ?? message}"); });
_ec.Setup(x => x.GetGitHubContext("workspace")).Returns(Path.Combine(_workFolder, "actions", "actions"));
@@ -3597,25 +3593,6 @@ runs:
_dockerManager.Setup(x => x.DockerBuild(_ec.Object, It.IsAny<string>(), It.IsAny<string>(), It.IsAny<string>(), It.IsAny<string>())).Returns(Task.FromResult(0));
_jobServer = new Mock<IJobServer>();
_jobServer.Setup(x => x.ResolveActionDownloadInfoAsync(It.IsAny<Guid>(), It.IsAny<string>(), It.IsAny<Guid>(), It.IsAny<ActionReferenceList>(), It.IsAny<CancellationToken>()))
.Returns((Guid scopeIdentifier, string hubName, Guid planId, ActionReferenceList actions, CancellationToken cancellationToken) =>
{
var result = new ActionDownloadInfoCollection { Actions = new Dictionary<string, ActionDownloadInfo>() };
foreach (var action in actions.Actions)
{
var key = $"{action.NameWithOwner}@{action.Ref}";
result.Actions[key] = new ActionDownloadInfo
{
NameWithOwner = action.NameWithOwner,
Ref = action.Ref,
TarballUrl = $"https://api.github.com/repos/{action.NameWithOwner}/tarball/{action.Ref}",
ZipballUrl = $"https://api.github.com/repos/{action.NameWithOwner}/zipball/{action.Ref}",
};
}
return Task.FromResult(result);
});
_pluginManager = new Mock<IRunnerPluginManager>();
_pluginManager.Setup(x => x.GetPluginAction(It.IsAny<string>())).Returns(new RunnerPluginActionInfo() { PluginTypeName = "plugin.class, plugin", PostPluginTypeName = "plugin.cleanup, plugin" });
@@ -3623,7 +3600,6 @@ runs:
actionManifest.Initialize(_hc);
_hc.SetSingleton<IDockerCommandManager>(_dockerManager.Object);
_hc.SetSingleton<IJobServer>(_jobServer.Object);
_hc.SetSingleton<IRunnerPluginManager>(_pluginManager.Object);
_hc.SetSingleton<IActionManifestManager>(actionManifest);
_hc.SetSingleton<IHttpClientHandlerFactory>(new HttpClientHandlerFactory());

View File

@@ -717,7 +717,7 @@ namespace GitHub.Runner.Common.Tests.Worker
_ec.Object.ExpressionValues["github"] = new DictionaryContextData
{
{ "ref", new StringContextData("refs/heads/main") },
{ "ref", new StringContextData("refs/heads/master") },
};
_ec.Object.ExpressionValues["strategy"] = new DictionaryContextData();
_ec.Object.ExpressionValues["matrix"] = new DictionaryContextData();
@@ -737,7 +737,7 @@ namespace GitHub.Runner.Common.Tests.Worker
result = actionManifest.EvaluateDefaultInput(_ec.Object, "testInput", new BasicExpressionToken(null, null, null, "github.ref"));
//Assert
Assert.Equal("refs/heads/main", result);
Assert.Equal("refs/heads/master", result);
}
finally
{
@@ -754,14 +754,9 @@ namespace GitHub.Runner.Common.Tests.Worker
_hc = new TestHostContext(this, name);
_ec = new Mock<IExecutionContext>();
_ec.Setup(x => x.Global)
.Returns(new GlobalContext
{
FileTable = new List<String>(),
Variables = new Variables(_hc, new Dictionary<string, VariableValue>()),
WriteDebug = true,
});
_ec.Setup(x => x.WriteDebug).Returns(true);
_ec.Setup(x => x.CancellationToken).Returns(_ecTokenSource.Token);
_ec.Setup(x => x.Variables).Returns(new Variables(_hc, new Dictionary<string, VariableValue>()));
_ec.Setup(x => x.ExpressionValues).Returns(new DictionaryContextData());
_ec.Setup(x => x.ExpressionFunctions).Returns(new List<IFunctionInfo>());
_ec.Setup(x => x.Write(It.IsAny<string>(), It.IsAny<string>())).Callback((string tag, string message) => { _hc.GetTrace().Info($"{tag}{message}"); });

View File

@@ -32,8 +32,6 @@ namespace GitHub.Runner.Common.Tests.Worker
private TestHostContext _hc;
private ActionRunner _actionRunner;
private IActionManifestManager _actionManifestManager;
private Mock<IFileCommandManager> _fileCommandManager;
private DictionaryContextData _context = new DictionaryContextData();
[Fact]
@@ -364,7 +362,6 @@ namespace GitHub.Runner.Common.Tests.Worker
_handlerFactory = new Mock<IHandlerFactory>();
_defaultStepHost = new Mock<IDefaultStepHost>();
_actionManifestManager = new ActionManifestManager();
_fileCommandManager = new Mock<IFileCommandManager>();
_actionManifestManager.Initialize(_hc);
var githubContext = new GitHubContext();
@@ -378,16 +375,14 @@ namespace GitHub.Runner.Common.Tests.Worker
#endif
_ec = new Mock<IExecutionContext>();
_ec.Setup(x => x.Global).Returns(new GlobalContext());
_ec.Setup(x => x.ExpressionValues).Returns(_context);
_ec.Setup(x => x.ExpressionFunctions).Returns(new List<IFunctionInfo>());
_ec.Setup(x => x.IntraActionState).Returns(new Dictionary<string, string>());
_ec.Object.Global.EnvironmentVariables = new Dictionary<string, string>();
_ec.Object.Global.FileTable = new List<String>();
_ec.Setup(x => x.EnvironmentVariables).Returns(new Dictionary<string, string>());
_ec.Setup(x => x.SetGitHubContext(It.IsAny<string>(), It.IsAny<string>()));
_ec.Setup(x => x.GetGitHubContext(It.IsAny<string>())).Returns("{\"foo\":\"bar\"}");
_ec.Setup(x => x.CancellationToken).Returns(_ecTokenSource.Token);
_ec.Object.Global.Variables = new Variables(_hc, new Dictionary<string, VariableValue>());
_ec.Setup(x => x.Variables).Returns(new Variables(_hc, new Dictionary<string, VariableValue>()));
_ec.Setup(x => x.Write(It.IsAny<string>(), It.IsAny<string>())).Callback((string tag, string message) => { _hc.GetTrace().Info($"[{tag}]{message}"); });
_ec.Setup(x => x.AddIssue(It.IsAny<Issue>(), It.IsAny<string>())).Callback((Issue issue, string message) => { _hc.GetTrace().Info($"[{issue.Type}]{issue.Message ?? message}"); });
@@ -397,8 +392,6 @@ namespace GitHub.Runner.Common.Tests.Worker
_hc.EnqueueInstance<IDefaultStepHost>(_defaultStepHost.Object);
_hc.EnqueueInstance(_fileCommandManager.Object);
// Instance to test.
_actionRunner = new ActionRunner();
_actionRunner.Initialize(_hc);

View File

@@ -26,7 +26,7 @@ namespace GitHub.Runner.Common.Tests.Worker
TimelineReference timeline = new TimelineReference();
Guid jobId = Guid.NewGuid();
string jobName = "some job name";
var jobRequest = new Pipelines.AgentJobRequestMessage(plan, timeline, jobId, jobName, jobName, null, null, null, new Dictionary<string, VariableValue>(), new List<MaskHint>(), new Pipelines.JobResources(), new Pipelines.ContextData.DictionaryContextData(), new Pipelines.WorkspaceOptions(), new List<Pipelines.ActionStep>(), null, null, null);
var jobRequest = new Pipelines.AgentJobRequestMessage(plan, timeline, jobId, jobName, jobName, null, null, null, new Dictionary<string, VariableValue>(), new List<MaskHint>(), new Pipelines.JobResources(), new Pipelines.ContextData.DictionaryContextData(), new Pipelines.WorkspaceOptions(), new List<Pipelines.ActionStep>(), null, null, null, null);
jobRequest.Resources.Repositories.Add(new Pipelines.RepositoryResource()
{
Alias = Pipelines.PipelineConstants.SelfAlias,
@@ -102,7 +102,7 @@ namespace GitHub.Runner.Common.Tests.Worker
TimelineReference timeline = new TimelineReference();
Guid jobId = Guid.NewGuid();
string jobName = "some job name";
var jobRequest = new Pipelines.AgentJobRequestMessage(plan, timeline, jobId, jobName, jobName, null, null, null, new Dictionary<string, VariableValue>(), new List<MaskHint>(), new Pipelines.JobResources(), new Pipelines.ContextData.DictionaryContextData(), new Pipelines.WorkspaceOptions(), new List<Pipelines.ActionStep>(), null, null, null);
var jobRequest = new Pipelines.AgentJobRequestMessage(plan, timeline, jobId, jobName, jobName, null, null, null, new Dictionary<string, VariableValue>(), new List<MaskHint>(), new Pipelines.JobResources(), new Pipelines.ContextData.DictionaryContextData(), new Pipelines.WorkspaceOptions(), new List<Pipelines.ActionStep>(), null, null, null, null);
jobRequest.Resources.Repositories.Add(new Pipelines.RepositoryResource()
{
Alias = Pipelines.PipelineConstants.SelfAlias,
@@ -116,7 +116,7 @@ namespace GitHub.Runner.Common.Tests.Worker
var pagingLogger = new Mock<IPagingLogger>();
var jobServerQueue = new Mock<IJobServerQueue>();
jobServerQueue.Setup(x => x.QueueTimelineRecordUpdate(It.IsAny<Guid>(), It.IsAny<TimelineRecord>()));
jobServerQueue.Setup(x => x.QueueWebConsoleLine(It.IsAny<Guid>(), It.IsAny<string>(),It.IsAny<long>())).Callback((Guid id, string msg, long? lineNumber) => { hc.GetTrace().Info(msg); });
jobServerQueue.Setup(x => x.QueueWebConsoleLine(It.IsAny<Guid>(), It.IsAny<string>())).Callback((Guid id, string msg) => { hc.GetTrace().Info(msg); });
hc.EnqueueInstance(pagingLogger.Object);
hc.SetSingleton(jobServerQueue.Object);
@@ -137,7 +137,7 @@ namespace GitHub.Runner.Common.Tests.Worker
ec.Complete();
jobServerQueue.Verify(x => x.QueueWebConsoleLine(It.IsAny<Guid>(), It.IsAny<string>(), It.IsAny<long?>()), Times.Exactly(10));
jobServerQueue.Verify(x => x.QueueWebConsoleLine(It.IsAny<Guid>(), It.IsAny<string>()), Times.Exactly(10));
}
}
@@ -153,7 +153,7 @@ namespace GitHub.Runner.Common.Tests.Worker
TimelineReference timeline = new TimelineReference();
Guid jobId = Guid.NewGuid();
string jobName = "some job name";
var jobRequest = new Pipelines.AgentJobRequestMessage(plan, timeline, jobId, jobName, jobName, null, null, null, new Dictionary<string, VariableValue>(), new List<MaskHint>(), new Pipelines.JobResources(), new Pipelines.ContextData.DictionaryContextData(), new Pipelines.WorkspaceOptions(), new List<Pipelines.ActionStep>(), null, null, null);
var jobRequest = new Pipelines.AgentJobRequestMessage(plan, timeline, jobId, jobName, jobName, null, null, null, new Dictionary<string, VariableValue>(), new List<MaskHint>(), new Pipelines.JobResources(), new Pipelines.ContextData.DictionaryContextData(), new Pipelines.WorkspaceOptions(), new List<Pipelines.ActionStep>(), null, null, null, null);
jobRequest.Resources.Repositories.Add(new Pipelines.RepositoryResource()
{
Alias = Pipelines.PipelineConstants.SelfAlias,
@@ -171,7 +171,7 @@ namespace GitHub.Runner.Common.Tests.Worker
var pagingLogger5 = new Mock<IPagingLogger>();
var jobServerQueue = new Mock<IJobServerQueue>();
jobServerQueue.Setup(x => x.QueueTimelineRecordUpdate(It.IsAny<Guid>(), It.IsAny<TimelineRecord>()));
jobServerQueue.Setup(x => x.QueueWebConsoleLine(It.IsAny<Guid>(), It.IsAny<string>(), It.IsAny<long?>())).Callback((Guid id, string msg, long? lineNumber) => { hc.GetTrace().Info(msg); });
jobServerQueue.Setup(x => x.QueueWebConsoleLine(It.IsAny<Guid>(), It.IsAny<string>())).Callback((Guid id, string msg) => { hc.GetTrace().Info(msg); });
var actionRunner1 = new ActionRunner();
actionRunner1.Initialize(hc);
@@ -251,7 +251,7 @@ namespace GitHub.Runner.Common.Tests.Worker
TimelineReference timeline = new TimelineReference();
Guid jobId = Guid.NewGuid();
string jobName = "some job name";
var jobRequest = new Pipelines.AgentJobRequestMessage(plan, timeline, jobId, jobName, jobName, null, null, null, new Dictionary<string, VariableValue>(), new List<MaskHint>(), new Pipelines.JobResources(), new Pipelines.ContextData.DictionaryContextData(), new Pipelines.WorkspaceOptions(), new List<Pipelines.ActionStep>(), null, null, null);
var jobRequest = new Pipelines.AgentJobRequestMessage(plan, timeline, jobId, jobName, jobName, null, null, null, new Dictionary<string, VariableValue>(), new List<MaskHint>(), new Pipelines.JobResources(), new Pipelines.ContextData.DictionaryContextData(), new Pipelines.WorkspaceOptions(), new List<Pipelines.ActionStep>(), null, null, null, null);
jobRequest.Resources.Repositories.Add(new Pipelines.RepositoryResource()
{
Alias = Pipelines.PipelineConstants.SelfAlias,
@@ -269,7 +269,7 @@ namespace GitHub.Runner.Common.Tests.Worker
var pagingLogger5 = new Mock<IPagingLogger>();
var jobServerQueue = new Mock<IJobServerQueue>();
jobServerQueue.Setup(x => x.QueueTimelineRecordUpdate(It.IsAny<Guid>(), It.IsAny<TimelineRecord>()));
jobServerQueue.Setup(x => x.QueueWebConsoleLine(It.IsAny<Guid>(), It.IsAny<string>(), It.IsAny<long?>())).Callback((Guid id, string msg, long? lineNumber) => { hc.GetTrace().Info(msg); });
jobServerQueue.Setup(x => x.QueueWebConsoleLine(It.IsAny<Guid>(), It.IsAny<string>())).Callback((Guid id, string msg) => { hc.GetTrace().Info(msg); });
var actionRunner1 = new ActionRunner();
actionRunner1.Initialize(hc);
@@ -335,7 +335,7 @@ namespace GitHub.Runner.Common.Tests.Worker
TimelineReference timeline = new TimelineReference();
Guid jobId = Guid.NewGuid();
string jobName = "some job name";
var jobRequest = new Pipelines.AgentJobRequestMessage(plan, timeline, jobId, jobName, jobName, null, null, null, new Dictionary<string, VariableValue>(), new List<MaskHint>(), new Pipelines.JobResources(), new Pipelines.ContextData.DictionaryContextData(), new Pipelines.WorkspaceOptions(), new List<Pipelines.ActionStep>(), null, null, null);
var jobRequest = new Pipelines.AgentJobRequestMessage(plan, timeline, jobId, jobName, jobName, null, null, null, new Dictionary<string, VariableValue>(), new List<MaskHint>(), new Pipelines.JobResources(), new Pipelines.ContextData.DictionaryContextData(), new Pipelines.WorkspaceOptions(), new List<Pipelines.ActionStep>(), null, null, null, null);
jobRequest.Resources.Repositories.Add(new Pipelines.RepositoryResource()
{
Alias = Pipelines.PipelineConstants.SelfAlias,
@@ -357,20 +357,20 @@ namespace GitHub.Runner.Common.Tests.Worker
// Act.
jobContext.InitializeJob(jobRequest, CancellationToken.None);
jobContext.Global.StepsContext.SetConclusion(null, "step1", ActionResult.Success);
var conclusion1 = (jobContext.Global.StepsContext.GetScope(null)["step1"] as DictionaryContextData)["conclusion"].ToString();
jobContext.StepsContext.SetConclusion(null, "step1", ActionResult.Success);
var conclusion1 = (jobContext.StepsContext.GetScope(null)["step1"] as DictionaryContextData)["conclusion"].ToString();
Assert.Equal(conclusion1, conclusion1.ToLowerInvariant());
jobContext.Global.StepsContext.SetOutcome(null, "step2", ActionResult.Cancelled);
var outcome1 = (jobContext.Global.StepsContext.GetScope(null)["step2"] as DictionaryContextData)["outcome"].ToString();
jobContext.StepsContext.SetOutcome(null, "step2", ActionResult.Cancelled);
var outcome1 = (jobContext.StepsContext.GetScope(null)["step2"] as DictionaryContextData)["outcome"].ToString();
Assert.Equal(outcome1, outcome1.ToLowerInvariant());
jobContext.Global.StepsContext.SetConclusion(null, "step3", ActionResult.Failure);
var conclusion2 = (jobContext.Global.StepsContext.GetScope(null)["step3"] as DictionaryContextData)["conclusion"].ToString();
jobContext.StepsContext.SetConclusion(null, "step3", ActionResult.Failure);
var conclusion2 = (jobContext.StepsContext.GetScope(null)["step3"] as DictionaryContextData)["conclusion"].ToString();
Assert.Equal(conclusion2, conclusion2.ToLowerInvariant());
jobContext.Global.StepsContext.SetOutcome(null, "step4", ActionResult.Skipped);
var outcome2 = (jobContext.Global.StepsContext.GetScope(null)["step4"] as DictionaryContextData)["outcome"].ToString();
jobContext.StepsContext.SetOutcome(null, "step4", ActionResult.Skipped);
var outcome2 = (jobContext.StepsContext.GetScope(null)["step4"] as DictionaryContextData)["outcome"].ToString();
Assert.Equal(outcome2, outcome2.ToLowerInvariant());
jobContext.JobContext.Status = ActionResult.Success;

View File

@@ -98,7 +98,7 @@ namespace GitHub.Runner.Common.Tests.Worker
};
Guid jobId = Guid.NewGuid();
_message = new Pipelines.AgentJobRequestMessage(plan, timeline, jobId, "test", "test", null, null, null, new Dictionary<string, VariableValue>(), new List<MaskHint>(), new Pipelines.JobResources(), new Pipelines.ContextData.DictionaryContextData(), new Pipelines.WorkspaceOptions(), steps, null, null, null);
_message = new Pipelines.AgentJobRequestMessage(plan, timeline, jobId, "test", "test", null, null, null, new Dictionary<string, VariableValue>(), new List<MaskHint>(), new Pipelines.JobResources(), new Pipelines.ContextData.DictionaryContextData(), new Pipelines.WorkspaceOptions(), steps, null, null, null, null);
GitHubContext github = new GitHubContext();
github["repository"] = new Pipelines.ContextData.StringContextData("actions/runner");
_message.ContextData.Add("github", github);

View File

@@ -60,7 +60,7 @@ namespace GitHub.Runner.Common.Tests.Worker
TaskOrchestrationPlanReference plan = new TaskOrchestrationPlanReference();
TimelineReference timeline = new Timeline(Guid.NewGuid());
Guid jobId = Guid.NewGuid();
_message = new Pipelines.AgentJobRequestMessage(plan, timeline, jobId, testName, testName, null, null, null, new Dictionary<string, VariableValue>(), new List<MaskHint>(), new Pipelines.JobResources(), new Pipelines.ContextData.DictionaryContextData(), new Pipelines.WorkspaceOptions(), new List<Pipelines.ActionStep>(), null, null, null);
_message = new Pipelines.AgentJobRequestMessage(plan, timeline, jobId, testName, testName, null, null, null, new Dictionary<string, VariableValue>(), new List<MaskHint>(), new Pipelines.JobResources(), new Pipelines.ContextData.DictionaryContextData(), new Pipelines.WorkspaceOptions(), new List<Pipelines.ActionStep>(), null, null, null, null);
_message.Variables[Constants.Variables.System.Culture] = "en-US";
_message.Resources.Endpoints.Add(new ServiceEndpoint()
{

View File

@@ -955,13 +955,12 @@ namespace GitHub.Runner.Common.Tests.Worker
_variables = new Variables(hostContext, new Dictionary<string, DTWebApi.VariableValue>());
_executionContext = new Mock<IExecutionContext>();
_executionContext.Setup(x => x.Global)
.Returns(new GlobalContext
{
Container = jobContainer,
Variables = _variables,
WriteDebug = true,
});
_executionContext.Setup(x => x.WriteDebug)
.Returns(true);
_executionContext.Setup(x => x.Variables)
.Returns(_variables);
_executionContext.Setup(x => x.Container)
.Returns(jobContainer);
_executionContext.Setup(x => x.GetMatchers())
.Returns(matchers?.Matchers ?? new List<IssueMatcherConfig>());
_executionContext.Setup(x => x.Add(It.IsAny<OnMatcherChanged>()))

View File

@@ -203,7 +203,6 @@ namespace GitHub.Runner.Common.Tests.Worker
// Setup the execution context.
_ec = new Mock<IExecutionContext>();
_ec.Setup(x => x.Global).Returns(new GlobalContext());
GitHubContext githubContext = new GitHubContext();
_ec.Setup(x => x.GetGitHubContext("repository")).Returns("actions/runner");

View File

@@ -1,390 +0,0 @@
using System;
using System.Collections.Generic;
using System.Globalization;
using System.IO;
using System.Linq;
using System.Text;
using System.Threading;
using System.Threading.Tasks;
using System.Runtime.CompilerServices;
using GitHub.Runner.Common.Util;
using GitHub.Runner.Sdk;
using GitHub.Runner.Worker;
using GitHub.Runner.Worker.Container;
using GitHub.Runner.Worker.Handlers;
using Moq;
using Xunit;
using DTWebApi = GitHub.DistributedTask.WebApi;
namespace GitHub.Runner.Common.Tests.Worker
{
public sealed class SetEnvFileCommandL0
{
private Mock<IExecutionContext> _executionContext;
private List<Tuple<DTWebApi.Issue, string>> _issues;
private string _rootDirectory;
private SetEnvFileCommand _setEnvFileCommand;
private ITraceWriter _trace;
[Fact]
[Trait("Level", "L0")]
[Trait("Category", "Worker")]
public void SetEnvFileCommand_DirectoryNotFound()
{
using (var hostContext = Setup())
{
var envFile = Path.Combine(_rootDirectory, "directory-not-found", "env");
_setEnvFileCommand.ProcessCommand(_executionContext.Object, envFile, null);
Assert.Equal(0, _issues.Count);
Assert.Equal(0, _executionContext.Object.Global.EnvironmentVariables.Count);
}
}
[Fact]
[Trait("Level", "L0")]
[Trait("Category", "Worker")]
public void SetEnvFileCommand_NotFound()
{
using (var hostContext = Setup())
{
var envFile = Path.Combine(_rootDirectory, "file-not-found");
_setEnvFileCommand.ProcessCommand(_executionContext.Object, envFile, null);
Assert.Equal(0, _issues.Count);
Assert.Equal(0, _executionContext.Object.Global.EnvironmentVariables.Count);
}
}
[Fact]
[Trait("Level", "L0")]
[Trait("Category", "Worker")]
public void SetEnvFileCommand_EmptyFile()
{
using (var hostContext = Setup())
{
var envFile = Path.Combine(_rootDirectory, "empty-file");
var content = new List<string>();
WriteContent(envFile, content);
_setEnvFileCommand.ProcessCommand(_executionContext.Object, envFile, null);
Assert.Equal(0, _issues.Count);
Assert.Equal(0, _executionContext.Object.Global.EnvironmentVariables.Count);
}
}
[Fact]
[Trait("Level", "L0")]
[Trait("Category", "Worker")]
public void SetEnvFileCommand_Simple()
{
using (var hostContext = Setup())
{
var envFile = Path.Combine(_rootDirectory, "simple");
var content = new List<string>
{
"MY_ENV=MY VALUE",
};
WriteContent(envFile, content);
_setEnvFileCommand.ProcessCommand(_executionContext.Object, envFile, null);
Assert.Equal(0, _issues.Count);
Assert.Equal(1, _executionContext.Object.Global.EnvironmentVariables.Count);
Assert.Equal("MY VALUE", _executionContext.Object.Global.EnvironmentVariables["MY_ENV"]);
}
}
[Fact]
[Trait("Level", "L0")]
[Trait("Category", "Worker")]
public void SetEnvFileCommand_Simple_SkipEmptyLines()
{
using (var hostContext = Setup())
{
var envFile = Path.Combine(_rootDirectory, "simple");
var content = new List<string>
{
string.Empty,
"MY_ENV=my value",
string.Empty,
"MY_ENV_2=my second value",
string.Empty,
};
WriteContent(envFile, content);
_setEnvFileCommand.ProcessCommand(_executionContext.Object, envFile, null);
Assert.Equal(0, _issues.Count);
Assert.Equal(2, _executionContext.Object.Global.EnvironmentVariables.Count);
Assert.Equal("my value", _executionContext.Object.Global.EnvironmentVariables["MY_ENV"]);
Assert.Equal("my second value", _executionContext.Object.Global.EnvironmentVariables["MY_ENV_2"]);
}
}
[Fact]
[Trait("Level", "L0")]
[Trait("Category", "Worker")]
public void SetEnvFileCommand_Simple_EmptyValue()
{
using (var hostContext = Setup())
{
var envFile = Path.Combine(_rootDirectory, "simple-empty-value");
var content = new List<string>
{
"MY_ENV=",
};
WriteContent(envFile, content);
_setEnvFileCommand.ProcessCommand(_executionContext.Object, envFile, null);
Assert.Equal(0, _issues.Count);
Assert.Equal(1, _executionContext.Object.Global.EnvironmentVariables.Count);
Assert.Equal(string.Empty, _executionContext.Object.Global.EnvironmentVariables["MY_ENV"]);
}
}
[Fact]
[Trait("Level", "L0")]
[Trait("Category", "Worker")]
public void SetEnvFileCommand_Simple_MultipleValues()
{
using (var hostContext = Setup())
{
var envFile = Path.Combine(_rootDirectory, "simple");
var content = new List<string>
{
"MY_ENV=my value",
"MY_ENV_2=",
"MY_ENV_3=my third value",
};
WriteContent(envFile, content);
_setEnvFileCommand.ProcessCommand(_executionContext.Object, envFile, null);
Assert.Equal(0, _issues.Count);
Assert.Equal(3, _executionContext.Object.Global.EnvironmentVariables.Count);
Assert.Equal("my value", _executionContext.Object.Global.EnvironmentVariables["MY_ENV"]);
Assert.Equal(string.Empty, _executionContext.Object.Global.EnvironmentVariables["MY_ENV_2"]);
Assert.Equal("my third value", _executionContext.Object.Global.EnvironmentVariables["MY_ENV_3"]);
}
}
[Fact]
[Trait("Level", "L0")]
[Trait("Category", "Worker")]
public void SetEnvFileCommand_Simple_SpecialCharacters()
{
using (var hostContext = Setup())
{
var envFile = Path.Combine(_rootDirectory, "simple");
var content = new List<string>
{
"MY_ENV==abc",
"MY_ENV_2=def=ghi",
"MY_ENV_3=jkl=",
};
WriteContent(envFile, content);
_setEnvFileCommand.ProcessCommand(_executionContext.Object, envFile, null);
Assert.Equal(0, _issues.Count);
Assert.Equal(3, _executionContext.Object.Global.EnvironmentVariables.Count);
Assert.Equal("=abc", _executionContext.Object.Global.EnvironmentVariables["MY_ENV"]);
Assert.Equal("def=ghi", _executionContext.Object.Global.EnvironmentVariables["MY_ENV_2"]);
Assert.Equal("jkl=", _executionContext.Object.Global.EnvironmentVariables["MY_ENV_3"]);
}
}
[Fact]
[Trait("Level", "L0")]
[Trait("Category", "Worker")]
public void SetEnvFileCommand_Heredoc()
{
using (var hostContext = Setup())
{
var envFile = Path.Combine(_rootDirectory, "heredoc");
var content = new List<string>
{
"MY_ENV<<EOF",
"line one",
"line two",
"line three",
"EOF",
};
WriteContent(envFile, content);
_setEnvFileCommand.ProcessCommand(_executionContext.Object, envFile, null);
Assert.Equal(0, _issues.Count);
Assert.Equal(1, _executionContext.Object.Global.EnvironmentVariables.Count);
Assert.Equal($"line one{Environment.NewLine}line two{Environment.NewLine}line three", _executionContext.Object.Global.EnvironmentVariables["MY_ENV"]);
}
}
[Fact]
[Trait("Level", "L0")]
[Trait("Category", "Worker")]
public void SetEnvFileCommand_Heredoc_EmptyValue()
{
using (var hostContext = Setup())
{
var envFile = Path.Combine(_rootDirectory, "heredoc");
var content = new List<string>
{
"MY_ENV<<EOF",
"EOF",
};
WriteContent(envFile, content);
_setEnvFileCommand.ProcessCommand(_executionContext.Object, envFile, null);
Assert.Equal(0, _issues.Count);
Assert.Equal(1, _executionContext.Object.Global.EnvironmentVariables.Count);
Assert.Equal(string.Empty, _executionContext.Object.Global.EnvironmentVariables["MY_ENV"]);
}
}
[Fact]
[Trait("Level", "L0")]
[Trait("Category", "Worker")]
public void SetEnvFileCommand_Heredoc_SkipEmptyLines()
{
using (var hostContext = Setup())
{
var envFile = Path.Combine(_rootDirectory, "heredoc");
var content = new List<string>
{
string.Empty,
"MY_ENV<<EOF",
"hello",
"world",
"EOF",
string.Empty,
"MY_ENV_2<<EOF",
"HELLO",
"AGAIN",
"EOF",
string.Empty,
};
WriteContent(envFile, content);
_setEnvFileCommand.ProcessCommand(_executionContext.Object, envFile, null);
Assert.Equal(0, _issues.Count);
Assert.Equal(2, _executionContext.Object.Global.EnvironmentVariables.Count);
Assert.Equal($"hello{Environment.NewLine}world", _executionContext.Object.Global.EnvironmentVariables["MY_ENV"]);
Assert.Equal($"HELLO{Environment.NewLine}AGAIN", _executionContext.Object.Global.EnvironmentVariables["MY_ENV_2"]);
}
}
[Fact]
[Trait("Level", "L0")]
[Trait("Category", "Worker")]
public void SetEnvFileCommand_Heredoc_SpecialCharacters()
{
using (var hostContext = Setup())
{
var envFile = Path.Combine(_rootDirectory, "heredoc");
var content = new List<string>
{
"MY_ENV<<=EOF",
"hello",
"one",
"=EOF",
"MY_ENV_2<<<EOF",
"hello",
"two",
"<EOF",
"MY_ENV_3<<EOF",
"hello",
string.Empty,
"three",
string.Empty,
"EOF",
"MY_ENV_4<<EOF",
"hello=four",
"EOF",
"MY_ENV_5<<EOF",
" EOF",
"EOF",
};
WriteContent(envFile, content);
_setEnvFileCommand.ProcessCommand(_executionContext.Object, envFile, null);
Assert.Equal(0, _issues.Count);
Assert.Equal(5, _executionContext.Object.Global.EnvironmentVariables.Count);
Assert.Equal($"hello{Environment.NewLine}one", _executionContext.Object.Global.EnvironmentVariables["MY_ENV"]);
Assert.Equal($"hello{Environment.NewLine}two", _executionContext.Object.Global.EnvironmentVariables["MY_ENV_2"]);
Assert.Equal($"hello{Environment.NewLine}{Environment.NewLine}three{Environment.NewLine}", _executionContext.Object.Global.EnvironmentVariables["MY_ENV_3"]);
Assert.Equal($"hello=four", _executionContext.Object.Global.EnvironmentVariables["MY_ENV_4"]);
Assert.Equal($" EOF", _executionContext.Object.Global.EnvironmentVariables["MY_ENV_5"]);
}
}
#if OS_WINDOWS
[Fact]
[Trait("Level", "L0")]
[Trait("Category", "Worker")]
public void SetEnvFileCommand_Heredoc_PreservesNewline()
{
using (var hostContext = Setup())
{
var newline = "\n";
var envFile = Path.Combine(_rootDirectory, "heredoc");
var content = new List<string>
{
"MY_ENV<<EOF",
"hello",
"world",
"EOF",
};
WriteContent(envFile, content, newline: newline);
_setEnvFileCommand.ProcessCommand(_executionContext.Object, envFile, null);
Assert.Equal(0, _issues.Count);
Assert.Equal(1, _executionContext.Object.Global.EnvironmentVariables.Count);
Assert.Equal($"hello{newline}world", _executionContext.Object.Global.EnvironmentVariables["MY_ENV"]);
}
}
#endif
private void WriteContent(
string path,
List<string> content,
string newline = null)
{
if (string.IsNullOrEmpty(newline))
{
newline = Environment.NewLine;
}
var encoding = new UTF8Encoding(true); // Emit BOM
var contentStr = string.Join(newline, content);
File.WriteAllText(path, contentStr, encoding);
}
private TestHostContext Setup([CallerMemberName] string name = "")
{
_issues = new List<Tuple<DTWebApi.Issue, string>>();
var hostContext = new TestHostContext(this, name);
// Trace
_trace = hostContext.GetTrace();
// Directory for test data
var workDirectory = hostContext.GetDirectory(WellKnownDirectory.Work);
ArgUtil.NotNullOrEmpty(workDirectory, nameof(workDirectory));
Directory.CreateDirectory(workDirectory);
_rootDirectory = Path.Combine(workDirectory, nameof(SetEnvFileCommandL0));
Directory.CreateDirectory(_rootDirectory);
// Execution context
_executionContext = new Mock<IExecutionContext>();
_executionContext.Setup(x => x.Global)
.Returns(new GlobalContext
{
EnvironmentVariables = new Dictionary<string, string>(VarUtil.EnvironmentVariableKeyComparer),
WriteDebug = true,
});
_executionContext.Setup(x => x.AddIssue(It.IsAny<DTWebApi.Issue>(), It.IsAny<string>()))
.Callback((DTWebApi.Issue issue, string logMessage) =>
{
_issues.Add(new Tuple<DTWebApi.Issue, string>(issue, logMessage));
var message = !string.IsNullOrEmpty(logMessage) ? logMessage : issue.Message;
_trace.Info($"Issue '{issue.Type}': {message}");
});
_executionContext.Setup(x => x.Write(It.IsAny<string>(), It.IsAny<string>()))
.Callback((string tag, string message) =>
{
_trace.Info($"{tag}{message}");
});
// SetEnvFileCommand
_setEnvFileCommand = new SetEnvFileCommand();
_setEnvFileCommand.Initialize(hostContext);
return hostContext;
}
}
}

View File

@@ -38,13 +38,11 @@ namespace GitHub.Runner.Common.Tests.Worker
};
_ec = new Mock<IExecutionContext>();
_ec.SetupAllProperties();
_ec.Setup(x => x.Global).Returns(new GlobalContext { WriteDebug = true });
_ec.Object.Global.Variables = _variables;
_ec.Object.Global.EnvironmentVariables = _env;
_ec.Setup(x => x.Variables).Returns(_variables);
_contexts = new DictionaryContextData();
_jobContext = new JobContext();
_contexts["github"] = new GitHubContext();
_contexts["github"] = new DictionaryContextData();
_contexts["runner"] = new DictionaryContextData();
_contexts["job"] = _jobContext;
_ec.Setup(x => x.ExpressionValues).Returns(_contexts);
@@ -52,7 +50,7 @@ namespace GitHub.Runner.Common.Tests.Worker
_ec.Setup(x => x.JobContext).Returns(_jobContext);
_stepContext = new StepsContext();
_ec.Object.Global.StepsContext = _stepContext;
_ec.Setup(x => x.StepsContext).Returns(_stepContext);
_ec.Setup(x => x.PostJobSteps).Returns(new Stack<IStep>());
@@ -428,11 +426,11 @@ namespace GitHub.Runner.Common.Tests.Worker
Assert.Equal(TaskResult.Succeeded, _ec.Object.Result ?? TaskResult.Succeeded);
#if OS_WINDOWS
Assert.Equal("100", step1.Object.ExecutionContext.ExpressionValues["env"].AssertDictionary("env")["env1"].AssertString("100"));
Assert.Equal("github_actions", step1.Object.ExecutionContext.ExpressionValues["env"].AssertDictionary("env")["env2"].AssertString("github_actions"));
Assert.Equal("100", _ec.Object.ExpressionValues["env"].AssertDictionary("env")["env1"].AssertString("100"));
Assert.Equal("github_actions", _ec.Object.ExpressionValues["env"].AssertDictionary("env")["env2"].AssertString("github_actions"));
#else
Assert.Equal("100", step1.Object.ExecutionContext.ExpressionValues["env"].AssertCaseSensitiveDictionary("env")["env1"].AssertString("100"));
Assert.Equal("github_actions", step1.Object.ExecutionContext.ExpressionValues["env"].AssertCaseSensitiveDictionary("env")["env2"].AssertString("github_actions"));
Assert.Equal("100", _ec.Object.ExpressionValues["env"].AssertCaseSensitiveDictionary("env")["env1"].AssertString("100"));
Assert.Equal("github_actions", _ec.Object.ExpressionValues["env"].AssertCaseSensitiveDictionary("env")["env2"].AssertString("github_actions"));
#endif
}
}
@@ -465,13 +463,13 @@ namespace GitHub.Runner.Common.Tests.Worker
// Assert.
Assert.Equal(TaskResult.Succeeded, _ec.Object.Result ?? TaskResult.Succeeded);
#if OS_WINDOWS
Assert.Equal("1000", step2.Object.ExecutionContext.ExpressionValues["env"].AssertDictionary("env")["env1"].AssertString("1000"));
Assert.Equal("github_actions", step2.Object.ExecutionContext.ExpressionValues["env"].AssertDictionary("env")["env3"].AssertString("github_actions"));
Assert.False(step2.Object.ExecutionContext.ExpressionValues["env"].AssertDictionary("env").ContainsKey("env2"));
Assert.Equal("1000", _ec.Object.ExpressionValues["env"].AssertDictionary("env")["env1"].AssertString("1000"));
Assert.Equal("github_actions", _ec.Object.ExpressionValues["env"].AssertDictionary("env")["env3"].AssertString("github_actions"));
Assert.False(_ec.Object.ExpressionValues["env"].AssertDictionary("env").ContainsKey("env2"));
#else
Assert.Equal("1000", step2.Object.ExecutionContext.ExpressionValues["env"].AssertCaseSensitiveDictionary("env")["env1"].AssertString("1000"));
Assert.Equal("github_actions", step2.Object.ExecutionContext.ExpressionValues["env"].AssertCaseSensitiveDictionary("env")["env3"].AssertString("github_actions"));
Assert.False(step2.Object.ExecutionContext.ExpressionValues["env"].AssertCaseSensitiveDictionary("env").ContainsKey("env2"));
Assert.Equal("1000", _ec.Object.ExpressionValues["env"].AssertCaseSensitiveDictionary("env")["env1"].AssertString("1000"));
Assert.Equal("github_actions", _ec.Object.ExpressionValues["env"].AssertCaseSensitiveDictionary("env")["env3"].AssertString("github_actions"));
Assert.False(_ec.Object.ExpressionValues["env"].AssertCaseSensitiveDictionary("env").ContainsKey("env2"));
#endif
}
}
@@ -503,11 +501,11 @@ namespace GitHub.Runner.Common.Tests.Worker
// Assert.
Assert.Equal(TaskResult.Succeeded, _ec.Object.Result ?? TaskResult.Succeeded);
#if OS_WINDOWS
Assert.Equal("1000", step2.Object.ExecutionContext.ExpressionValues["env"].AssertDictionary("env")["env1"].AssertString("1000"));
Assert.Equal("something", step2.Object.ExecutionContext.ExpressionValues["env"].AssertDictionary("env")["env2"].AssertString("something"));
Assert.Equal("1000", _ec.Object.ExpressionValues["env"].AssertDictionary("env")["env1"].AssertString("1000"));
Assert.Equal("something", _ec.Object.ExpressionValues["env"].AssertDictionary("env")["env2"].AssertString("something"));
#else
Assert.Equal("1000", step2.Object.ExecutionContext.ExpressionValues["env"].AssertCaseSensitiveDictionary("env")["env1"].AssertString("1000"));
Assert.Equal("something", step2.Object.ExecutionContext.ExpressionValues["env"].AssertCaseSensitiveDictionary("env")["env2"].AssertString("something"));
Assert.Equal("1000", _ec.Object.ExpressionValues["env"].AssertCaseSensitiveDictionary("env")["env1"].AssertString("1000"));
Assert.Equal("something", _ec.Object.ExpressionValues["env"].AssertCaseSensitiveDictionary("env")["env2"].AssertString("something"));
#endif
}
}
@@ -601,15 +599,13 @@ namespace GitHub.Runner.Common.Tests.Worker
// Setup the step execution context.
var stepContext = new Mock<IExecutionContext>();
stepContext.SetupAllProperties();
stepContext.Setup(x => x.Global).Returns(() => _ec.Object.Global);
var expressionValues = new DictionaryContextData();
foreach (var pair in _ec.Object.ExpressionValues)
{
expressionValues[pair.Key] = pair.Value;
}
stepContext.Setup(x => x.ExpressionValues).Returns(expressionValues);
stepContext.Setup(x => x.WriteDebug).Returns(true);
stepContext.Setup(x => x.Variables).Returns(_variables);
stepContext.Setup(x => x.EnvironmentVariables).Returns(_env);
stepContext.Setup(x => x.ExpressionValues).Returns(_contexts);
stepContext.Setup(x => x.ExpressionFunctions).Returns(new List<IFunctionInfo>());
stepContext.Setup(x => x.JobContext).Returns(_jobContext);
stepContext.Setup(x => x.StepsContext).Returns(_stepContext);
stepContext.Setup(x => x.ContextName).Returns(step.Object.Action.ContextName);
stepContext.Setup(x => x.Complete(It.IsAny<TaskResult?>(), It.IsAny<string>(), It.IsAny<string>()))
.Callback((TaskResult? r, string currentOperation, string resultCode) =>

View File

@@ -67,7 +67,7 @@ namespace GitHub.Runner.Common.Tests.Worker
new Pipelines.ContextData.DictionaryContextData()
},
};
var jobRequest = new Pipelines.AgentJobRequestMessage(plan, timeline, JobId, jobName, jobName, new StringToken(null, null, null, "ubuntu"), sidecarContainers, null, variables, new List<MaskHint>(), resources, context, null, actions, null, null, null);
var jobRequest = new Pipelines.AgentJobRequestMessage(plan, timeline, JobId, jobName, jobName, new StringToken(null, null, null, "ubuntu"), sidecarContainers, null, variables, new List<MaskHint>(), resources, context, null, actions, null, null, null, null);
return jobRequest;
}

View File

@@ -173,6 +173,57 @@ function package ()
powershell -NoLogo -Sta -NoProfile -NonInteractive -ExecutionPolicy Unrestricted -Command "Add-Type -Assembly \"System.IO.Compression.FileSystem\"; [System.IO.Compression.ZipFile]::CreateFromDirectory(\"${window_path}\", \"${zip_name}\")"
fi
runner_patch_pkg_name="actions-runner-${RUNTIME_ID}-${runner_ver}-patch"
heading "Packaging Patch Version ${runner_patch_pkg_name}"
echo "Downloading latest runner ..."
mkdir -p "_temp"
pushd "_temp" > /dev/null
latest_version_label=$(curl -s -X GET 'https://api.github.com/repos/actions/runner/releases/latest' | jq -r '.tag_name')
latest_version=$(echo ${latest_version_label:1})
latest_version_runner_file=""
if [[ ("$CURRENT_PLATFORM" == "linux") || ("$CURRENT_PLATFORM" == "darwin") ]]; then
latest_version_runner_file="actions-runner-${RUNTIME_ID}-${latest_version}.tar.gz"
elif [[ ("$CURRENT_PLATFORM" == "windows") ]]; then
latest_version_runner_file="actions-runner-${RUNTIME_ID}-${latest_version}.zip"
fi
latest_version_download_url="https://github.com/actions/runner/releases/download/${latest_version_label}/${latest_version_runner_file}"
echo "Downloading ${latest_version_label} for ${RUNTIME_ID} ..."
echo $latest_version_download_url
curl -O -L ${latest_version_download_url}
mkdir -p "latest_runner"
if [[ ("$CURRENT_PLATFORM" == "linux") || ("$CURRENT_PLATFORM" == "darwin") ]]; then
tar xzf "./${latest_version_runner_file}" -C latest_runner
elif [[ ("$CURRENT_PLATFORM" == "windows") ]]; then
powershell -NoLogo -Sta -NoProfile -NonInteractive -ExecutionPolicy Unrestricted -Command "Add-Type -Assembly \"System.IO.Compression.FileSystem\"; [System.IO.Compression.ZipFile]::ExtractToDirectory(\"./${latest_version_runner_file}\", \"latest_runner\")"
fi
mkdir -p "_patch"
diff -qrN "${LAYOUT_DIR}" "latest_runner" | cut -d " " -f 2 | xargs -t -I file cp file "_patch/"$(echo file | cut -c ${#LAYOUT_DIR}- | cut -c 3-)
popd > /dev/null
if [[ ("$CURRENT_PLATFORM" == "linux") || ("$CURRENT_PLATFORM" == "darwin") ]]; then
patch_tar_name="${runner_patch_pkg_name}.tar.gz"
echo "Creating $patch_tar_name in ${PACKAGE_DIR}/_temp/_patch"
tar -czf "${patch_tar_name}" -C "${PACKAGE_DIR}/_temp/_patch" .
elif [[ ("$CURRENT_PLATFORM" == "windows") ]]; then
patch_zip_name="${runner_patch_pkg_name}.zip"
echo "Convert ${PACKAGE_DIR} to Windows style path"
window_path=${PACKAGE_DIR:1}
window_path=${window_path:0:1}:${window_path:1}
echo "Creating $patch_zip_name in ${window_path}/_temp/_patch"
powershell -NoLogo -Sta -NoProfile -NonInteractive -ExecutionPolicy Unrestricted -Command "Add-Type -Assembly \"System.IO.Compression.FileSystem\"; [System.IO.Compression.ZipFile]::CreateFromDirectory(\"${window_path}\", \"${patch_zip_name}\")"
fi
rm -Rf "${PACKAGE_DIR}/_temp"
popd > /dev/null
}

View File

@@ -1 +1 @@
2.273.6
2.263.0