Compare commits

..

34 Commits

Author SHA1 Message Date
TingluoHuang
e0dd60e815 . 2021-11-04 01:08:23 -04:00
TingluoHuang
37d1aa10aa . 2021-11-04 00:52:24 -04:00
TingluoHuang
34cdb04de6 . 2021-11-04 00:49:10 -04:00
TingluoHuang
41fcb3f152 . 2021-11-04 00:46:40 -04:00
TingluoHuang
6593716f32 L0 2021-11-02 10:53:50 -04:00
TingluoHuang
3bc49d0567 l0. 2021-11-02 10:53:50 -04:00
TingluoHuang
f8db7860dc Support node.js 16 and bump node.js 12 version. 2021-11-02 10:53:50 -04:00
Julio Barba
85e1927754 Prepare for runner 2.284.0 release (#1448) 2021-11-01 11:16:21 -04:00
Julio Barba
b6dbf42746 Improve retry handling based on feedback (#1447) 2021-10-29 16:04:34 -04:00
Ferenc Hammerl
67ba8a7d42 Support Conditional Steps in Composite Actions (#1438)
* conditional support for composite actions

* Fix Conditional function evaluation

* Push launch.json temporarily

* Revert "Push launch.json temporarily"

* rename context

* Cleanup comments

* fix success/failure functions to run based on pre/main steps

* idea of step_status

* change to use steps context, WIP

* add inputs to possible if condition expressions

* use action_status

* pr cleanup

* Added right stages

* Test on stage in conditional functions

* Fix naming and formatting

* Fix tests

* Add success and failure L0s

* Remove comment

* Remove whitespace

* Undo formatting

* Add L0 for step-if parsing

* Add ADR

Co-authored-by: Thomas Boop <thboop@github.com>
2021-10-29 15:45:42 +02:00
Ferenc Hammerl
e4f9e6ae26 Log current runner version in terminal (#1441) 2021-10-29 14:23:26 +02:00
Thomas Boop
854d5e3bf3 Fix an issue where nested local composite actions did not correctly register post steps (#1433)
* Always register post steps for local actions

* Register post steps along with their conditions

* remove debug code

Co-authored-by: Ferenc Hammerl <fhammerl@github.com>
2021-10-27 15:31:58 +02:00
Thomas Boop
57dec28f68 Cleanup Older versions on MacOS now that we recreate node versions as needed (#1410)
* Cleanup old version update code

* fix template

* fix indents
2021-10-19 10:15:37 -04:00
Tingluo Huang
55a861f089 Expose GITHUB_REF_* as environment variable (#1314)
* Keep env vars alphabetical

* ref_* context.

Co-authored-by: Ferenc Hammerl <31069338+fhammerl@users.noreply.github.com>
2021-10-18 22:22:34 -04:00
jeremyd2019
51b2031cbf Add arch to runner context (#1372)
Fixes #1185
2021-10-13 23:49:26 -04:00
Raphael Cruzeiro
400b2d879c Makes the user keychains available to the service (#847)
Without creating a session, the service is not able to access the keychains for the user specified under `UserName`. This causes any workflow that deals with code signing to fail as the only keychain loaded with be the system one. This should fix #350
2021-10-06 15:37:45 -04:00
Thomas Boop
c4b6d288d4 fix ephemeral runner upgrade on mac/linux (#1403) 2021-10-05 10:15:19 +02:00
Julio Barba
0699597876 Use Actions Service health and api.github.com endpoints after connection failure on Actions Server and Hosted (#1385) 2021-09-30 13:40:34 -04:00
Thomas Boop
a592b14ae3 Runner 2.283.2 Release (#1389) 2021-09-29 15:49:40 -04:00
Thomas Boop
04269f7b1b Handle keeping previous OSX versions more smoothly on Mac (#1381)
* Handle macOS upgrade smoothly

* cleanup

* misc cleanup

* final updates

* Update src/Misc/layoutbin/update.sh.template

Co-authored-by: Patrick Ellis <319655+pje@users.noreply.github.com>

* Update src/Misc/layoutbin/update.sh.template

Co-authored-by: Patrick Ellis <319655+pje@users.noreply.github.com>

* Upload telemetry and default to old method as needed

* minor fix

* add one more bit of logging

* some more telemetry

* quote variables to handle spaces

* tiny fix for ubuntu

* remove version and move telemetry to diag

* use full path

Co-authored-by: Patrick Ellis <319655+pje@users.noreply.github.com>
2021-09-29 15:49:31 -04:00
Ferenc Hammerl
e89d2e84bd Stop-Commands: stopToken restrictions (#1371)
* Prevent stopTokens that are workflow commands

Co-authored-by: Thomas Boop <52323235+thboop@users.noreply.github.com>

* Check context for env var too

* Accept true, 1 and $true instead of just "true"

* Setup ExpressionValues in tests

* Update src/Runner.Common/Constants.cs

Co-authored-by: Thomas Boop <52323235+thboop@users.noreply.github.com>

* Separate success and fail tests for invalid token

* Fix envcontext for tests

Co-authored-by: Thomas Boop <52323235+thboop@users.noreply.github.com>
2021-09-29 14:44:01 -04:00
Thomas Boop
afe7066e39 only cleanup runner local files on success (#1384) 2021-09-28 18:55:28 -04:00
Ferenc Hammerl
da79ef4acb Fix unconfiguring of runner after group changes (#1359)
* Ignore agentpool when unconfiguring the runner

Runner names and IDs are unique within a ServiceHost
They don't need to be included when unconfiguring the runner.

* Use -1 instead of 0 to highlight how it is ignored

* Use overloads and 0 instead of -1

Using 0 seems to be the convention

* Fix typo calling the wrong method
2021-09-22 15:04:43 +02:00
Tingluo Huang
5afb52b272 Update the comment about the --once in Constants.cs (#1360)
* Update Constants.cs

* feedback.

* Update src/Runner.Listener/Runner.cs

Co-authored-by: Thomas Boop <52323235+thboop@users.noreply.github.com>

Co-authored-by: Thomas Boop <52323235+thboop@users.noreply.github.com>
2021-09-21 21:31:48 +00:00
Thomas Boop
cf87c55557 Don't retry 422 (#1352) 2021-09-21 09:59:21 -04:00
Ferenc Hammerl
43fa351980 Update telemetry (#1355)
* Track "pause-logging"

* Bump release version
2021-09-20 15:54:20 +02:00
Ferenc Hammerl
ecfc2cc9e9 Prepare 2.283.0 release (#1351)
* Update releaseNote.md

* Update runnerversion

* Update releaseNote.md

Co-authored-by: Patrick Ellis <319655+pje@users.noreply.github.com>

Co-authored-by: Patrick Ellis <319655+pje@users.noreply.github.com>
2021-09-20 14:59:37 +02:00
Ferenc Hammerl
740fb43731 Generic telemetry (#1321)
* Add generateIdTokenUrl as an env var

* Add generateIdTokenUrl to env vars

* Add basic telemetry class and submit it on jobcompleted

* Use constructor overload

* Rename telemetry to jobTelemetry

* Rename telemetry file

* Make JobTelemetryType a string

* Collect telemetry

* Remove debugger

* Update src/Runner.Worker/ActionCommandManager.cs

Co-authored-by: Thomas Boop <52323235+thboop@users.noreply.github.com>

* Use same JobTelemetry for all contexts

* Mask telemetry data

* Mask in JobRunner instead

* Empty line

* Change method signature
Returning with a List suggests we clone it and that the
original doesn't change..

* Update launch.json

Co-authored-by: Thomas Boop <52323235+thboop@users.noreply.github.com>
2021-09-20 14:44:50 +02:00
Patrick Ellis
f259e5706f Ephemeral runner deletes local .runner,.credentials files after completion (#1344)
Closes #1337
2021-09-16 11:00:27 -04:00
Ferenc Hammerl
5d84918ed5 Add name to runner context (#1312)
* Add generateIdTokenUrl as an env var

* Add generateIdTokenUrl to env vars

* Add name and runner_group to context

* No longer add runner-group

* Update runner name if needed

* Get interface instead of concrete class

* Check for nulls on ReservedAgent

* Avoid loading setting file unnecesseraly

* Only check agentName once

* Use Trace.Error when can't update settings

* Better equals and exception handling

* Update JobDispatcher.cs

* Add tests and null check
2021-09-16 15:25:51 +02:00
Julio Barba
881c521005 Revert "Recreate VssConnection on retry (#1316)" (#1343)
This reverts commit 4359dd605b.
2021-09-15 13:21:50 -04:00
Patrick Ellis
176e7f5208 Trim trailing whitespace in all md and yml files (#1329)
* Trim non-significant trailing whitespace, add final newlines to md,yml files

* Add .editorconfig with basic whitespace conventions
2021-09-15 13:35:25 +02:00
Jacob Wallraff
b6d46c148a Add attempt number to GitHub context (#1302)
* Add attempt number to GitHub context

* Change context name

* Changing order
2021-09-15 11:00:53 +02:00
Thomas Boop
38e33bb8e3 Update network.md 2021-09-14 15:28:30 -04:00
78 changed files with 2364 additions and 570 deletions

8
.editorconfig Normal file
View File

@@ -0,0 +1,8 @@
# https://editorconfig.org/
[*]
insert_final_newline = true # ensure all files end with a single newline
trim_trailing_whitespace = true # attempt to remove trailing whitespace on save
[*.md]
trim_trailing_whitespace = false # in markdown, "two trailing spaces" is unfortunately meaningful; it means `<br>`

View File

@@ -18,7 +18,7 @@ jobs:
build:
strategy:
matrix:
runtime: [ linux-x64, linux-arm64, linux-arm, win-x64, osx-x64 ]
runtime: [ linux-x64, linux-musl-x64, linux-arm64, linux-arm, win-x64, osx-x64 ]
include:
- runtime: linux-x64
os: ubuntu-latest
@@ -32,6 +32,10 @@ jobs:
os: ubuntu-latest
devScript: ./dev.sh
- runtime: linux-musl-x64
os: ubuntu-latest
devScript: ./dev.sh
- runtime: osx-x64
os: macOS-latest
devScript: ./dev.sh
@@ -55,7 +59,7 @@ jobs:
run: |
${{ matrix.devScript }} test
working-directory: src
if: matrix.runtime != 'linux-arm64' && matrix.runtime != 'linux-arm'
if: matrix.runtime != 'linux-arm64' && matrix.runtime != 'linux-arm' && matrix.runtime != 'linux-musl-x64'
# Create runner package tar.gz/zip
- name: Package Release

View File

@@ -0,0 +1,71 @@
# ADR 1438: Support Conditionals In Composite Actions
**Date**: 2021-10-13
**Status**: Accepted
## Context
We recently shipped composite actions, which allows you to reuse individual steps inside an action.
However, one of the [most requested features](https://github.com/actions/runner/issues/834) has been a way to support the `if` keyword.
### Goals
- We want to keep consistent with current behavior
- We want to support conditionals via the `if` keyword
- Our built in functions like `success` should be implementable without calling them, for example you can do `job.status == success` rather then `success()` currently.
### How does composite currently work?
Currently, we have limited conditional support in composite actions for `pre` and `post` steps.
These are based on the `job status`, and support keywords like `always()`, `failed()`, `success()` and `cancelled()`.
However, generic or main steps do **not** support conditionals.
By default, in a regular workflow, a step runs on the `success()` condition. Which looks at the **job** **status**, sees if it is successful and runs.
By default, in a composite action, main steps run until a single step fails in that composite action, then the composite action is halted early. It does **not** care about the job status.
Pre, and post steps in composite actions use the job status to determine if they should run.
### How do we go forward?
Well, if we think about what composite actions are currently doing when invoking main steps, they are checking if the current composite action is successful.
Lets formalize that concept into a "real" idea.
- We will add an `action_status` field to the github context to mimic the [job's context status](https://docs.github.com/en/actions/learn-github-actions/contexts#job-context).
- We have an existing concept that does this `action_path` which is only set for composite actions on the github context.
- In a composite action during a main step, the `success()` function will check if `action_status == success`, rather then `job_status == success`. Failure will work the same way.
- Pre and post steps in composite actions will not change, they will continue to check the job status.
### Nested Scenario
For nested composite actions, we will follow the existing behavior, you only care about your current composite action, not any parents.
For example, lets imagine a scenario with a simple nested composite action
```
- Job
- Regular Step
- Composite Action
- runs: exit 1
- if: always()
uses: A child composite action
- if: success()
runs: echo "this should print"
- runs: echo "this should also print"
- if: success()
runs: echo "this will not print as the current composite action has failed already"
```
The child composite actions steps should run in this example, the child composite action has not yet failed, so it should run all steps until a step fails. This is consistent with how a composite action currently works in production if the main job fails but a composite action is invoked with `if:always()` or `if: failure()`
### Other options explored
We could add the `current_step_status` to the job context rather then `__status` to the steps context, however this comes with two major downsides:
- We need to support the field for every type of step, because its non trivial to remove a field from the job context once it has been added (its readonly)
- For all actions besides composite it would only every be `success`
- Its weird to have a `current_step` value on the job context
- We also explored a `__status` on the steps context.
- The `__` is required to prevent us from colliding with a step with id: status
- This felt wrong because the naming was not smooth, and did not fit into current conventions.
### Consequences
- github context has a new field for the status of the current composite action.
- We support conditional's in composite actions
- We keep the existing behavior for all users, but allow them to expand that functionality.

View File

@@ -55,6 +55,7 @@ If you are having trouble connecting, try these steps:
- You may need to configure your TLS settings to use the correct version
- You should support TLS version 1.2 or later
- You may need to configure your TLS settings to have up to date cipher suites, this may be solved by system updates and patches.
- Most notably, on windows server 2012 make sure [the tls cipher suite update](https://support.microsoft.com/en-us/topic/update-adds-new-tls-cipher-suites-and-changes-cipher-suite-priorities-in-windows-8-1-and-windows-server-2012-r2-8e395e43-c8ef-27d8-b60c-0fc57d526d94) is installed
- Your firewall, proxy or network configuration may be blocking the connection
- You will want to reach out to whoever is in charge of your network with these pcap files to further troubleshoot
- If you see a failure later in the handshake:

View File

@@ -1,14 +1,19 @@
## Features
N/A
- Expose GITHUB_REF_* as environment variable (#1314)
- Add arch to runner context (#1372)
- Support Conditional Steps in Composite Actions (#1438)
- Log current runner version in terminal (#1441)
## Bugs
- Revert "More resilient VssConnection client retries in JobServer" (#1343)
- Makes the user keychains available to the service (#847)
- Use Actions Service health and api.github.com endpoints after connection failure on Actions Server and Hosted (#1385)
- Fix an issue where nested local composite actions did not correctly register post steps (#1433)
## Misc
N/A
- Cleanup Older versions on MacOS now that we recreate node versions as needed (#1410)
## Windows x64
We recommend configuring the runner in a root folder of the Windows drive (e.g. "C:\actions-runner"). This will help avoid issues related to service identity folder permissions and long file path restrictions on Windows.

View File

@@ -1 +1 @@
2.282.1
<Update to ./src/runnerversion when creating release>

View File

@@ -29,7 +29,7 @@
<DefineConstants>$(DefineConstants);X64</DefineConstants>
</PropertyGroup>
<PropertyGroup Condition="'$(BUILD_OS)' == 'Linux' AND ('$(PackageRuntime)' == 'linux-x64' OR '$(PackageRuntime)' == '')">
<PropertyGroup Condition="'$(BUILD_OS)' == 'Linux' AND ('$(PackageRuntime)' == 'linux-x64' OR '$(PackageRuntime)' == 'linux-musl-x64' OR '$(PackageRuntime)' == '')">
<DefineConstants>$(DefineConstants);X64</DefineConstants>
</PropertyGroup>
<PropertyGroup Condition="'$(BUILD_OS)' == 'Linux' AND '$(PackageRuntime)' == 'linux-arm'">

View File

@@ -16,15 +16,27 @@
- LTS - most current supported release
- 2-part version in a format A.B - represents a specific release
examples: 2.0, 1.0
- Branch name
examples: release/2.0.0, Master
Note: The version parameter overrides the channel parameter.
- 3-part version in a format A.B.Cxx - represents a specific SDK release
examples: 5.0.1xx, 5.0.2xx
Supported since 5.0 release
Note: The version parameter overrides the channel parameter when any version other than 'latest' is used.
.PARAMETER Quality
Download the latest build of specified quality in the channel. The possible values are: daily, signed, validated, preview, GA.
Works only in combination with channel. Not applicable for current and LTS channels and will be ignored if those channels are used.
For SDK use channel in A.B.Cxx format: using quality together with channel in A.B format is not supported.
Supported since 5.0 release.
Note: The version parameter overrides the channel parameter when any version other than 'latest' is used, and therefore overrides the quality.
.PARAMETER Version
Default: latest
Represents a build version on specific channel. Possible values:
- latest - most latest build on specific channel
- 3-part version in a format A.B.C - represents specific version of build
examples: 2.0.0-preview2-006120, 1.1.0
.PARAMETER Internal
Download internal builds. Requires providing credentials via -FeedCredential parameter.
.PARAMETER FeedCredential
Token to access Azure feed. Used as a query string to append to the Azure feed.
This parameter typically is not specified.
.PARAMETER InstallDir
Default: %LocalAppData%\Microsoft\dotnet
Path to where to install dotnet. Note that binaries will be placed directly in a given directory.
@@ -59,9 +71,6 @@
.PARAMETER UncachedFeed
This parameter typically is not changed by the user.
It allows changing the URL for the Uncached feed used by this installer.
.PARAMETER FeedCredential
Used as a query string to append to the Azure feed.
It allows changing the URL to use non-public blob storage accounts.
.PARAMETER ProxyAddress
If set, the installer will use the proxy when making web requests
.PARAMETER ProxyUseDefaultCredentials
@@ -77,15 +86,19 @@
.PARAMETER JSonFile
Determines the SDK version from a user specified global.json file
Note: global.json must have a value for 'SDK:Version'
.PARAMETER DownloadTimeout
Determines timeout duration in seconds for dowloading of the SDK file
Default: 1200 seconds (20 minutes)
#>
[cmdletbinding()]
param(
[string]$Channel="LTS",
[string]$Quality,
[string]$Version="Latest",
[switch]$Internal,
[string]$JSonFile,
[string]$InstallDir="<auto>",
[Alias('i')][string]$InstallDir="<auto>",
[string]$Architecture="<auto>",
[ValidateSet("dotnet", "aspnetcore", "windowsdesktop", IgnoreCase = $false)]
[string]$Runtime,
[Obsolete("This parameter may be removed in a future version of this script. The recommended alternative is '-Runtime dotnet'.")]
[switch]$SharedRuntime,
@@ -98,7 +111,8 @@ param(
[switch]$ProxyUseDefaultCredentials,
[string[]]$ProxyBypassList=@(),
[switch]$SkipNonVersionedFiles,
[switch]$NoCdn
[switch]$NoCdn,
[int]$DownloadTimeout=1200
)
Set-StrictMode -Version Latest
@@ -167,7 +181,7 @@ function Say-Invocation($Invocation) {
Say-Verbose "$command $args"
}
function Invoke-With-Retry([ScriptBlock]$ScriptBlock, [int]$MaxAttempts = 3, [int]$SecondsBetweenAttempts = 1) {
function Invoke-With-Retry([ScriptBlock]$ScriptBlock, [System.Threading.CancellationToken]$cancellationToken = [System.Threading.CancellationToken]::None, [int]$MaxAttempts = 3, [int]$SecondsBetweenAttempts = 1) {
$Attempts = 0
while ($true) {
@@ -176,7 +190,7 @@ function Invoke-With-Retry([ScriptBlock]$ScriptBlock, [int]$MaxAttempts = 3, [in
}
catch {
$Attempts++
if ($Attempts -lt $MaxAttempts) {
if (($Attempts -lt $MaxAttempts) -and -not $cancellationToken.IsCancellationRequested) {
Start-Sleep $SecondsBetweenAttempts
}
else {
@@ -205,7 +219,7 @@ function Get-Machine-Architecture() {
function Get-CLIArchitecture-From-Architecture([string]$Architecture) {
Say-Invocation $MyInvocation
switch ($Architecture.ToLower()) {
switch ($Architecture.ToLowerInvariant()) {
{ $_ -eq "<auto>" } { return Get-CLIArchitecture-From-Architecture $(Get-Machine-Architecture) }
{ ($_ -eq "amd64") -or ($_ -eq "x64") } { return "x64" }
{ $_ -eq "x86" } { return "x86" }
@@ -215,6 +229,53 @@ function Get-CLIArchitecture-From-Architecture([string]$Architecture) {
}
}
function Get-NormalizedQuality([string]$Quality) {
Say-Invocation $MyInvocation
if ([string]::IsNullOrEmpty($Quality)) {
return ""
}
switch ($Quality) {
{ @("daily", "signed", "validated", "preview") -contains $_ } { return $Quality.ToLowerInvariant() }
#ga quality is available without specifying quality, so normalizing it to empty
{ $_ -eq "ga" } { return "" }
default { throw "'$Quality' is not a supported value for -Quality option. Supported values are: daily, signed, validated, preview, ga. If you think this is a bug, report it at https://github.com/dotnet/install-scripts/issues." }
}
}
function Get-NormalizedChannel([string]$Channel) {
Say-Invocation $MyInvocation
if ([string]::IsNullOrEmpty($Channel)) {
return ""
}
if ($Channel.StartsWith('release/')) {
Say-Warning 'Using branch name with -Channel option is no longer supported with newer releases. Use -Quality option with a channel in X.Y format instead, such as "-Channel 5.0 -Quality Daily."'
}
switch ($Channel) {
{ $_ -eq "lts" } { return "LTS" }
{ $_ -eq "current" } { return "current" }
default { return $Channel.ToLowerInvariant() }
}
}
function Get-NormalizedProduct([string]$Runtime) {
Say-Invocation $MyInvocation
switch ($Runtime) {
{ $_ -eq "dotnet" } { return "dotnet-runtime" }
{ $_ -eq "aspnetcore" } { return "aspnetcore-runtime" }
{ $_ -eq "windowsdesktop" } { return "windowsdesktop-runtime" }
{ [string]::IsNullOrEmpty($_) } { return "dotnet-sdk" }
default { throw "'$Runtime' is not a supported value for -Runtime option, supported values are: dotnet, aspnetcore, windowsdesktop. If you think this is a bug, report it at https://github.com/dotnet/install-scripts/issues." }
}
}
# The version text returned from the feeds is a 1-line or 2-line string:
# For the SDK and the dotnet runtime (2 lines):
# Line 1: # commit_hash
@@ -243,10 +304,11 @@ function Load-Assembly([string] $Assembly) {
}
}
function GetHTTPResponse([Uri] $Uri)
function GetHTTPResponse([Uri] $Uri, [bool]$HeaderOnly, [bool]$DisableRedirect, [bool]$DisableFeedCredential)
{
Invoke-With-Retry(
{
$cts = New-Object System.Threading.CancellationTokenSource
$downloadScript = {
$HttpClient = $null
@@ -270,32 +332,53 @@ function GetHTTPResponse([Uri] $Uri)
}
}
$HttpClientHandler = New-Object System.Net.Http.HttpClientHandler
if($ProxyAddress) {
$HttpClientHandler = New-Object System.Net.Http.HttpClientHandler
$HttpClientHandler.Proxy = New-Object System.Net.WebProxy -Property @{
Address=$ProxyAddress;
UseDefaultCredentials=$ProxyUseDefaultCredentials;
BypassList = $ProxyBypassList;
}
$HttpClient = New-Object System.Net.Http.HttpClient -ArgumentList $HttpClientHandler
}
if ($DisableRedirect)
{
$HttpClientHandler.AllowAutoRedirect = $false
}
$HttpClient = New-Object System.Net.Http.HttpClient -ArgumentList $HttpClientHandler
# Default timeout for HttpClient is 100s. For a 50 MB download this assumes 500 KB/s average, any less will time out
# Defaulting to 20 minutes allows it to work over much slower connections.
$HttpClient.Timeout = New-TimeSpan -Seconds $DownloadTimeout
if ($HeaderOnly){
$completionOption = [System.Net.Http.HttpCompletionOption]::ResponseHeadersRead
}
else {
$HttpClient = New-Object System.Net.Http.HttpClient
$completionOption = [System.Net.Http.HttpCompletionOption]::ResponseContentRead
}
# Default timeout for HttpClient is 100s. For a 50 MB download this assumes 500 KB/s average, any less will time out
# 20 minutes allows it to work over much slower connections.
$HttpClient.Timeout = New-TimeSpan -Minutes 20
$Task = $HttpClient.GetAsync("${Uri}${FeedCredential}").ConfigureAwait("false");
if ($DisableFeedCredential) {
$UriWithCredential = $Uri
}
else {
$UriWithCredential = "${Uri}${FeedCredential}"
}
$Task = $HttpClient.GetAsync("$UriWithCredential", $completionOption).ConfigureAwait("false");
$Response = $Task.GetAwaiter().GetResult();
if (($null -eq $Response) -or (-not ($Response.IsSuccessStatusCode))) {
if (($null -eq $Response) -or ((-not $HeaderOnly) -and (-not ($Response.IsSuccessStatusCode)))) {
# The feed credential is potentially sensitive info. Do not log FeedCredential to console output.
$DownloadException = [System.Exception] "Unable to download $Uri."
if ($null -ne $Response) {
$DownloadException.Data["StatusCode"] = [int] $Response.StatusCode
$DownloadException.Data["ErrorMessage"] = "Unable to download $Uri. Returned HTTP status code: " + $DownloadException.Data["StatusCode"]
if (404 -eq [int] $Response.StatusCode)
{
$cts.Cancel()
}
}
throw $DownloadException
@@ -323,11 +406,22 @@ function GetHTTPResponse([Uri] $Uri)
throw $DownloadException
}
finally {
if ($HttpClient -ne $null) {
if ($null -ne $HttpClient) {
$HttpClient.Dispose()
}
}
})
}
try {
return Invoke-With-Retry $downloadScript $cts.Token
}
finally
{
if ($null -ne $cts)
{
$cts.Dispose()
}
}
}
function Get-Latest-Version-Info([string]$AzureFeed, [string]$Channel) {
@@ -349,6 +443,9 @@ function Get-Latest-Version-Info([string]$AzureFeed, [string]$Channel) {
else {
throw "Invalid value for `$Runtime"
}
Say-Verbose "Constructed latest.version URL: $VersionFileUrl"
try {
$Response = GetHTTPResponse -Uri $VersionFileUrl
}
@@ -411,7 +508,7 @@ function Get-Specific-Version-From-Version([string]$AzureFeed, [string]$Channel,
Say-Invocation $MyInvocation
if (-not $JSonFile) {
if ($Version.ToLower() -eq "latest") {
if ($Version.ToLowerInvariant() -eq "latest") {
$LatestVersionInfo = Get-Latest-Version-Info -AzureFeed $AzureFeed -Channel $Channel
return $LatestVersionInfo.Version
}
@@ -478,57 +575,115 @@ function Get-LegacyDownload-Link([string]$AzureFeed, [string]$SpecificVersion, [
return $PayloadURL
}
function Get-Product-Version([string]$AzureFeed, [string]$SpecificVersion) {
function Get-Product-Version([string]$AzureFeed, [string]$SpecificVersion, [string]$PackageDownloadLink) {
Say-Invocation $MyInvocation
if ($Runtime -eq "dotnet") {
$ProductVersionTxtURL = "$AzureFeed/Runtime/$SpecificVersion/productVersion.txt"
}
elseif ($Runtime -eq "aspnetcore") {
$ProductVersionTxtURL = "$AzureFeed/aspnetcore/Runtime/$SpecificVersion/productVersion.txt"
}
elseif ($Runtime -eq "windowsdesktop") {
# The windows desktop runtime is part of the core runtime layout prior to 5.0
$ProductVersionTxtURL = "$AzureFeed/Runtime/$SpecificVersion/productVersion.txt"
if ($SpecificVersion -match '^(\d+)\.(.*)')
{
$majorVersion = [int]$Matches[1]
if ($majorVersion -ge 5)
{
$ProductVersionTxtURL = "$AzureFeed/WindowsDesktop/$SpecificVersion/productVersion.txt"
# Try to get the version number, using the productVersion.txt file located next to the installer file.
$ProductVersionTxtURLs = (Get-Product-Version-Url $AzureFeed $SpecificVersion $PackageDownloadLink -Flattened $true),
(Get-Product-Version-Url $AzureFeed $SpecificVersion $PackageDownloadLink -Flattened $false)
Foreach ($ProductVersionTxtURL in $ProductVersionTxtURLs) {
Say-Verbose "Checking for the existence of $ProductVersionTxtURL"
try {
$productVersionResponse = GetHTTPResponse($productVersionTxtUrl)
if ($productVersionResponse.StatusCode -eq 200) {
$productVersion = $productVersionResponse.Content.ReadAsStringAsync().Result.Trim()
if ($productVersion -ne $SpecificVersion)
{
Say "Using alternate version $productVersion found in $ProductVersionTxtURL"
}
return $productVersion
}
else {
Say-Verbose "Got StatusCode $($productVersionResponse.StatusCode) when trying to get productVersion.txt at $productVersionTxtUrl."
}
}
}
elseif (-not $Runtime) {
$ProductVersionTxtURL = "$AzureFeed/Sdk/$SpecificVersion/productVersion.txt"
}
else {
throw "Invalid value '$Runtime' specified for `$Runtime"
catch {
Say-Verbose "Could not read productVersion.txt at $productVersionTxtUrl (Exception: '$($_.Exception.Message)'. )"
}
}
Say-Verbose "Checking for existence of $ProductVersionTxtURL"
# Getting the version number with productVersion.txt has failed. Try parsing the download link for a version number.
if ([string]::IsNullOrEmpty($PackageDownloadLink))
{
Say-Verbose "Using the default value '$SpecificVersion' as the product version."
return $SpecificVersion
}
try {
$productVersionResponse = GetHTTPResponse($productVersionTxtUrl)
$productVersion = Get-ProductVersionFromDownloadLink $PackageDownloadLink $SpecificVersion
return $productVersion
}
if ($productVersionResponse.StatusCode -eq 200) {
$productVersion = $productVersionResponse.Content.ReadAsStringAsync().Result.Trim()
if ($productVersion -ne $SpecificVersion)
{
Say "Using alternate version $productVersion found in $ProductVersionTxtURL"
}
function Get-Product-Version-Url([string]$AzureFeed, [string]$SpecificVersion, [string]$PackageDownloadLink, [bool]$Flattened) {
Say-Invocation $MyInvocation
return $productVersion
$majorVersion=$null
if ($SpecificVersion -match '^(\d+)\.(.*)') {
$majorVersion = $Matches[1] -as[int]
}
$pvFileName='productVersion.txt'
if($Flattened) {
if(-not $Runtime) {
$pvFileName='sdk-productVersion.txt'
}
elseif($Runtime -eq "dotnet") {
$pvFileName='runtime-productVersion.txt'
}
else {
Say-Verbose "Got StatusCode $($productVersionResponse.StatusCode) trying to get productVersion.txt at $productVersionTxtUrl, so using default value of $SpecificVersion"
$productVersion = $SpecificVersion
$pvFileName="$Runtime-productVersion.txt"
}
} catch {
Say-Verbose "Could not read productVersion.txt at $productVersionTxtUrl, so using default value of $SpecificVersion (Exception: '$($_.Exception.Message)' )"
$productVersion = $SpecificVersion
}
if ([string]::IsNullOrEmpty($PackageDownloadLink)) {
if ($Runtime -eq "dotnet") {
$ProductVersionTxtURL = "$AzureFeed/Runtime/$SpecificVersion/$pvFileName"
}
elseif ($Runtime -eq "aspnetcore") {
$ProductVersionTxtURL = "$AzureFeed/aspnetcore/Runtime/$SpecificVersion/$pvFileName"
}
elseif ($Runtime -eq "windowsdesktop") {
# The windows desktop runtime is part of the core runtime layout prior to 5.0
$ProductVersionTxtURL = "$AzureFeed/Runtime/$SpecificVersion/$pvFileName"
if ($majorVersion -ne $null -and $majorVersion -ge 5) {
$ProductVersionTxtURL = "$AzureFeed/WindowsDesktop/$SpecificVersion/$pvFileName"
}
}
elseif (-not $Runtime) {
$ProductVersionTxtURL = "$AzureFeed/Sdk/$SpecificVersion/$pvFileName"
}
else {
throw "Invalid value '$Runtime' specified for `$Runtime"
}
}
else {
$ProductVersionTxtURL = $PackageDownloadLink.Substring(0, $PackageDownloadLink.LastIndexOf("/")) + "/$pvFileName"
}
Say-Verbose "Constructed productVersion link: $ProductVersionTxtURL"
return $ProductVersionTxtURL
}
function Get-ProductVersionFromDownloadLink([string]$PackageDownloadLink, [string]$SpecificVersion)
{
Say-Invocation $MyInvocation
#product specific version follows the product name
#for filename 'dotnet-sdk-3.1.404-win-x64.zip': the product version is 3.1.400
$filename = $PackageDownloadLink.Substring($PackageDownloadLink.LastIndexOf("/") + 1)
$filenameParts = $filename.Split('-')
if ($filenameParts.Length -gt 2)
{
$productVersion = $filenameParts[2]
Say-Verbose "Extracted product version '$productVersion' from download link '$PackageDownloadLink'."
}
else {
Say-Verbose "Using the default value '$SpecificVersion' as the product version."
$productVersion = $SpecificVersion
}
return $productVersion
}
@@ -702,15 +857,150 @@ function Prepend-Sdk-InstallRoot-To-Path([string]$InstallRoot, [string]$BinFolde
}
}
function Get-AkaMSDownloadLink([string]$Channel, [string]$Quality, [bool]$Internal, [string]$Product, [string]$Architecture) {
Say-Invocation $MyInvocation
#quality is not supported for LTS or current channel
if (![string]::IsNullOrEmpty($Quality) -and (@("LTS", "current") -contains $Channel)) {
$Quality = ""
Say-Warning "Specifying quality for current or LTS channel is not supported, the quality will be ignored."
}
Say-Verbose "Retrieving primary payload URL from aka.ms link for channel: '$Channel', quality: '$Quality' product: '$Product', os: 'win', architecture: '$Architecture'."
#construct aka.ms link
$akaMsLink = "https://aka.ms/dotnet"
if ($Internal) {
$akaMsLink += "/internal"
}
$akaMsLink += "/$Channel"
if (-not [string]::IsNullOrEmpty($Quality)) {
$akaMsLink +="/$Quality"
}
$akaMsLink +="/$Product-win-$Architecture.zip"
Say-Verbose "Constructed aka.ms link: '$akaMsLink'."
$akaMsDownloadLink=$null
for ($maxRedirections = 9; $maxRedirections -ge 0; $maxRedirections--)
{
#get HTTP response
#do not pass credentials as a part of the $akaMsLink and do not apply credentials in the GetHTTPResponse function
#otherwise the redirect link would have credentials as well
#it would result in applying credentials twice to the resulting link and thus breaking it, and in echoing credentials to the output as a part of redirect link
$Response= GetHTTPResponse -Uri $akaMsLink -HeaderOnly $true -DisableRedirect $true -DisableFeedCredential $true
Say-Verbose "Received response:`n$Response"
if ([string]::IsNullOrEmpty($Response)) {
Say-Verbose "The link '$akaMsLink' is not valid: failed to get redirect location. The resource is not available."
return $null
}
#if HTTP code is 301 (Moved Permanently), the redirect link exists
if ($Response.StatusCode -eq 301)
{
try {
$akaMsDownloadLink = $Response.Headers.GetValues("Location")[0]
if ([string]::IsNullOrEmpty($akaMsDownloadLink)) {
Say-Verbose "The link '$akaMsLink' is not valid: server returned 301 (Moved Permanently), but the headers do not contain the redirect location."
return $null
}
Say-Verbose "The redirect location retrieved: '$akaMsDownloadLink'."
# This may yet be a link to another redirection. Attempt to retrieve the page again.
$akaMsLink = $akaMsDownloadLink
continue
}
catch {
Say-Verbose "The link '$akaMsLink' is not valid: failed to get redirect location."
return $null
}
}
elseif ((($Response.StatusCode -lt 300) -or ($Response.StatusCode -ge 400)) -and (-not [string]::IsNullOrEmpty($akaMsDownloadLink)))
{
# Redirections have ended.
return $akaMsDownloadLink
}
Say-Verbose "The link '$akaMsLink' is not valid: failed to retrieve the redirection location."
return $null
}
Say-Verbose "Aka.ms links have redirected more than the maximum allowed redirections. This may be caused by a cyclic redirection of aka.ms links."
return $null
}
Say "Note that the intended use of this script is for Continuous Integration (CI) scenarios, where:"
Say "- The SDK needs to be installed without user interaction and without admin rights."
Say "- The SDK installation doesn't need to persist across multiple CI runs."
Say "To set up a development environment or to run apps, use installers rather than this script. Visit https://dotnet.microsoft.com/download to get the installer.`r`n"
if ($Internal -and [string]::IsNullOrWhitespace($FeedCredential)) {
$message = "Provide credentials via -FeedCredential parameter."
if ($DryRun) {
Say-Warning "$message"
} else {
throw "$message"
}
}
#FeedCredential should start with "?", for it to be added to the end of the link.
#adding "?" at the beginning of the FeedCredential if needed.
if ((![string]::IsNullOrWhitespace($FeedCredential)) -and ($FeedCredential[0] -ne '?')) {
$FeedCredential = "?" + $FeedCredential
}
$CLIArchitecture = Get-CLIArchitecture-From-Architecture $Architecture
$SpecificVersion = Get-Specific-Version-From-Version -AzureFeed $AzureFeed -Channel $Channel -Version $Version -JSonFile $JSonFile
$DownloadLink, $EffectiveVersion = Get-Download-Link -AzureFeed $AzureFeed -SpecificVersion $SpecificVersion -CLIArchitecture $CLIArchitecture
$LegacyDownloadLink = Get-LegacyDownload-Link -AzureFeed $AzureFeed -SpecificVersion $SpecificVersion -CLIArchitecture $CLIArchitecture
$NormalizedQuality = Get-NormalizedQuality $Quality
Say-Verbose "Normalized quality: '$NormalizedQuality'"
$NormalizedChannel = Get-NormalizedChannel $Channel
Say-Verbose "Normalized channel: '$NormalizedChannel'"
$NormalizedProduct = Get-NormalizedProduct $Runtime
Say-Verbose "Normalized product: '$NormalizedProduct'"
$DownloadLink = $null
#try to get download location from aka.ms link
#not applicable when exact version is specified via command or json file
if ([string]::IsNullOrEmpty($JSonFile) -and ($Version -eq "latest")) {
$AkaMsDownloadLink = Get-AkaMSDownloadLink -Channel $NormalizedChannel -Quality $NormalizedQuality -Internal $Internal -Product $NormalizedProduct -Architecture $CLIArchitecture
if ([string]::IsNullOrEmpty($AkaMsDownloadLink)){
if (-not [string]::IsNullOrEmpty($NormalizedQuality)) {
# if quality is specified - exit with error - there is no fallback approach
Say-Error "Failed to locate the latest version in the channel '$NormalizedChannel' with '$NormalizedQuality' quality for '$NormalizedProduct', os: 'win', architecture: '$CLIArchitecture'."
Say-Error "Refer to: https://aka.ms/dotnet-os-lifecycle for information on .NET Core support."
throw "aka.ms link resolution failure"
}
Say-Verbose "Falling back to latest.version file approach."
}
else {
Say-Verbose "Retrieved primary named payload URL from aka.ms link: '$AkaMsDownloadLink'."
$DownloadLink = $AkaMsDownloadLink
Say-Verbose "Downloading using legacy url will not be attempted."
$LegacyDownloadLink = $null
#get version from the path
$pathParts = $DownloadLink.Split('/')
if ($pathParts.Length -ge 2) {
$SpecificVersion = $pathParts[$pathParts.Length - 2]
Say-Verbose "Version: '$SpecificVersion'."
}
else {
Say-Error "Failed to extract the version from download link '$DownloadLink'."
}
#retrieve effective (product) version
$EffectiveVersion = Get-Product-Version -AzureFeed $AzureFeed -SpecificVersion $SpecificVersion -PackageDownloadLink $DownloadLink
Say-Verbose "Product version: '$EffectiveVersion'."
}
}
if ([string]::IsNullOrEmpty($DownloadLink)) {
$SpecificVersion = Get-Specific-Version-From-Version -AzureFeed $AzureFeed -Channel $Channel -Version $Version -JSonFile $JSonFile
$DownloadLink, $EffectiveVersion = Get-Download-Link -AzureFeed $AzureFeed -SpecificVersion $SpecificVersion -CLIArchitecture $CLIArchitecture
$LegacyDownloadLink = Get-LegacyDownload-Link -AzureFeed $AzureFeed -SpecificVersion $SpecificVersion -CLIArchitecture $CLIArchitecture
}
$InstallRoot = Resolve-Installation-Path $InstallDir
Say-Verbose "InstallRoot: $InstallRoot"
@@ -718,9 +1008,9 @@ $ScriptName = $MyInvocation.MyCommand.Name
if ($DryRun) {
Say "Payload URLs:"
Say "Primary named payload URL: $DownloadLink"
Say "Primary named payload URL: ${DownloadLink}"
if ($LegacyDownloadLink) {
Say "Legacy named payload URL: $LegacyDownloadLink"
Say "Legacy named payload URL: ${LegacyDownloadLink}"
}
$RepeatableCommand = ".\$ScriptName -Version `"$SpecificVersion`" -InstallDir `"$InstallRoot`" -Architecture `"$CLIArchitecture`""
if ($Runtime -eq "dotnet") {
@@ -729,11 +1019,20 @@ if ($DryRun) {
elseif ($Runtime -eq "aspnetcore") {
$RepeatableCommand+=" -Runtime `"aspnetcore`""
}
if (-not [string]::IsNullOrEmpty($NormalizedQuality))
{
$RepeatableCommand+=" -Quality `"$NormalizedQuality`""
}
foreach ($key in $MyInvocation.BoundParameters.Keys) {
if (-not (@("Architecture","Channel","DryRun","InstallDir","Runtime","SharedRuntime","Version") -contains $key)) {
if (-not (@("Architecture","Channel","DryRun","InstallDir","Runtime","SharedRuntime","Version","Quality","FeedCredential") -contains $key)) {
$RepeatableCommand+=" -$key `"$($MyInvocation.BoundParameters[$key])`""
}
}
if ($MyInvocation.BoundParameters.Keys -contains "FeedCredential") {
$RepeatableCommand+=" -FeedCredential `"<feedCredential>`""
}
Say "Repeatable invocation: $RepeatableCommand"
if ($SpecificVersion -ne $EffectiveVersion)
{
@@ -779,9 +1078,16 @@ if ($isAssetInstalled) {
New-Item -ItemType Directory -Force -Path $InstallRoot | Out-Null
$installDrive = $((Get-Item $InstallRoot).PSDrive.Name);
$diskInfo = Get-PSDrive -Name $installDrive
if ($diskInfo.Free / 1MB -le 100) {
$installDrive = $((Get-Item $InstallRoot -Force).PSDrive.Name);
$diskInfo = $null
try{
$diskInfo = Get-PSDrive -Name $installDrive
}
catch{
Say-Warning "Failed to check the disk space. Installation will continue, but it may fail if you do not have enough disk space."
}
if ( ($diskInfo -ne $null) -and ($diskInfo.Free / 1MB -le 100)) {
throw "There is not enough disk space on drive ${installDrive}:"
}
@@ -901,43 +1207,44 @@ Prepend-Sdk-InstallRoot-To-Path -InstallRoot $InstallRoot -BinFolderRelativePath
Say "Note that the script does not resolve dependencies during installation."
Say "To check the list of dependencies, go to https://docs.microsoft.com/dotnet/core/install/windows#dependencies"
Say "Installation finished"
# SIG # Begin signature block
# MIIjjwYJKoZIhvcNAQcCoIIjgDCCI3wCAQExDzANBglghkgBZQMEAgEFADB5Bgor
# MIIjhgYJKoZIhvcNAQcCoIIjdzCCI3MCAQExDzANBglghkgBZQMEAgEFADB5Bgor
# BgEEAYI3AgEEoGswaTA0BgorBgEEAYI3AgEeMCYCAwEAAAQQH8w7YFlLCE63JNLG
# KX7zUQIBAAIBAAIBAAIBAAIBADAxMA0GCWCGSAFlAwQCAQUABCCNsnhcJvx/hXmM
# w8KjuvvIMDBFonhg9XJFc1QwfTyH4aCCDYEwggX/MIID56ADAgECAhMzAAABh3IX
# chVZQMcJAAAAAAGHMA0GCSqGSIb3DQEBCwUAMH4xCzAJBgNVBAYTAlVTMRMwEQYD
# KX7zUQIBAAIBAAIBAAIBAAIBADAxMA0GCWCGSAFlAwQCAQUABCDO8obeUp97UA1H
# qy3NSLDwxxLLlEbGxeCzMOcKVT/3eKCCDYEwggX/MIID56ADAgECAhMzAAAB32vw
# LpKnSrTQAAAAAAHfMA0GCSqGSIb3DQEBCwUAMH4xCzAJBgNVBAYTAlVTMRMwEQYD
# VQQIEwpXYXNoaW5ndG9uMRAwDgYDVQQHEwdSZWRtb25kMR4wHAYDVQQKExVNaWNy
# b3NvZnQgQ29ycG9yYXRpb24xKDAmBgNVBAMTH01pY3Jvc29mdCBDb2RlIFNpZ25p
# bmcgUENBIDIwMTEwHhcNMjAwMzA0MTgzOTQ3WhcNMjEwMzAzMTgzOTQ3WjB0MQsw
# bmcgUENBIDIwMTEwHhcNMjAxMjE1MjEzMTQ1WhcNMjExMjAyMjEzMTQ1WjB0MQsw
# CQYDVQQGEwJVUzETMBEGA1UECBMKV2FzaGluZ3RvbjEQMA4GA1UEBxMHUmVkbW9u
# ZDEeMBwGA1UEChMVTWljcm9zb2Z0IENvcnBvcmF0aW9uMR4wHAYDVQQDExVNaWNy
# b3NvZnQgQ29ycG9yYXRpb24wggEiMA0GCSqGSIb3DQEBAQUAA4IBDwAwggEKAoIB
# AQDOt8kLc7P3T7MKIhouYHewMFmnq8Ayu7FOhZCQabVwBp2VS4WyB2Qe4TQBT8aB
# znANDEPjHKNdPT8Xz5cNali6XHefS8i/WXtF0vSsP8NEv6mBHuA2p1fw2wB/F0dH
# sJ3GfZ5c0sPJjklsiYqPw59xJ54kM91IOgiO2OUzjNAljPibjCWfH7UzQ1TPHc4d
# weils8GEIrbBRb7IWwiObL12jWT4Yh71NQgvJ9Fn6+UhD9x2uk3dLj84vwt1NuFQ
# itKJxIV0fVsRNR3abQVOLqpDugbr0SzNL6o8xzOHL5OXiGGwg6ekiXA1/2XXY7yV
# Fc39tledDtZjSjNbex1zzwSXAgMBAAGjggF+MIIBejAfBgNVHSUEGDAWBgorBgEE
# AYI3TAgBBggrBgEFBQcDAzAdBgNVHQ4EFgQUhov4ZyO96axkJdMjpzu2zVXOJcsw
# AQC2uxlZEACjqfHkuFyoCwfL25ofI9DZWKt4wEj3JBQ48GPt1UsDv834CcoUUPMn
# s/6CtPoaQ4Thy/kbOOg/zJAnrJeiMQqRe2Lsdb/NSI2gXXX9lad1/yPUDOXo4GNw
# PjXq1JZi+HZV91bUr6ZjzePj1g+bepsqd/HC1XScj0fT3aAxLRykJSzExEBmU9eS
# yuOwUuq+CriudQtWGMdJU650v/KmzfM46Y6lo/MCnnpvz3zEL7PMdUdwqj/nYhGG
# 3UVILxX7tAdMbz7LN+6WOIpT1A41rwaoOVnv+8Ua94HwhjZmu1S73yeV7RZZNxoh
# EegJi9YYssXa7UZUUkCCA+KnAgMBAAGjggF+MIIBejAfBgNVHSUEGDAWBgorBgEE
# AYI3TAgBBggrBgEFBQcDAzAdBgNVHQ4EFgQUOPbML8IdkNGtCfMmVPtvI6VZ8+Mw
# UAYDVR0RBEkwR6RFMEMxKTAnBgNVBAsTIE1pY3Jvc29mdCBPcGVyYXRpb25zIFB1
# ZXJ0byBSaWNvMRYwFAYDVQQFEw0yMzAwMTIrNDU4Mzg1MB8GA1UdIwQYMBaAFEhu
# ZXJ0byBSaWNvMRYwFAYDVQQFEw0yMzAwMTIrNDYzMDA5MB8GA1UdIwQYMBaAFEhu
# ZOVQBdOCqhc3NyK1bajKdQKVMFQGA1UdHwRNMEswSaBHoEWGQ2h0dHA6Ly93d3cu
# bWljcm9zb2Z0LmNvbS9wa2lvcHMvY3JsL01pY0NvZFNpZ1BDQTIwMTFfMjAxMS0w
# Ny0wOC5jcmwwYQYIKwYBBQUHAQEEVTBTMFEGCCsGAQUFBzAChkVodHRwOi8vd3d3
# Lm1pY3Jvc29mdC5jb20vcGtpb3BzL2NlcnRzL01pY0NvZFNpZ1BDQTIwMTFfMjAx
# MS0wNy0wOC5jcnQwDAYDVR0TAQH/BAIwADANBgkqhkiG9w0BAQsFAAOCAgEAixmy
# S6E6vprWD9KFNIB9G5zyMuIjZAOuUJ1EK/Vlg6Fb3ZHXjjUwATKIcXbFuFC6Wr4K
# NrU4DY/sBVqmab5AC/je3bpUpjtxpEyqUqtPc30wEg/rO9vmKmqKoLPT37svc2NV
# BmGNl+85qO4fV/w7Cx7J0Bbqk19KcRNdjt6eKoTnTPHBHlVHQIHZpMxacbFOAkJr
# qAVkYZdz7ikNXTxV+GRb36tC4ByMNxE2DF7vFdvaiZP0CVZ5ByJ2gAhXMdK9+usx
# zVk913qKde1OAuWdv+rndqkAIm8fUlRnr4saSCg7cIbUwCCf116wUJ7EuJDg0vHe
# yhnCeHnBbyH3RZkHEi2ofmfgnFISJZDdMAeVZGVOh20Jp50XBzqokpPzeZ6zc1/g
# yILNyiVgE+RPkjnUQshd1f1PMgn3tns2Cz7bJiVUaqEO3n9qRFgy5JuLae6UweGf
# AeOo3dgLZxikKzYs3hDMaEtJq8IP71cX7QXe6lnMmXU/Hdfz2p897Zd+kU+vZvKI
# 3cwLfuVQgK2RZ2z+Kc3K3dRPz2rXycK5XCuRZmvGab/WbrZiC7wJQapgBodltMI5
# GMdFrBg9IeF7/rP4EqVQXeKtevTlZXjpuNhhjuR+2DMt/dWufjXpiW91bo3aH6Ea
# jOALXmoxgltCp1K7hrS6gmsvj94cLRf50QQ4U8Qwggd6MIIFYqADAgECAgphDpDS
# MS0wNy0wOC5jcnQwDAYDVR0TAQH/BAIwADANBgkqhkiG9w0BAQsFAAOCAgEAnnqH
# tDyYUFaVAkvAK0eqq6nhoL95SZQu3RnpZ7tdQ89QR3++7A+4hrr7V4xxmkB5BObS
# 0YK+MALE02atjwWgPdpYQ68WdLGroJZHkbZdgERG+7tETFl3aKF4KpoSaGOskZXp
# TPnCaMo2PXoAMVMGpsQEQswimZq3IQ3nRQfBlJ0PoMMcN/+Pks8ZTL1BoPYsJpok
# t6cql59q6CypZYIwgyJ892HpttybHKg1ZtQLUlSXccRMlugPgEcNZJagPEgPYni4
# b11snjRAgf0dyQ0zI9aLXqTxWUU5pCIFiPT0b2wsxzRqCtyGqpkGM8P9GazO8eao
# mVItCYBcJSByBx/pS0cSYwBBHAZxJODUqxSXoSGDvmTfqUJXntnWkL4okok1FiCD
# Z4jpyXOQunb6egIXvkgQ7jb2uO26Ow0m8RwleDvhOMrnHsupiOPbozKroSa6paFt
# VSh89abUSooR8QdZciemmoFhcWkEwFg4spzvYNP4nIs193261WyTaRMZoceGun7G
# CT2Rl653uUj+F+g94c63AhzSq4khdL4HlFIP2ePv29smfUnHtGq6yYFDLnT0q/Y+
# Di3jwloF8EWkkHRtSuXlFUbTmwr/lDDgbpZiKhLS7CBTDj32I0L5i532+uHczw82
# oZDmYmYmIUSMbZOgS65h797rj5JJ6OkeEUJoAVwwggd6MIIFYqADAgECAgphDpDS
# AAAAAAADMA0GCSqGSIb3DQEBCwUAMIGIMQswCQYDVQQGEwJVUzETMBEGA1UECBMK
# V2FzaGluZ3RvbjEQMA4GA1UEBxMHUmVkbW9uZDEeMBwGA1UEChMVTWljcm9zb2Z0
# IENvcnBvcmF0aW9uMTIwMAYDVQQDEylNaWNyb3NvZnQgUm9vdCBDZXJ0aWZpY2F0
@@ -977,119 +1284,119 @@ Say "Installation finished"
# xw4o7t5lL+yX9qFcltgA1qFGvVnzl6UJS0gQmYAf0AApxbGbpT9Fdx41xtKiop96
# eiL6SJUfq/tHI4D1nvi/a7dLl+LrdXga7Oo3mXkYS//WsyNodeav+vyL6wuA6mk7
# r/ww7QRMjt/fdW1jkT3RnVZOT7+AVyKheBEyIXrvQQqxP/uozKRdwaGIm1dxVk5I
# RcBCyZt2WwqASGv9eZ/BvW1taslScxMNelDNMYIVZDCCFWACAQEwgZUwfjELMAkG
# RcBCyZt2WwqASGv9eZ/BvW1taslScxMNelDNMYIVWzCCFVcCAQEwgZUwfjELMAkG
# A1UEBhMCVVMxEzARBgNVBAgTCldhc2hpbmd0b24xEDAOBgNVBAcTB1JlZG1vbmQx
# HjAcBgNVBAoTFU1pY3Jvc29mdCBDb3Jwb3JhdGlvbjEoMCYGA1UEAxMfTWljcm9z
# b2Z0IENvZGUgU2lnbmluZyBQQ0EgMjAxMQITMwAAAYdyF3IVWUDHCQAAAAABhzAN
# b2Z0IENvZGUgU2lnbmluZyBQQ0EgMjAxMQITMwAAAd9r8C6Sp0q00AAAAAAB3zAN
# BglghkgBZQMEAgEFAKCBrjAZBgkqhkiG9w0BCQMxDAYKKwYBBAGCNwIBBDAcBgor
# BgEEAYI3AgELMQ4wDAYKKwYBBAGCNwIBFTAvBgkqhkiG9w0BCQQxIgQgpT/bxWwe
# aW0EinKMWCAzDXUjwXkIHldYzR6lw4/1Pc0wQgYKKwYBBAGCNwIBDDE0MDKgFIAS
# BgEEAYI3AgELMQ4wDAYKKwYBBAGCNwIBFTAvBgkqhkiG9w0BCQQxIgQg7EExOCvj
# ZDkub0Fc7HAVivJIlz+Omt11fROkTFKR+KQwQgYKKwYBBAGCNwIBDDE0MDKgFIAS
# AE0AaQBjAHIAbwBzAG8AZgB0oRqAGGh0dHA6Ly93d3cubWljcm9zb2Z0LmNvbTAN
# BgkqhkiG9w0BAQEFAASCAQCHd7sSQVq0YDg8QDx6/kLWn3s6jtvvIDCCgsO9spHM
# quPd4FPbG67DCsKDClekQs52qrtRO3Zo+JMnCw4j3bS+gZHzeJr2shbftOrpsFoD
# l7OPcUmtrqul9dkQCOp8t0MP3ls0n96/YyNy6lz4BAlTdkdDx957uAxalKaCIBzb
# R9QyppOKIfNFvwD4EI5KI6tpmSy/uH8SrRg7ZExAYZl6J6R18WkL7KHn649lPoAQ
# ujwrIXH10xOJops45ILGzKWQcHmCzLJGYapL4VHUuK+73nT+9ZROGHdk/PyvIcdw
# iERa+C06v305t3DA+CuHFy1tvyw7IFF6RVbLZPwxrJjToYIS7jCCEuoGCisGAQQB
# gjcDAwExghLaMIIS1gYJKoZIhvcNAQcCoIISxzCCEsMCAQMxDzANBglghkgBZQME
# AgEFADCCAVUGCyqGSIb3DQEJEAEEoIIBRASCAUAwggE8AgEBBgorBgEEAYRZCgMB
# MDEwDQYJYIZIAWUDBAIBBQAEIOCaTmvM1AP0WaEVqzKaaCu/R+bTlR4kCrM/ZXsb
# /eNOAgZgGeLsMwsYEzIwMjEwMjAzMjExNzQ5LjU5MVowBIACAfSggdSkgdEwgc4x
# BgkqhkiG9w0BAQEFAASCAQBPvflyU+HQEIM4XvImAK2eQp74ABDaxeWVbl21ZF0L
# cKf5GkIzdvlCoKxforXBkRIBtLxU9Sb4ff+AO3jiw2M36ZLwZQRYcn003FTR3BAY
# 3XVGdxb64vCEMUb6Dzh9Hb1lmJjyn2NHsB514ZuUez4v/UIsV9e3TV731kY6gcQV
# 3ridn3IKj3RMJGtPD5qxVMZohvVIy1Xiz4n/9BpBkk4HHsSuGPmBj08jWnIcI5TF
# eXYC4LonTOFZLcsKiW4eMRaDt0fXxJjhQF8DSJX3PAxCC73YkPrJJyXAkGNHcXCg
# 2Lyi9Yd6oV0nbJlCSLWK7jcn5lxUIgsrb+U6F8yXoNsXoYIS5TCCEuEGCisGAQQB
# gjcDAwExghLRMIISzQYJKoZIhvcNAQcCoIISvjCCEroCAQMxDzANBglghkgBZQME
# AgEFADCCAVEGCyqGSIb3DQEJEAEEoIIBQASCATwwggE4AgEBBgorBgEEAYRZCgMB
# MDEwDQYJYIZIAWUDBAIBBQAEIEwFz8qgqbYKKOXyaZMOTE0/ve/rpdJjnq1eyDVG
# mWB6AgZhHpsDT9wYEzIwMjEwODIzMTUxNzA2LjAyMVowBIACAfSggdCkgc0wgcox
# CzAJBgNVBAYTAlVTMRMwEQYDVQQIEwpXYXNoaW5ndG9uMRAwDgYDVQQHEwdSZWRt
# b25kMR4wHAYDVQQKExVNaWNyb3NvZnQgQ29ycG9yYXRpb24xKTAnBgNVBAsTIE1p
# Y3Jvc29mdCBPcGVyYXRpb25zIFB1ZXJ0byBSaWNvMSYwJAYDVQQLEx1UaGFsZXMg
# VFNTIEVTTjo4OTdBLUUzNTYtMTcwMTElMCMGA1UEAxMcTWljcm9zb2Z0IFRpbWUt
# U3RhbXAgU2VydmljZaCCDkEwggT1MIID3aADAgECAhMzAAABLCKvRZd1+RvuAAAA
# AAEsMA0GCSqGSIb3DQEBCwUAMHwxCzAJBgNVBAYTAlVTMRMwEQYDVQQIEwpXYXNo
# aW5ndG9uMRAwDgYDVQQHEwdSZWRtb25kMR4wHAYDVQQKExVNaWNyb3NvZnQgQ29y
# cG9yYXRpb24xJjAkBgNVBAMTHU1pY3Jvc29mdCBUaW1lLVN0YW1wIFBDQSAyMDEw
# MB4XDTE5MTIxOTAxMTUwM1oXDTIxMDMxNzAxMTUwM1owgc4xCzAJBgNVBAYTAlVT
# MRMwEQYDVQQIEwpXYXNoaW5ndG9uMRAwDgYDVQQHEwdSZWRtb25kMR4wHAYDVQQK
# ExVNaWNyb3NvZnQgQ29ycG9yYXRpb24xKTAnBgNVBAsTIE1pY3Jvc29mdCBPcGVy
# YXRpb25zIFB1ZXJ0byBSaWNvMSYwJAYDVQQLEx1UaGFsZXMgVFNTIEVTTjo4OTdB
# LUUzNTYtMTcwMTElMCMGA1UEAxMcTWljcm9zb2Z0IFRpbWUtU3RhbXAgU2Vydmlj
# ZTCCASIwDQYJKoZIhvcNAQEBBQADggEPADCCAQoCggEBAPK1zgSSq+MxAYo3qpCt
# QDxSMPPJy6mm/wfEJNjNUnYtLFBwl1BUS5trEk/t41ldxITKehs+ABxYqo4Qxsg3
# Gy1ugKiwHAnYiiekfC+ZhptNFgtnDZIn45zC0AlVr/6UfLtsLcHCh1XElLUHfEC0
# nBuQcM/SpYo9e3l1qY5NdMgDGxCsmCKdiZfYXIu+U0UYIBhdzmSHnB3fxZOBVcr5
# htFHEBBNt/rFJlm/A4yb8oBsp+Uf0p5QwmO/bCcdqB15JpylOhZmWs0sUfJKlK9E
# rAhBwGki2eIRFKsQBdkXS9PWpF1w2gIJRvSkDEaCf+lbGTPdSzHSbfREWOF9wY3i
# Yj8CAwEAAaOCARswggEXMB0GA1UdDgQWBBRRahZSGfrCQhCyIyGH9DkiaW7L0zAf
# BgNVHSMEGDAWgBTVYzpcijGQ80N7fEYbxTNoWoVtVTBWBgNVHR8ETzBNMEugSaBH
# hkVodHRwOi8vY3JsLm1pY3Jvc29mdC5jb20vcGtpL2NybC9wcm9kdWN0cy9NaWNU
# aW1TdGFQQ0FfMjAxMC0wNy0wMS5jcmwwWgYIKwYBBQUHAQEETjBMMEoGCCsGAQUF
# BzAChj5odHRwOi8vd3d3Lm1pY3Jvc29mdC5jb20vcGtpL2NlcnRzL01pY1RpbVN0
# YVBDQV8yMDEwLTA3LTAxLmNydDAMBgNVHRMBAf8EAjAAMBMGA1UdJQQMMAoGCCsG
# AQUFBwMIMA0GCSqGSIb3DQEBCwUAA4IBAQBPFxHIwi4vAH49w9Svmz6K3tM55RlW
# 5pPeULXdut2Rqy6Ys0+VpZsbuaEoxs6Z1C3hMbkiqZFxxyltxJpuHTyGTg61zfNI
# F5n6RsYF3s7IElDXNfZznF1/2iWc6uRPZK8rxxUJ/7emYXZCYwuUY0XjsCpP9pbR
# RKeJi6r5arSyI+NfKxvgoM21JNt1BcdlXuAecdd/k8UjxCscffanoK2n6LFw1PcZ
# lEO7NId7o+soM2C0QY5BYdghpn7uqopB6ixyFIIkDXFub+1E7GmAEwfU6VwEHL7y
# 9rNE8bd+JrQs+yAtkkHy9FmXg/PsGq1daVzX1So7CJ6nyphpuHSN3VfTMIIGcTCC
# BFmgAwIBAgIKYQmBKgAAAAAAAjANBgkqhkiG9w0BAQsFADCBiDELMAkGA1UEBhMC
# VVMxEzARBgNVBAgTCldhc2hpbmd0b24xEDAOBgNVBAcTB1JlZG1vbmQxHjAcBgNV
# BAoTFU1pY3Jvc29mdCBDb3Jwb3JhdGlvbjEyMDAGA1UEAxMpTWljcm9zb2Z0IFJv
# b3QgQ2VydGlmaWNhdGUgQXV0aG9yaXR5IDIwMTAwHhcNMTAwNzAxMjEzNjU1WhcN
# MjUwNzAxMjE0NjU1WjB8MQswCQYDVQQGEwJVUzETMBEGA1UECBMKV2FzaGluZ3Rv
# bjEQMA4GA1UEBxMHUmVkbW9uZDEeMBwGA1UEChMVTWljcm9zb2Z0IENvcnBvcmF0
# aW9uMSYwJAYDVQQDEx1NaWNyb3NvZnQgVGltZS1TdGFtcCBQQ0EgMjAxMDCCASIw
# DQYJKoZIhvcNAQEBBQADggEPADCCAQoCggEBAKkdDbx3EYo6IOz8E5f1+n9plGt0
# VBDVpQoAgoX77XxoSyxfxcPlYcJ2tz5mK1vwFVMnBDEfQRsalR3OCROOfGEwWbEw
# RA/xYIiEVEMM1024OAizQt2TrNZzMFcmgqNFDdDq9UeBzb8kYDJYYEbyWEeGMoQe
# dGFnkV+BVLHPk0ySwcSmXdFhE24oxhr5hoC732H8RsEnHSRnEnIaIYqvS2SJUGKx
# Xf13Hz3wV3WsvYpCTUBR0Q+cBj5nf/VmwAOWRH7v0Ev9buWayrGo8noqCjHw2k4G
# kbaICDXoeByw6ZnNPOcvRLqn9NxkvaQBwSAJk3jN/LzAyURdXhacAQVPIk0CAwEA
# AaOCAeYwggHiMBAGCSsGAQQBgjcVAQQDAgEAMB0GA1UdDgQWBBTVYzpcijGQ80N7
# fEYbxTNoWoVtVTAZBgkrBgEEAYI3FAIEDB4KAFMAdQBiAEMAQTALBgNVHQ8EBAMC
# AYYwDwYDVR0TAQH/BAUwAwEB/zAfBgNVHSMEGDAWgBTV9lbLj+iiXGJo0T2UkFvX
# zpoYxDBWBgNVHR8ETzBNMEugSaBHhkVodHRwOi8vY3JsLm1pY3Jvc29mdC5jb20v
# cGtpL2NybC9wcm9kdWN0cy9NaWNSb29DZXJBdXRfMjAxMC0wNi0yMy5jcmwwWgYI
# KwYBBQUHAQEETjBMMEoGCCsGAQUFBzAChj5odHRwOi8vd3d3Lm1pY3Jvc29mdC5j
# b20vcGtpL2NlcnRzL01pY1Jvb0NlckF1dF8yMDEwLTA2LTIzLmNydDCBoAYDVR0g
# AQH/BIGVMIGSMIGPBgkrBgEEAYI3LgMwgYEwPQYIKwYBBQUHAgEWMWh0dHA6Ly93
# d3cubWljcm9zb2Z0LmNvbS9QS0kvZG9jcy9DUFMvZGVmYXVsdC5odG0wQAYIKwYB
# BQUHAgIwNB4yIB0ATABlAGcAYQBsAF8AUABvAGwAaQBjAHkAXwBTAHQAYQB0AGUA
# bQBlAG4AdAAuIB0wDQYJKoZIhvcNAQELBQADggIBAAfmiFEN4sbgmD+BcQM9naOh
# IW+z66bM9TG+zwXiqf76V20ZMLPCxWbJat/15/B4vceoniXj+bzta1RXCCtRgkQS
# +7lTjMz0YBKKdsxAQEGb3FwX/1z5Xhc1mCRWS3TvQhDIr79/xn/yN31aPxzymXlK
# kVIArzgPF/UveYFl2am1a+THzvbKegBvSzBEJCI8z+0DpZaPWSm8tv0E4XCfMkon
# /VWvL/625Y4zu2JfmttXQOnxzplmkIz/amJ/3cVKC5Em4jnsGUpxY517IW3DnKOi
# PPp/fZZqkHimbdLhnPkd/DjYlPTGpQqWhqS9nhquBEKDuLWAmyI4ILUl5WTs9/S/
# fmNZJQ96LjlXdqJxqgaKD4kWumGnEcua2A5HmoDF0M2n0O99g/DhO3EJ3110mCII
# YdqwUB5vvfHhAN/nMQekkzr3ZUd46PioSKv33nJ+YWtvd6mBy6cJrDm77MbL2IK0
# cs0d9LiFAR6A+xuJKlQ5slvayA1VmXqHczsI5pgt6o3gMy4SKfXAL1QnIffIrE7a
# KLixqduWsqdCosnPGUFN4Ib5KpqjEWYw07t0MkvfY3v1mYovG8chr1m1rtxEPJdQ
# cdeh0sVV42neV8HR3jDA/czmTfsNv11P6Z0eGTgvvM9YBS7vDaBQNdrvCScc1bN+
# NR4Iuto229Nfj950iEkSoYICzzCCAjgCAQEwgfyhgdSkgdEwgc4xCzAJBgNVBAYT
# AlVTMRMwEQYDVQQIEwpXYXNoaW5ndG9uMRAwDgYDVQQHEwdSZWRtb25kMR4wHAYD
# VQQKExVNaWNyb3NvZnQgQ29ycG9yYXRpb24xKTAnBgNVBAsTIE1pY3Jvc29mdCBP
# cGVyYXRpb25zIFB1ZXJ0byBSaWNvMSYwJAYDVQQLEx1UaGFsZXMgVFNTIEVTTjo4
# OTdBLUUzNTYtMTcwMTElMCMGA1UEAxMcTWljcm9zb2Z0IFRpbWUtU3RhbXAgU2Vy
# dmljZaIjCgEBMAcGBSsOAwIaAxUADE5OKSMoNx/mYxYWap1RTOohbJ2ggYMwgYCk
# fjB8MQswCQYDVQQGEwJVUzETMBEGA1UECBMKV2FzaGluZ3RvbjEQMA4GA1UEBxMH
# UmVkbW9uZDEeMBwGA1UEChMVTWljcm9zb2Z0IENvcnBvcmF0aW9uMSYwJAYDVQQD
# Ex1NaWNyb3NvZnQgVGltZS1TdGFtcCBQQ0EgMjAxMDANBgkqhkiG9w0BAQUFAAIF
# AOPFChkwIhgPMjAyMTAyMDMxNTQwMDlaGA8yMDIxMDIwNDE1NDAwOVowdDA6Bgor
# BgEEAYRZCgQBMSwwKjAKAgUA48UKGQIBADAHAgEAAgIXmDAHAgEAAgIRyTAKAgUA
# 48ZbmQIBADA2BgorBgEEAYRZCgQCMSgwJjAMBgorBgEEAYRZCgMCoAowCAIBAAID
# B6EgoQowCAIBAAIDAYagMA0GCSqGSIb3DQEBBQUAA4GBAHeeznL2n6HWCjHH94Fl
# hcdW6TEXzq4XNgp1Gx1W9F8gJ4x+SwoV7elJZkwgGffcpHomLvIY/VSuzsl1NgtJ
# TWM2UxoqSv58BBOrl4eGhH6kkg8Ucy2tdeK5T8cHa8pMkq2j9pFd2mRG/6VMk0dl
# Xz7Uy3Z6bZqkcABMyAfuAaGbMYIDDTCCAwkCAQEwgZMwfDELMAkGA1UEBhMCVVMx
# EzARBgNVBAgTCldhc2hpbmd0b24xEDAOBgNVBAcTB1JlZG1vbmQxHjAcBgNVBAoT
# FU1pY3Jvc29mdCBDb3Jwb3JhdGlvbjEmMCQGA1UEAxMdTWljcm9zb2Z0IFRpbWUt
# U3RhbXAgUENBIDIwMTACEzMAAAEsIq9Fl3X5G+4AAAAAASwwDQYJYIZIAWUDBAIB
# BQCgggFKMBoGCSqGSIb3DQEJAzENBgsqhkiG9w0BCRABBDAvBgkqhkiG9w0BCQQx
# IgQg/QYv7yp+354WTjWUIsXWndTEzXjaYjqwYjcBxCJKjdUwgfoGCyqGSIb3DQEJ
# EAIvMYHqMIHnMIHkMIG9BCBbn/0uFFh42hTM5XOoKdXevBaiSxmYK9Ilcn9nu5ZH
# 4TCBmDCBgKR+MHwxCzAJBgNVBAYTAlVTMRMwEQYDVQQIEwpXYXNoaW5ndG9uMRAw
# DgYDVQQHEwdSZWRtb25kMR4wHAYDVQQKExVNaWNyb3NvZnQgQ29ycG9yYXRpb24x
# JjAkBgNVBAMTHU1pY3Jvc29mdCBUaW1lLVN0YW1wIFBDQSAyMDEwAhMzAAABLCKv
# RZd1+RvuAAAAAAEsMCIEIIfIM3YbzHswb/Kj/qq1l1cHA6QBl+gEXYanUNJomrpT
# MA0GCSqGSIb3DQEBCwUABIIBAAwdcXssUZGO7ho5+NHLjIxLtQk543aKGo+lrRMY
# Q9abE1h/AaaNJl0iGxX4IihNWyfovSfYL3L4eODUBAu68tWSxeceRfWNsb/ZZfUi
# v89hpLssI/Gf1BEgNMA4zCuIGQiC8okusVumEpAhhvCEbSiTTTtBdolTnU/CAKui
# oxaU3R9XkKh1F4oAM26+dJ1J2BLQXPs5afNvvedDsZWNQUPK1sFF3JRfzxiTrwBW
# EJRyflev9gyDoqCHzippgb+6+eti1WTkcA9Q49GIT11S6LOAVqkSC9N7Nqf8ksh8
# ARdwT8jigpsm+mj7lrVU9upDkhVYhKeO8oiZq95Q53Zkteo=
# b25kMR4wHAYDVQQKExVNaWNyb3NvZnQgQ29ycG9yYXRpb24xJTAjBgNVBAsTHE1p
# Y3Jvc29mdCBBbWVyaWNhIE9wZXJhdGlvbnMxJjAkBgNVBAsTHVRoYWxlcyBUU1Mg
# RVNOOjhBODItRTM0Ri05RERBMSUwIwYDVQQDExxNaWNyb3NvZnQgVGltZS1TdGFt
# cCBTZXJ2aWNloIIOPDCCBPEwggPZoAMCAQICEzMAAAFLT7KmSNXkwlEAAAAAAUsw
# DQYJKoZIhvcNAQELBQAwfDELMAkGA1UEBhMCVVMxEzARBgNVBAgTCldhc2hpbmd0
# b24xEDAOBgNVBAcTB1JlZG1vbmQxHjAcBgNVBAoTFU1pY3Jvc29mdCBDb3Jwb3Jh
# dGlvbjEmMCQGA1UEAxMdTWljcm9zb2Z0IFRpbWUtU3RhbXAgUENBIDIwMTAwHhcN
# MjAxMTEyMTgyNTU5WhcNMjIwMjExMTgyNTU5WjCByjELMAkGA1UEBhMCVVMxEzAR
# BgNVBAgTCldhc2hpbmd0b24xEDAOBgNVBAcTB1JlZG1vbmQxHjAcBgNVBAoTFU1p
# Y3Jvc29mdCBDb3Jwb3JhdGlvbjElMCMGA1UECxMcTWljcm9zb2Z0IEFtZXJpY2Eg
# T3BlcmF0aW9uczEmMCQGA1UECxMdVGhhbGVzIFRTUyBFU046OEE4Mi1FMzRGLTlE
# REExJTAjBgNVBAMTHE1pY3Jvc29mdCBUaW1lLVN0YW1wIFNlcnZpY2UwggEiMA0G
# CSqGSIb3DQEBAQUAA4IBDwAwggEKAoIBAQChNnpQx3YuJr/ivobPoLtpQ9egUFl8
# THdWZ6SAKIJdtP3L24D3/d63ommmjZjCyrQm+j/1tHDAwjQGuOwYvn79ecPCQfAB
# 91JnEp/wP4BMF2SXyMf8k9R84RthIdfGHPXTWqzpCCfNWolVEcUVm8Ad/r1LrikR
# O+4KKo6slDQJKsgKApfBU/9J7Rudvhw1rEQw0Nk1BRGWjrIp7/uWoUIfR4rcl6U1
# utOiYIonC87PPpAJQXGRsDdKnVFF4NpWvMiyeuksn5t/Otwz82sGlne/HNQpmMzi
# gR8cZ8eXEDJJNIZxov9WAHHj28gUE29D8ivAT706ihxvTv50ZY8W51uxAgMBAAGj
# ggEbMIIBFzAdBgNVHQ4EFgQUUqpqftASlue6K3LePlTTn01K68YwHwYDVR0jBBgw
# FoAU1WM6XIoxkPNDe3xGG8UzaFqFbVUwVgYDVR0fBE8wTTBLoEmgR4ZFaHR0cDov
# L2NybC5taWNyb3NvZnQuY29tL3BraS9jcmwvcHJvZHVjdHMvTWljVGltU3RhUENB
# XzIwMTAtMDctMDEuY3JsMFoGCCsGAQUFBwEBBE4wTDBKBggrBgEFBQcwAoY+aHR0
# cDovL3d3dy5taWNyb3NvZnQuY29tL3BraS9jZXJ0cy9NaWNUaW1TdGFQQ0FfMjAx
# MC0wNy0wMS5jcnQwDAYDVR0TAQH/BAIwADATBgNVHSUEDDAKBggrBgEFBQcDCDAN
# BgkqhkiG9w0BAQsFAAOCAQEAFtq51Zc/O1AfJK4tEB2Nr8bGEVD5qQ8l8gXIQMrM
# ZYtddHH+cGiqgF/4GmvmPfl5FAYh+gf/8Yd3q4/iD2+K4LtJbs/3v6mpyBl1mQ4v
# usK65dAypWmiT1W3FiXjsmCIkjSDDsKLFBYH5yGFnNFOEMgL+O7u4osH42f80nc2
# WdnZV6+OvW035XPV6ZttUBfFWHdIbUkdOG1O2n4yJm10OfacItZ08fzgMMqE+f/S
# TgVWNCHbR2EYqTWayrGP69jMwtVD9BGGTWti1XjpvE6yKdO8H9nuRi3L+C6jYntf
# aEmBTbnTFEV+kRx1CNcpSb9os86CAUehZU1aRzQ6CQ/pjzCCBnEwggRZoAMCAQIC
# CmEJgSoAAAAAAAIwDQYJKoZIhvcNAQELBQAwgYgxCzAJBgNVBAYTAlVTMRMwEQYD
# VQQIEwpXYXNoaW5ndG9uMRAwDgYDVQQHEwdSZWRtb25kMR4wHAYDVQQKExVNaWNy
# b3NvZnQgQ29ycG9yYXRpb24xMjAwBgNVBAMTKU1pY3Jvc29mdCBSb290IENlcnRp
# ZmljYXRlIEF1dGhvcml0eSAyMDEwMB4XDTEwMDcwMTIxMzY1NVoXDTI1MDcwMTIx
# NDY1NVowfDELMAkGA1UEBhMCVVMxEzARBgNVBAgTCldhc2hpbmd0b24xEDAOBgNV
# BAcTB1JlZG1vbmQxHjAcBgNVBAoTFU1pY3Jvc29mdCBDb3Jwb3JhdGlvbjEmMCQG
# A1UEAxMdTWljcm9zb2Z0IFRpbWUtU3RhbXAgUENBIDIwMTAwggEiMA0GCSqGSIb3
# DQEBAQUAA4IBDwAwggEKAoIBAQCpHQ28dxGKOiDs/BOX9fp/aZRrdFQQ1aUKAIKF
# ++18aEssX8XD5WHCdrc+Zitb8BVTJwQxH0EbGpUdzgkTjnxhMFmxMEQP8WCIhFRD
# DNdNuDgIs0Ldk6zWczBXJoKjRQ3Q6vVHgc2/JGAyWGBG8lhHhjKEHnRhZ5FfgVSx
# z5NMksHEpl3RYRNuKMYa+YaAu99h/EbBJx0kZxJyGiGKr0tkiVBisV39dx898Fd1
# rL2KQk1AUdEPnAY+Z3/1ZsADlkR+79BL/W7lmsqxqPJ6Kgox8NpOBpG2iAg16Hgc
# sOmZzTznL0S6p/TcZL2kAcEgCZN4zfy8wMlEXV4WnAEFTyJNAgMBAAGjggHmMIIB
# 4jAQBgkrBgEEAYI3FQEEAwIBADAdBgNVHQ4EFgQU1WM6XIoxkPNDe3xGG8UzaFqF
# bVUwGQYJKwYBBAGCNxQCBAweCgBTAHUAYgBDAEEwCwYDVR0PBAQDAgGGMA8GA1Ud
# EwEB/wQFMAMBAf8wHwYDVR0jBBgwFoAU1fZWy4/oolxiaNE9lJBb186aGMQwVgYD
# VR0fBE8wTTBLoEmgR4ZFaHR0cDovL2NybC5taWNyb3NvZnQuY29tL3BraS9jcmwv
# cHJvZHVjdHMvTWljUm9vQ2VyQXV0XzIwMTAtMDYtMjMuY3JsMFoGCCsGAQUFBwEB
# BE4wTDBKBggrBgEFBQcwAoY+aHR0cDovL3d3dy5taWNyb3NvZnQuY29tL3BraS9j
# ZXJ0cy9NaWNSb29DZXJBdXRfMjAxMC0wNi0yMy5jcnQwgaAGA1UdIAEB/wSBlTCB
# kjCBjwYJKwYBBAGCNy4DMIGBMD0GCCsGAQUFBwIBFjFodHRwOi8vd3d3Lm1pY3Jv
# c29mdC5jb20vUEtJL2RvY3MvQ1BTL2RlZmF1bHQuaHRtMEAGCCsGAQUFBwICMDQe
# MiAdAEwAZQBnAGEAbABfAFAAbwBsAGkAYwB5AF8AUwB0AGEAdABlAG0AZQBuAHQA
# LiAdMA0GCSqGSIb3DQEBCwUAA4ICAQAH5ohRDeLG4Jg/gXEDPZ2joSFvs+umzPUx
# vs8F4qn++ldtGTCzwsVmyWrf9efweL3HqJ4l4/m87WtUVwgrUYJEEvu5U4zM9GAS
# inbMQEBBm9xcF/9c+V4XNZgkVkt070IQyK+/f8Z/8jd9Wj8c8pl5SpFSAK84Dxf1
# L3mBZdmptWvkx872ynoAb0swRCQiPM/tA6WWj1kpvLb9BOFwnzJKJ/1Vry/+tuWO
# M7tiX5rbV0Dp8c6ZZpCM/2pif93FSguRJuI57BlKcWOdeyFtw5yjojz6f32WapB4
# pm3S4Zz5Hfw42JT0xqUKloakvZ4argRCg7i1gJsiOCC1JeVk7Pf0v35jWSUPei45
# V3aicaoGig+JFrphpxHLmtgOR5qAxdDNp9DvfYPw4TtxCd9ddJgiCGHasFAeb73x
# 4QDf5zEHpJM692VHeOj4qEir995yfmFrb3epgcunCaw5u+zGy9iCtHLNHfS4hQEe
# gPsbiSpUObJb2sgNVZl6h3M7COaYLeqN4DMuEin1wC9UJyH3yKxO2ii4sanblrKn
# QqLJzxlBTeCG+SqaoxFmMNO7dDJL32N79ZmKLxvHIa9Zta7cRDyXUHHXodLFVeNp
# 3lfB0d4wwP3M5k37Db9dT+mdHhk4L7zPWAUu7w2gUDXa7wknHNWzfjUeCLraNtvT
# X4/edIhJEqGCAs4wggI3AgEBMIH4oYHQpIHNMIHKMQswCQYDVQQGEwJVUzETMBEG
# A1UECBMKV2FzaGluZ3RvbjEQMA4GA1UEBxMHUmVkbW9uZDEeMBwGA1UEChMVTWlj
# cm9zb2Z0IENvcnBvcmF0aW9uMSUwIwYDVQQLExxNaWNyb3NvZnQgQW1lcmljYSBP
# cGVyYXRpb25zMSYwJAYDVQQLEx1UaGFsZXMgVFNTIEVTTjo4QTgyLUUzNEYtOURE
# QTElMCMGA1UEAxMcTWljcm9zb2Z0IFRpbWUtU3RhbXAgU2VydmljZaIjCgEBMAcG
# BSsOAwIaAxUAkToz97fseHxNOUSQ5O/bBVSF+e6ggYMwgYCkfjB8MQswCQYDVQQG
# EwJVUzETMBEGA1UECBMKV2FzaGluZ3RvbjEQMA4GA1UEBxMHUmVkbW9uZDEeMBwG
# A1UEChMVTWljcm9zb2Z0IENvcnBvcmF0aW9uMSYwJAYDVQQDEx1NaWNyb3NvZnQg
# VGltZS1TdGFtcCBQQ0EgMjAxMDANBgkqhkiG9w0BAQUFAAIFAOTNtrcwIhgPMjAy
# MTA4MjMxMzU1MDNaGA8yMDIxMDgyNDEzNTUwM1owdzA9BgorBgEEAYRZCgQBMS8w
# LTAKAgUA5M22twIBADAKAgEAAgIcfAIB/zAHAgEAAgIQOTAKAgUA5M8INwIBADA2
# BgorBgEEAYRZCgQCMSgwJjAMBgorBgEEAYRZCgMCoAowCAIBAAIDB6EgoQowCAIB
# AAIDAYagMA0GCSqGSIb3DQEBBQUAA4GBADrUnASEIEovjIFdqxhyYsyiBSfNeD7e
# W4qeRl1B4DTAY8Z/U1SpnK8yy7r6oZn0D8og+QJmnWEnCPqNXAreyg8tlFyTF1TV
# K6uUlslIY4WCmrAECIFua/DuksTerny66SBw4aSEeQHtGiynnR87WcdqItBJxxRA
# pElXzs8QzC0EMYIDDTCCAwkCAQEwgZMwfDELMAkGA1UEBhMCVVMxEzARBgNVBAgT
# Cldhc2hpbmd0b24xEDAOBgNVBAcTB1JlZG1vbmQxHjAcBgNVBAoTFU1pY3Jvc29m
# dCBDb3Jwb3JhdGlvbjEmMCQGA1UEAxMdTWljcm9zb2Z0IFRpbWUtU3RhbXAgUENB
# IDIwMTACEzMAAAFLT7KmSNXkwlEAAAAAAUswDQYJYIZIAWUDBAIBBQCgggFKMBoG
# CSqGSIb3DQEJAzENBgsqhkiG9w0BCRABBDAvBgkqhkiG9w0BCQQxIgQg87M/Jiqh
# MlaXNWAIAn7CM9CZw6QCPgBx3voUSIH6+qUwgfoGCyqGSIb3DQEJEAIvMYHqMIHn
# MIHkMIG9BCBr9u6EInnsZYEts/Fj/rIFv0YZW1ynhXKOP2hVPUU5IzCBmDCBgKR+
# MHwxCzAJBgNVBAYTAlVTMRMwEQYDVQQIEwpXYXNoaW5ndG9uMRAwDgYDVQQHEwdS
# ZWRtb25kMR4wHAYDVQQKExVNaWNyb3NvZnQgQ29ycG9yYXRpb24xJjAkBgNVBAMT
# HU1pY3Jvc29mdCBUaW1lLVN0YW1wIFBDQSAyMDEwAhMzAAABS0+ypkjV5MJRAAAA
# AAFLMCIEIGs9Mn51TUGRleeUO5lQCRHZSu0G/8Ir92l3NgMGq1/GMA0GCSqGSIb3
# DQEBCwUABIIBAA1oBIiM3guh89/Pvwwcn4hdZGGBsMk56MRMvOir8XQM0Je/rSwP
# P11S2VwYUMmBjfFX1QK7r280k3k9xcDWW79ZC4Kn2UsV7jxuUwFHgdVlgDNjg6QL
# unGuwFX5lStgNevwUH9DotoASXxgtwNNmtZNDB4NYVCxQLS7RtVSbxP+xFYXbokG
# G4DFVHBr3CGPga3OduE/YUBpxIRxtiDF/9WHo0tJEdS95tfdnsyaGo8eHNzyfTdS
# kjrNsAlqpJRT0Va/ooBXCV7Uny7PlCkRIQRTSJDy/kSXpsL22CTEHw6mU0jEU99k
# OL4uwu+aVnolWFlpbiJjQxBqfT3plBLIg88=
# SIG # End signature block

View File

@@ -24,7 +24,7 @@ exec 3>&1
# See if stdout is a terminal
if [ -t 1 ] && command -v tput > /dev/null; then
# see if it supports colors
ncolors=$(tput colors)
ncolors=$(tput colors || echo 0)
if [ -n "$ncolors" ] && [ $ncolors -ge 8 ]; then
bold="$(tput bold || echo)"
normal="$(tput sgr0 || echo)"
@@ -224,7 +224,7 @@ get_legacy_os_name() {
machine_has() {
eval $invocation
hash "$1" > /dev/null 2>&1
command -v "$1" > /dev/null 2>&1
return $?
}
@@ -368,6 +368,75 @@ get_normalized_os() {
return 0
}
# args:
# quality - $1
get_normalized_quality() {
eval $invocation
local quality="$(to_lowercase "$1")"
if [ ! -z "$quality" ]; then
case "$quality" in
daily | signed | validated | preview)
echo "$quality"
return 0
;;
ga)
#ga quality is available without specifying quality, so normalizing it to empty
return 0
;;
*)
say_err "'$quality' is not a supported value for --quality option. Supported values are: daily, signed, validated, preview, ga. If you think this is a bug, report it at https://github.com/dotnet/install-scripts/issues."
return 1
;;
esac
fi
return 0
}
# args:
# channel - $1
get_normalized_channel() {
eval $invocation
local channel="$(to_lowercase "$1")"
if [[ $channel == release/* ]]; then
say_warning 'Using branch name with -Channel option is no longer supported with newer releases. Use -Quality option with a channel in X.Y format instead.';
fi
if [ ! -z "$channel" ]; then
case "$channel" in
lts)
echo "LTS"
return 0
;;
*)
echo "$channel"
return 0
;;
esac
fi
return 0
}
# args:
# runtime - $1
get_normalized_product() {
eval $invocation
local runtime="$(to_lowercase "$1")"
if [[ "$runtime" == "dotnet" ]]; then
product="dotnet-runtime"
elif [[ "$runtime" == "aspnetcore" ]]; then
product="aspnetcore-runtime"
elif [ -z "$runtime" ]; then
product="dotnet-sdk"
fi
echo "$product"
return 0
}
# The version text returned from the feeds is a 1-line or 2-line string:
# For the SDK and the dotnet runtime (2 lines):
# Line 1: # commit_hash
@@ -541,43 +610,131 @@ construct_download_link() {
# args:
# azure_feed - $1
# specific_version - $2
# download link - $3 (optional)
get_specific_product_version() {
# If we find a 'productVersion.txt' at the root of any folder, we'll use its contents
# to resolve the version of what's in the folder, superseding the specified version.
# if 'productVersion.txt' is missing but download link is already available, product version will be taken from download link
eval $invocation
local azure_feed="$1"
local specific_version="${2//[$'\t\r\n']}"
local specific_product_version=$specific_version
local package_download_link=""
if [ $# -gt 2 ]; then
local package_download_link="$3"
fi
local specific_product_version=null
# Try to get the version number, using the productVersion.txt file located next to the installer file.
local download_links=($(get_specific_product_version_url "$azure_feed" "$specific_version" true "$package_download_link")
$(get_specific_product_version_url "$azure_feed" "$specific_version" false "$package_download_link"))
for download_link in "${download_links[@]}"
do
say_verbose "Checking for the existence of $download_link"
if machine_has "curl"
then
specific_product_version=$(curl -s --fail "${download_link}${feed_credential}")
if [ $? = 0 ]; then
echo "${specific_product_version//[$'\t\r\n']}"
return 0
fi
elif machine_has "wget"
then
specific_product_version=$(wget -qO- "${download_link}${feed_credential}")
if [ $? = 0 ]; then
echo "${specific_product_version//[$'\t\r\n']}"
return 0
fi
fi
done
# Getting the version number with productVersion.txt has failed. Try parsing the download link for a version number.
say_verbose "Failed to get the version using productVersion.txt file. Download link will be parsed instead."
specific_product_version="$(get_product_specific_version_from_download_link "$package_download_link" "$specific_version")"
echo "${specific_product_version//[$'\t\r\n']}"
return 0
}
# args:
# azure_feed - $1
# specific_version - $2
# is_flattened - $3
# download link - $4 (optional)
get_specific_product_version_url() {
eval $invocation
local azure_feed="$1"
local specific_version="$2"
local is_flattened="$3"
local package_download_link=""
if [ $# -gt 3 ]; then
local package_download_link="$4"
fi
local pvFileName="productVersion.txt"
if [ "$is_flattened" = true ]; then
if [ -z "$runtime" ]; then
pvFileName="sdk-productVersion.txt"
elif [[ "$runtime" == "dotnet" ]]; then
pvFileName="runtime-productVersion.txt"
else
pvFileName="$runtime-productVersion.txt"
fi
fi
local download_link=null
if [[ "$runtime" == "dotnet" ]]; then
download_link="$azure_feed/Runtime/$specific_version/productVersion.txt${feed_credential}"
elif [[ "$runtime" == "aspnetcore" ]]; then
download_link="$azure_feed/aspnetcore/Runtime/$specific_version/productVersion.txt${feed_credential}"
elif [ -z "$runtime" ]; then
download_link="$azure_feed/Sdk/$specific_version/productVersion.txt${feed_credential}"
if [ -z "$package_download_link" ]; then
if [[ "$runtime" == "dotnet" ]]; then
download_link="$azure_feed/Runtime/$specific_version/${pvFileName}"
elif [[ "$runtime" == "aspnetcore" ]]; then
download_link="$azure_feed/aspnetcore/Runtime/$specific_version/${pvFileName}"
elif [ -z "$runtime" ]; then
download_link="$azure_feed/Sdk/$specific_version/${pvFileName}"
else
return 1
fi
else
return 1
download_link="${package_download_link%/*}/${pvFileName}"
fi
if machine_has "curl"
then
specific_product_version=$(curl -s --fail "$download_link")
if [ $? -ne 0 ]
then
specific_product_version=$specific_version
fi
elif machine_has "wget"
then
specific_product_version=$(wget -qO- "$download_link")
if [ $? -ne 0 ]
then
specific_product_version=$specific_version
fi
fi
specific_product_version="${specific_product_version//[$'\t\r\n']}"
say_verbose "Constructed productVersion link: $download_link"
echo "$download_link"
return 0
}
# args:
# download link - $1
# specific version - $2
get_product_specific_version_from_download_link()
{
eval $invocation
local download_link="$1"
local specific_version="$2"
local specific_product_version=""
if [ -z "$download_link" ]; then
echo "$specific_version"
return 0
fi
#get filename
filename="${download_link##*/}"
#product specific version follows the product name
#for filename 'dotnet-sdk-3.1.404-linux-x64.tar.gz': the product version is 3.1.404
IFS='-'
read -ra filename_elems <<< "$filename"
count=${#filename_elems[@]}
if [[ "$count" -gt 2 ]]; then
specific_product_version="${filename_elems[2]}"
else
specific_product_version=$specific_version
fi
unset IFS;
echo "$specific_product_version"
return 0
}
@@ -711,19 +868,62 @@ extract_dotnet_package() {
return 0
}
# args:
# remote_path - $1
# disable_feed_credential - $2
get_http_header()
{
eval $invocation
local remote_path="$1"
local disable_feed_credential="$2"
local failed=false
local response
if machine_has "curl"; then
get_http_header_curl $remote_path $disable_feed_credential || failed=true
elif machine_has "wget"; then
get_http_header_wget $remote_path $disable_feed_credential || failed=true
else
failed=true
fi
if [ "$failed" = true ]; then
say_verbose "Failed to get HTTP header: '$remote_path'."
return 1
fi
return 0
}
# args:
# remote_path - $1
# disable_feed_credential - $2
get_http_header_curl() {
eval $invocation
local remote_path="$1"
remote_path_with_credential="${remote_path}${feed_credential}"
local disable_feed_credential="$2"
remote_path_with_credential="$remote_path"
if [ "$disable_feed_credential" = false ]; then
remote_path_with_credential+="$feed_credential"
fi
curl_options="-I -sSL --retry 5 --retry-delay 2 --connect-timeout 15 "
curl $curl_options "$remote_path_with_credential" || return 1
return 0
}
# args:
# remote_path - $1
# disable_feed_credential - $2
get_http_header_wget() {
eval $invocation
local remote_path="$1"
remote_path_with_credential="${remote_path}${feed_credential}"
local disable_feed_credential="$2"
remote_path_with_credential="$remote_path"
if [ "$disable_feed_credential" = false ]; then
remote_path_with_credential+="$feed_credential"
fi
wget_options="-q -S --spider --tries 5 --waitretry 2 --connect-timeout 15 "
wget $wget_options "$remote_path_with_credential" 2>&1 || return 1
return 0
@@ -763,7 +963,7 @@ download() {
say "Download attempt #$attempts has failed: $http_code $download_error_msg"
say "Attempt #$((attempts+1)) will start in $((attempts*10)) seconds."
sleep $((attempts*20))
sleep $((attempts*10))
done
@@ -783,6 +983,7 @@ downloadcurl() {
local remote_path="$1"
local out_path="${2:-}"
# Append feed_credential as late as possible before calling curl to avoid logging feed_credential
# Avoid passing URI with credentials to functions: note, most of them echoing parameters of invocation in verbose output.
local remote_path_with_credential="${remote_path}${feed_credential}"
local curl_options="--retry 20 --retry-delay 2 --connect-timeout 15 -sSL -f --create-dirs "
local failed=false
@@ -792,7 +993,8 @@ downloadcurl() {
curl $curl_options -o "$out_path" "$remote_path_with_credential" || failed=true
fi
if [ "$failed" = true ]; then
local response=$(get_http_header_curl $remote_path_with_credential)
local disable_feed_credential=false
local response=$(get_http_header_curl $remote_path $disable_feed_credential)
http_code=$( echo "$response" | awk '/^HTTP/{print $2}' | tail -1 )
download_error_msg="Unable to download $remote_path."
if [[ $http_code != 2* ]]; then
@@ -822,7 +1024,8 @@ downloadwget() {
wget $wget_options -O "$out_path" "$remote_path_with_credential" || failed=true
fi
if [ "$failed" = true ]; then
local response=$(get_http_header_wget $remote_path_with_credential)
local disable_feed_credential=false
local response=$(get_http_header_wget $remote_path $disable_feed_credential)
http_code=$( echo "$response" | awk '/^ HTTP/{print $2}' | tail -1 )
download_error_msg="Unable to download $remote_path."
if [[ $http_code != 2* ]]; then
@@ -834,15 +1037,115 @@ downloadwget() {
return 0
}
get_download_link_from_aka_ms() {
eval $invocation
#quality is not supported for LTS or current channel
if [[ ! -z "$normalized_quality" && ("$normalized_channel" == "LTS" || "$normalized_channel" == "current") ]]; then
normalized_quality=""
say_warning "Specifying quality for current or LTS channel is not supported, the quality will be ignored."
fi
say_verbose "Retrieving primary payload URL from aka.ms for channel: '$normalized_channel', quality: '$normalized_quality', product: '$normalized_product', os: '$normalized_os', architecture: '$normalized_architecture'."
#construct aka.ms link
aka_ms_link="https://aka.ms/dotnet"
if [ "$internal" = true ]; then
aka_ms_link="$aka_ms_link/internal"
fi
aka_ms_link="$aka_ms_link/$normalized_channel"
if [[ ! -z "$normalized_quality" ]]; then
aka_ms_link="$aka_ms_link/$normalized_quality"
fi
aka_ms_link="$aka_ms_link/$normalized_product-$normalized_os-$normalized_architecture.tar.gz"
say_verbose "Constructed aka.ms link: '$aka_ms_link'."
#get HTTP response
#do not pass credentials as a part of the $aka_ms_link and do not apply credentials in the get_http_header function
#otherwise the redirect link would have credentials as well
#it would result in applying credentials twice to the resulting link and thus breaking it, and in echoing credentials to the output as a part of redirect link
disable_feed_credential=true
response="$(get_http_header $aka_ms_link $disable_feed_credential)"
say_verbose "Received response: $response"
# Get results of all the redirects.
http_codes=$( echo "$response" | awk '$1 ~ /^HTTP/ {print $2}' )
# They all need to be 301, otherwise some links are broken (except for the last, which is not a redirect but 200 or 404).
broken_redirects=$( echo "$http_codes" | sed '$d' | grep -v '301' )
# All HTTP codes are 301 (Moved Permanently), the redirect link exists.
if [[ -z "$broken_redirects" ]]; then
aka_ms_download_link=$( echo "$response" | awk '$1 ~ /^Location/{print $2}' | tail -1 | tr -d '\r')
if [[ -z "$aka_ms_download_link" ]]; then
say_verbose "The aka.ms link '$aka_ms_link' is not valid: failed to get redirect location."
return 1
fi
say_verbose "The redirect location retrieved: '$aka_ms_download_link'."
return 0
else
say_verbose "The aka.ms link '$aka_ms_link' is not valid: received HTTP code: $(echo "$broken_redirects" | paste -sd "," -)."
return 1
fi
}
calculate_vars() {
eval $invocation
valid_legacy_download_link=true
#normalize input variables
normalized_architecture="$(get_normalized_architecture_from_architecture "$architecture")"
say_verbose "normalized_architecture=$normalized_architecture"
say_verbose "Normalized architecture: '$normalized_architecture'."
normalized_os="$(get_normalized_os "$user_defined_os")"
say_verbose "normalized_os=$normalized_os"
say_verbose "Normalized OS: '$normalized_os'."
normalized_quality="$(get_normalized_quality "$quality")"
say_verbose "Normalized quality: '$normalized_quality'."
normalized_channel="$(get_normalized_channel "$channel")"
say_verbose "Normalized channel: '$normalized_channel'."
normalized_product="$(get_normalized_product "$runtime")"
say_verbose "Normalized product: '$normalized_product'."
#try to get download location from aka.ms link
#not applicable when exact version is specified via command or json file
normalized_version="$(to_lowercase "$version")"
if [[ -z "$json_file" && "$normalized_version" == "latest" ]]; then
valid_aka_ms_link=true;
get_download_link_from_aka_ms || valid_aka_ms_link=false
if [ "$valid_aka_ms_link" == false ]; then
# if quality is specified - exit with error - there is no fallback approach
if [ ! -z "$normalized_quality" ]; then
say_err "Failed to locate the latest version in the channel '$normalized_channel' with '$normalized_quality' quality for '$normalized_product', os: '$normalized_os', architecture: '$normalized_architecture'."
say_err "Refer to: https://aka.ms/dotnet-os-lifecycle for information on .NET Core support."
return 1
fi
say_verbose "Falling back to latest.version file approach."
else
say_verbose "Retrieved primary payload URL from aka.ms link: '$aka_ms_download_link'."
download_link=$aka_ms_download_link
say_verbose "Downloading using legacy url will not be attempted."
valid_legacy_download_link=false
#get version from the path
IFS='/'
read -ra pathElems <<< "$download_link"
count=${#pathElems[@]}
specific_version="${pathElems[count-2]}"
unset IFS;
say_verbose "Version: '$specific_version'."
#Retrieve product specific version
specific_product_version="$(get_specific_product_version "$azure_feed" "$specific_version" "$download_link")"
say_verbose "Product specific version: '$specific_product_version'."
install_root="$(resolve_installation_path "$install_dir")"
say_verbose "InstallRoot: '$install_root'."
return
fi
fi
specific_version="$(get_specific_version_from_version "$azure_feed" "$channel" "$normalized_architecture" "$version" "$json_file")"
specific_product_version="$(get_specific_product_version "$azure_feed" "$specific_version")"
@@ -1011,6 +1314,8 @@ feed_credential=""
verbose=false
runtime=""
runtime_id=""
quality=""
internal=false
override_non_versioned_files=true
non_dynamic_parameters=""
user_defined_os=""
@@ -1027,6 +1332,14 @@ do
shift
version="$1"
;;
-q|--quality|-[Qq]uality)
shift
quality="$1"
;;
--internal|-[Ii]nternal)
internal=true
non_dynamic_parameters+=" $name"
;;
-i|--install-dir|-[Ii]nstall[Dd]ir)
shift
install_dir="$1"
@@ -1084,7 +1397,9 @@ do
--feed-credential|-[Ff]eed[Cc]redential)
shift
feed_credential="$1"
non_dynamic_parameters+=" $name "\""$1"\"""
#feed_credential should start with "?", for it to be added to the end of the link.
#adding "?" at the beginning of the feed_credential if needed.
[[ -z "$(echo $feed_credential)" ]] || [[ $feed_credential == \?* ]] || feed_credential="?$feed_credential"
;;
--runtime-id|-[Rr]untime[Ii]d)
shift
@@ -1116,15 +1431,26 @@ do
echo " - LTS - most current supported release"
echo " - 2-part version in a format A.B - represents a specific release"
echo " examples: 2.0; 1.0"
echo " - Branch name"
echo " examples: release/2.0.0; Master"
echo " Note: The version parameter overrides the channel parameter."
echo " - 3-part version in a format A.B.Cxx - represents a specific SDK release"
echo " examples: 5.0.1xx, 5.0.2xx."
echo " Supported since 5.0 release"
echo " Note: The version parameter overrides the channel parameter when any version other than `latest` is used."
echo " -v,--version <VERSION> Use specific VERSION, Defaults to \`$version\`."
echo " -Version"
echo " Possible values:"
echo " - latest - most latest build on specific channel"
echo " - 3-part version in a format A.B.C - represents specific version of build"
echo " examples: 2.0.0-preview2-006120; 1.1.0"
echo " -q,--quality <quality> Download the latest build of specified quality in the channel."
echo " -Quality"
echo " The possible values are: daily, signed, validated, preview, GA."
echo " Works only in combination with channel. Not applicable for current and LTS channels and will be ignored if those channels are used."
echo " For SDK use channel in A.B.Cxx format. Using quality for SDK together with channel in A.B format is not supported."
echo " Supported since 5.0 release."
echo " Note: The version parameter overrides the channel parameter when any version other than `latest` is used, and therefore overrides the quality."
echo " --internal,-Internal Download internal builds. Requires providing credentials via --feed-credential parameter."
echo " --feed-credential <FEEDCREDENTIAL> Token to access Azure feed. Used as a query string to append to the Azure feed."
echo " -FeedCredential This parameter typically is not specified."
echo " -i,--install-dir <DIR> Install under specified location (see Install Location below)"
echo " -InstallDir"
echo " --architecture <ARCHITECTURE> Architecture of dotnet binaries to be installed, Defaults to \`$architecture\`."
@@ -1145,7 +1471,6 @@ do
echo " --verbose,-Verbose Display diagnostics information."
echo " --azure-feed,-AzureFeed Azure feed location. Defaults to $azure_feed, This parameter typically is not changed by the user."
echo " --uncached-feed,-UncachedFeed Uncached feed location. This parameter typically is not changed by the user."
echo " --feed-credential,-FeedCredential Azure feed shared access token. This parameter typically is not specified."
echo " --skip-non-versioned-files Skips non-versioned files if they already exist, such as the dotnet executable."
echo " -SkipNonVersionedFiles"
echo " --no-cdn,-NoCdn Disable downloading from the Azure CDN, and use the uncached feed directly."
@@ -1186,23 +1511,44 @@ say "- The SDK needs to be installed without user interaction and without admin
say "- The SDK installation doesn't need to persist across multiple CI runs."
say "To set up a development environment or to run apps, use installers rather than this script. Visit https://dotnet.microsoft.com/download to get the installer.\n"
if [ "$internal" = true ] && [ -z "$(echo $feed_credential)" ]; then
message="Provide credentials via --feed-credential parameter."
if [ "$dry_run" = true ]; then
say_warning "$message"
else
say_err "$message"
exit 1
fi
fi
check_min_reqs
calculate_vars
script_name=$(basename "$0")
if [ "$dry_run" = true ]; then
say "Payload URLs:"
say "Primary named payload URL: $download_link"
say "Primary named payload URL: ${download_link}"
if [ "$valid_legacy_download_link" = true ]; then
say "Legacy named payload URL: $legacy_download_link"
say "Legacy named payload URL: ${legacy_download_link}"
fi
repeatable_command="./$script_name --version "\""$specific_version"\"" --install-dir "\""$install_root"\"" --architecture "\""$normalized_architecture"\"" --os "\""$normalized_os"\"""
if [ ! -z "$normalized_quality" ]; then
repeatable_command+=" --quality "\""$normalized_quality"\"""
fi
if [[ "$runtime" == "dotnet" ]]; then
repeatable_command+=" --runtime "\""dotnet"\"""
elif [[ "$runtime" == "aspnetcore" ]]; then
repeatable_command+=" --runtime "\""aspnetcore"\"""
fi
repeatable_command+="$non_dynamic_parameters"
if [ -n "$feed_credential" ]; then
repeatable_command+=" --feed-credential "\""<feed_credential>"\"""
fi
say "Repeatable invocation: $repeatable_command"
exit 0
fi

View File

@@ -3,7 +3,8 @@ PACKAGERUNTIME=$1
PRECACHE=$2
NODE_URL=https://nodejs.org/dist
NODE12_VERSION="12.13.1"
NODE12_VERSION="12.22.7"
NODE16_VERSION="16.13.0"
get_abs_path() {
# exploits the fact that pwd will print abs path when no args
@@ -126,6 +127,8 @@ function acquireExternalTool() {
if [[ "$PACKAGERUNTIME" == "win-x64" || "$PACKAGERUNTIME" == "win-x86" ]]; then
acquireExternalTool "$NODE_URL/v${NODE12_VERSION}/$PACKAGERUNTIME/node.exe" node12/bin
acquireExternalTool "$NODE_URL/v${NODE12_VERSION}/$PACKAGERUNTIME/node.lib" node12/bin
acquireExternalTool "$NODE_URL/v${NODE16_VERSION}/$PACKAGERUNTIME/node.exe" node16/bin
acquireExternalTool "$NODE_URL/v${NODE16_VERSION}/$PACKAGERUNTIME/node.lib" node16/bin
if [[ "$PRECACHE" != "" ]]; then
acquireExternalTool "https://github.com/microsoft/vswhere/releases/download/2.6.7/vswhere.exe" vswhere
fi
@@ -134,18 +137,30 @@ fi
# Download the external tools only for OSX.
if [[ "$PACKAGERUNTIME" == "osx-x64" ]]; then
acquireExternalTool "$NODE_URL/v${NODE12_VERSION}/node-v${NODE12_VERSION}-darwin-x64.tar.gz" node12 fix_nested_dir
acquireExternalTool "$NODE_URL/v${NODE16_VERSION}/node-v${NODE16_VERSION}-darwin-x64.tar.gz" node16 fix_nested_dir
fi
# Download the external tools for Linux PACKAGERUNTIMEs.
if [[ "$PACKAGERUNTIME" == "linux-x64" ]]; then
acquireExternalTool "$NODE_URL/v${NODE12_VERSION}/node-v${NODE12_VERSION}-linux-x64.tar.gz" node12 fix_nested_dir
acquireExternalTool "https://vstsagenttools.blob.core.windows.net/tools/nodejs/${NODE12_VERSION}/alpine/x64/node-${NODE12_VERSION}-alpine-x64.tar.gz" node12_alpine
acquireExternalTool "https://vstsagenttools.blob.core.windows.net/tools/nodejs/${NODE12_VERSION}/alpine/x64/node-v${NODE12_VERSION}-alpine-x64.tar.gz" node12_alpine
acquireExternalTool "$NODE_URL/v${NODE16_VERSION}/node-v${NODE16_VERSION}-linux-x64.tar.gz" node16 fix_nested_dir
acquireExternalTool "https://vstsagenttools.blob.core.windows.net/tools/nodejs/${NODE16_VERSION}/alpine/x64/node-v${NODE16_VERSION}-alpine-x64.tar.gz" node16_alpine
fi
if [[ "$PACKAGERUNTIME" == "linux-arm64" ]]; then
acquireExternalTool "$NODE_URL/v${NODE12_VERSION}/node-v${NODE12_VERSION}-linux-arm64.tar.gz" node12 fix_nested_dir
acquireExternalTool "$NODE_URL/v${NODE16_VERSION}/node-v${NODE16_VERSION}-linux-arm64.tar.gz" node16 fix_nested_dir
fi
if [[ "$PACKAGERUNTIME" == "linux-arm" ]]; then
acquireExternalTool "$NODE_URL/v${NODE12_VERSION}/node-v${NODE12_VERSION}-linux-armv7l.tar.gz" node12 fix_nested_dir
acquireExternalTool "$NODE_URL/v${NODE16_VERSION}/node-v${NODE16_VERSION}-linux-armv7l.tar.gz" node16 fix_nested_dir
fi
if [[ "$PACKAGERUNTIME" == "linux-musl-x64" ]]; then
acquireExternalTool "$NODE_URL/v${NODE12_VERSION}/node-v${NODE12_VERSION}-linux-x64.tar.gz" node12_glib fix_nested_dir
acquireExternalTool "https://vstsagenttools.blob.core.windows.net/tools/nodejs/${NODE12_VERSION}/alpine/x64/node-v${NODE12_VERSION}-alpine-x64.tar.gz" node12
acquireExternalTool "$NODE_URL/v${NODE16_VERSION}/node-v${NODE16_VERSION}-linux-x64.tar.gz" node16_glib fix_nested_dir
acquireExternalTool "https://vstsagenttools.blob.core.windows.net/tools/nodejs/${NODE16_VERSION}/alpine/x64/node-v${NODE16_VERSION}-alpine-x64.tar.gz" node16
fi

View File

@@ -25,5 +25,7 @@
</dict>
<key>ProcessType</key>
<string>Interactive</string>
<key>SessionCreate</key>
<true/>
</dict>
</plist>

View File

@@ -18,6 +18,8 @@ downloadrunnerversion=_DOWNLOAD_RUNNER_VERSION_
logfile="_UPDATE_LOG_"
restartinteractiverunner=_RESTART_INTERACTIVE_RUNNER_
telemetryfile="$rootfolder/_diag/.telemetry"
# log user who run the script
date "+[%F %T-%4N] --------whoami--------" >> "$logfile" 2>&1
whoami >> "$logfile" 2>&1
@@ -118,40 +120,55 @@ then
exit 1
fi
# fix upgrade issue with macOS
# fix upgrade issue with macOS when running as a service
attemptedtargetedfix=0
currentplatform=$(uname | awk '{print tolower($0)}')
if [[ "$currentplatform" == 'darwin' ]]; then
# need a short-term fix for https://github.com/actions/runner/issues/743
# we will recreate all the ./externals/node12/bin/node of the past 5 versions
# v2.280.3 v2.280.2 v2.280.1 v2.279.0 v2.278.0
if [[ ! -e "$rootfolder/externals.2.280.3/node12/bin/node" ]]
then
mkdir -p "$rootfolder/externals.2.280.3/node12/bin"
cp "$rootfolder/externals/node12/bin/node" "$rootfolder/externals.2.280.3/node12/bin/node"
fi
if [[ "$currentplatform" == 'darwin' && restartinteractiverunner -eq 0 ]]; then
# We needed a fix for https://github.com/actions/runner/issues/743
# We will recreate the ./externals/node12/bin/node of the past runner version that launched the runnerlistener service
# Otherwise mac gatekeeper kills the processes we spawn on creation as we are running a process with no backing file
if [[ ! -e "$rootfolder/externals.2.280.2/node12/bin/node" ]]
# We need the pid for the nodejs loop, get that here, its the parent of the runner C# pid
# assumption here is only one process is invoking rootfolder/runsvc.sh
procgroup=$(ps x -o pgid,command | grep "$rootfolder/runsvc.sh" | grep -v grep | awk '{print $1}')
if [[ $? -eq 0 && -n "$procgroup" ]]
then
mkdir -p "$rootfolder/externals.2.280.2/node12/bin"
cp "$rootfolder/externals/node12/bin/node" "$rootfolder/externals.2.280.2/node12/bin/node"
fi
if [[ ! -e "$rootfolder/externals.2.280.1/node12/bin/node" ]]
then
mkdir -p "$rootfolder/externals.2.280.1/node12/bin"
cp "$rootfolder/externals/node12/bin/node" "$rootfolder/externals.2.280.1/node12/bin/node"
fi
if [[ ! -e "$rootfolder/externals.2.279.0/node12/bin/node" ]]
then
mkdir -p "$rootfolder/externals.2.279.0/node12/bin"
cp "$rootfolder/externals/node12/bin/node" "$rootfolder/externals.2.279.0/node12/bin/node"
fi
if [[ ! -e "$rootfolder/externals.2.278.0/node12/bin/node" ]]
then
mkdir -p "$rootfolder/externals.2.278.0/node12/bin"
cp "$rootfolder/externals/node12/bin/node" "$rootfolder/externals.2.278.0/node12/bin/node"
# inspect the open file handles to find the node process
# we can't actually inspect the process using ps because it uses relative paths and doesn't follow symlinks
path=$(lsof -a -g "$procgroup" -F n | grep node12/bin/node | grep externals | tail -1 | cut -c2-)
if [[ $? -eq 0 && -n "$path" ]]
then
# trim the last 5 characters of the path '/node'
trimmedpath=$(dirname "$path")
if [[ $? -eq 0 && -n "$trimmedpath" ]]
then
attemptedtargetedfix=1
# Create the path if it does not exist
if [[ ! -e "$path" ]]
then
date "+[%F %T-%4N] Creating fallback node at path $path" >> "$logfile" 2>&1
mkdir -p "$trimmedpath"
cp "$rootfolder/externals/node12/bin/node" "$path"
else
date "+[%F %T-%4N] Path for fallback node exists, skipping creating $path" >> "$logfile" 2>&1
fi
else
date "+[%F %T-%4N] DarwinRunnerUpgrade: Failed to trim runner path. TrimmedPath: $trimmedpath, path: $path, pgid: $procgroup, root: $rootfolder" >> "$logfile" 2>&1
date "+[%F %T-%4N] DarwinRunnerUpgrade: Failed to trim runner path. TrimmedPath: $trimmedpath, path: $path, pgid: $procgroup, root: $rootfolder" >> "$telemetryfile" 2>&1
fi
else
date "+[%F %T-%4N] DarwinRunnerUpgrade: Failed to find runner path. Path: $path, pgid: $procgroup, root: $rootfolder" >> "$logfile" 2>&1
date "+[%F %T-%4N] DarwinRunnerUpgrade: Failed to find runner path. Path: $path, pgid: $procgroup, root: $rootfolder" >> "$telemetryfile" 2>&1
fi
else
runproc=$(ps x -o pgid,command | grep "run.sh" | grep -v grep | awk '{print $1}')
if [[ $? -eq 0 && -n "$runproc" ]]
then
date "+[%F %T-%4N] Running as ephemeral using run.sh, no need to recreate node folder" >> "$logfile" 2>&1
else
date "+[%F %T-%4N] DarwinRunnerUpgrade: Failed to find runner pgid. pgid: $procgroup, root: $rootfolder" >> "$logfile" 2>&1
date "+[%F %T-%4N] DarwinRunnerUpgrade: Failed to find runner pgid. pgid: $procgroup, root: $rootfolder" >> "$telemetryfile" 2>&1
fi
fi
fi

View File

@@ -43,6 +43,21 @@ else
else
sleep 5
fi
elif [[ $returnCode == 4 ]]; then
if [ ! -x "$(command -v sleep)" ]; then
if [ ! -x "$(command -v ping)" ]; then
COUNT="0"
while [[ $COUNT != 5000 ]]; do
echo "SLEEP" > /dev/null
COUNT=$[$COUNT+1]
done
else
ping -c 5 127.0.0.1 > /dev/null
fi
else
sleep 5
fi
"$DIR"/bin/Runner.Listener run $*
else
exit $returnCode
fi

View File

@@ -26,6 +26,7 @@ namespace GitHub.Runner.Common
Certificates,
Options,
SetupInfo,
Telemetry
}
public static class Constants
@@ -128,7 +129,7 @@ namespace GitHub.Runner.Common
public static readonly string Ephemeral = "ephemeral";
public static readonly string Help = "help";
public static readonly string Replace = "replace";
public static readonly string Once = "once"; // TODO: Remove in 10/2021
public static readonly string Once = "once"; // Keep this around since customers still relies on it
public static readonly string RunAsService = "runasservice";
public static readonly string Unattended = "unattended";
public static readonly string Version = "version";
@@ -154,6 +155,7 @@ namespace GitHub.Runner.Common
public static readonly string LowDiskSpace = "LOW_DISK_SPACE";
public static readonly string UnsupportedCommand = "UNSUPPORTED_COMMAND";
public static readonly string UnsupportedCommandMessageDisabled = "The `{0}` command is disabled. Please upgrade to using Environment Files or opt into unsecure command execution by setting the `ACTIONS_ALLOW_UNSECURE_COMMANDS` environment variable to `true`. For more information see: https://github.blog/changelog/2020-10-01-github-actions-deprecating-set-env-and-add-path-commands/";
public static readonly string UnsupportedStopCommandTokenDisabled = "You cannot use a endToken that is an empty string, the string 'pause-logging', or another workflow command. For more information see: https://docs.github.com/en/actions/learn-github-actions/workflow-commands-for-github-actions#example-stopping-and-starting-workflow-commands or opt into insecure command execution by setting the `ACTIONS_ALLOW_UNSECURE_STOPCOMMAND_TOKENS` environment variable to `true`.";
}
public static class RunnerEvent
@@ -213,6 +215,7 @@ namespace GitHub.Runner.Common
// Keep alphabetical
//
public static readonly string AllowUnsupportedCommands = "ACTIONS_ALLOW_UNSECURE_COMMANDS";
public static readonly string AllowUnsupportedStopCommandTokens = "ACTIONS_ALLOW_UNSECURE_STOPCOMMAND_TOKENS";
public static readonly string RunnerDebug = "ACTIONS_RUNNER_DEBUG";
public static readonly string StepDebug = "ACTIONS_STEP_DEBUG";
}

View File

@@ -343,6 +343,12 @@ namespace GitHub.Runner.Common
".setup_info");
break;
case WellKnownConfigFile.Telemetry:
path = Path.Combine(
GetDirectory(WellKnownDirectory.Diag),
".telemetry");
break;
default:
throw new NotSupportedException($"Unexpected well known config file: '{configFile}'");
}

View File

@@ -2,8 +2,11 @@
using System;
using System.Collections.Generic;
using System.IO;
using System.Net.Http;
using System.Threading;
using System.Threading.Tasks;
using GitHub.Runner.Sdk;
using GitHub.Services.Common;
using GitHub.Services.WebApi;
namespace GitHub.Runner.Common
@@ -35,7 +38,11 @@ namespace GitHub.Runner.Common
public async Task ConnectAsync(VssConnection jobConnection)
{
_connection = jobConnection;
int attemptCount = 5;
int totalAttempts = 5;
int attemptCount = totalAttempts;
var configurationStore = HostContext.GetService<IConfigurationStore>();
var runnerSettings = configurationStore.GetSettings();
while (!_connection.HasAuthenticated && attemptCount-- > 0)
{
try
@@ -45,17 +52,71 @@ namespace GitHub.Runner.Common
}
catch (Exception ex) when (attemptCount > 0)
{
Trace.Info($"Catch exception during connect. {attemptCount} attemp left.");
Trace.Info($"Catch exception during connect. {attemptCount} attempts left.");
Trace.Error(ex);
if (runnerSettings.IsHostedServer)
{
await CheckNetworkEndpointsAsync(attemptCount);
}
}
await Task.Delay(100);
int attempt = totalAttempts - attemptCount;
TimeSpan backoff = BackoffTimerHelper.GetExponentialBackoff(attempt, TimeSpan.FromMilliseconds(100), TimeSpan.FromSeconds(3.2), TimeSpan.FromMilliseconds(100));
await Task.Delay(backoff);
}
_taskClient = _connection.GetClient<TaskHttpClient>();
_hasConnection = true;
}
private async Task CheckNetworkEndpointsAsync(int attemptsLeft)
{
try
{
Trace.Info("Requesting Actions Service health endpoint status");
using (var httpClientHandler = HostContext.CreateHttpClientHandler())
using (var actionsClient = new HttpClient(httpClientHandler))
{
var baseUri = new Uri(_connection.Uri.GetLeftPart(UriPartial.Authority));
actionsClient.DefaultRequestHeaders.UserAgent.AddRange(HostContext.UserAgents);
// Call the _apis/health endpoint, and include how many attempts are left as a URL query for easy tracking
var response = await actionsClient.GetAsync(new Uri(baseUri, $"_apis/health?_internalRunnerAttemptsLeft={attemptsLeft}"));
Trace.Info($"Actions health status code: {response.StatusCode}");
}
}
catch (Exception ex)
{
// Log error, but continue as this call is best-effort
Trace.Info($"Actions Service health endpoint failed due to {ex.GetType().Name}");
Trace.Error(ex);
}
try
{
Trace.Info("Requesting Github API endpoint status");
// This is a dotcom public API... just call it directly
using (var httpClientHandler = HostContext.CreateHttpClientHandler())
using (var gitHubClient = new HttpClient(httpClientHandler))
{
gitHubClient.DefaultRequestHeaders.UserAgent.AddRange(HostContext.UserAgents);
// Call the api.github.com endpoint, and include how many attempts are left as a URL query for easy tracking
var response = await gitHubClient.GetAsync($"https://api.github.com?_internalRunnerAttemptsLeft={attemptsLeft}");
Trace.Info($"api.github.com status code: {response.StatusCode}");
}
}
catch (Exception ex)
{
// Log error, but continue as this call is best-effort
Trace.Info($"Github API endpoint failed due to {ex.GetType().Name}");
Trace.Error(ex);
}
}
private void CheckConnection()
{
if (!_hasConnection)

View File

@@ -3,7 +3,7 @@
<PropertyGroup>
<TargetFramework>netcoreapp3.1</TargetFramework>
<OutputType>Library</OutputType>
<RuntimeIdentifiers>win-x64;win-x86;linux-x64;linux-arm64;linux-arm;osx-x64</RuntimeIdentifiers>
<RuntimeIdentifiers>win-x64;win-x86;linux-x64;linux-arm64;linux-arm;linux-musl-x64;osx-x64</RuntimeIdentifiers>
<TargetLatestRuntimePatch>true</TargetLatestRuntimePatch>
<AssetTargetFallback>portable-net45+win8</AssetTargetFallback>
<NoWarn>NU1701;NU1603</NoWarn>

View File

@@ -29,8 +29,10 @@ namespace GitHub.Runner.Common
// Configuration
Task<TaskAgent> AddAgentAsync(Int32 agentPoolId, TaskAgent agent);
Task DeleteAgentAsync(int agentPoolId, int agentId);
Task DeleteAgentAsync(int agentId);
Task<List<TaskAgentPool>> GetAgentPoolsAsync(string agentPoolName = null, TaskAgentPoolType poolType = TaskAgentPoolType.Automation);
Task<List<TaskAgent>> GetAgentsAsync(int agentPoolId, string agentName = null);
Task<List<TaskAgent>> GetAgentsAsync(string agentName);
Task<TaskAgent> ReplaceAgentAsync(int agentPoolId, TaskAgent agent);
// messagequeue
@@ -252,6 +254,11 @@ namespace GitHub.Runner.Common
return _genericTaskAgentClient.GetAgentsAsync(agentPoolId, agentName, false);
}
public Task<List<TaskAgent>> GetAgentsAsync(string agentName)
{
return GetAgentsAsync(0, agentName); // search in all all agentPools
}
public Task<TaskAgent> ReplaceAgentAsync(int agentPoolId, TaskAgent agent)
{
CheckConnection(RunnerConnectionType.Generic);
@@ -264,6 +271,11 @@ namespace GitHub.Runner.Common
return _genericTaskAgentClient.DeleteAgentAsync(agentPoolId, agentId);
}
public Task DeleteAgentAsync(int agentId)
{
return DeleteAgentAsync(0, agentId); // agentPool is ignored server side
}
//-----------------------------------------------------------------
// MessageQueue
//-----------------------------------------------------------------

View File

@@ -31,6 +31,7 @@ namespace GitHub.Runner.Listener
Constants.Runner.CommandLine.Flags.Commit,
Constants.Runner.CommandLine.Flags.Ephemeral,
Constants.Runner.CommandLine.Flags.Help,
Constants.Runner.CommandLine.Flags.Once,
Constants.Runner.CommandLine.Flags.Replace,
Constants.Runner.CommandLine.Flags.RunAsService,
Constants.Runner.CommandLine.Flags.Unattended,
@@ -68,7 +69,7 @@ namespace GitHub.Runner.Listener
public bool Version => TestFlag(Constants.Runner.CommandLine.Flags.Version);
public bool Ephemeral => TestFlag(Constants.Runner.CommandLine.Flags.Ephemeral);
// TODO: Remove in 10/2021
// Keep this around since customers still relies on it
public bool RunOnce => TestFlag(Constants.Runner.CommandLine.Flags.Once);
// Constructor.

View File

@@ -22,6 +22,7 @@ namespace GitHub.Runner.Listener.Configuration
bool IsConfigured();
Task ConfigureAsync(CommandSettings command);
Task UnconfigureAsync(CommandSettings command);
void DeleteLocalRunnerConfig();
RunnerSettings LoadSettings();
}
@@ -329,6 +330,38 @@ namespace GitHub.Runner.Listener.Configuration
#endif
}
// Delete .runner and .credentials files
public void DeleteLocalRunnerConfig()
{
bool isConfigured = _store.IsConfigured();
bool hasCredentials = _store.HasCredentials();
//delete credential config files
var currentAction = "Removing .credentials";
if (hasCredentials)
{
_store.DeleteCredential();
var keyManager = HostContext.GetService<IRSAKeyManager>();
keyManager.DeleteKey();
_term.WriteSuccessMessage("Removed .credentials");
}
else
{
_term.WriteLine("Does not exist. Skipping " + currentAction);
}
//delete settings config file
currentAction = "Removing .runner";
if (isConfigured)
{
_store.DeleteSettings();
_term.WriteSuccessMessage("Removed .runner");
}
else
{
_term.WriteLine("Does not exist. Skipping " + currentAction);
}
}
public async Task UnconfigureAsync(CommandSettings command)
{
string currentAction = string.Empty;
@@ -382,7 +415,7 @@ namespace GitHub.Runner.Listener.Configuration
// Determine the service deployment type based on connection data. (Hosted/OnPremises)
await _runnerServer.ConnectAsync(new Uri(settings.ServerUrl), creds);
var agents = await _runnerServer.GetAgentsAsync(settings.PoolId, settings.AgentName);
var agents = await _runnerServer.GetAgentsAsync(settings.AgentName);
Trace.Verbose("Returns {0} agents", agents.Count);
TaskAgent agent = agents.FirstOrDefault();
if (agent == null)
@@ -391,7 +424,7 @@ namespace GitHub.Runner.Listener.Configuration
}
else
{
await _runnerServer.DeleteAgentAsync(settings.PoolId, settings.AgentId);
await _runnerServer.DeleteAgentAsync(settings.AgentId);
_term.WriteLine();
_term.WriteSuccessMessage("Runner removed successfully");
@@ -402,31 +435,7 @@ namespace GitHub.Runner.Listener.Configuration
_term.WriteLine("Cannot connect to server, because config files are missing. Skipping removing runner from the server.");
}
//delete credential config files
currentAction = "Removing .credentials";
if (hasCredentials)
{
_store.DeleteCredential();
var keyManager = HostContext.GetService<IRSAKeyManager>();
keyManager.DeleteKey();
_term.WriteSuccessMessage("Removed .credentials");
}
else
{
_term.WriteLine("Does not exist. Skipping " + currentAction);
}
//delete settings config file
currentAction = "Removing .runner";
if (isConfigured)
{
_store.DeleteSettings();
_term.WriteSuccessMessage("Removed .runner");
}
else
{
_term.WriteLine("Does not exist. Skipping " + currentAction);
}
DeleteLocalRunnerConfig();
}
catch (Exception)
{

View File

@@ -36,7 +36,10 @@ namespace GitHub.Runner.Listener
{
private readonly Lazy<Dictionary<long, TaskResult>> _localRunJobResult = new Lazy<Dictionary<long, TaskResult>>();
private int _poolId;
RunnerSettings _runnerSetting;
IConfigurationStore _configurationStore;
RunnerSettings _runnerSettings;
private static readonly string _workerProcessName = $"Runner.Worker{IOUtil.ExeExtension}";
// this is not thread-safe
@@ -54,9 +57,9 @@ namespace GitHub.Runner.Listener
base.Initialize(hostContext);
// get pool id from config
var configurationStore = hostContext.GetService<IConfigurationStore>();
_runnerSetting = configurationStore.GetSettings();
_poolId = _runnerSetting.PoolId;
_configurationStore = hostContext.GetService<IConfigurationStore>();
_runnerSettings = _configurationStore.GetSettings();
_poolId = _runnerSettings.PoolId;
int channelTimeoutSeconds;
if (!int.TryParse(Environment.GetEnvironmentVariable("GITHUB_ACTIONS_RUNNER_CHANNEL_TIMEOUT") ?? string.Empty, out channelTimeoutSeconds))
@@ -661,13 +664,15 @@ namespace GitHub.Runner.Listener
try
{
request = await runnerServer.RenewAgentRequestAsync(poolId, requestId, lockToken, orchestrationId, token);
Trace.Info($"Successfully renew job request {requestId}, job is valid till {request.LockedUntil.Value}");
if (!firstJobRequestRenewed.Task.IsCompleted)
{
// fire first renew succeed event.
firstJobRequestRenewed.TrySetResult(0);
// Update settings if the runner name has been changed server-side
UpdateAgentNameIfNeeded(request.ReservedAgent?.Name);
}
if (encounteringError > 0)
@@ -767,6 +772,27 @@ namespace GitHub.Runner.Listener
}
}
private void UpdateAgentNameIfNeeded(string agentName)
{
var isNewAgentName = !string.Equals(_runnerSettings.AgentName, agentName, StringComparison.Ordinal);
if (!isNewAgentName || string.IsNullOrEmpty(agentName))
{
return;
}
_runnerSettings.AgentName = agentName;
try
{
_configurationStore.SaveSettings(_runnerSettings);
}
catch (Exception ex)
{
Trace.Error("Cannot update the settings file:");
Trace.Error(ex);
}
}
// Best effort upload any logs for this job.
private async Task TryUploadUnfinishedLogs(Pipelines.AgentJobRequestMessage message)
{

View File

@@ -3,13 +3,12 @@
<PropertyGroup>
<TargetFramework>netcoreapp3.1</TargetFramework>
<OutputType>Exe</OutputType>
<RuntimeIdentifiers>win-x64;win-x86;linux-x64;linux-arm64;linux-arm;osx-x64</RuntimeIdentifiers>
<RuntimeIdentifiers>win-x64;win-x86;linux-x64;linux-arm64;linux-arm;linux-musl-x64;osx-x64</RuntimeIdentifiers>
<TargetLatestRuntimePatch>true</TargetLatestRuntimePatch>
<AssetTargetFallback>portable-net45+win8</AssetTargetFallback>
<NoWarn>NU1701;NU1603</NoWarn>
<Version>$(Version)</Version>
<TieredCompilationQuickJit>true</TieredCompilationQuickJit>
<PublishReadyToRun>true</PublishReadyToRun>
</PropertyGroup>
<ItemGroup>

View File

@@ -233,8 +233,14 @@ namespace GitHub.Runner.Listener
Trace.Info($"Set runner startup type - {startType}");
HostContext.StartupType = startType;
if (command.RunOnce)
{
_term.WriteLine("Warning: '--once' is going to be deprecated in the future, please consider using '--ephemeral' during runner registration.", ConsoleColor.Yellow);
_term.WriteLine("https://docs.github.com/en/actions/hosting-your-own-runners/autoscaling-with-self-hosted-runners#using-ephemeral-runners-for-autoscaling", ConsoleColor.Yellow);
}
// Run the runner interactively or as service
return await RunAsync(settings, command.RunOnce || settings.Ephemeral); // TODO: Remove RunOnce later.
return await RunAsync(settings, command.RunOnce || settings.Ephemeral);
}
else
{
@@ -306,10 +312,15 @@ namespace GitHub.Runner.Listener
}
HostContext.WritePerfCounter("SessionCreated");
_term.WriteLine($"Current runner version: '{BuildConstants.RunnerPackage.Version}'");
_term.WriteLine($"{DateTime.UtcNow:u}: Listening for Jobs");
IJobDispatcher jobDispatcher = null;
CancellationTokenSource messageQueueLoopTokenSource = CancellationTokenSource.CreateLinkedTokenSource(HostContext.RunnerShutdownToken);
// Should we try to cleanup ephemeral runners
bool runOnceJobCompleted = false;
try
{
var notification = HostContext.GetService<IJobNotification>();
@@ -371,6 +382,7 @@ namespace GitHub.Runner.Listener
Task completeTask = await Task.WhenAny(getNextMessage, jobDispatcher.RunOnceJobCompleted.Task);
if (completeTask == jobDispatcher.RunOnceJobCompleted.Task)
{
runOnceJobCompleted = true;
Trace.Info("Job has finished at backend, the runner will exit since it is running under onetime use mode.");
Trace.Info("Stop message queue looping.");
messageQueueLoopTokenSource.Cancel();
@@ -478,6 +490,12 @@ namespace GitHub.Runner.Listener
}
messageQueueLoopTokenSource.Dispose();
if (settings.Ephemeral && runOnceJobCompleted)
{
var configManager = HostContext.GetService<IConfigurationManager>();
configManager.DeleteLocalRunnerConfig();
}
}
}
catch (TaskAgentAccessTokenExpiredException)

View File

@@ -75,11 +75,9 @@ namespace GitHub.Runner.Listener
Trace.Info($"All running job has exited.");
// We need to keep runner backup around for macOS until we fixed https://github.com/actions/runner/issues/743
#if !OS_OSX
// delete runner backup
DeletePreviousVersionRunnerBackup(token);
Trace.Info($"Delete old version runner backup.");
#endif
// generate update script from template
await UpdateRunnerUpdateStateAsync("Generate and execute update script.");

View File

@@ -3,13 +3,12 @@
<PropertyGroup>
<TargetFramework>netcoreapp3.1</TargetFramework>
<OutputType>Exe</OutputType>
<RuntimeIdentifiers>win-x64;win-x86;linux-x64;linux-arm64;linux-arm;osx-x64</RuntimeIdentifiers>
<RuntimeIdentifiers>win-x64;win-x86;linux-x64;linux-arm64;linux-arm;linux-musl-x64;osx-x64</RuntimeIdentifiers>
<TargetLatestRuntimePatch>true</TargetLatestRuntimePatch>
<AssetTargetFallback>portable-net45+win8</AssetTargetFallback>
<NoWarn>NU1701;NU1603</NoWarn>
<Version>$(Version)</Version>
<TieredCompilationQuickJit>true</TieredCompilationQuickJit>
<PublishReadyToRun>true</PublishReadyToRun>
</PropertyGroup>
<ItemGroup>

View File

@@ -3,7 +3,7 @@
<PropertyGroup>
<TargetFramework>netcoreapp3.1</TargetFramework>
<OutputType>Library</OutputType>
<RuntimeIdentifiers>win-x64;win-x86;linux-x64;linux-arm64;linux-arm;osx-x64</RuntimeIdentifiers>
<RuntimeIdentifiers>win-x64;win-x86;linux-x64;linux-arm64;linux-arm;linux-musl-x64;osx-x64</RuntimeIdentifiers>
<TargetLatestRuntimePatch>true</TargetLatestRuntimePatch>
<AssetTargetFallback>portable-net45+win8</AssetTargetFallback>
<NoWarn>NU1701;NU1603</NoWarn>

View File

@@ -3,7 +3,7 @@
<PropertyGroup>
<TargetFramework>netcoreapp3.1</TargetFramework>
<OutputType>Library</OutputType>
<RuntimeIdentifiers>win-x64;win-x86;linux-x64;linux-arm64;linux-arm;osx-x64</RuntimeIdentifiers>
<RuntimeIdentifiers>win-x64;win-x86;linux-x64;linux-arm64;linux-arm;linux-musl-x64;osx-x64</RuntimeIdentifiers>
<TargetLatestRuntimePatch>true</TargetLatestRuntimePatch>
<AssetTargetFallback>portable-net45+win8</AssetTargetFallback>
<NoWarn>NU1701;NU1603</NoWarn>

View File

@@ -1,7 +1,5 @@
using GitHub.DistributedTask.Pipelines;
using GitHub.DistributedTask.Pipelines.ContextData;
using GitHub.DistributedTask.Pipelines.ContextData;
using GitHub.DistributedTask.WebApi;
using GitHub.Runner.Common.Util;
using GitHub.Runner.Worker.Container;
using System;
using System.Collections.Generic;
@@ -110,11 +108,18 @@ namespace GitHub.Runner.Worker
// Stop command
if (string.Equals(actionCommand.Command, _stopCommand, StringComparison.OrdinalIgnoreCase))
{
context.Output(input);
context.Debug("Paused processing commands until '##[{actionCommand.Data}]' is received");
ValidateStopToken(context, actionCommand.Data);
_stopToken = actionCommand.Data;
_stopProcessCommand = true;
_registeredCommands.Add(_stopToken);
if (_stopToken.Length > 6)
{
HostContext.SecretMasker.AddValue(_stopToken);
}
context.Output(input);
context.Debug("Paused processing commands until the token you called ::stopCommands:: with is received");
return true;
}
// Found command
@@ -148,7 +153,42 @@ namespace GitHub.Runner.Worker
return true;
}
internal static bool EnhancedAnnotationsEnabled(IExecutionContext context) {
private void ValidateStopToken(IExecutionContext context, string stopToken)
{
#if OS_WINDOWS
var envContext = context.ExpressionValues["env"] as DictionaryContextData;
#else
var envContext = context.ExpressionValues["env"] as CaseSensitiveDictionaryContextData;
#endif
var allowUnsecureStopCommandTokens = false;
allowUnsecureStopCommandTokens = StringUtil.ConvertToBoolean(Environment.GetEnvironmentVariable(Constants.Variables.Actions.AllowUnsupportedStopCommandTokens));
if (!allowUnsecureStopCommandTokens && envContext.ContainsKey(Constants.Variables.Actions.AllowUnsupportedStopCommandTokens))
{
allowUnsecureStopCommandTokens = StringUtil.ConvertToBoolean(envContext[Constants.Variables.Actions.AllowUnsupportedStopCommandTokens].ToString());
}
bool isTokenInvalid = _registeredCommands.Contains(stopToken)
|| string.IsNullOrEmpty(stopToken)
|| string.Equals(stopToken, "pause-logging", StringComparison.OrdinalIgnoreCase);
if (isTokenInvalid)
{
var telemetry = new JobTelemetry
{
Message = $"Invoked ::stopCommand:: with token: [{stopToken}]",
Type = JobTelemetryType.ActionCommand
};
context.JobTelemetry.Add(telemetry);
}
if (isTokenInvalid && !allowUnsecureStopCommandTokens)
{
throw new Exception(Constants.Runner.UnsupportedStopCommandTokenDisabled);
}
}
internal static bool EnhancedAnnotationsEnabled(IExecutionContext context)
{
return context.Global.Variables.GetBoolean("DistributedTask.EnhancedAnnotations") ?? false;
}
}

View File

@@ -267,6 +267,19 @@ namespace GitHub.Runner.Worker
_cachedEmbeddedPostSteps[parentStepId].Push(clonedAction);
}
}
else if (depth > 0)
{
// if we're in a composite action and haven't loaded the local action yet
// we assume it has a post step
if (!_cachedEmbeddedPostSteps.ContainsKey(parentStepId))
{
// If we haven't done so already, add the parent to the post steps
_cachedEmbeddedPostSteps[parentStepId] = new Stack<Pipelines.ActionStep>();
}
// Clone action so we can modify the condition without affecting the original
var clonedAction = action.Clone() as Pipelines.ActionStep;
_cachedEmbeddedPostSteps[parentStepId].Push(clonedAction);
}
}
}
@@ -633,7 +646,12 @@ namespace GitHub.Runner.Worker
}
catch (Exception ex) when (!executionContext.CancellationToken.IsCancellationRequested) // Do not retry if the run is canceled.
{
if (attempt < 3)
// UnresolvableActionDownloadInfoException is a 422 client error, don't retry
// Some possible cases are:
// * Repo is rate limited
// * Repo or tag doesn't exist, or isn't public
// * Policy validation failed
if (attempt < 3 && !(ex is WebApi.UnresolvableActionDownloadInfoException))
{
executionContext.Output($"Failed to resolve action download info. Error: {ex.Message}");
executionContext.Debug(ex.ToString());
@@ -649,6 +667,7 @@ namespace GitHub.Runner.Worker
// Some possible cases are:
// * Repo is rate limited
// * Repo or tag doesn't exist, or isn't public
// * Policy validation failed
if (ex is WebApi.UnresolvableActionDownloadInfoException)
{
throw;
@@ -1028,7 +1047,6 @@ namespace GitHub.Runner.Worker
}
}
// TODO: remove once we remove the DistributedTask.EnableCompositeActions FF
foreach (var step in compositeAction.Steps)
{
if (string.IsNullOrEmpty(executionContext.Global.Variables.Get("DistributedTask.EnableCompositeActions")) && step.Reference.Type != Pipelines.ActionSourceType.Script)
@@ -1171,6 +1189,8 @@ namespace GitHub.Runner.Worker
public string Pre { get; set; }
public string Post { get; set; }
public string NodeVersion { get; set; }
}
public sealed class PluginActionExecutionData : ActionExecutionData

View File

@@ -451,7 +451,8 @@ namespace GitHub.Runner.Worker
};
}
}
else if (string.Equals(usingToken.Value, "node12", StringComparison.OrdinalIgnoreCase))
else if (string.Equals(usingToken.Value, "node12", StringComparison.OrdinalIgnoreCase)||
string.Equals(usingToken.Value, "node16", StringComparison.OrdinalIgnoreCase))
{
if (string.IsNullOrEmpty(mainToken?.Value))
{
@@ -461,6 +462,7 @@ namespace GitHub.Runner.Worker
{
return new NodeJSActionExecutionData()
{
NodeVersion = usingToken.Value,
Script = mainToken.Value,
Pre = preToken?.Value,
InitCondition = preIfToken?.Value ?? "always()",
@@ -490,7 +492,7 @@ namespace GitHub.Runner.Worker
}
else
{
throw new ArgumentOutOfRangeException($"'using: {usingToken.Value}' is not supported, use 'docker' or 'node12' instead.");
throw new ArgumentOutOfRangeException($"'using: {usingToken.Value}' is not supported, use 'docker', 'node12' or 'node16' instead.");
}
}
else if (pluginToken != null)

View File

@@ -40,6 +40,7 @@ namespace GitHub.Runner.Worker
string ScopeName { get; }
string SiblingScopeName { get; }
string ContextName { get; }
ActionRunStage Stage { get; }
Task ForceCompleted { get; }
TaskResult? Result { get; set; }
TaskResult? Outcome { get; set; }
@@ -52,6 +53,7 @@ namespace GitHub.Runner.Worker
Dictionary<string, VariableValue> JobOutputs { get; }
ActionsEnvironmentReference ActionsEnvironment { get; }
List<ActionsStepTelemetry> ActionsStepsTelemetry { get; }
List<JobTelemetry> JobTelemetry { get; }
DictionaryContextData ExpressionValues { get; }
IList<IFunctionInfo> ExpressionFunctions { get; }
JobContext JobContext { get; }
@@ -61,7 +63,7 @@ namespace GitHub.Runner.Worker
// Only job level ExecutionContext has PostJobSteps
Stack<IStep> PostJobSteps { get; }
HashSet<Guid> EmbeddedStepsWithPostRegistered{ get; }
Dictionary<Guid, string> EmbeddedStepsWithPostRegistered { get; }
// Keep track of embedded steps states
Dictionary<Guid, Dictionary<string, string>> EmbeddedIntraActionState { get; }
@@ -75,8 +77,8 @@ namespace GitHub.Runner.Worker
// Initialize
void InitializeJob(Pipelines.AgentJobRequestMessage message, CancellationToken token);
void CancelToken();
IExecutionContext CreateChild(Guid recordId, string displayName, string refName, string scopeName, string contextName, Dictionary<string, string> intraActionState = null, int? recordOrder = null, IPagingLogger logger = null, bool isEmbedded = false, CancellationTokenSource cancellationTokenSource = null, Guid embeddedId = default(Guid), string siblingScopeName = null);
IExecutionContext CreateEmbeddedChild(string scopeName, string contextName, Guid embeddedId, Dictionary<string, string> intraActionState = null, string siblingScopeName = null);
IExecutionContext CreateChild(Guid recordId, string displayName, string refName, string scopeName, string contextName, ActionRunStage stage, Dictionary<string, string> intraActionState = null, int? recordOrder = null, IPagingLogger logger = null, bool isEmbedded = false, CancellationTokenSource cancellationTokenSource = null, Guid embeddedId = default(Guid), string siblingScopeName = null);
IExecutionContext CreateEmbeddedChild(string scopeName, string contextName, Guid embeddedId, ActionRunStage stage, Dictionary<string, string> intraActionState = null, string siblingScopeName = null);
// logging
long Write(string tag, string message);
@@ -143,6 +145,7 @@ namespace GitHub.Runner.Worker
public string ScopeName { get; private set; }
public string SiblingScopeName { get; private set; }
public string ContextName { get; private set; }
public ActionRunStage Stage { get; private set; }
public Task ForceCompleted => _forceCompleted.Task;
public CancellationToken CancellationToken => _cancellationTokenSource.Token;
public Dictionary<string, string> IntraActionState { get; private set; }
@@ -150,6 +153,7 @@ namespace GitHub.Runner.Worker
public ActionsEnvironmentReference ActionsEnvironment { get; private set; }
public List<ActionsStepTelemetry> ActionsStepsTelemetry { get; private set; }
public List<JobTelemetry> JobTelemetry { get; private set; }
public DictionaryContextData ExpressionValues { get; } = new DictionaryContextData();
public IList<IFunctionInfo> ExpressionFunctions { get; } = new List<IFunctionInfo>();
@@ -166,7 +170,7 @@ namespace GitHub.Runner.Worker
public HashSet<Guid> StepsWithPostRegistered { get; private set; }
// Only job level ExecutionContext has EmbeddedStepsWithPostRegistered
public HashSet<Guid> EmbeddedStepsWithPostRegistered { get; private set; }
public Dictionary<Guid, string> EmbeddedStepsWithPostRegistered { get; private set; }
public Dictionary<Guid, Dictionary<string, string>> EmbeddedIntraActionState { get; private set; }
@@ -263,11 +267,18 @@ namespace GitHub.Runner.Worker
string siblingScopeName = null;
if (this.IsEmbedded)
{
if (step is IActionRunner actionRunner && !Root.EmbeddedStepsWithPostRegistered.Add(actionRunner.Action.Id))
if (step is IActionRunner actionRunner)
{
Trace.Info($"'post' of '{actionRunner.DisplayName}' already push to child post step stack.");
if (Root.EmbeddedStepsWithPostRegistered.ContainsKey(actionRunner.Action.Id))
{
Trace.Info($"'post' of '{actionRunner.DisplayName}' already push to child post step stack.");
}
else
{
Root.EmbeddedStepsWithPostRegistered[actionRunner.Action.Id] = actionRunner.Condition;
}
return;
}
return;
}
else if (step is IActionRunner actionRunner && !Root.StepsWithPostRegistered.Add(actionRunner.Action.Id))
{
@@ -283,7 +294,7 @@ namespace GitHub.Runner.Worker
Root.PostJobSteps.Push(step);
}
public IExecutionContext CreateChild(Guid recordId, string displayName, string refName, string scopeName, string contextName, Dictionary<string, string> intraActionState = null, int? recordOrder = null, IPagingLogger logger = null, bool isEmbedded = false, CancellationTokenSource cancellationTokenSource = null, Guid embeddedId = default(Guid), string siblingScopeName = null)
public IExecutionContext CreateChild(Guid recordId, string displayName, string refName, string scopeName, string contextName, ActionRunStage stage, Dictionary<string, string> intraActionState = null, int? recordOrder = null, IPagingLogger logger = null, bool isEmbedded = false, CancellationTokenSource cancellationTokenSource = null, Guid embeddedId = default(Guid), string siblingScopeName = null)
{
Trace.Entering();
@@ -292,8 +303,10 @@ namespace GitHub.Runner.Worker
child.Global = Global;
child.ScopeName = scopeName;
child.ContextName = contextName;
child.Stage = stage;
child.EmbeddedId = embeddedId;
child.SiblingScopeName = siblingScopeName;
child.JobTelemetry = JobTelemetry;
if (intraActionState == null)
{
child.IntraActionState = new Dictionary<string, string>(StringComparer.OrdinalIgnoreCase);
@@ -341,9 +354,9 @@ namespace GitHub.Runner.Worker
/// An embedded execution context shares the same record ID, record name, logger,
/// and a linked cancellation token.
/// </summary>
public IExecutionContext CreateEmbeddedChild(string scopeName, string contextName, Guid embeddedId, Dictionary<string, string> intraActionState = null, string siblingScopeName = null)
public IExecutionContext CreateEmbeddedChild(string scopeName, string contextName, Guid embeddedId, ActionRunStage stage, Dictionary<string, string> intraActionState = null, string siblingScopeName = null)
{
return Root.CreateChild(_record.Id, _record.Name, _record.Id.ToString("N"), scopeName, contextName, logger: _logger, isEmbedded: true, cancellationTokenSource: CancellationTokenSource.CreateLinkedTokenSource(_cancellationTokenSource.Token), intraActionState: intraActionState, embeddedId: embeddedId, siblingScopeName: siblingScopeName);
return Root.CreateChild(_record.Id, _record.Name, _record.Id.ToString("N"), scopeName, contextName, stage, logger: _logger, isEmbedded: true, cancellationTokenSource: CancellationTokenSource.CreateLinkedTokenSource(_cancellationTokenSource.Token), intraActionState: intraActionState, embeddedId: embeddedId, siblingScopeName: siblingScopeName);
}
public void Start(string currentOperation = null)
@@ -650,6 +663,8 @@ namespace GitHub.Runner.Worker
// ActionsStepTelemetry
ActionsStepsTelemetry = new List<ActionsStepTelemetry>();
JobTelemetry = new List<JobTelemetry>();
// Service container info
Global.ServiceContainers = new List<ContainerInfo>();
@@ -711,10 +726,10 @@ namespace GitHub.Runner.Worker
StepsWithPostRegistered = new HashSet<Guid>();
// EmbeddedStepsWithPostRegistered for job ExecutionContext
EmbeddedStepsWithPostRegistered = new HashSet<Guid>();
EmbeddedStepsWithPostRegistered = new Dictionary<Guid, string>();
// EmbeddedIntraActionState for job ExecutionContext
EmbeddedIntraActionState = new Dictionary<Guid, Dictionary<string,string>>();
EmbeddedIntraActionState = new Dictionary<Guid, Dictionary<string, string>>();
// Job timeline record.
InitializeTimelineRecord(
@@ -932,7 +947,7 @@ namespace GitHub.Runner.Worker
}
var newGuid = Guid.NewGuid();
return CreateChild(newGuid, displayName, newGuid.ToString("N"), null, null, intraActionState, _childTimelineRecordOrder - Root.PostJobSteps.Count, siblingScopeName: siblingScopeName);
return CreateChild(newGuid, displayName, newGuid.ToString("N"), null, null, ActionRunStage.Post, intraActionState, _childTimelineRecordOrder - Root.PostJobSteps.Count, siblingScopeName: siblingScopeName);
}
}
@@ -965,7 +980,7 @@ namespace GitHub.Runner.Worker
// Do not add a format string overload. See comment on ExecutionContext.Write().
public static void InfrastructureError(this IExecutionContext context, string message)
{
context.AddIssue(new Issue() { Type = IssueType.Error, Message = message, IsInfrastructureIssue = true});
context.AddIssue(new Issue() { Type = IssueType.Error, Message = message, IsInfrastructureIssue = true });
}
// Do not add a format string overload. See comment on ExecutionContext.Write().

View File

@@ -24,8 +24,19 @@ namespace GitHub.Runner.Worker.Expressions
ArgUtil.NotNull(templateContext, nameof(templateContext));
var executionContext = templateContext.State[nameof(IExecutionContext)] as IExecutionContext;
ArgUtil.NotNull(executionContext, nameof(executionContext));
ActionResult jobStatus = executionContext.JobContext.Status ?? ActionResult.Success;
return jobStatus == ActionResult.Failure;
// Decide based on 'action_status' for composite MAIN steps and 'job.status' for pre, post and job-level steps
var isCompositeMainStep = executionContext.IsEmbedded && executionContext.Stage == ActionRunStage.Main;
if (isCompositeMainStep)
{
ActionResult actionStatus = EnumUtil.TryParse<ActionResult>(executionContext.GetGitHubContext("action_status")) ?? ActionResult.Success;
return actionStatus == ActionResult.Failure;
}
else
{
ActionResult jobStatus = executionContext.JobContext.Status ?? ActionResult.Success;
return jobStatus == ActionResult.Failure;
}
}
}
}

View File

@@ -24,8 +24,19 @@ namespace GitHub.Runner.Worker.Expressions
ArgUtil.NotNull(templateContext, nameof(templateContext));
var executionContext = templateContext.State[nameof(IExecutionContext)] as IExecutionContext;
ArgUtil.NotNull(executionContext, nameof(executionContext));
ActionResult jobStatus = executionContext.JobContext.Status ?? ActionResult.Success;
return jobStatus == ActionResult.Success;
// Decide based on 'action_status' for composite MAIN steps and 'job.status' for pre, post and job-level steps
var isCompositeMainStep = executionContext.IsEmbedded && executionContext.Stage == ActionRunStage.Main;
if (isCompositeMainStep)
{
ActionResult actionStatus = EnumUtil.TryParse<ActionResult>(executionContext.GetGitHubContext("action_status")) ?? ActionResult.Success;
return actionStatus == ActionResult.Success;
}
else
{
ActionResult jobStatus = executionContext.JobContext.Status ?? ActionResult.Success;
return jobStatus == ActionResult.Success;
}
}
}
}

View File

@@ -23,9 +23,13 @@ namespace GitHub.Runner.Worker
"job",
"path",
"ref",
"ref_name",
"ref_protected",
"ref_type",
"repository",
"repository_owner",
"retention_days",
"run_attempt",
"run_id",
"run_number",
"server_url",
@@ -38,9 +42,16 @@ namespace GitHub.Runner.Worker
{
foreach (var data in this)
{
if (_contextEnvAllowlist.Contains(data.Key) && data.Value is StringContextData value)
if (_contextEnvAllowlist.Contains(data.Key))
{
yield return new KeyValuePair<string, string>($"GITHUB_{data.Key.ToUpperInvariant()}", value);
if (data.Value is StringContextData value)
{
yield return new KeyValuePair<string, string>($"GITHUB_{data.Key.ToUpperInvariant()}", value);
}
else if (data.Value is BooleanContextData booleanValue)
{
yield return new KeyValuePair<string, string>($"GITHUB_{data.Key.ToUpperInvariant()}", booleanValue.ToString());
}
}
}
}

View File

@@ -50,8 +50,9 @@ namespace GitHub.Runner.Worker.Handlers
// Only register post steps for steps that actually ran
foreach (var step in Data.PostSteps.ToList())
{
if (ExecutionContext.Root.EmbeddedStepsWithPostRegistered.Contains(step.Id))
if (ExecutionContext.Root.EmbeddedStepsWithPostRegistered.ContainsKey(step.Id))
{
step.Condition = ExecutionContext.Root.EmbeddedStepsWithPostRegistered[step.Id];
steps.Add(step);
}
else
@@ -124,7 +125,7 @@ namespace GitHub.Runner.Worker.Handlers
{
ArgUtil.NotNull(step, step.DisplayName);
var stepId = $"__{Guid.NewGuid()}";
step.ExecutionContext = ExecutionContext.CreateEmbeddedChild(childScopeName, stepId, Guid.NewGuid());
step.ExecutionContext = ExecutionContext.CreateEmbeddedChild(childScopeName, stepId, Guid.NewGuid(), stage);
embeddedSteps.Add(step);
}
}
@@ -143,7 +144,7 @@ namespace GitHub.Runner.Worker.Handlers
step.Stage = stage;
step.Condition = stepData.Condition;
ExecutionContext.Root.EmbeddedIntraActionState.TryGetValue(step.Action.Id, out var intraActionState);
step.ExecutionContext = ExecutionContext.CreateEmbeddedChild(childScopeName, stepData.ContextName, step.Action.Id, intraActionState: intraActionState, siblingScopeName: siblingScopeName);
step.ExecutionContext = ExecutionContext.CreateEmbeddedChild(childScopeName, stepData.ContextName, step.Action.Id, stage, intraActionState: intraActionState, siblingScopeName: siblingScopeName);
step.ExecutionContext.ExpressionValues["inputs"] = inputsData;
if (!String.IsNullOrEmpty(ExecutionContext.SiblingScopeName))
{
@@ -241,6 +242,10 @@ namespace GitHub.Runner.Worker.Handlers
step.ExecutionContext.ExpressionFunctions.Add(new FunctionInfo<FailureFunction>(PipelineTemplateConstants.Failure, 0, 0));
step.ExecutionContext.ExpressionFunctions.Add(new FunctionInfo<SuccessFunction>(PipelineTemplateConstants.Success, 0, 0));
// Set action_status to the success of the current composite action
var actionResult = ExecutionContext.Result?.ToActionResult() ?? ActionResult.Success;
step.ExecutionContext.SetGitHubContext("action_status", actionResult.ToString());
// Initialize env context
Trace.Info("Initialize Env context for embedded step");
#if OS_WINDOWS
@@ -295,108 +300,100 @@ namespace GitHub.Runner.Worker.Handlers
CancellationTokenRegistration? jobCancelRegister = null;
try
{
// For main steps just run the action
if (stage == ActionRunStage.Main)
// Register job cancellation call back only if job cancellation token not been fire before each step run
if (!ExecutionContext.Root.CancellationToken.IsCancellationRequested)
{
// Test the condition again. The job was canceled after the condition was originally evaluated.
jobCancelRegister = ExecutionContext.Root.CancellationToken.Register(() =>
{
// Mark job as cancelled
ExecutionContext.Root.Result = TaskResult.Canceled;
ExecutionContext.Root.JobContext.Status = ExecutionContext.Root.Result?.ToActionResult();
step.ExecutionContext.Debug($"Re-evaluate condition on job cancellation for step: '{step.DisplayName}'.");
var conditionReTestTraceWriter = new ConditionTraceWriter(Trace, null); // host tracing only
var conditionReTestResult = false;
if (HostContext.RunnerShutdownToken.IsCancellationRequested)
{
step.ExecutionContext.Debug($"Skip Re-evaluate condition on runner shutdown.");
}
else
{
try
{
var templateEvaluator = step.ExecutionContext.ToPipelineTemplateEvaluator(conditionReTestTraceWriter);
var condition = new BasicExpressionToken(null, null, null, step.Condition);
conditionReTestResult = templateEvaluator.EvaluateStepIf(condition, step.ExecutionContext.ExpressionValues, step.ExecutionContext.ExpressionFunctions, step.ExecutionContext.ToExpressionState());
}
catch (Exception ex)
{
// Cancel the step since we get exception while re-evaluate step condition
Trace.Info("Caught exception from expression when re-test condition on job cancellation.");
step.ExecutionContext.Error(ex);
}
}
if (!conditionReTestResult)
{
// Cancel the step
Trace.Info("Cancel current running step.");
step.ExecutionContext.CancelToken();
}
});
}
else
{
if (ExecutionContext.Root.Result != TaskResult.Canceled)
{
// Mark job as cancelled
ExecutionContext.Root.Result = TaskResult.Canceled;
ExecutionContext.Root.JobContext.Status = ExecutionContext.Root.Result?.ToActionResult();
}
}
// Evaluate condition
step.ExecutionContext.Debug($"Evaluating condition for step: '{step.DisplayName}'");
var conditionTraceWriter = new ConditionTraceWriter(Trace, step.ExecutionContext);
var conditionResult = false;
var conditionEvaluateError = default(Exception);
if (HostContext.RunnerShutdownToken.IsCancellationRequested)
{
step.ExecutionContext.Debug($"Skip evaluate condition on runner shutdown.");
}
else
{
try
{
var templateEvaluator = step.ExecutionContext.ToPipelineTemplateEvaluator(conditionTraceWriter);
var condition = new BasicExpressionToken(null, null, null, step.Condition);
conditionResult = templateEvaluator.EvaluateStepIf(condition, step.ExecutionContext.ExpressionValues, step.ExecutionContext.ExpressionFunctions, step.ExecutionContext.ToExpressionState());
}
catch (Exception ex)
{
Trace.Info("Caught exception from expression.");
Trace.Error(ex);
conditionEvaluateError = ex;
}
}
if (!conditionResult && conditionEvaluateError == null)
{
// Condition is false
Trace.Info("Skipping step due to condition evaluation.");
step.ExecutionContext.Result = TaskResult.Skipped;
continue;
}
else if (conditionEvaluateError != null)
{
// Condition error
step.ExecutionContext.Error(conditionEvaluateError);
step.ExecutionContext.Result = TaskResult.Failed;
ExecutionContext.Result = TaskResult.Failed;
break;
}
else
{
await RunStepAsync(step);
}
// We need to evaluate conditions for pre/post steps
else
{
// Register job cancellation call back only if job cancellation token not been fire before each step run
if (!ExecutionContext.Root.CancellationToken.IsCancellationRequested)
{
// Test the condition again. The job was canceled after the condition was originally evaluated.
jobCancelRegister = ExecutionContext.Root.CancellationToken.Register(() =>
{
// Mark job as cancelled
ExecutionContext.Root.Result = TaskResult.Canceled;
ExecutionContext.Root.JobContext.Status = ExecutionContext.Root.Result?.ToActionResult();
step.ExecutionContext.Debug($"Re-evaluate condition on job cancellation for step: '{step.DisplayName}'.");
var conditionReTestTraceWriter = new ConditionTraceWriter(Trace, null); // host tracing only
var conditionReTestResult = false;
if (HostContext.RunnerShutdownToken.IsCancellationRequested)
{
step.ExecutionContext.Debug($"Skip Re-evaluate condition on runner shutdown.");
}
else
{
try
{
var templateEvaluator = step.ExecutionContext.ToPipelineTemplateEvaluator(conditionReTestTraceWriter);
var condition = new BasicExpressionToken(null, null, null, step.Condition);
conditionReTestResult = templateEvaluator.EvaluateStepIf(condition, step.ExecutionContext.ExpressionValues, step.ExecutionContext.ExpressionFunctions, step.ExecutionContext.ToExpressionState());
}
catch (Exception ex)
{
// Cancel the step since we get exception while re-evaluate step condition
Trace.Info("Caught exception from expression when re-test condition on job cancellation.");
step.ExecutionContext.Error(ex);
}
}
if (!conditionReTestResult)
{
// Cancel the step
Trace.Info("Cancel current running step.");
step.ExecutionContext.CancelToken();
}
});
}
else
{
if (ExecutionContext.Root.Result != TaskResult.Canceled)
{
// Mark job as cancelled
ExecutionContext.Root.Result = TaskResult.Canceled;
ExecutionContext.Root.JobContext.Status = ExecutionContext.Root.Result?.ToActionResult();
}
}
// Evaluate condition
step.ExecutionContext.Debug($"Evaluating condition for step: '{step.DisplayName}'");
var conditionTraceWriter = new ConditionTraceWriter(Trace, step.ExecutionContext);
var conditionResult = false;
var conditionEvaluateError = default(Exception);
if (HostContext.RunnerShutdownToken.IsCancellationRequested)
{
step.ExecutionContext.Debug($"Skip evaluate condition on runner shutdown.");
}
else
{
try
{
var templateEvaluator = step.ExecutionContext.ToPipelineTemplateEvaluator(conditionTraceWriter);
var condition = new BasicExpressionToken(null, null, null, step.Condition);
conditionResult = templateEvaluator.EvaluateStepIf(condition, step.ExecutionContext.ExpressionValues, step.ExecutionContext.ExpressionFunctions, step.ExecutionContext.ToExpressionState());
}
catch (Exception ex)
{
Trace.Info("Caught exception from expression.");
Trace.Error(ex);
conditionEvaluateError = ex;
}
}
if (!conditionResult && conditionEvaluateError == null)
{
// Condition is false
Trace.Info("Skipping step due to condition evaluation.");
step.ExecutionContext.Result = TaskResult.Skipped;
continue;
}
else if (conditionEvaluateError != null)
{
// Condition error
step.ExecutionContext.Error(conditionEvaluateError);
step.ExecutionContext.Result = TaskResult.Failed;
ExecutionContext.Result = TaskResult.Failed;
break;
}
else
{
await RunStepAsync(step);
}
}
}
finally
{
@@ -412,12 +409,6 @@ namespace GitHub.Runner.Worker.Handlers
{
Trace.Info($"Update job result with current composite step result '{step.ExecutionContext.Result}'.");
ExecutionContext.Result = TaskResultUtil.MergeTaskResults(ExecutionContext.Result, step.ExecutionContext.Result.Value);
// We should run cleanup even if one of the cleanup step fails
if (stage != ActionRunStage.Post)
{
break;
}
}
}
}

View File

@@ -83,7 +83,7 @@ namespace GitHub.Runner.Worker.Handlers
HasPreStep = Data.HasPre,
HasPostStep = Data.HasPost,
IsEmbedded = ExecutionContext.IsEmbedded,
Type = "node12"
Type = Data.NodeVersion
};
ExecutionContext.Root.ActionsStepsTelemetry.Add(telemetry);
}
@@ -99,7 +99,7 @@ namespace GitHub.Runner.Worker.Handlers
workingDirectory = HostContext.GetDirectory(WellKnownDirectory.Work);
}
var nodeRuntimeVersion = await StepHost.DetermineNodeRuntimeVersion(ExecutionContext);
var nodeRuntimeVersion = await StepHost.DetermineNodeRuntimeVersion(ExecutionContext, Data.NodeVersion);
string file = Path.Combine(HostContext.GetDirectory(WellKnownDirectory.Externals), nodeRuntimeVersion, "bin", $"node{IOUtil.ExeExtension}");
// Format the arguments passed to node.

View File

@@ -23,7 +23,7 @@ namespace GitHub.Runner.Worker.Handlers
string ResolvePathForStepHost(string path);
Task<string> DetermineNodeRuntimeVersion(IExecutionContext executionContext);
Task<string> DetermineNodeRuntimeVersion(IExecutionContext executionContext, string preferredVersion);
Task<int> ExecuteAsync(string workingDirectory,
string fileName,
@@ -58,9 +58,9 @@ namespace GitHub.Runner.Worker.Handlers
return path;
}
public Task<string> DetermineNodeRuntimeVersion(IExecutionContext executionContext)
public Task<string> DetermineNodeRuntimeVersion(IExecutionContext executionContext, string preferredVersion)
{
return Task.FromResult<string>("node12");
return Task.FromResult<string>(preferredVersion);
}
public async Task<int> ExecuteAsync(string workingDirectory,
@@ -123,7 +123,7 @@ namespace GitHub.Runner.Worker.Handlers
}
}
public async Task<string> DetermineNodeRuntimeVersion(IExecutionContext executionContext)
public async Task<string> DetermineNodeRuntimeVersion(IExecutionContext executionContext, string preferredVersion)
{
// Best effort to determine a compatible node runtime
// There may be more variation in which libraries are linked than just musl/glibc,
@@ -148,14 +148,14 @@ namespace GitHub.Runner.Worker.Handlers
var msg = $"JavaScript Actions in Alpine containers are only supported on x64 Linux runners. Detected {os} {arch}";
throw new NotSupportedException(msg);
}
nodeExternal = "node12_alpine";
nodeExternal = $"{preferredVersion}_alpine";
executionContext.Debug($"Container distribution is alpine. Running JavaScript Action with external tool: {nodeExternal}");
return nodeExternal;
}
}
}
// Optimistically use the default
nodeExternal = "node12";
nodeExternal = preferredVersion;
executionContext.Debug($"Running JavaScript Action with default external tool: {nodeExternal}");
return nodeExternal;
}

View File

@@ -55,7 +55,7 @@ namespace GitHub.Runner.Worker
ArgUtil.NotNull(message, nameof(message));
// Create a new timeline record for 'Set up job'
IExecutionContext context = jobContext.CreateChild(Guid.NewGuid(), "Set up job", $"{nameof(JobExtension)}_Init", null, null);
IExecutionContext context = jobContext.CreateChild(Guid.NewGuid(), "Set up job", $"{nameof(JobExtension)}_Init", null, null, ActionRunStage.Pre);
List<IStep> preJobSteps = new List<IStep>();
List<IStep> jobSteps = new List<IStep>();
@@ -306,13 +306,13 @@ namespace GitHub.Runner.Worker
JobExtensionRunner extensionStep = step as JobExtensionRunner;
ArgUtil.NotNull(extensionStep, extensionStep.DisplayName);
Guid stepId = Guid.NewGuid();
extensionStep.ExecutionContext = jobContext.CreateChild(stepId, extensionStep.DisplayName, null, null, stepId.ToString("N"));
extensionStep.ExecutionContext = jobContext.CreateChild(stepId, extensionStep.DisplayName, null, null, stepId.ToString("N"), ActionRunStage.Pre);
}
else if (step is IActionRunner actionStep)
{
ArgUtil.NotNull(actionStep, step.DisplayName);
Guid stepId = Guid.NewGuid();
actionStep.ExecutionContext = jobContext.CreateChild(stepId, actionStep.DisplayName, stepId.ToString("N"), null, null, intraActionStates[actionStep.Action.Id]);
actionStep.ExecutionContext = jobContext.CreateChild(stepId, actionStep.DisplayName, stepId.ToString("N"), null, null, ActionRunStage.Pre, intraActionStates[actionStep.Action.Id]);
}
}
@@ -323,7 +323,7 @@ namespace GitHub.Runner.Worker
{
ArgUtil.NotNull(actionStep, step.DisplayName);
intraActionStates.TryGetValue(actionStep.Action.Id, out var intraActionState);
actionStep.ExecutionContext = jobContext.CreateChild(actionStep.Action.Id, actionStep.DisplayName, actionStep.Action.Name, null, actionStep.Action.ContextName, intraActionState);
actionStep.ExecutionContext = jobContext.CreateChild(actionStep.Action.Id, actionStep.DisplayName, actionStep.Action.Name, null, actionStep.Action.ContextName, ActionRunStage.Main, intraActionState);
}
}
@@ -394,7 +394,7 @@ namespace GitHub.Runner.Worker
ArgUtil.NotNull(jobContext, nameof(jobContext));
// create a new timeline record node for 'Finalize job'
IExecutionContext context = jobContext.CreateChild(Guid.NewGuid(), "Complete job", $"{nameof(JobExtension)}_Final", null, null);
IExecutionContext context = jobContext.CreateChild(Guid.NewGuid(), "Complete job", $"{nameof(JobExtension)}_Final", null, null, ActionRunStage.Post);
using (var register = jobContext.CancellationToken.Register(() => { context.CancelToken(); }))
{
try

View File

@@ -7,6 +7,7 @@ using System;
using System.Collections.Generic;
using System.IO;
using System.Linq;
using System.Text;
using System.Threading;
using System.Threading.Tasks;
using System.Net.Http;
@@ -105,6 +106,10 @@ namespace GitHub.Runner.Worker
}
jobContext.SetRunnerContext("os", VarUtil.OS);
jobContext.SetRunnerContext("arch", VarUtil.OSArchitecture);
var runnerSettings = HostContext.GetService<IConfigurationStore>().GetSettings();
jobContext.SetRunnerContext("name", runnerSettings.AgentName);
string toolsDirectory = HostContext.GetDirectory(WellKnownDirectory.Tools);
Directory.CreateDirectory(toolsDirectory);
@@ -225,8 +230,15 @@ namespace GitHub.Runner.Worker
return result;
}
// Load any upgrade telemetry
LoadFromTelemetryFile(jobContext.JobTelemetry);
// Make sure we don't submit secrets as telemetry
MaskTelemetrySecrets(jobContext.JobTelemetry);
Trace.Info("Raising job completed event.");
var jobCompletedEvent = new JobCompletedEvent(message.RequestId, message.JobId, result, jobContext.JobOutputs, jobContext.ActionsEnvironment, jobContext.ActionsStepsTelemetry);
var jobCompletedEvent = new JobCompletedEvent(message.RequestId, message.JobId, result, jobContext.JobOutputs, jobContext.ActionsEnvironment, jobContext.ActionsStepsTelemetry, jobContext.JobTelemetry);
var completeJobRetryLimit = 5;
var exceptions = new List<Exception>();
@@ -270,6 +282,38 @@ namespace GitHub.Runner.Worker
throw new AggregateException(exceptions);
}
private void MaskTelemetrySecrets(List<JobTelemetry> jobTelemetry)
{
foreach (var telemetryItem in jobTelemetry)
{
telemetryItem.Message = HostContext.SecretMasker.MaskSecrets(telemetryItem.Message);
}
}
private void LoadFromTelemetryFile(List<JobTelemetry> jobTelemetry)
{
try
{
var telemetryFilePath = HostContext.GetConfigFile(WellKnownConfigFile.Telemetry);
if (File.Exists(telemetryFilePath))
{
var telemetryData = File.ReadAllText(telemetryFilePath, Encoding.UTF8);
var telemetry = new JobTelemetry
{
Message = $"Runner File Telemetry:\n{telemetryData}",
Type = JobTelemetryType.General
};
jobTelemetry.Add(telemetry);
IOUtil.DeleteFile(telemetryFilePath);
}
}
catch (Exception e)
{
Trace.Error("Error when trying to load telemetry from telemetry file");
Trace.Error(e);
}
}
private async Task ShutdownQueue(bool throwOnFailure)
{
if (_jobServerQueue != null)

View File

@@ -3,13 +3,12 @@
<PropertyGroup>
<TargetFramework>netcoreapp3.1</TargetFramework>
<OutputType>Exe</OutputType>
<RuntimeIdentifiers>win-x64;win-x86;linux-x64;linux-arm64;linux-arm;osx-x64</RuntimeIdentifiers>
<RuntimeIdentifiers>win-x64;win-x86;linux-x64;linux-arm64;linux-arm;linux-musl-x64;osx-x64</RuntimeIdentifiers>
<TargetLatestRuntimePatch>true</TargetLatestRuntimePatch>
<AssetTargetFallback>portable-net45+win8</AssetTargetFallback>
<NoWarn>NU1701;NU1603</NoWarn>
<Version>$(Version)</Version>
<TieredCompilationQuickJit>true</TieredCompilationQuickJit>
<PublishReadyToRun>true</PublishReadyToRun>
</PropertyGroup>
<ItemGroup>

View File

@@ -46,7 +46,7 @@
"runs": {
"one-of": [
"container-runs",
"node12-runs",
"node-runs",
"plugin-runs",
"composite-runs"
]
@@ -80,7 +80,7 @@
"loose-value-type": "string"
}
},
"node12-runs": {
"node-runs": {
"mapping": {
"properties": {
"using": "non-empty-string",
@@ -112,7 +112,7 @@
"item-type": "composite-step"
}
},
"composite-step":{
"composite-step": {
"one-of": [
"run-step",
"uses-step"
@@ -123,6 +123,7 @@
"properties": {
"name": "string-steps-context",
"id": "non-empty-string",
"if": "step-if",
"run": {
"type": "string-steps-context",
"required": true
@@ -141,6 +142,7 @@
"properties": {
"name": "string-steps-context",
"id": "non-empty-string",
"if": "step-if",
"uses": {
"type": "non-empty-string",
"required": true
@@ -216,6 +218,24 @@
"loose-value-type": "string"
}
},
"step-if": {
"context": [
"github",
"inputs",
"strategy",
"matrix",
"steps",
"job",
"runner",
"env",
"always(0,0)",
"failure(0,0)",
"cancelled(0,0)",
"success(0,0)",
"hashFiles(1,255)"
],
"string": {}
},
"step-with": {
"context": [
"github",

View File

@@ -638,6 +638,7 @@ namespace GitHub.DistributedTask.Pipelines.ObjectTemplating
new NamedValueInfo<NoOperationNamedValue>(PipelineTemplateConstants.Matrix),
new NamedValueInfo<NoOperationNamedValue>(PipelineTemplateConstants.Steps),
new NamedValueInfo<NoOperationNamedValue>(PipelineTemplateConstants.GitHub),
new NamedValueInfo<NoOperationNamedValue>(PipelineTemplateConstants.Inputs),
new NamedValueInfo<NoOperationNamedValue>(PipelineTemplateConstants.Job),
new NamedValueInfo<NoOperationNamedValue>(PipelineTemplateConstants.Runner),
new NamedValueInfo<NoOperationNamedValue>(PipelineTemplateConstants.Env),

View File

@@ -155,6 +155,19 @@ namespace GitHub.DistributedTask.WebApi
this.ActionsStepsTelemetry = actionsStepsTelemetry;
}
public JobCompletedEvent(
Int64 requestId,
Guid jobId,
TaskResult result,
Dictionary<String, VariableValue> outputs,
ActionsEnvironmentReference actionsEnvironment,
List<ActionsStepTelemetry> actionsStepsTelemetry,
List<JobTelemetry> jobTelemetry)
: this(requestId, jobId, result, outputs, actionsEnvironment, actionsStepsTelemetry)
{
this.JobTelemetry = jobTelemetry;
}
[DataMember(EmitDefaultValue = false)]
public Int64 RequestId
{
@@ -189,6 +202,13 @@ namespace GitHub.DistributedTask.WebApi
get;
set;
}
[DataMember(EmitDefaultValue = false)]
public List<JobTelemetry> JobTelemetry
{
get;
set;
}
}
[DataContract]

View File

@@ -0,0 +1,17 @@
using System.Runtime.Serialization;
namespace GitHub.DistributedTask.WebApi
{
/// <summary>
/// Information about a job run on the runner
/// </summary>
[DataContract]
public class JobTelemetry
{
[DataMember(EmitDefaultValue = false)]
public string Message { get; set; }
[DataMember(EmitDefaultValue = false)]
public JobTelemetryType Type { get; set; }
}
}

View File

@@ -0,0 +1,13 @@
using System.Runtime.Serialization;
namespace GitHub.DistributedTask.WebApi
{
public enum JobTelemetryType
{
[EnumMember]
General = 0,
[EnumMember]
ActionCommand = 1,
}
}

View File

@@ -3,7 +3,7 @@
<PropertyGroup>
<TargetFramework>netcoreapp3.1</TargetFramework>
<OutputType>Library</OutputType>
<RuntimeIdentifiers>win-x64;win-x86;linux-x64;linux-arm64;linux-arm;osx-x64</RuntimeIdentifiers>
<RuntimeIdentifiers>win-x64;win-x86;linux-x64;linux-arm64;linux-arm;linux-musl-x64;osx-x64</RuntimeIdentifiers>
<TargetLatestRuntimePatch>true</TargetLatestRuntimePatch>
<AssetTargetFallback>portable-net45+win8</AssetTargetFallback>
<NoWarn>NU1701;NU1603</NoWarn>

View File

@@ -19,6 +19,7 @@ namespace GitHub.Runner.Common.Tests
"linux-x64",
"linux-arm",
"linux-arm64",
"linux-musl-x64",
"osx-x64"
};

View File

@@ -161,7 +161,8 @@ namespace GitHub.Runner.Common.Tests.Listener.Configuration
"--work", _expectedWorkFolder,
"--auth", _expectedAuthType,
"--token", _expectedToken,
"--labels", userLabels
"--labels", userLabels,
"--ephemeral",
});
trace.Info("Constructed.");
_store.Setup(x => x.IsConfigured()).Returns(false);
@@ -179,6 +180,7 @@ namespace GitHub.Runner.Common.Tests.Listener.Configuration
Assert.True(s.AgentName.Equals(_expectedAgentName));
Assert.True(s.PoolId.Equals(_secondRunnerGroupId));
Assert.True(s.WorkFolder.Equals(_expectedWorkFolder));
Assert.True(s.Ephemeral.Equals(true));
// validate GetAgentPoolsAsync gets called twice with automation pool type
_runnerServer.Verify(x => x.GetAgentPoolsAsync(It.IsAny<string>(), It.Is<TaskAgentPoolType>(p => p == TaskAgentPoolType.Automation)), Times.Exactly(2));

View File

@@ -264,6 +264,170 @@ namespace GitHub.Runner.Common.Tests.Listener
}
}
[Fact]
[Trait("Level", "L0")]
[Trait("Category", "Runner")]
public async void RenewJobRequestNewAgentNameUpdatesSettings()
{
//Arrange
using (var hc = new TestHostContext(this))
{
var count = 0;
var oldName = "OldName";
var newName = "NewName";
var oldSettings = new RunnerSettings { AgentName = oldName };
var reservedAgent = new TaskAgentReference { Name = newName };
var trace = hc.GetTrace(nameof(DispatcherRenewJobRequestStopOnJobTokenExpiredExceptions));
TaskCompletionSource<int> firstJobRequestRenewed = new TaskCompletionSource<int>();
CancellationTokenSource cancellationTokenSource = new CancellationTokenSource();
var request = new Mock<TaskAgentJobRequest>();
request.Object.ReservedAgent = reservedAgent;
PropertyInfo lockUntilProperty = request.Object.GetType().GetProperty("LockedUntil", BindingFlags.Instance | BindingFlags.NonPublic | BindingFlags.Public);
Assert.NotNull(lockUntilProperty);
lockUntilProperty.SetValue(request.Object, DateTime.UtcNow.AddMinutes(5));
hc.SetSingleton<IRunnerServer>(_runnerServer.Object);
hc.SetSingleton<IConfigurationStore>(_configurationStore.Object);
_configurationStore.Setup(x => x.GetSettings()).Returns(oldSettings);
_runnerServer.Setup(x => x.RenewAgentRequestAsync(It.IsAny<int>(), It.IsAny<long>(), It.IsAny<Guid>(), It.IsAny<string>(), It.IsAny<CancellationToken>()))
.Returns(() =>
{
count++;
if (count < 5)
{
return Task.FromResult<TaskAgentJobRequest>(request.Object);
}
else if (count == 5 || count == 6 || count == 7)
{
throw new TimeoutException("");
}
else
{
cancellationTokenSource.Cancel();
return Task.FromResult<TaskAgentJobRequest>(request.Object);
}
});
var jobDispatcher = new JobDispatcher();
jobDispatcher.Initialize(hc);
// Act
await jobDispatcher.RenewJobRequestAsync(0, 0, Guid.Empty, Guid.NewGuid().ToString(), firstJobRequestRenewed, cancellationTokenSource.Token);
// Assert
_configurationStore.Verify(x => x.SaveSettings(It.Is<RunnerSettings>(settings => settings.AgentName == newName)), Times.Once);
}
}
[Fact]
[Trait("Level", "L0")]
[Trait("Category", "Runner")]
public async void RenewJobRequestSameAgentNameIgnored()
{
//Arrange
using (var hc = new TestHostContext(this))
{
var count = 0;
var oldName = "OldName";
var newName = "OldName";
var oldSettings = new RunnerSettings { AgentName = oldName };
var reservedAgent = new TaskAgentReference { Name = newName };
var trace = hc.GetTrace(nameof(DispatcherRenewJobRequestStopOnJobTokenExpiredExceptions));
TaskCompletionSource<int> firstJobRequestRenewed = new TaskCompletionSource<int>();
CancellationTokenSource cancellationTokenSource = new CancellationTokenSource();
var request = new Mock<TaskAgentJobRequest>();
request.Object.ReservedAgent = reservedAgent;
PropertyInfo lockUntilProperty = request.Object.GetType().GetProperty("LockedUntil", BindingFlags.Instance | BindingFlags.NonPublic | BindingFlags.Public);
Assert.NotNull(lockUntilProperty);
lockUntilProperty.SetValue(request.Object, DateTime.UtcNow.AddMinutes(5));
hc.SetSingleton<IRunnerServer>(_runnerServer.Object);
hc.SetSingleton<IConfigurationStore>(_configurationStore.Object);
_configurationStore.Setup(x => x.GetSettings()).Returns(oldSettings);
_runnerServer.Setup(x => x.RenewAgentRequestAsync(It.IsAny<int>(), It.IsAny<long>(), It.IsAny<Guid>(), It.IsAny<string>(), It.IsAny<CancellationToken>()))
.Returns(() =>
{
count++;
if (count < 5)
{
return Task.FromResult<TaskAgentJobRequest>(request.Object);
}
else if (count == 5 || count == 6 || count == 7)
{
throw new TimeoutException("");
}
else
{
cancellationTokenSource.Cancel();
return Task.FromResult<TaskAgentJobRequest>(request.Object);
}
});
var jobDispatcher = new JobDispatcher();
jobDispatcher.Initialize(hc);
// Act
await jobDispatcher.RenewJobRequestAsync(0, 0, Guid.Empty, Guid.NewGuid().ToString(), firstJobRequestRenewed, cancellationTokenSource.Token);
// Assert
_configurationStore.Verify(x => x.SaveSettings(It.IsAny<RunnerSettings>()), Times.Never);
}
}
[Fact]
[Trait("Level", "L0")]
[Trait("Category", "Runner")]
public async void RenewJobRequestNullAgentNameIgnored()
{
//Arrange
using (var hc = new TestHostContext(this))
{
var count = 0;
var oldName = "OldName";
var oldSettings = new RunnerSettings { AgentName = oldName };
var trace = hc.GetTrace(nameof(DispatcherRenewJobRequestStopOnJobTokenExpiredExceptions));
TaskCompletionSource<int> firstJobRequestRenewed = new TaskCompletionSource<int>();
CancellationTokenSource cancellationTokenSource = new CancellationTokenSource();
var request = new Mock<TaskAgentJobRequest>();
PropertyInfo lockUntilProperty = request.Object.GetType().GetProperty("LockedUntil", BindingFlags.Instance | BindingFlags.NonPublic | BindingFlags.Public);
Assert.NotNull(lockUntilProperty);
lockUntilProperty.SetValue(request.Object, DateTime.UtcNow.AddMinutes(5));
hc.SetSingleton<IRunnerServer>(_runnerServer.Object);
hc.SetSingleton<IConfigurationStore>(_configurationStore.Object);
_configurationStore.Setup(x => x.GetSettings()).Returns(oldSettings);
_runnerServer.Setup(x => x.RenewAgentRequestAsync(It.IsAny<int>(), It.IsAny<long>(), It.IsAny<Guid>(), It.IsAny<string>(), It.IsAny<CancellationToken>()))
.Returns(() =>
{
count++;
if (count < 5)
{
return Task.FromResult<TaskAgentJobRequest>(request.Object);
}
else if (count == 5 || count == 6 || count == 7)
{
throw new TimeoutException("");
}
else
{
cancellationTokenSource.Cancel();
return Task.FromResult<TaskAgentJobRequest>(request.Object);
}
});
var jobDispatcher = new JobDispatcher();
jobDispatcher.Initialize(hc);
// Act
await jobDispatcher.RenewJobRequestAsync(0, 0, Guid.Empty, Guid.NewGuid().ToString(), firstJobRequestRenewed, cancellationTokenSource.Token);
// Assert
_configurationStore.Verify(x => x.SaveSettings(It.IsAny<RunnerSettings>()), Times.Never);
}
}
[Fact]
[Trait("Level", "L0")]
[Trait("Category", "Runner")]

View File

@@ -149,6 +149,9 @@ namespace GitHub.Runner.Common.Tests.Listener
_messageListener.Verify(x => x.CreateSessionAsync(It.IsAny<CancellationToken>()), Times.Once());
_messageListener.Verify(x => x.DeleteSessionAsync(), Times.Once());
_messageListener.Verify(x => x.DeleteMessageAsync(It.IsAny<TaskAgentMessage>()), Times.AtLeastOnce());
// verify that we didn't try to delete local settings file (since we're not ephemeral)
_configurationManager.Verify(x => x.DeleteLocalRunnerConfig(), Times.Never());
}
}
}
@@ -312,6 +315,9 @@ namespace GitHub.Runner.Common.Tests.Listener
_messageListener.Verify(x => x.CreateSessionAsync(It.IsAny<CancellationToken>()), Times.Once());
_messageListener.Verify(x => x.DeleteSessionAsync(), Times.Once());
_messageListener.Verify(x => x.DeleteMessageAsync(It.IsAny<TaskAgentMessage>()), Times.AtLeastOnce());
// verify that we did try to delete local settings file (since we're ephemeral)
_configurationManager.Verify(x => x.DeleteLocalRunnerConfig(), Times.Once());
}
}

View File

@@ -2,6 +2,7 @@
using System.Collections.Generic;
using System.IO;
using System.Runtime.CompilerServices;
using GitHub.DistributedTask.Pipelines.ContextData;
using GitHub.DistributedTask.WebApi;
using GitHub.Runner.Worker;
using GitHub.Runner.Worker.Container;
@@ -83,6 +84,7 @@ namespace GitHub.Runner.Common.Tests.Worker
{
using (TestHostContext hc = CreateTestContext())
{
_ec.Setup(x => x.ExpressionValues).Returns(GetExpressionValues());
_ec.Setup(x => x.Write(It.IsAny<string>(), It.IsAny<string>()))
.Returns((string tag, string line) =>
{
@@ -105,6 +107,88 @@ namespace GitHub.Runner.Common.Tests.Worker
}
}
[Theory]
[InlineData("stop-commands", "1")]
[InlineData("", "1")]
[InlineData("set-env", "1")]
[InlineData("stop-commands", "true")]
[InlineData("", "true")]
[InlineData("set-env", "true")]
[Trait("Level", "L0")]
[Trait("Category", "Worker")]
public void StopProcessCommand__AllowsInvalidStopTokens__IfEnvVarIsSet(string invalidToken, string allowUnsupportedStopCommandTokens)
{
using (TestHostContext hc = CreateTestContext())
{
_ec.Object.Global.EnvironmentVariables = new Dictionary<string, string>();
var expressionValues = new DictionaryContextData
{
["env"] =
#if OS_WINDOWS
new DictionaryContextData{ { Constants.Variables.Actions.AllowUnsupportedStopCommandTokens, new StringContextData(allowUnsupportedStopCommandTokens) }}
#else
new CaseSensitiveDictionaryContextData{ { Constants.Variables.Actions.AllowUnsupportedStopCommandTokens, new StringContextData(allowUnsupportedStopCommandTokens) }}
#endif
};
_ec.Setup(x => x.ExpressionValues).Returns(expressionValues);
_ec.Setup(x => x.JobTelemetry).Returns(new List<JobTelemetry>());
Assert.True(_commandManager.TryProcessCommand(_ec.Object, $"::stop-commands::{invalidToken}", null));
}
}
[Theory]
[InlineData("stop-commands")]
[InlineData("")]
[InlineData("set-env")]
[Trait("Level", "L0")]
[Trait("Category", "Worker")]
public void StopProcessCommand__FailOnInvalidStopTokens(string invalidToken)
{
using (TestHostContext hc = CreateTestContext())
{
_ec.Object.Global.EnvironmentVariables = new Dictionary<string, string>();
_ec.Setup(x => x.ExpressionValues).Returns(GetExpressionValues());
_ec.Setup(x => x.JobTelemetry).Returns(new List<JobTelemetry>());
Assert.Throws<Exception>(() => _commandManager.TryProcessCommand(_ec.Object, $"::stop-commands::{invalidToken}", null));
}
}
[Fact]
[Trait("Level", "L0")]
[Trait("Category", "Worker")]
public void StopProcessCommandAcceptsValidToken()
{
var validToken = "randomToken";
using (TestHostContext hc = CreateTestContext())
{
_ec.Setup(x => x.ExpressionValues).Returns(GetExpressionValues());
Assert.True(_commandManager.TryProcessCommand(_ec.Object, $"::stop-commands::{validToken}", null));
Assert.False(_commandManager.TryProcessCommand(_ec.Object, "##[set-env name=foo]bar", null));
Assert.True(_commandManager.TryProcessCommand(_ec.Object, $"::{validToken}::", null));
Assert.True(_commandManager.TryProcessCommand(_ec.Object, "##[set-env name=foo]bar", null));
}
}
[Fact]
[Trait("Level", "L0")]
[Trait("Category", "Worker")]
public void StopProcessCommandMasksValidTokenForEntireRun()
{
var validToken = "randomToken";
using (TestHostContext hc = CreateTestContext())
{
_ec.Setup(x => x.ExpressionValues).Returns(GetExpressionValues());
Assert.True(_commandManager.TryProcessCommand(_ec.Object, $"::stop-commands::{validToken}", null));
Assert.False(_commandManager.TryProcessCommand(_ec.Object, "##[set-env name=foo]bar", null));
Assert.Equal("***", hc.SecretMasker.MaskSecrets(validToken));
Assert.True(_commandManager.TryProcessCommand(_ec.Object, $"::{validToken}::", null));
Assert.True(_commandManager.TryProcessCommand(_ec.Object, "##[set-env name=foo]bar", null));
Assert.Equal("***", hc.SecretMasker.MaskSecrets(validToken));
}
}
[Fact]
[Trait("Level", "L0")]
[Trait("Category", "Worker")]
@@ -202,7 +286,7 @@ namespace GitHub.Runner.Common.Tests.Worker
return 1;
});
var registeredCommands = new HashSet<string>(new string[1]{ "warning" });
var registeredCommands = new HashSet<string>(new string[1] { "warning" });
ActionCommand command;
// Columns when lines are different
@@ -375,5 +459,19 @@ namespace GitHub.Runner.Common.Tests.Worker
return hostContext;
}
private DictionaryContextData GetExpressionValues()
{
return new DictionaryContextData
{
["env"] =
#if OS_WINDOWS
new DictionaryContextData()
#else
new CaseSensitiveDictionaryContextData()
#endif
};
}
}
}

View File

@@ -965,7 +965,7 @@ namespace GitHub.Runner.Common.Tests.Worker
};
//Act
var result = await _actionManager.PrepareActionsAsync(_ec.Object, actions);
var result = await _actionManager.PrepareActionsAsync(_ec.Object, actions);
//Assert
Assert.Equal(1, result.PreStepTracker.Count);
@@ -1288,7 +1288,7 @@ runs:
[Fact]
[Trait("Level", "L0")]
[Trait("Category", "Worker")]
public void LoadsNodeActionDefinition()
public void LoadsNode12ActionDefinition()
{
try
{
@@ -1344,8 +1344,78 @@ runs:
Assert.True(string.IsNullOrEmpty(inputDefaults["entryPoint"]));
Assert.NotNull(definition.Data.Execution); // execution
Assert.NotNull((definition.Data.Execution as NodeJSActionExecutionData));
Assert.NotNull(definition.Data.Execution as NodeJSActionExecutionData);
Assert.Equal("task.js", (definition.Data.Execution as NodeJSActionExecutionData).Script);
Assert.Equal("node12", (definition.Data.Execution as NodeJSActionExecutionData).NodeVersion);
}
finally
{
Teardown();
}
}
[Fact]
[Trait("Level", "L0")]
[Trait("Category", "Worker")]
public void LoadsNode16ActionDefinition()
{
try
{
// Arrange.
Setup();
const string Content = @"
# Container action
name: 'Hello World'
description: 'Greet the world and record the time'
author: 'GitHub'
inputs:
greeting: # id of input
description: 'The greeting we choose - will print ""{greeting}, World!"" on stdout'
required: true
default: 'Hello'
entryPoint: # id of input
description: 'optional docker entrypoint overwrite.'
required: false
outputs:
time: # id of output
description: 'The time we did the greeting'
icon: 'hello.svg' # vector art to display in the GitHub Marketplace
color: 'green' # optional, decorates the entry in the GitHub Marketplace
runs:
using: 'node16'
main: 'task.js'
";
Pipelines.ActionStep instance;
string directory;
CreateAction(yamlContent: Content, instance: out instance, directory: out directory);
// Act.
Definition definition = _actionManager.LoadAction(_ec.Object, instance);
// Assert.
Assert.NotNull(definition);
Assert.Equal(directory, definition.Directory);
Assert.NotNull(definition.Data);
Assert.NotNull(definition.Data.Inputs); // inputs
Dictionary<string, string> inputDefaults = new Dictionary<string, string>(StringComparer.OrdinalIgnoreCase);
foreach (var input in definition.Data.Inputs)
{
var name = input.Key.AssertString("key").Value;
var value = input.Value.AssertScalar("value").ToString();
_hc.GetTrace().Info($"Default: {name} = {value}");
inputDefaults[name] = value;
}
Assert.Equal(2, inputDefaults.Count);
Assert.True(inputDefaults.ContainsKey("greeting"));
Assert.Equal("Hello", inputDefaults["greeting"]);
Assert.True(string.IsNullOrEmpty(inputDefaults["entryPoint"]));
Assert.NotNull(definition.Data.Execution); // execution
Assert.NotNull(definition.Data.Execution as NodeJSActionExecutionData);
Assert.Equal("task.js", (definition.Data.Execution as NodeJSActionExecutionData).Script);
Assert.Equal("node16", (definition.Data.Execution as NodeJSActionExecutionData).NodeVersion);
}
finally
{

View File

@@ -408,6 +408,50 @@ namespace GitHub.Runner.Common.Tests.Worker
var nodeAction = result.Execution as NodeJSActionExecutionData;
Assert.Equal("main.js", nodeAction.Script);
Assert.Equal("node12", nodeAction.NodeVersion);
}
finally
{
Teardown();
}
}
[Fact]
[Trait("Level", "L0")]
[Trait("Category", "Worker")]
public void Load_Node16Action()
{
try
{
//Arrange
Setup();
var actionManifest = new ActionManifestManager();
actionManifest.Initialize(_hc);
//Act
var result = actionManifest.Load(_ec.Object, Path.Combine(TestUtil.GetTestDataPath(), "node16action.yml"));
//Assert
Assert.Equal("Hello World", result.Name);
Assert.Equal("Greet the world and record the time", result.Description);
Assert.Equal(2, result.Inputs.Count);
Assert.Equal("greeting", result.Inputs[0].Key.AssertString("key").Value);
Assert.Equal("Hello", result.Inputs[0].Value.AssertString("value").Value);
Assert.Equal("entryPoint", result.Inputs[1].Key.AssertString("key").Value);
Assert.Equal("", result.Inputs[1].Value.AssertString("value").Value);
Assert.Equal(1, result.Deprecated.Count);
Assert.True(result.Deprecated.ContainsKey("greeting"));
result.Deprecated.TryGetValue("greeting", out string value);
Assert.Equal("This property has been deprecated", value);
Assert.Equal(ActionExecutionType.NodeJS, result.Execution.ExecutionType);
var nodeAction = result.Execution as NodeJSActionExecutionData;
Assert.Equal("main.js", nodeAction.Script);
Assert.Equal("node16", nodeAction.NodeVersion);
}
finally
{
@@ -628,6 +672,32 @@ namespace GitHub.Runner.Common.Tests.Worker
}
}
[Fact]
[Trait("Level", "L0")]
[Trait("Category", "Worker")]
public void Load_ConditionalCompositeAction()
{
try
{
//Arrange
Setup();
var actionManifest = new ActionManifestManager();
actionManifest.Initialize(_hc);
//Act
var result = actionManifest.Load(_ec.Object, Path.Combine(TestUtil.GetTestDataPath(), "conditional_composite_action.yml"));
//Assert
Assert.Equal("Conditional Composite", result.Name);
Assert.Equal(ActionExecutionType.Composite, result.Execution.ExecutionType);
}
finally
{
Teardown();
}
}
[Fact]
[Trait("Level", "L0")]
[Trait("Category", "Worker")]

View File

@@ -193,9 +193,9 @@ namespace GitHub.Runner.Common.Tests.Worker
// Act.
jobContext.InitializeJob(jobRequest, CancellationToken.None);
var action1 = jobContext.CreateChild(Guid.NewGuid(), "action_1", "action_1", null, null);
var action1 = jobContext.CreateChild(Guid.NewGuid(), "action_1", "action_1", null, null, 0);
action1.IntraActionState["state"] = "1";
var action2 = jobContext.CreateChild(Guid.NewGuid(), "action_2", "action_2", null, null);
var action2 = jobContext.CreateChild(Guid.NewGuid(), "action_2", "action_2", null, null, 0);
action2.IntraActionState["state"] = "2";
@@ -291,8 +291,8 @@ namespace GitHub.Runner.Common.Tests.Worker
// Act.
jobContext.InitializeJob(jobRequest, CancellationToken.None);
var action1 = jobContext.CreateChild(Guid.NewGuid(), "action_1_pre", "action_1_pre", null, null);
var action2 = jobContext.CreateChild(Guid.NewGuid(), "action_1_main", "action_1_main", null, null);
var action1 = jobContext.CreateChild(Guid.NewGuid(), "action_1_pre", "action_1_pre", null, null, 0);
var action2 = jobContext.CreateChild(Guid.NewGuid(), "action_1_main", "action_1_main", null, null, 0);
var actionId = Guid.NewGuid();
var postRunner1 = hc.CreateService<IActionRunner>();

View File

@@ -105,6 +105,36 @@ namespace GitHub.Runner.Common.Tests.Worker.Expressions
}
}
[Theory]
[Trait("Level", "L0")]
[Trait("Category", "Worker")]
[InlineData(ActionResult.Failure, ActionResult.Failure, true)]
[InlineData(ActionResult.Failure, ActionResult.Success, false)]
[InlineData(ActionResult.Success, ActionResult.Failure, true)]
[InlineData(ActionResult.Success, ActionResult.Success, false)]
[InlineData(ActionResult.Success, null, false)]
public void FailureFunctionComposite(ActionResult jobStatus, ActionResult? actionStatus, bool expected)
{
using (TestHostContext hc = CreateTestContext())
{
// Arrange
var executionContext = InitializeExecutionContext(hc);
executionContext.Setup(x => x.GetGitHubContext("action_status")).Returns(actionStatus.ToString());
executionContext.Setup( x=> x.IsEmbedded).Returns(true);
executionContext.Setup( x=> x.Stage).Returns(ActionRunStage.Main);
_jobContext.Status = jobStatus;
// Act.
bool actual = Evaluate("failure()");
// Assert.
Assert.Equal(expected, actual);
}
}
[Fact]
[Trait("Level", "L0")]
[Trait("Category", "Worker")]
@@ -134,12 +164,43 @@ namespace GitHub.Runner.Common.Tests.Worker.Expressions
}
}
[Theory]
[Trait("Level", "L0")]
[Trait("Category", "Worker")]
[InlineData(ActionResult.Failure, ActionResult.Failure, false)]
[InlineData(ActionResult.Failure, ActionResult.Success, true)]
[InlineData(ActionResult.Success, ActionResult.Failure, false)]
[InlineData(ActionResult.Success, ActionResult.Success, true)]
[InlineData(ActionResult.Success, null, true)]
public void SuccessFunctionComposite(ActionResult jobStatus, ActionResult? actionStatus, bool expected)
{
using (TestHostContext hc = CreateTestContext())
{
// Arrange
var executionContext = InitializeExecutionContext(hc);
executionContext.Setup(x => x.GetGitHubContext("action_status")).Returns(actionStatus.ToString());
executionContext.Setup( x=> x.IsEmbedded).Returns(true);
executionContext.Setup( x=> x.Stage).Returns(ActionRunStage.Main);
_jobContext.Status = jobStatus;
// Act.
bool actual = Evaluate("success()");
// Assert.
Assert.Equal(expected, actual);
}
}
private TestHostContext CreateTestContext([CallerMemberName] String testName = "")
{
return new TestHostContext(this, testName);
}
private void InitializeExecutionContext(TestHostContext hc)
private Mock<IExecutionContext> InitializeExecutionContext(TestHostContext hc)
{
_jobContext = new JobContext();
@@ -149,6 +210,8 @@ namespace GitHub.Runner.Common.Tests.Worker.Expressions
_templateContext = new TemplateContext();
_templateContext.State[nameof(IExecutionContext)] = executionContext.Object;
return executionContext;
}
private bool Evaluate(string expression)

View File

@@ -0,0 +1,109 @@
using System;
using System.Collections.Generic;
using System.Runtime.CompilerServices;
using System.Threading.Tasks;
using Moq;
using Xunit;
using GitHub.Runner.Worker;
using GitHub.Runner.Worker.Handlers;
using GitHub.Runner.Worker.Container;
namespace GitHub.Runner.Common.Tests.Worker
{
public sealed class StepHostL0
{
private Mock<IExecutionContext> _ec;
private Mock<IDockerCommandManager> _dc;
private TestHostContext CreateTestContext([CallerMemberName] String testName = "")
{
var hc = new TestHostContext(this, testName);
_ec = new Mock<IExecutionContext>();
_ec.SetupAllProperties();
_ec.Setup(x => x.Global).Returns(new GlobalContext { WriteDebug = true });
var trace = hc.GetTrace();
_ec.Setup(x => x.Write(It.IsAny<string>(), It.IsAny<string>())).Callback((string tag, string message) => { trace.Info($"[{tag}]{message}"); });
_dc = new Mock<IDockerCommandManager>();
hc.SetSingleton(_dc.Object);
return hc;
}
[Fact]
[Trait("Level", "L0")]
[Trait("Category", "Worker")]
public async Task DetermineNodeRuntimeVersionInContainerAsync()
{
using (TestHostContext hc = CreateTestContext())
{
// Arrange.
var sh = new ContainerStepHost();
sh.Initialize(hc);
sh.Container = new ContainerInfo() { ContainerId = "1234abcd" };
_dc.Setup(d => d.DockerExec(_ec.Object, It.IsAny<string>(), It.IsAny<string>(), It.IsAny<string>(), It.IsAny<List<string>>()))
.ReturnsAsync(0);
// Act.
var nodeVersion = await sh.DetermineNodeRuntimeVersion(_ec.Object, "node12");
// Assert.
Assert.Equal("node12", nodeVersion);
}
}
[Fact]
[Trait("Level", "L0")]
[Trait("Category", "Worker")]
public async Task DetermineNodeRuntimeVersionInAlpineContainerAsync()
{
using (TestHostContext hc = CreateTestContext())
{
// Arrange.
var sh = new ContainerStepHost();
sh.Initialize(hc);
sh.Container = new ContainerInfo() { ContainerId = "1234abcd" };
_dc.Setup(d => d.DockerExec(_ec.Object, It.IsAny<string>(), It.IsAny<string>(), It.IsAny<string>(), It.IsAny<List<string>>()))
.Callback((IExecutionContext ec, string id, string options, string command, List<string> output) =>
{
output.Add("alpine");
})
.ReturnsAsync(0);
// Act.
var nodeVersion = await sh.DetermineNodeRuntimeVersion(_ec.Object, "node16");
// Assert.
Assert.Equal("node16_alpine", nodeVersion);
}
}
[Fact]
[Trait("Level", "L0")]
[Trait("Category", "Worker")]
public async Task DetermineNodeRuntimeVersionInUnknowContainerAsync()
{
using (TestHostContext hc = CreateTestContext())
{
// Arrange.
var sh = new ContainerStepHost();
sh.Initialize(hc);
sh.Container = new ContainerInfo() { ContainerId = "1234abcd" };
_dc.Setup(d => d.DockerExec(_ec.Object, It.IsAny<string>(), It.IsAny<string>(), It.IsAny<string>(), It.IsAny<List<string>>()))
.Callback((IExecutionContext ec, string id, string options, string command, List<string> output) =>
{
output.Add("github");
})
.ReturnsAsync(0);
// Act.
var nodeVersion = await sh.DetermineNodeRuntimeVersion(_ec.Object, "node16");
// Assert.
Assert.Equal("node16", nodeVersion);
}
}
}
}

View File

@@ -1,7 +1,7 @@
<Project Sdk="Microsoft.NET.Sdk">
<PropertyGroup>
<TargetFramework>netcoreapp3.1</TargetFramework>
<RuntimeIdentifiers>win-x64;win-x86;linux-x64;linux-arm64;linux-arm;osx-x64</RuntimeIdentifiers>
<RuntimeIdentifiers>win-x64;win-x86;linux-x64;linux-arm64;linux-arm;linux-musl-x64;osx-x64</RuntimeIdentifiers>
<TargetLatestRuntimePatch>true</TargetLatestRuntimePatch>
<AssetTargetFallback>portable-net45+win8</AssetTargetFallback>
<NoWarn>NU1701;NU1603;NU1603;xUnit2013;</NoWarn>

View File

@@ -0,0 +1,49 @@
name: 'Conditional Composite'
description: 'Test composite run step conditionals'
inputs:
exit-code:
description: 'Action fails if set to non-zero'
default: '0'
outputs:
default:
description: "Did step run with default?"
value: ${{ steps.default-conditional.outputs.default }}
success:
description: "Did step run with success?"
value: ${{ steps.success-conditional.outputs.success }}
failure:
description: "Did step run with failure?"
value: ${{ steps.failure-conditional.outputs.failure }}
always:
description: "Did step run with always?"
value: ${{ steps.always-conditional.outputs.always }}
runs:
using: "composite"
steps:
- run: exit ${{ inputs.exit-code }}
shell: bash
- run: echo "::set-output name=default::true"
id: default-conditional
shell: bash
- run: echo "::set-output name=success::true"
id: success-conditional
shell: bash
if: success()
- run: echo "::set-output name=failure::true"
id: failure-conditional
shell: bash
if: failure()
- run: echo "::set-output name=always::true"
id: always-conditional
shell: bash
if: always()
- run: echo "failed"
shell: bash
if: ${{ inputs.exit-code == 1 && failure() }}

View File

@@ -0,0 +1,20 @@
name: 'Hello World'
description: 'Greet the world and record the time'
author: 'Test Corporation'
inputs:
greeting: # id of input
description: 'The greeting we choose - will print ""{greeting}, World!"" on stdout'
required: true
default: 'Hello'
deprecationMessage: 'This property has been deprecated'
entryPoint: # id of input
description: 'optional docker entrypoint overwrite.'
required: false
outputs:
time: # id of output
description: 'The time we did the greeting'
icon: 'hello.svg' # vector art to display in the GitHub Marketplace
color: 'green' # optional, decorates the entry in the GitHub Marketplace
runs:
using: 'node16'
main: 'main.js'

View File

@@ -47,6 +47,10 @@ elif [[ "$CURRENT_PLATFORM" == 'linux' ]]; then
aarch64) RUNTIME_ID="linux-arm64";;
esac
fi
if [ -f "/lib/ld-musl-x86_64.so.1" ]; then
RUNTIME_ID="linux-musl-x64"
fi
elif [[ "$CURRENT_PLATFORM" == 'darwin' ]]; then
RUNTIME_ID='osx-x64'
fi
@@ -57,7 +61,7 @@ fi
# Make sure current platform support publish the dotnet runtime
# Windows can publish win-x86/x64
# Linux can publish linux-x64/arm/arm64
# Linux can publish linux-x64/arm/arm64/musl-x64
# OSX can publish osx-x64
if [[ "$CURRENT_PLATFORM" == 'windows' ]]; then
if [[ ("$RUNTIME_ID" != 'win-x86') && ("$RUNTIME_ID" != 'win-x64') ]]; then
@@ -65,7 +69,7 @@ if [[ "$CURRENT_PLATFORM" == 'windows' ]]; then
exit 1
fi
elif [[ "$CURRENT_PLATFORM" == 'linux' ]]; then
if [[ ("$RUNTIME_ID" != 'linux-x64') && ("$RUNTIME_ID" != 'linux-x86') && ("$RUNTIME_ID" != 'linux-arm64') && ("$RUNTIME_ID" != 'linux-arm') ]]; then
if [[ ("$RUNTIME_ID" != 'linux-x64') && ("$RUNTIME_ID" != 'linux-musl-x64') && ("$RUNTIME_ID" != 'linux-arm64') && ("$RUNTIME_ID" != 'linux-arm') ]]; then
echo "Failed: Can't build $RUNTIME_ID package $CURRENT_PLATFORM" >&2
exit 1
fi

View File

@@ -1 +1 @@
2.282.1
2.284.0