Compare commits

...

23 Commits

Author SHA1 Message Date
Tingluo Huang
2ca5ee0fbd support defaults. 2020-03-17 22:52:48 -04:00
Tingluo Huang
88875ca1b0 set steps.<id>.outcome and steps.<id>.conclusion. (#372) 2020-03-17 21:18:42 -04:00
Tingluo Huang
a5eb8cb5c4 set CI=true when launch process in actions runner. (#374) 2020-03-17 19:58:12 -04:00
Josh Soref
41f4ca3414 grammar (#373) 2020-03-16 22:19:57 -04:00
eric sciple
aa9f5bf070 adr step output and conclusion (#274) 2020-03-16 14:56:07 -04:00
Tingluo Huang
2d6042421f add support for job outputs. (#365)
* add support for job outputs.
2020-03-14 17:54:58 -04:00
Tingluo Huang
c8890d0f3f Expose job name as $GITHUB_JOB (#366) 2020-03-12 20:47:25 -04:00
Konrad Pabjan
53fb6297cb Change problem matchers output to debug (#363) 2020-03-11 21:52:46 -04:00
Tingluo Huang
f9b5d626c5 load and print machine setup info from .setup_info (#364) 2020-03-11 10:36:56 -04:00
Tingluo Huang
d34afb54b1 ADR for expose runner's machine info in log. (#354)
* ADR for expose runner's machine info in log.

* rename

* Update docs/adrs/0354-runner-machine-info.md

Co-Authored-By: Hugo van Kemenade <hugovk@users.noreply.github.com>

* Update docs/adrs/0354-runner-machine-info.md

Co-Authored-By: Hugo van Kemenade <hugovk@users.noreply.github.com>

* Update docs/adrs/0354-runner-machine-info.md

Co-Authored-By: Hugo van Kemenade <hugovk@users.noreply.github.com>

* Update 0354-runner-machine-info.md

Co-authored-by: Hugo van Kemenade <hugovk@users.noreply.github.com>
2020-03-10 15:45:35 -04:00
Julio Barba
e291ebc58a Add runner auth documentation (#357)
Add runner authentication/authorization documentation.

This doc explains how auth is used at all phases of the runner lifetime (i.e. configuration, listener start, and workflow run), for both self-hosted and hosted runners.
2020-03-09 13:01:41 -04:00
Tingluo Huang
6bec1e3bb8 Switch to use token service instead of SPS for exchanging oauth token. (#325)
* Gracefully switch the runner to use Token Service instead of SPS.

* PR feedback.

* feedback2

* report error.
2020-03-04 21:40:58 -05:00
eric sciple
0cba42590f preserve workflow file/line/column for better error messages (#356) 2020-03-03 22:38:19 -05:00
Josh Gross
94e7560ccd Add event type to credential call (#352)
* Add event type to credential call

* Move events to contants
2020-03-02 11:22:45 -05:00
Lokesh Gopu
d80ab095a5 Update runnerversion (#348)
* Update runnerversion

* Update runnerversion
2020-02-28 13:28:33 -05:00
Josh Gross
2efd6f70e2 Use the Uri Scheme in the register runner url (#345) 2020-02-25 18:30:33 -05:00
Lokesh Gopu
a6f144b014 Update Runner Register GitHub API URL to Support Org-level Runner (#339)
* Update GitHub API URL

* pr comments

* Updated GitHub API URL
2020-02-24 09:15:15 -05:00
eric sciple
5294a3ee06 commands translate file path from container action (#331) 2020-02-12 21:07:43 -05:00
Tingluo Huang
745b90a8b2 Revert "Update Base64 Encoders to deal with suffixes (#284)" (#330)
This reverts commit c45aebc9ab.
2020-02-12 14:26:30 -05:00
Tingluo Huang
0db908da8d Use authenticate endpoint for testing runner connection. (#311)
* use authenticate endpoint for testing runner connection.

* PR feedback.
2020-02-05 16:56:38 -05:00
Thomas Boop
68de3a94be Remove Temporary Build Step (#316)
* Remove Temporary Build Step

* Updated dev.sh to set path for find
2020-02-04 12:59:49 -05:00
Tingluo Huang
a0a590fb48 setup/evaluate env context after setup steps context. (#309) 2020-01-30 22:14:14 -05:00
Christopher Johnson
87a232c477 Fix windows directions in release notes (#307) 2020-01-29 12:58:09 -05:00
70 changed files with 4076 additions and 1327 deletions

View File

@@ -43,14 +43,6 @@ jobs:
steps:
- uses: actions/checkout@v1
# Set Path workaround for https://github.com/actions/virtual-environments/issues/263
- run: |
echo "::add-path::C:\Program Files\Git\mingw64\bin"
echo "::add-path::C:\Program Files\Git\usr\bin"
echo "::add-path::C:\Program Files\Git\bin"
if: matrix.os == 'windows-latest'
name: "Temp step to Set Path for Windows"
# Build runner layout
- name: Build & Layout Release
run: |

View File

@@ -0,0 +1,62 @@
# ADR 0274: Step outcome and conclusion
**Date**: 2020-01-13
**Status**: Accepted
## Context
This ADR proposes adding `steps.<id>.outcome` and `steps.<id>.conclusion` to the steps context.
This allows downstream a step to run based on whether a previous step succeeded or failed.
Reminder, currently the steps contains `steps.<id>.outputs`.
## Decision
For steps that have completed, populate `steps.<id>.outcome` and `steps.<id>.conclusion` with one of the following values:
- `success`
- `failure`
- `cancelled`
- `skipped`
When a continue-on-error step fails, the outcome will be `failure` even though the final conclusion is `success`.
### Example
```yaml
steps:
- id: experimental
continue-on-error: true
run: ./build.sh experimental
- if: ${{ steps.experimental.outcome == 'success' }}
run: ./publish.sh experimental
```
### Terminology
The runs API uses the term `conclusion`.
Therefore we use a different term `outcome` for the value prior to continue-on-error.
The following is a snippet from the runs API response payload:
```json
"steps": [
{
"name": "Set up job",
"status": "completed",
"conclusion": "success",
"number": 1,
"started_at": "2020-01-09T11:06:16.000-05:00",
"completed_at": "2020-01-09T11:06:18.000-05:00"
},
```
## Consequences
- Update runner
- Update [docs](https://help.github.com/en/actions/automating-your-workflow-with-github-actions/contexts-and-expression-syntax-for-github-actions#steps-context)

View File

@@ -0,0 +1,35 @@
# ADR 354: Expose runner machine info
**Date**: 2020-03-02
**Status**: Pending
## Context
- Provide a mechanism in the runner to include extra information in `Set up job` step's log.
Ex: Include OS/Software info from Hosted image.
## Decision
The runner will look for a file `.setup_info` under the runner's root directory, The file can be a JSON with a simple schema.
```json
[
{
"group": "OS Detail",
"detail": "........"
},
{
"group": "Software Detail",
"detail": "........"
}
]
```
The runner will use `##[group]` and `##[endgroup]` to fold all detail info into an expandable group.
Both [virtual-environments](https://github.com/actions/virtual-environments) and self-hosted runners can use this mechanism to add extra logging info to the `Set up job` step's log.
## Consequences
1. Change the runner to best effort read/parse `.extra_setup_info` file under runner root directory.
2. [virtual-environments](https://github.com/actions/virtual-environments) generate the file during image generation.
3. Change MMS provisioner to properly copy the file to runner root directory at runtime.

View File

@@ -44,7 +44,7 @@ Sample developer flow:
```bash
git clone https://github.com/actions/runner
cd ./src
./dev.(sh/cmd) layout # the runner that build from source is in {root}/_layout
./dev.(sh/cmd) layout # the runner that built from source is in {root}/_layout
<make code changes>
./dev.(sh/cmd) build # {root}/_layout will get updated
./dev.(sh/cmd) test # run all unit tests before git commit/push

61
docs/design/auth.md Normal file
View File

@@ -0,0 +1,61 @@
# Runner Authentication and Authorization
## Goals
- Support runner installs in untrusted domains.
- The account that configures or runs the runner process is not relevant for accessing GitHub resources.
- Accessing GitHub resources is done with a per-job token which expires when job completes.
- The token is granted to trusted parts of the system including the runner, actions and script steps specified by the workflow author as trusted.
- All OAuth tokens that come from the Token Service that the runner uses to access Actions Service resources are the same. It's just the scope and expiration of the token that may vary.
## Configuration
Configuring a self-hosted runner is [covered here in the documentation](https://help.github.com/en/actions/hosting-your-own-runners/adding-self-hosted-runners).
Configuration is done with the user being authenticated via a time-limited, GitHub runner registration token.
*Your credentials are never used for registering the runner with the service.*
![Self-hosted runner config](../res/self-hosted-config.png)
During configuration, an RSA public/private key pair is created, the private key is stored in file on disk. On Windows, the content is protected with DPAPI (machine level encrypted - runner only valid on that machine) and on Linux/OSX with `chmod` permissions.
Using your credentials, the runner is registered with the service by sending the public key to the service which adds that runner to the pool and stores the public key, the Token Service will generate a `clientId` associated with the public key.
## Start and Listen
After configuring the runner, the runner can be started interactively (`./run.cmd` or `./run.sh`) or as a service.
![Self-hosted runner start](../res/self-hosted-start.png)
On start, the runner listener process loads the RSA private key (on Windows decrypting with machine key DPAPI), and asks the Token Service for an OAuth token which is signed with the RSA private key.
The server then responds with an OAuth token that grants permission to access the message queue (HTTP long poll), allowing the runner to acquire the messages it will eventually run.
## Run a workflow
When a workflow is run, its labels are evaluated, it is matched to a runner and a message is placed in a queue of messages for that runner.
The runner then starts listening for jobs via the message queue HTTP long poll.
The message is encrypted with the runner's public key, stored during runner configuration.
![Runner workflow run](../res/workflow-run.png)
A workflow is queued as a result of a triggered [event](https://help.github.com/en/actions/reference/events-that-trigger-workflows). Workflows can be scheduled to [run at specific UTC times](https://help.github.com/en/actions/reference/events-that-trigger-workflows#scheduled-events-schedule) using POSIX `cron` syntax.
An [OAuth token](http://self-issued.info/docs/draft-ietf-oauth-json-web-token.html) is generated, granting limited access to the host in Actions Service associated with the github.com repository/organization.
The lifetime of the OAuth token is the lifetime of the run or at most the [job timeout (default: 6 hours)](https://help.github.com/en/actions/reference/workflow-syntax-for-github-actions#jobsjob_idtimeout-minutes), plus 10 additional minutes.
## Accessing GitHub resources
The job message sent to the runner contains the OAuth token to talk back to the Actions Service.
The runner listener parent process will spawn a runner worker process for that job and send it the job message over IPC.
The token is never persisted.
Each action is run as a unique subprocess.
The encrypted access token will be provided as an environment variable in each action subprocess.
The token is registered with the runner as a secret and scrubbed from the logs as they are written.
Authentication in a workflow run to github.com can be accomplished by using the [`GITHUB_TOKEN`](https://help.github.com/en/actions/configuring-and-managing-workflows/authenticating-with-the-github_token#about-the-github_token-secret)) secret. This token expires after 60 minutes. Please note that this token is different from the OAuth token that the runner uses to talk to the Actions Service.
## Hosted runner authentication
Hosted runner authentication differs from self-hosted authentication in that runners do not undergo a registration process, but instead, the hosted runners get the OAuth token directly by reading the `.credentials` file. The scope of this particular token is limited for a given workflow job execution, and the token is revoked as soon as the job is finished.
![Hosted runner config and start](../res/hosted-config-start.png)

Binary file not shown.

After

Width:  |  Height:  |  Size: 31 KiB

View File

@@ -0,0 +1,52 @@
# Markup used to generate the runner auth diagrams: https://websequencediagrams.com
title Runner Configuration (self-hosted only)
note left of Runner: GitHub repo URL as input
Runner->github.com: Retrieve Actions Service access using runner registration token
github.com->Runner: Access token for Actions Service
note left of Runner: Generate RSA key pair
note left of Runner: Store encrypted RSA private key on disk
Runner->Actions Service: Register runner using Actions Service access token
note right of Runner: Runner name, RSA public key sent
note right of Actions Service: Public key stored
Actions Service->Token Service: Register runner as an app along with the RSA public key
note right of Token Service: Public key stored
Token Service->Actions Service: Client Id for the runner application
Actions Service->Runner: Client Id and Token Endpoint URL
note left of Runner: Store runner configuration info into .runner file
note left of Runner: Store Token registration info into .credentials file
title Runner Start and Running (self-hosted only)
Runner.Listener->Runner.Listener: Start
note left of Runner.Listener: Load config info from .runner
note left of Runner.Listener: Load token registration from .credentials
Runner.Listener->Token Service: Exchange OAuth token (happens every 50 mins)
note right of Runner.Listener: Construct JWT token, use Client Id signed by RSA private key
note left of Actions Service: Find corresponding RSA public key, use Client Id\nVerify JWT token's signature
Token Service->Runner.Listener: OAuth token with limited permission and valid for 50 mins
Runner.Listener->Actions Service: Connect to Actions Service with OAuth token
Actions Service->Runner.Listener: Workflow job
title Running workflow
Runner.Listener->Service (Message Queue): Get message
note right of Runner.Listener: Authenticate with exchanged OAuth token
Event->Actions Service: Queue workflow
Actions Service->Actions Service: Generate OAuth token per job
Actions Service->Actions Service: Build job message with the OAuth token
Actions Service->Actions Service: Encrypt job message with the target runner's public key
Actions Service->Service (Message Queue): Send encrypted job message to runner
Service (Message Queue)->Runner.Listener: Send job
note right of Runner.Listener: Decrypt message with runner's private key
Runner.Listener->Runner.Worker: Create worker process per job and run the job
title Runner Configuration, Start and Running (hosted only)
Machine Management Service->Runner.Listener: Construct .runner configuration file, store token in .credentials
Runner.Listener->Runner.Listener: Start
note left of Runner.Listener: Load config info from .runner
note left of Runner.Listener: Load OAuth token from .credentials
Runner.Listener->Actions Service: Connect to Actions Service with OAuth token in .credentials
Actions Service->Runner.Listener: Workflow job

Binary file not shown.

After

Width:  |  Height:  |  Size: 98 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 43 KiB

BIN
docs/res/workflow-run.png Normal file

Binary file not shown.

After

Width:  |  Height:  |  Size: 46 KiB

View File

@@ -24,7 +24,7 @@
- Treat warnings as errors during compile (#249)
## Windows x64
We recommend configuring the runner under "<DRIVE>:\actions-runner". This will help avoid issues related to service identity folder permissions and long file path restrictions on Windows
We recommend configuring the runner in a root folder of the Windows drive (e.g. "C:\actions-runner"). This will help avoid issues related to service identity folder permissions and long file path restrictions on Windows
```
// Create a folder under the drive root
mkdir \actions-runner ; cd \actions-runner
@@ -32,7 +32,7 @@ mkdir \actions-runner ; cd \actions-runner
Invoke-WebRequest -Uri https://github.com/actions/runner/releases/download/v<RUNNER_VERSION>/actions-runner-win-x64-<RUNNER_VERSION>.zip -OutFile actions-runner-win-x64-<RUNNER_VERSION>.zip
// Extract the installer
Add-Type -AssemblyName System.IO.Compression.FileSystem ;
[System.IO.Compression.ZipFile]::ExtractToDirectory("$HOME\Downloads\actions-runner-win-x64-<RUNNER_VERSION>.zip", "$PWD")
[System.IO.Compression.ZipFile]::ExtractToDirectory("$PWD\actions-runner-win-x64-<RUNNER_VERSION>.zip", "$PWD")
```
## OSX

View File

@@ -78,8 +78,10 @@ namespace GitHub.Runner.Common
bool IsServiceConfigured();
bool HasCredentials();
CredentialData GetCredentials();
CredentialData GetMigratedCredentials();
RunnerSettings GetSettings();
void SaveCredential(CredentialData credential);
void SaveMigratedCredential(CredentialData credential);
void SaveSettings(RunnerSettings settings);
void DeleteCredential();
void DeleteSettings();
@@ -90,9 +92,11 @@ namespace GitHub.Runner.Common
private string _binPath;
private string _configFilePath;
private string _credFilePath;
private string _migratedCredFilePath;
private string _serviceConfigFilePath;
private CredentialData _creds;
private CredentialData _migratedCreds;
private RunnerSettings _settings;
public override void Initialize(IHostContext hostContext)
@@ -114,6 +118,9 @@ namespace GitHub.Runner.Common
_credFilePath = hostContext.GetConfigFile(WellKnownConfigFile.Credentials);
Trace.Info("CredFilePath: {0}", _credFilePath);
_migratedCredFilePath = hostContext.GetConfigFile(WellKnownConfigFile.MigratedCredentials);
Trace.Info("MigratedCredFilePath: {0}", _migratedCredFilePath);
_serviceConfigFilePath = hostContext.GetConfigFile(WellKnownConfigFile.Service);
Trace.Info("ServiceConfigFilePath: {0}", _serviceConfigFilePath);
}
@@ -123,7 +130,7 @@ namespace GitHub.Runner.Common
public bool HasCredentials()
{
Trace.Info("HasCredentials()");
bool credsStored = (new FileInfo(_credFilePath)).Exists;
bool credsStored = (new FileInfo(_credFilePath)).Exists || (new FileInfo(_migratedCredFilePath)).Exists;
Trace.Info("stored {0}", credsStored);
return credsStored;
}
@@ -154,6 +161,16 @@ namespace GitHub.Runner.Common
return _creds;
}
public CredentialData GetMigratedCredentials()
{
if (_migratedCreds == null && File.Exists(_migratedCredFilePath))
{
_migratedCreds = IOUtil.LoadObject<CredentialData>(_migratedCredFilePath);
}
return _migratedCreds;
}
public RunnerSettings GetSettings()
{
if (_settings == null)
@@ -188,6 +205,21 @@ namespace GitHub.Runner.Common
File.SetAttributes(_credFilePath, File.GetAttributes(_credFilePath) | FileAttributes.Hidden);
}
public void SaveMigratedCredential(CredentialData credential)
{
Trace.Info("Saving {0} migrated credential @ {1}", credential.Scheme, _migratedCredFilePath);
if (File.Exists(_migratedCredFilePath))
{
// Delete existing credential file first, since the file is hidden and not able to overwrite.
Trace.Info("Delete exist runner migrated credential file.");
IOUtil.DeleteFile(_migratedCredFilePath);
}
IOUtil.SaveObject(credential, _migratedCredFilePath);
Trace.Info("Migrated Credentials Saved.");
File.SetAttributes(_migratedCredFilePath, File.GetAttributes(_migratedCredFilePath) | FileAttributes.Hidden);
}
public void SaveSettings(RunnerSettings settings)
{
Trace.Info("Saving runner settings.");
@@ -206,6 +238,7 @@ namespace GitHub.Runner.Common
public void DeleteCredential()
{
IOUtil.Delete(_credFilePath, default(CancellationToken));
IOUtil.Delete(_migratedCredFilePath, default(CancellationToken));
}
public void DeleteSettings()

View File

@@ -19,11 +19,13 @@ namespace GitHub.Runner.Common
{
Runner,
Credentials,
MigratedCredentials,
RSACredentials,
Service,
CredentialStore,
Certificates,
Options,
SetupInfo,
}
public static class Constants
@@ -136,6 +138,12 @@ namespace GitHub.Runner.Common
}
}
public static class RunnerEvent
{
public static readonly string Register = "register";
public static readonly string Remove = "remove";
}
public static class Pipeline
{
public static class Path

View File

@@ -83,7 +83,6 @@ namespace GitHub.Runner.Common
_loadContext.Unloading += LoadContext_Unloading;
this.SecretMasker.AddValueEncoder(ValueEncoders.Base64StringEscape);
this.SecretMasker.AddValueEncoder(ValueEncoders.Base64StringEscapeTrimmed);
this.SecretMasker.AddValueEncoder(ValueEncoders.Base64StringEscapeShift1);
this.SecretMasker.AddValueEncoder(ValueEncoders.Base64StringEscapeShift2);
this.SecretMasker.AddValueEncoder(ValueEncoders.ExpressionStringEscape);
@@ -282,6 +281,12 @@ namespace GitHub.Runner.Common
".credentials");
break;
case WellKnownConfigFile.MigratedCredentials:
path = Path.Combine(
GetDirectory(WellKnownDirectory.Root),
".credentials_migrated");
break;
case WellKnownConfigFile.RSACredentials:
path = Path.Combine(
GetDirectory(WellKnownDirectory.Root),
@@ -317,6 +322,13 @@ namespace GitHub.Runner.Common
GetDirectory(WellKnownDirectory.Root),
".options");
break;
case WellKnownConfigFile.SetupInfo:
path = Path.Combine(
GetDirectory(WellKnownDirectory.Root),
".setup_info");
break;
default:
throw new NotSupportedException($"Unexpected well known config file: '{configFile}'");
}

View File

@@ -50,6 +50,10 @@ namespace GitHub.Runner.Common
// agent update
Task<TaskAgent> UpdateAgentUpdateStateAsync(int agentPoolId, int agentId, string currentState);
// runner authorization url
Task<string> GetRunnerAuthUrlAsync(int runnerPoolId, int runnerId);
Task ReportRunnerAuthUrlErrorAsync(int runnerPoolId, int runnerId, string error);
}
public sealed class RunnerServer : RunnerService, IRunnerServer
@@ -334,5 +338,20 @@ namespace GitHub.Runner.Common
CheckConnection(RunnerConnectionType.Generic);
return _genericTaskAgentClient.UpdateAgentUpdateStateAsync(agentPoolId, agentId, currentState);
}
//-----------------------------------------------------------------
// Runner Auth Url
//-----------------------------------------------------------------
public Task<string> GetRunnerAuthUrlAsync(int runnerPoolId, int runnerId)
{
CheckConnection(RunnerConnectionType.MessageQueue);
return _messageTaskAgentClient.GetAgentAuthUrlAsync(runnerPoolId, runnerId);
}
public Task ReportRunnerAuthUrlErrorAsync(int runnerPoolId, int runnerId, string error)
{
CheckConnection(RunnerConnectionType.MessageQueue);
return _messageTaskAgentClient.ReportAgentAuthUrlMigrationErrorAsync(runnerPoolId, runnerId, error);
}
}
}

View File

@@ -1,19 +1,18 @@
using GitHub.DistributedTask.WebApi;
using GitHub.Runner.Common;
using GitHub.Runner.Common.Util;
using GitHub.Runner.Sdk;
using GitHub.Services.Common;
using GitHub.Services.OAuth;
using GitHub.Services.WebApi;
using System;
using System.Collections.Generic;
using System.IO;
using System.Linq;
using System.Security.Cryptography;
using System.Threading.Tasks;
using System.Runtime.InteropServices;
using GitHub.Runner.Common;
using GitHub.Runner.Sdk;
using System.Net.Http;
using System.Net.Http.Headers;
using System.Runtime.InteropServices;
using System.Security.Cryptography;
using System.Threading.Tasks;
namespace GitHub.Runner.Listener.Configuration
{
@@ -109,7 +108,7 @@ namespace GitHub.Runner.Listener.Configuration
{
runnerSettings.GitHubUrl = inputUrl;
var githubToken = command.GetRunnerRegisterToken();
GitHubAuthResult authResult = await GetTenantCredential(inputUrl, githubToken);
GitHubAuthResult authResult = await GetTenantCredential(inputUrl, githubToken, Constants.RunnerEvent.Register);
runnerSettings.ServerUrl = authResult.TenantUrl;
creds = authResult.ToVssCredentials();
Trace.Info("cred retrieved via GitHub auth");
@@ -277,12 +276,15 @@ namespace GitHub.Runner.Listener.Configuration
throw new NotSupportedException("Message queue listen OAuth token.");
}
// Testing agent connection, detect any protential connection issue, like local clock skew that cause OAuth token expired.
// Testing agent connection, detect any potential connection issue, like local clock skew that cause OAuth token expired.
var credMgr = HostContext.GetService<ICredentialManager>();
VssCredentials credential = credMgr.LoadCredentials();
try
{
await _runnerServer.ConnectAsync(new Uri(runnerSettings.ServerUrl), credential);
// ConnectAsync() hits _apis/connectionData which is an anonymous endpoint
// Need to hit an authenticate endpoint to trigger OAuth token exchange.
await _runnerServer.GetAgentPoolsAsync();
_term.WriteSuccessMessage("Runner connection is good");
}
catch (VssOAuthTokenRequestException ex) when (ex.Message.Contains("Current server time is"))
@@ -373,7 +375,7 @@ namespace GitHub.Runner.Listener.Configuration
else
{
var githubToken = command.GetRunnerDeletionToken();
GitHubAuthResult authResult = await GetTenantCredential(settings.GitHubUrl, githubToken);
GitHubAuthResult authResult = await GetTenantCredential(settings.GitHubUrl, githubToken, Constants.RunnerEvent.Remove);
creds = authResult.ToVssCredentials();
Trace.Info("cred retrieved via GitHub auth");
}
@@ -517,17 +519,23 @@ namespace GitHub.Runner.Listener.Configuration
}
}
private async Task<GitHubAuthResult> GetTenantCredential(string githubUrl, string githubToken)
private async Task<GitHubAuthResult> GetTenantCredential(string githubUrl, string githubToken, string runnerEvent)
{
var gitHubUrl = new UriBuilder(githubUrl);
var githubApiUrl = $"https://api.{gitHubUrl.Host}/repos/{gitHubUrl.Path.Trim('/')}/actions-runners/registration";
var gitHubUrlBuilder = new UriBuilder(githubUrl);
var githubApiUrl = $"{gitHubUrlBuilder.Scheme}://api.{gitHubUrlBuilder.Host}/actions/runner-registration";
using (var httpClientHandler = HostContext.CreateHttpClientHandler())
using (var httpClient = new HttpClient(httpClientHandler))
{
httpClient.DefaultRequestHeaders.Authorization = new AuthenticationHeaderValue("RemoteAuth", githubToken);
httpClient.DefaultRequestHeaders.UserAgent.Add(HostContext.UserAgent);
httpClient.DefaultRequestHeaders.Accept.Add(new MediaTypeWithQualityHeaderValue("application/vnd.github.shuri-preview+json"));
var response = await httpClient.PostAsync(githubApiUrl, new StringContent("", null, "application/json"));
var bodyObject = new Dictionary<string, string>()
{
{"url", githubUrl},
{"runner_event", runnerEvent}
};
var response = await httpClient.PostAsync(githubApiUrl, new StringContent(StringUtil.ConvertToJson(bodyObject), null, "application/json"));
if (response.IsSuccessStatusCode)
{

View File

@@ -13,7 +13,7 @@ namespace GitHub.Runner.Listener.Configuration
public interface ICredentialManager : IRunnerService
{
ICredentialProvider GetCredentialProvider(string credType);
VssCredentials LoadCredentials();
VssCredentials LoadCredentials(bool preferMigrated = true);
}
public class CredentialManager : RunnerService, ICredentialManager
@@ -40,7 +40,7 @@ namespace GitHub.Runner.Listener.Configuration
return creds;
}
public VssCredentials LoadCredentials()
public VssCredentials LoadCredentials(bool preferMigrated = true)
{
IConfigurationStore store = HostContext.GetService<IConfigurationStore>();
@@ -50,6 +50,16 @@ namespace GitHub.Runner.Listener.Configuration
}
CredentialData credData = store.GetCredentials();
if (preferMigrated)
{
var migratedCred = store.GetMigratedCredentials();
if (migratedCred != null)
{
credData = migratedCred;
}
}
ICredentialProvider credProv = GetCredentialProvider(credData.Scheme);
credProv.CredentialData = credData;

View File

@@ -1,6 +1,5 @@
using System;
using GitHub.Runner.Common;
using GitHub.Runner.Common.Util;
using GitHub.Runner.Sdk;
using GitHub.Services.Common;
using GitHub.Services.OAuth;
@@ -29,7 +28,7 @@ namespace GitHub.Runner.Listener.Configuration
var authorizationUrl = this.CredentialData.Data.GetValueOrDefault("authorizationUrl", null);
// For back compat with .credential file that doesn't has 'oauthEndpointUrl' section
var oathEndpointUrl = this.CredentialData.Data.GetValueOrDefault("oauthEndpointUrl", authorizationUrl);
var oauthEndpointUrl = this.CredentialData.Data.GetValueOrDefault("oauthEndpointUrl", authorizationUrl);
ArgUtil.NotNullOrEmpty(clientId, nameof(clientId));
ArgUtil.NotNullOrEmpty(authorizationUrl, nameof(authorizationUrl));
@@ -39,7 +38,7 @@ namespace GitHub.Runner.Listener.Configuration
var keyManager = context.GetService<IRSAKeyManager>();
var signingCredentials = VssSigningCredentials.Create(() => keyManager.GetKey());
var clientCredential = new VssOAuthJwtBearerClientCredential(clientId, authorizationUrl, signingCredentials);
var agentCredential = new VssOAuthCredential(new Uri(oathEndpointUrl, UriKind.Absolute), VssOAuthGrant.ClientCredentials, clientCredential);
var agentCredential = new VssOAuthCredential(new Uri(oauthEndpointUrl, UriKind.Absolute), VssOAuthGrant.ClientCredentials, clientCredential);
// Construct a credentials cache with a single OAuth credential for communication. The windows credential
// is explicitly set to null to ensure we never do that negotiation.

View File

@@ -18,6 +18,7 @@ namespace GitHub.Runner.Listener
[ServiceLocator(Default = typeof(JobDispatcher))]
public interface IJobDispatcher : IRunnerService
{
bool Busy { get; }
TaskCompletionSource<bool> RunOnceJobCompleted { get; }
void Run(Pipelines.AgentJobRequestMessage message, bool runOnce = false);
bool Cancel(JobCancelMessage message);
@@ -69,6 +70,8 @@ namespace GitHub.Runner.Listener
public TaskCompletionSource<bool> RunOnceJobCompleted => _runOnceJobCompleted;
public bool Busy { get; private set; }
public void Run(Pipelines.AgentJobRequestMessage jobRequestMessage, bool runOnce = false)
{
Trace.Info($"Job request {jobRequestMessage.RequestId} for plan {jobRequestMessage.Plan.PlanId} job {jobRequestMessage.JobId} received.");
@@ -247,7 +250,7 @@ namespace GitHub.Runner.Listener
Task completedTask = await Task.WhenAny(jobDispatch.WorkerDispatch, Task.Delay(TimeSpan.FromSeconds(45)));
if (completedTask != jobDispatch.WorkerDispatch)
{
// at this point, the job exectuion might encounter some dead lock and even not able to be canclled.
// at this point, the job execution might encounter some dead lock and even not able to be cancelled.
// no need to localize the exception string should never happen.
throw new InvalidOperationException($"Job dispatch process for {jobDispatch.JobId} has encountered unexpected error, the dispatch task is not able to be canceled within 45 seconds.");
}
@@ -296,190 +299,290 @@ namespace GitHub.Runner.Listener
private async Task RunAsync(Pipelines.AgentJobRequestMessage message, WorkerDispatcher previousJobDispatch, CancellationToken jobRequestCancellationToken, CancellationToken workerCancelTimeoutKillToken)
{
if (previousJobDispatch != null)
Busy = true;
try
{
Trace.Verbose($"Make sure the previous job request {previousJobDispatch.JobId} has successfully finished on worker.");
await EnsureDispatchFinished(previousJobDispatch);
}
else
{
Trace.Verbose($"This is the first job request.");
}
var term = HostContext.GetService<ITerminal>();
term.WriteLine($"{DateTime.UtcNow:u}: Running job: {message.JobDisplayName}");
// first job request renew succeed.
TaskCompletionSource<int> firstJobRequestRenewed = new TaskCompletionSource<int>();
var notification = HostContext.GetService<IJobNotification>();
// lock renew cancellation token.
using (var lockRenewalTokenSource = new CancellationTokenSource())
using (var workerProcessCancelTokenSource = new CancellationTokenSource())
{
long requestId = message.RequestId;
Guid lockToken = Guid.Empty; // lockToken has never been used, keep this here of compat
// start renew job request
Trace.Info($"Start renew job request {requestId} for job {message.JobId}.");
Task renewJobRequest = RenewJobRequestAsync(_poolId, requestId, lockToken, firstJobRequestRenewed, lockRenewalTokenSource.Token);
// wait till first renew succeed or job request is canceled
// not even start worker if the first renew fail
await Task.WhenAny(firstJobRequestRenewed.Task, renewJobRequest, Task.Delay(-1, jobRequestCancellationToken));
if (renewJobRequest.IsCompleted)
if (previousJobDispatch != null)
{
// renew job request task complete means we run out of retry for the first job request renew.
Trace.Info($"Unable to renew job request for job {message.JobId} for the first time, stop dispatching job to worker.");
return;
Trace.Verbose($"Make sure the previous job request {previousJobDispatch.JobId} has successfully finished on worker.");
await EnsureDispatchFinished(previousJobDispatch);
}
else
{
Trace.Verbose($"This is the first job request.");
}
if (jobRequestCancellationToken.IsCancellationRequested)
var term = HostContext.GetService<ITerminal>();
term.WriteLine($"{DateTime.UtcNow:u}: Running job: {message.JobDisplayName}");
// first job request renew succeed.
TaskCompletionSource<int> firstJobRequestRenewed = new TaskCompletionSource<int>();
var notification = HostContext.GetService<IJobNotification>();
// lock renew cancellation token.
using (var lockRenewalTokenSource = new CancellationTokenSource())
using (var workerProcessCancelTokenSource = new CancellationTokenSource())
{
Trace.Info($"Stop renew job request for job {message.JobId}.");
// stop renew lock
lockRenewalTokenSource.Cancel();
// renew job request should never blows up.
await renewJobRequest;
long requestId = message.RequestId;
Guid lockToken = Guid.Empty; // lockToken has never been used, keep this here of compat
// complete job request with result Cancelled
await CompleteJobRequestAsync(_poolId, message, lockToken, TaskResult.Canceled);
return;
}
// start renew job request
Trace.Info($"Start renew job request {requestId} for job {message.JobId}.");
Task renewJobRequest = RenewJobRequestAsync(_poolId, requestId, lockToken, firstJobRequestRenewed, lockRenewalTokenSource.Token);
HostContext.WritePerfCounter($"JobRequestRenewed_{requestId.ToString()}");
// wait till first renew succeed or job request is canceled
// not even start worker if the first renew fail
await Task.WhenAny(firstJobRequestRenewed.Task, renewJobRequest, Task.Delay(-1, jobRequestCancellationToken));
Task<int> workerProcessTask = null;
object _outputLock = new object();
List<string> workerOutput = new List<string>();
using (var processChannel = HostContext.CreateService<IProcessChannel>())
using (var processInvoker = HostContext.CreateService<IProcessInvoker>())
{
// Start the process channel.
// It's OK if StartServer bubbles an execption after the worker process has already started.
// The worker will shutdown after 30 seconds if it hasn't received the job message.
processChannel.StartServer(
// Delegate to start the child process.
startProcess: (string pipeHandleOut, string pipeHandleIn) =>
{
// Validate args.
ArgUtil.NotNullOrEmpty(pipeHandleOut, nameof(pipeHandleOut));
ArgUtil.NotNullOrEmpty(pipeHandleIn, nameof(pipeHandleIn));
// Save STDOUT from worker, worker will use STDOUT report unhandle exception.
processInvoker.OutputDataReceived += delegate (object sender, ProcessDataReceivedEventArgs stdout)
{
if (!string.IsNullOrEmpty(stdout.Data))
{
lock (_outputLock)
{
workerOutput.Add(stdout.Data);
}
}
};
// Save STDERR from worker, worker will use STDERR on crash.
processInvoker.ErrorDataReceived += delegate (object sender, ProcessDataReceivedEventArgs stderr)
{
if (!string.IsNullOrEmpty(stderr.Data))
{
lock (_outputLock)
{
workerOutput.Add(stderr.Data);
}
}
};
// Start the child process.
HostContext.WritePerfCounter("StartingWorkerProcess");
var assemblyDirectory = HostContext.GetDirectory(WellKnownDirectory.Bin);
string workerFileName = Path.Combine(assemblyDirectory, _workerProcessName);
workerProcessTask = processInvoker.ExecuteAsync(
workingDirectory: assemblyDirectory,
fileName: workerFileName,
arguments: "spawnclient " + pipeHandleOut + " " + pipeHandleIn,
environment: null,
requireExitCodeZero: false,
outputEncoding: null,
killProcessOnCancel: true,
redirectStandardIn: null,
inheritConsoleHandler: false,
keepStandardInOpen: false,
highPriorityProcess: true,
cancellationToken: workerProcessCancelTokenSource.Token);
});
// Send the job request message.
// Kill the worker process if sending the job message times out. The worker
// process may have successfully received the job message.
try
if (renewJobRequest.IsCompleted)
{
Trace.Info($"Send job request message to worker for job {message.JobId}.");
HostContext.WritePerfCounter($"RunnerSendingJobToWorker_{message.JobId}");
using (var csSendJobRequest = new CancellationTokenSource(_channelTimeout))
{
await processChannel.SendAsync(
messageType: MessageType.NewJobRequest,
body: JsonUtility.ToString(message),
cancellationToken: csSendJobRequest.Token);
}
// renew job request task complete means we run out of retry for the first job request renew.
Trace.Info($"Unable to renew job request for job {message.JobId} for the first time, stop dispatching job to worker.");
return;
}
catch (OperationCanceledException)
{
// message send been cancelled.
// timeout 30 sec. kill worker.
Trace.Info($"Job request message sending for job {message.JobId} been cancelled, kill running worker.");
workerProcessCancelTokenSource.Cancel();
try
{
await workerProcessTask;
}
catch (OperationCanceledException)
{
Trace.Info("worker process has been killed.");
}
if (jobRequestCancellationToken.IsCancellationRequested)
{
Trace.Info($"Stop renew job request for job {message.JobId}.");
// stop renew lock
lockRenewalTokenSource.Cancel();
// renew job request should never blows up.
await renewJobRequest;
// not finish the job request since the job haven't run on worker at all, we will not going to set a result to server.
// complete job request with result Cancelled
await CompleteJobRequestAsync(_poolId, message, lockToken, TaskResult.Canceled);
return;
}
// we get first jobrequest renew succeed and start the worker process with the job message.
// send notification to machine provisioner.
var systemConnection = message.Resources.Endpoints.SingleOrDefault(x => string.Equals(x.Name, WellKnownServiceEndpointNames.SystemVssConnection, StringComparison.OrdinalIgnoreCase));
var accessToken = systemConnection?.Authorization?.Parameters["AccessToken"];
notification.JobStarted(message.JobId, accessToken, systemConnection.Url);
HostContext.WritePerfCounter($"JobRequestRenewed_{requestId.ToString()}");
HostContext.WritePerfCounter($"SentJobToWorker_{requestId.ToString()}");
try
Task<int> workerProcessTask = null;
object _outputLock = new object();
List<string> workerOutput = new List<string>();
using (var processChannel = HostContext.CreateService<IProcessChannel>())
using (var processInvoker = HostContext.CreateService<IProcessInvoker>())
{
TaskResult resultOnAbandonOrCancel = TaskResult.Succeeded;
// wait for renewlock, worker process or cancellation token been fired.
var completedTask = await Task.WhenAny(renewJobRequest, workerProcessTask, Task.Delay(-1, jobRequestCancellationToken));
if (completedTask == workerProcessTask)
{
// worker finished successfully, complete job request with result, attach unhandled exception reported by worker, stop renew lock, job has finished.
int returnCode = await workerProcessTask;
Trace.Info($"Worker finished for job {message.JobId}. Code: " + returnCode);
string detailInfo = null;
if (!TaskResultUtil.IsValidReturnCode(returnCode))
// Start the process channel.
// It's OK if StartServer bubbles an execption after the worker process has already started.
// The worker will shutdown after 30 seconds if it hasn't received the job message.
processChannel.StartServer(
// Delegate to start the child process.
startProcess: (string pipeHandleOut, string pipeHandleIn) =>
{
detailInfo = string.Join(Environment.NewLine, workerOutput);
Trace.Info($"Return code {returnCode} indicate worker encounter an unhandled exception or app crash, attach worker stdout/stderr to JobRequest result.");
await LogWorkerProcessUnhandledException(message, detailInfo);
// Validate args.
ArgUtil.NotNullOrEmpty(pipeHandleOut, nameof(pipeHandleOut));
ArgUtil.NotNullOrEmpty(pipeHandleIn, nameof(pipeHandleIn));
// Save STDOUT from worker, worker will use STDOUT report unhandle exception.
processInvoker.OutputDataReceived += delegate (object sender, ProcessDataReceivedEventArgs stdout)
{
if (!string.IsNullOrEmpty(stdout.Data))
{
lock (_outputLock)
{
workerOutput.Add(stdout.Data);
}
}
};
// Save STDERR from worker, worker will use STDERR on crash.
processInvoker.ErrorDataReceived += delegate (object sender, ProcessDataReceivedEventArgs stderr)
{
if (!string.IsNullOrEmpty(stderr.Data))
{
lock (_outputLock)
{
workerOutput.Add(stderr.Data);
}
}
};
// Start the child process.
HostContext.WritePerfCounter("StartingWorkerProcess");
var assemblyDirectory = HostContext.GetDirectory(WellKnownDirectory.Bin);
string workerFileName = Path.Combine(assemblyDirectory, _workerProcessName);
workerProcessTask = processInvoker.ExecuteAsync(
workingDirectory: assemblyDirectory,
fileName: workerFileName,
arguments: "spawnclient " + pipeHandleOut + " " + pipeHandleIn,
environment: null,
requireExitCodeZero: false,
outputEncoding: null,
killProcessOnCancel: true,
redirectStandardIn: null,
inheritConsoleHandler: false,
keepStandardInOpen: false,
highPriorityProcess: true,
cancellationToken: workerProcessCancelTokenSource.Token);
});
// Send the job request message.
// Kill the worker process if sending the job message times out. The worker
// process may have successfully received the job message.
try
{
Trace.Info($"Send job request message to worker for job {message.JobId}.");
HostContext.WritePerfCounter($"RunnerSendingJobToWorker_{message.JobId}");
using (var csSendJobRequest = new CancellationTokenSource(_channelTimeout))
{
await processChannel.SendAsync(
messageType: MessageType.NewJobRequest,
body: JsonUtility.ToString(message),
cancellationToken: csSendJobRequest.Token);
}
}
catch (OperationCanceledException)
{
// message send been cancelled.
// timeout 30 sec. kill worker.
Trace.Info($"Job request message sending for job {message.JobId} been cancelled, kill running worker.");
workerProcessCancelTokenSource.Cancel();
try
{
await workerProcessTask;
}
catch (OperationCanceledException)
{
Trace.Info("worker process has been killed.");
}
TaskResult result = TaskResultUtil.TranslateFromReturnCode(returnCode);
Trace.Info($"finish job request for job {message.JobId} with result: {result}");
term.WriteLine($"{DateTime.UtcNow:u}: Job {message.JobDisplayName} completed with result: {result}");
Trace.Info($"Stop renew job request for job {message.JobId}.");
// stop renew lock
lockRenewalTokenSource.Cancel();
// renew job request should never blows up.
await renewJobRequest;
// not finish the job request since the job haven't run on worker at all, we will not going to set a result to server.
return;
}
// we get first jobrequest renew succeed and start the worker process with the job message.
// send notification to machine provisioner.
var systemConnection = message.Resources.Endpoints.SingleOrDefault(x => string.Equals(x.Name, WellKnownServiceEndpointNames.SystemVssConnection, StringComparison.OrdinalIgnoreCase));
var accessToken = systemConnection?.Authorization?.Parameters["AccessToken"];
notification.JobStarted(message.JobId, accessToken, systemConnection.Url);
HostContext.WritePerfCounter($"SentJobToWorker_{requestId.ToString()}");
try
{
TaskResult resultOnAbandonOrCancel = TaskResult.Succeeded;
// wait for renewlock, worker process or cancellation token been fired.
var completedTask = await Task.WhenAny(renewJobRequest, workerProcessTask, Task.Delay(-1, jobRequestCancellationToken));
if (completedTask == workerProcessTask)
{
// worker finished successfully, complete job request with result, attach unhandled exception reported by worker, stop renew lock, job has finished.
int returnCode = await workerProcessTask;
Trace.Info($"Worker finished for job {message.JobId}. Code: " + returnCode);
string detailInfo = null;
if (!TaskResultUtil.IsValidReturnCode(returnCode))
{
detailInfo = string.Join(Environment.NewLine, workerOutput);
Trace.Info($"Return code {returnCode} indicate worker encounter an unhandled exception or app crash, attach worker stdout/stderr to JobRequest result.");
await LogWorkerProcessUnhandledException(message, detailInfo);
}
TaskResult result = TaskResultUtil.TranslateFromReturnCode(returnCode);
Trace.Info($"finish job request for job {message.JobId} with result: {result}");
term.WriteLine($"{DateTime.UtcNow:u}: Job {message.JobDisplayName} completed with result: {result}");
Trace.Info($"Stop renew job request for job {message.JobId}.");
// stop renew lock
lockRenewalTokenSource.Cancel();
// renew job request should never blows up.
await renewJobRequest;
// complete job request
await CompleteJobRequestAsync(_poolId, message, lockToken, result, detailInfo);
// print out unhandled exception happened in worker after we complete job request.
// when we run out of disk space, report back to server has higher priority.
if (!string.IsNullOrEmpty(detailInfo))
{
Trace.Error("Unhandled exception happened in worker:");
Trace.Error(detailInfo);
}
return;
}
else if (completedTask == renewJobRequest)
{
resultOnAbandonOrCancel = TaskResult.Abandoned;
}
else
{
resultOnAbandonOrCancel = TaskResult.Canceled;
}
// renew job request completed or job request cancellation token been fired for RunAsync(jobrequestmessage)
// cancel worker gracefully first, then kill it after worker cancel timeout
try
{
Trace.Info($"Send job cancellation message to worker for job {message.JobId}.");
using (var csSendCancel = new CancellationTokenSource(_channelTimeout))
{
var messageType = MessageType.CancelRequest;
if (HostContext.RunnerShutdownToken.IsCancellationRequested)
{
switch (HostContext.RunnerShutdownReason)
{
case ShutdownReason.UserCancelled:
messageType = MessageType.RunnerShutdown;
break;
case ShutdownReason.OperatingSystemShutdown:
messageType = MessageType.OperatingSystemShutdown;
break;
}
}
await processChannel.SendAsync(
messageType: messageType,
body: string.Empty,
cancellationToken: csSendCancel.Token);
}
}
catch (OperationCanceledException)
{
// message send been cancelled.
Trace.Info($"Job cancel message sending for job {message.JobId} been cancelled, kill running worker.");
workerProcessCancelTokenSource.Cancel();
try
{
await workerProcessTask;
}
catch (OperationCanceledException)
{
Trace.Info("worker process has been killed.");
}
}
// wait worker to exit
// if worker doesn't exit within timeout, then kill worker.
completedTask = await Task.WhenAny(workerProcessTask, Task.Delay(-1, workerCancelTimeoutKillToken));
// worker haven't exit within cancellation timeout.
if (completedTask != workerProcessTask)
{
Trace.Info($"worker process for job {message.JobId} haven't exit within cancellation timout, kill running worker.");
workerProcessCancelTokenSource.Cancel();
try
{
await workerProcessTask;
}
catch (OperationCanceledException)
{
Trace.Info("worker process has been killed.");
}
// When worker doesn't exit within cancel timeout, the runner will kill the worker process and worker won't finish upload job logs.
// The runner will try to upload these logs at this time.
await TryUploadUnfinishedLogs(message);
}
Trace.Info($"finish job request for job {message.JobId} with result: {resultOnAbandonOrCancel}");
term.WriteLine($"{DateTime.UtcNow:u}: Job {message.JobDisplayName} completed with result: {resultOnAbandonOrCancel}");
// complete job request with cancel result, stop renew lock, job has finished.
Trace.Info($"Stop renew job request for job {message.JobId}.");
// stop renew lock
@@ -488,112 +591,20 @@ namespace GitHub.Runner.Listener
await renewJobRequest;
// complete job request
await CompleteJobRequestAsync(_poolId, message, lockToken, result, detailInfo);
// print out unhandled exception happened in worker after we complete job request.
// when we run out of disk space, report back to server has higher priority.
if (!string.IsNullOrEmpty(detailInfo))
{
Trace.Error("Unhandled exception happened in worker:");
Trace.Error(detailInfo);
}
return;
await CompleteJobRequestAsync(_poolId, message, lockToken, resultOnAbandonOrCancel);
}
else if (completedTask == renewJobRequest)
finally
{
resultOnAbandonOrCancel = TaskResult.Abandoned;
// This should be the last thing to run so we don't notify external parties until actually finished
await notification.JobCompleted(message.JobId);
}
else
{
resultOnAbandonOrCancel = TaskResult.Canceled;
}
// renew job request completed or job request cancellation token been fired for RunAsync(jobrequestmessage)
// cancel worker gracefully first, then kill it after worker cancel timeout
try
{
Trace.Info($"Send job cancellation message to worker for job {message.JobId}.");
using (var csSendCancel = new CancellationTokenSource(_channelTimeout))
{
var messageType = MessageType.CancelRequest;
if (HostContext.RunnerShutdownToken.IsCancellationRequested)
{
switch (HostContext.RunnerShutdownReason)
{
case ShutdownReason.UserCancelled:
messageType = MessageType.RunnerShutdown;
break;
case ShutdownReason.OperatingSystemShutdown:
messageType = MessageType.OperatingSystemShutdown;
break;
}
}
await processChannel.SendAsync(
messageType: messageType,
body: string.Empty,
cancellationToken: csSendCancel.Token);
}
}
catch (OperationCanceledException)
{
// message send been cancelled.
Trace.Info($"Job cancel message sending for job {message.JobId} been cancelled, kill running worker.");
workerProcessCancelTokenSource.Cancel();
try
{
await workerProcessTask;
}
catch (OperationCanceledException)
{
Trace.Info("worker process has been killed.");
}
}
// wait worker to exit
// if worker doesn't exit within timeout, then kill worker.
completedTask = await Task.WhenAny(workerProcessTask, Task.Delay(-1, workerCancelTimeoutKillToken));
// worker haven't exit within cancellation timeout.
if (completedTask != workerProcessTask)
{
Trace.Info($"worker process for job {message.JobId} haven't exit within cancellation timout, kill running worker.");
workerProcessCancelTokenSource.Cancel();
try
{
await workerProcessTask;
}
catch (OperationCanceledException)
{
Trace.Info("worker process has been killed.");
}
// When worker doesn't exit within cancel timeout, the runner will kill the worker process and worker won't finish upload job logs.
// The runner will try to upload these logs at this time.
await TryUploadUnfinishedLogs(message);
}
Trace.Info($"finish job request for job {message.JobId} with result: {resultOnAbandonOrCancel}");
term.WriteLine($"{DateTime.UtcNow:u}: Job {message.JobDisplayName} completed with result: {resultOnAbandonOrCancel}");
// complete job request with cancel result, stop renew lock, job has finished.
Trace.Info($"Stop renew job request for job {message.JobId}.");
// stop renew lock
lockRenewalTokenSource.Cancel();
// renew job request should never blows up.
await renewJobRequest;
// complete job request
await CompleteJobRequestAsync(_poolId, message, lockToken, resultOnAbandonOrCancel);
}
finally
{
// This should be the last thing to run so we don't notify external parties until actually finished
await notification.JobCompleted(message.JobId);
}
}
}
finally
{
Busy = false;
}
}
public async Task RenewJobRequestAsync(int poolId, long requestId, Guid lockToken, TaskCompletionSource<int> firstJobRequestRenewed, CancellationToken token)

View File

@@ -13,7 +13,10 @@ using System.Diagnostics;
using System.Runtime.InteropServices;
using GitHub.Runner.Common;
using GitHub.Runner.Sdk;
using GitHub.Services.WebApi;
using System.Runtime.CompilerServices;
[assembly: InternalsVisibleTo("Test")]
namespace GitHub.Runner.Listener
{
[ServiceLocator(Default = typeof(MessageListener))]
@@ -32,18 +35,30 @@ namespace GitHub.Runner.Listener
private ITerminal _term;
private IRunnerServer _runnerServer;
private TaskAgentSession _session;
private ICredentialManager _credMgr;
private IConfigurationStore _configStore;
private TimeSpan _getNextMessageRetryInterval;
private readonly TimeSpan _sessionCreationRetryInterval = TimeSpan.FromSeconds(30);
private readonly TimeSpan _sessionConflictRetryLimit = TimeSpan.FromMinutes(4);
private readonly TimeSpan _clockSkewRetryLimit = TimeSpan.FromMinutes(30);
private readonly Dictionary<string, int> _sessionCreationExceptionTracker = new Dictionary<string, int>();
// Whether load credentials from .credentials_migrated file
internal bool _useMigratedCredentials;
// need to check auth url if there is only .credentials and auth schema is OAuth
internal bool _needToCheckAuthorizationUrlUpdate;
internal Task<VssCredentials> _authorizationUrlMigrationBackgroundTask;
internal Task _authorizationUrlRollbackReattemptDelayBackgroundTask;
public override void Initialize(IHostContext hostContext)
{
base.Initialize(hostContext);
_term = HostContext.GetService<ITerminal>();
_runnerServer = HostContext.GetService<IRunnerServer>();
_credMgr = HostContext.GetService<ICredentialManager>();
_configStore = HostContext.GetService<IConfigurationStore>();
}
public async Task<Boolean> CreateSessionAsync(CancellationToken token)
@@ -58,8 +73,8 @@ namespace GitHub.Runner.Listener
// Create connection.
Trace.Info("Loading Credentials");
var credMgr = HostContext.GetService<ICredentialManager>();
VssCredentials creds = credMgr.LoadCredentials();
_useMigratedCredentials = !StringUtil.ConvertToBoolean(Environment.GetEnvironmentVariable("GITHUB_ACTIONS_RUNNER_SPSAUTHURL"));
VssCredentials creds = _credMgr.LoadCredentials(_useMigratedCredentials);
var agent = new TaskAgentReference
{
@@ -74,6 +89,17 @@ namespace GitHub.Runner.Listener
string errorMessage = string.Empty;
bool encounteringError = false;
var originalCreds = _configStore.GetCredentials();
var migratedCreds = _configStore.GetMigratedCredentials();
if (migratedCreds == null)
{
_useMigratedCredentials = false;
if (originalCreds.Scheme == Constants.Configuration.OAuth)
{
_needToCheckAuthorizationUrlUpdate = true;
}
}
while (true)
{
token.ThrowIfCancellationRequested();
@@ -83,7 +109,7 @@ namespace GitHub.Runner.Listener
Trace.Info("Connecting to the Runner Server...");
await _runnerServer.ConnectAsync(new Uri(serverUrl), creds);
Trace.Info("VssConnection created");
_term.WriteLine();
_term.WriteSuccessMessage("Connected to GitHub");
_term.WriteLine();
@@ -101,6 +127,12 @@ namespace GitHub.Runner.Listener
encounteringError = false;
}
if (_needToCheckAuthorizationUrlUpdate)
{
// start background task try to get new authorization url
_authorizationUrlMigrationBackgroundTask = GetNewOAuthAuthorizationSetting(token);
}
return true;
}
catch (OperationCanceledException) when (token.IsCancellationRequested)
@@ -120,8 +152,21 @@ namespace GitHub.Runner.Listener
if (!IsSessionCreationExceptionRetriable(ex))
{
_term.WriteError($"Failed to create session. {ex.Message}");
return false;
if (_useMigratedCredentials)
{
// migrated credentials might cause lose permission during permission check,
// we will force to use original credential and try again
_useMigratedCredentials = false;
var reattemptBackoff = BackoffTimerHelper.GetRandomBackoff(TimeSpan.FromHours(24), TimeSpan.FromHours(36));
_authorizationUrlRollbackReattemptDelayBackgroundTask = HostContext.Delay(reattemptBackoff, token); // retry migrated creds in 24-36 hours.
creds = _credMgr.LoadCredentials(false);
Trace.Error("Fallback to original credentials and try again.");
}
else
{
_term.WriteError($"Failed to create session. {ex.Message}");
return false;
}
}
if (!encounteringError) //print the message only on the first error
@@ -182,6 +227,51 @@ namespace GitHub.Runner.Listener
encounteringError = false;
continuousError = 0;
}
if (_needToCheckAuthorizationUrlUpdate &&
_authorizationUrlMigrationBackgroundTask?.IsCompleted == true)
{
if (HostContext.GetService<IJobDispatcher>().Busy ||
HostContext.GetService<ISelfUpdater>().Busy)
{
Trace.Info("Job or runner updates in progress, update credentials next time.");
}
else
{
try
{
var newCred = await _authorizationUrlMigrationBackgroundTask;
await _runnerServer.ConnectAsync(new Uri(_settings.ServerUrl), newCred);
Trace.Info("Updated connection to use migrated credential for next GetMessage call.");
_useMigratedCredentials = true;
_authorizationUrlMigrationBackgroundTask = null;
_needToCheckAuthorizationUrlUpdate = false;
}
catch (Exception ex)
{
Trace.Error("Fail to refresh connection with new authorization url.");
Trace.Error(ex);
}
}
}
if (_authorizationUrlRollbackReattemptDelayBackgroundTask?.IsCompleted == true)
{
try
{
// we rolled back to use original creds about 2 days before, now it's a good time to try migrated creds again.
Trace.Info("Re-attempt to use migrated credential");
var migratedCreds = _credMgr.LoadCredentials();
await _runnerServer.ConnectAsync(new Uri(_settings.ServerUrl), migratedCreds);
_useMigratedCredentials = true;
_authorizationUrlRollbackReattemptDelayBackgroundTask = null;
}
catch (Exception ex)
{
Trace.Error("Fail to refresh connection with new authorization url on rollback reattempt.");
Trace.Error(ex);
}
}
}
catch (OperationCanceledException) when (token.IsCancellationRequested)
{
@@ -205,7 +295,21 @@ namespace GitHub.Runner.Listener
}
else if (!IsGetNextMessageExceptionRetriable(ex))
{
throw;
if (_useMigratedCredentials)
{
// migrated credentials might cause lose permission during permission check,
// we will force to use original credential and try again
_useMigratedCredentials = false;
var reattemptBackoff = BackoffTimerHelper.GetRandomBackoff(TimeSpan.FromHours(24), TimeSpan.FromHours(36));
_authorizationUrlRollbackReattemptDelayBackgroundTask = HostContext.Delay(reattemptBackoff, token); // retry migrated creds in 24-36 hours.
var originalCreds = _credMgr.LoadCredentials(false);
await _runnerServer.ConnectAsync(new Uri(_settings.ServerUrl), originalCreds);
Trace.Error("Fallback to original credentials and try again.");
}
else
{
throw;
}
}
else
{
@@ -397,5 +501,80 @@ namespace GitHub.Runner.Listener
return true;
}
}
private async Task<VssCredentials> GetNewOAuthAuthorizationSetting(CancellationToken token)
{
Trace.Info("Start checking oauth authorization url update.");
while (true)
{
var backoff = BackoffTimerHelper.GetRandomBackoff(TimeSpan.FromMinutes(30), TimeSpan.FromMinutes(45));
await HostContext.Delay(backoff, token);
try
{
var migratedAuthorizationUrl = await _runnerServer.GetRunnerAuthUrlAsync(_settings.PoolId, _settings.AgentId);
if (!string.IsNullOrEmpty(migratedAuthorizationUrl))
{
var credData = _configStore.GetCredentials();
var clientId = credData.Data.GetValueOrDefault("clientId", null);
var currentAuthorizationUrl = credData.Data.GetValueOrDefault("authorizationUrl", null);
Trace.Info($"Current authorization url: {currentAuthorizationUrl}, new authorization url: {migratedAuthorizationUrl}");
if (string.Equals(currentAuthorizationUrl, migratedAuthorizationUrl, StringComparison.OrdinalIgnoreCase))
{
// We don't need to update credentials.
Trace.Info("No needs to update authorization url");
await Task.Delay(TimeSpan.FromMilliseconds(-1), token);
}
var keyManager = HostContext.GetService<IRSAKeyManager>();
var signingCredentials = VssSigningCredentials.Create(() => keyManager.GetKey());
var migratedClientCredential = new VssOAuthJwtBearerClientCredential(clientId, migratedAuthorizationUrl, signingCredentials);
var migratedRunnerCredential = new VssOAuthCredential(new Uri(migratedAuthorizationUrl, UriKind.Absolute), VssOAuthGrant.ClientCredentials, migratedClientCredential);
Trace.Info("Try connect service with Token Service OAuth endpoint.");
var runnerServer = HostContext.CreateService<IRunnerServer>();
await runnerServer.ConnectAsync(new Uri(_settings.ServerUrl), migratedRunnerCredential);
await runnerServer.GetAgentPoolsAsync();
Trace.Info($"Successfully connected service with new authorization url.");
var migratedCredData = new CredentialData
{
Scheme = Constants.Configuration.OAuth,
Data =
{
{ "clientId", clientId },
{ "authorizationUrl", migratedAuthorizationUrl },
{ "oauthEndpointUrl", migratedAuthorizationUrl },
},
};
_configStore.SaveMigratedCredential(migratedCredData);
return migratedRunnerCredential;
}
else
{
Trace.Verbose("No authorization url updates");
}
}
catch (Exception ex)
{
Trace.Error("Fail to get/test new authorization url.");
Trace.Error(ex);
try
{
await _runnerServer.ReportRunnerAuthUrlErrorAsync(_settings.PoolId, _settings.AgentId, ex.ToString());
}
catch (Exception e)
{
// best effort
Trace.Error("Fail to report the migration error");
Trace.Error(e);
}
}
}
}
}
}

View File

@@ -17,6 +17,7 @@ namespace GitHub.Runner.Listener
[ServiceLocator(Default = typeof(SelfUpdater))]
public interface ISelfUpdater : IRunnerService
{
bool Busy { get; }
Task<bool> SelfUpdate(AgentRefreshMessage updateMessage, IJobDispatcher jobDispatcher, bool restartInteractiveRunner, CancellationToken token);
}
@@ -31,6 +32,8 @@ namespace GitHub.Runner.Listener
private int _poolId;
private int _agentId;
public bool Busy { get; private set; }
public override void Initialize(IHostContext hostContext)
{
base.Initialize(hostContext);
@@ -45,52 +48,60 @@ namespace GitHub.Runner.Listener
public async Task<bool> SelfUpdate(AgentRefreshMessage updateMessage, IJobDispatcher jobDispatcher, bool restartInteractiveRunner, CancellationToken token)
{
if (!await UpdateNeeded(updateMessage.TargetVersion, token))
Busy = true;
try
{
Trace.Info($"Can't find available update package.");
return false;
}
if (!await UpdateNeeded(updateMessage.TargetVersion, token))
{
Trace.Info($"Can't find available update package.");
return false;
}
Trace.Info($"An update is available.");
Trace.Info($"An update is available.");
// Print console line that warn user not shutdown runner.
await UpdateRunnerUpdateStateAsync("Runner update in progress, do not shutdown runner.");
await UpdateRunnerUpdateStateAsync($"Downloading {_targetPackage.Version} runner");
// Print console line that warn user not shutdown runner.
await UpdateRunnerUpdateStateAsync("Runner update in progress, do not shutdown runner.");
await UpdateRunnerUpdateStateAsync($"Downloading {_targetPackage.Version} runner");
await DownloadLatestRunner(token);
Trace.Info($"Download latest runner and unzip into runner root.");
await DownloadLatestRunner(token);
Trace.Info($"Download latest runner and unzip into runner root.");
// wait till all running job finish
await UpdateRunnerUpdateStateAsync("Waiting for current job finish running.");
// wait till all running job finish
await UpdateRunnerUpdateStateAsync("Waiting for current job finish running.");
await jobDispatcher.WaitAsync(token);
Trace.Info($"All running job has exited.");
await jobDispatcher.WaitAsync(token);
Trace.Info($"All running job has exited.");
// delete runner backup
DeletePreviousVersionRunnerBackup(token);
Trace.Info($"Delete old version runner backup.");
// delete runner backup
DeletePreviousVersionRunnerBackup(token);
Trace.Info($"Delete old version runner backup.");
// generate update script from template
await UpdateRunnerUpdateStateAsync("Generate and execute update script.");
// generate update script from template
await UpdateRunnerUpdateStateAsync("Generate and execute update script.");
string updateScript = GenerateUpdateScript(restartInteractiveRunner);
Trace.Info($"Generate update script into: {updateScript}");
string updateScript = GenerateUpdateScript(restartInteractiveRunner);
Trace.Info($"Generate update script into: {updateScript}");
// kick off update script
Process invokeScript = new Process();
// kick off update script
Process invokeScript = new Process();
#if OS_WINDOWS
invokeScript.StartInfo.FileName = WhichUtil.Which("cmd.exe", trace: Trace);
invokeScript.StartInfo.Arguments = $"/c \"{updateScript}\"";
#elif (OS_OSX || OS_LINUX)
invokeScript.StartInfo.FileName = WhichUtil.Which("bash", trace: Trace);
invokeScript.StartInfo.Arguments = $"\"{updateScript}\"";
invokeScript.StartInfo.FileName = WhichUtil.Which("bash", trace: Trace);
invokeScript.StartInfo.Arguments = $"\"{updateScript}\"";
#endif
invokeScript.Start();
Trace.Info($"Update script start running");
invokeScript.Start();
Trace.Info($"Update script start running");
await UpdateRunnerUpdateStateAsync("Runner will exit shortly for update, should back online within 10 seconds.");
await UpdateRunnerUpdateStateAsync("Runner will exit shortly for update, should back online within 10 seconds.");
return true;
return true;
}
finally
{
Busy = false;
}
}
private async Task<bool> UpdateNeeded(string targetVersion, CancellationToken token)

View File

@@ -271,6 +271,14 @@ namespace GitHub.Runner.Sdk
// Indicate GitHub Actions process.
_proc.StartInfo.Environment["GITHUB_ACTIONS"] = "true";
// Set CI=true when no one else already set it.
// CI=true is common set in most CI provider in GitHub
if (!_proc.StartInfo.Environment.ContainsKey("CI") &&
Environment.GetEnvironmentVariable("CI") == null)
{
_proc.StartInfo.Environment["CI"] = "true";
}
// Hook up the events.
_proc.EnableRaisingEvents = true;
_proc.Exited += ProcessExitedHandler;

View File

@@ -1,6 +1,7 @@
using GitHub.DistributedTask.Pipelines;
using GitHub.DistributedTask.WebApi;
using GitHub.Runner.Common.Util;
using GitHub.Runner.Worker.Container;
using System;
using System.Collections.Generic;
using System.IO;
@@ -15,14 +16,14 @@ namespace GitHub.Runner.Worker
{
void EnablePluginInternalCommand();
void DisablePluginInternalCommand();
bool TryProcessCommand(IExecutionContext context, string input);
bool TryProcessCommand(IExecutionContext context, string input, ContainerInfo container);
}
public sealed class ActionCommandManager : RunnerService, IActionCommandManager
{
private const string _stopCommand = "stop-commands";
private readonly Dictionary<string, IActionCommandExtension> _commandExtensions = new Dictionary<string, IActionCommandExtension>(StringComparer.OrdinalIgnoreCase);
private HashSet<string> _registeredCommands = new HashSet<string>(StringComparer.OrdinalIgnoreCase);
private readonly HashSet<string> _registeredCommands = new HashSet<string>(StringComparer.OrdinalIgnoreCase);
private readonly object _commandSerializeLock = new object();
private bool _stopProcessCommand = false;
private string _stopToken = null;
@@ -58,7 +59,7 @@ namespace GitHub.Runner.Worker
_registeredCommands.Remove("internal-set-repo-path");
}
public bool TryProcessCommand(IExecutionContext context, string input)
public bool TryProcessCommand(IExecutionContext context, string input, ContainerInfo container)
{
if (string.IsNullOrEmpty(input))
{
@@ -114,7 +115,7 @@ namespace GitHub.Runner.Worker
try
{
extension.ProcessCommand(context, input, actionCommand);
extension.ProcessCommand(context, input, actionCommand, container);
}
catch (Exception ex)
{
@@ -140,7 +141,7 @@ namespace GitHub.Runner.Worker
string Command { get; }
bool OmitEcho { get; }
void ProcessCommand(IExecutionContext context, string line, ActionCommand command);
void ProcessCommand(IExecutionContext context, string line, ActionCommand command, ContainerInfo container);
}
public sealed class InternalPluginSetRepoPathCommandExtension : RunnerService, IActionCommandExtension
@@ -150,7 +151,7 @@ namespace GitHub.Runner.Worker
public Type ExtensionType => typeof(IActionCommandExtension);
public void ProcessCommand(IExecutionContext context, string line, ActionCommand command)
public void ProcessCommand(IExecutionContext context, string line, ActionCommand command, ContainerInfo container)
{
if (!command.Properties.TryGetValue(SetRepoPathCommandProperties.repoFullName, out string repoFullName) || string.IsNullOrEmpty(repoFullName))
{
@@ -180,7 +181,7 @@ namespace GitHub.Runner.Worker
public Type ExtensionType => typeof(IActionCommandExtension);
public void ProcessCommand(IExecutionContext context, string line, ActionCommand command)
public void ProcessCommand(IExecutionContext context, string line, ActionCommand command, ContainerInfo container)
{
if (!command.Properties.TryGetValue(SetEnvCommandProperties.Name, out string envName) || string.IsNullOrEmpty(envName))
{
@@ -205,7 +206,7 @@ namespace GitHub.Runner.Worker
public Type ExtensionType => typeof(IActionCommandExtension);
public void ProcessCommand(IExecutionContext context, string line, ActionCommand command)
public void ProcessCommand(IExecutionContext context, string line, ActionCommand command, ContainerInfo container)
{
if (!command.Properties.TryGetValue(SetOutputCommandProperties.Name, out string outputName) || string.IsNullOrEmpty(outputName))
{
@@ -229,7 +230,7 @@ namespace GitHub.Runner.Worker
public Type ExtensionType => typeof(IActionCommandExtension);
public void ProcessCommand(IExecutionContext context, string line, ActionCommand command)
public void ProcessCommand(IExecutionContext context, string line, ActionCommand command, ContainerInfo container)
{
if (!command.Properties.TryGetValue(SaveStateCommandProperties.Name, out string stateName) || string.IsNullOrEmpty(stateName))
{
@@ -253,7 +254,7 @@ namespace GitHub.Runner.Worker
public Type ExtensionType => typeof(IActionCommandExtension);
public void ProcessCommand(IExecutionContext context, string line, ActionCommand command)
public void ProcessCommand(IExecutionContext context, string line, ActionCommand command, ContainerInfo container)
{
if (string.IsNullOrWhiteSpace(command.Data))
{
@@ -279,7 +280,7 @@ namespace GitHub.Runner.Worker
public Type ExtensionType => typeof(IActionCommandExtension);
public void ProcessCommand(IExecutionContext context, string line, ActionCommand command)
public void ProcessCommand(IExecutionContext context, string line, ActionCommand command, ContainerInfo container)
{
ArgUtil.NotNullOrEmpty(command.Data, "path");
context.PrependPath.RemoveAll(x => string.Equals(x, command.Data, StringComparison.CurrentCulture));
@@ -294,7 +295,7 @@ namespace GitHub.Runner.Worker
public Type ExtensionType => typeof(IActionCommandExtension);
public void ProcessCommand(IExecutionContext context, string line, ActionCommand command)
public void ProcessCommand(IExecutionContext context, string line, ActionCommand command, ContainerInfo container)
{
var file = command.Data;
@@ -306,9 +307,9 @@ namespace GitHub.Runner.Worker
}
// Translate file path back from container path
if (context.Container != null)
if (container != null)
{
file = context.Container.TranslateToHostPath(file);
file = container.TranslateToHostPath(file);
}
// Root the path
@@ -341,7 +342,7 @@ namespace GitHub.Runner.Worker
public Type ExtensionType => typeof(IActionCommandExtension);
public void ProcessCommand(IExecutionContext context, string line, ActionCommand command)
public void ProcessCommand(IExecutionContext context, string line, ActionCommand command, ContainerInfo container)
{
command.Properties.TryGetValue(RemoveMatcherCommandProperties.Owner, out string owner);
var file = command.Data;
@@ -369,9 +370,9 @@ namespace GitHub.Runner.Worker
else
{
// Translate file path back from container path
if (context.Container != null)
if (container != null)
{
file = context.Container.TranslateToHostPath(file);
file = container.TranslateToHostPath(file);
}
// Root the path
@@ -409,7 +410,7 @@ namespace GitHub.Runner.Worker
public Type ExtensionType => typeof(IActionCommandExtension);
public void ProcessCommand(IExecutionContext context, string inputLine, ActionCommand command)
public void ProcessCommand(IExecutionContext context, string inputLine, ActionCommand command, ContainerInfo container)
{
context.Debug(command.Data);
}
@@ -437,7 +438,7 @@ namespace GitHub.Runner.Worker
public Type ExtensionType => typeof(IActionCommandExtension);
public void ProcessCommand(IExecutionContext context, string inputLine, ActionCommand command)
public void ProcessCommand(IExecutionContext context, string inputLine, ActionCommand command, ContainerInfo container)
{
command.Properties.TryGetValue(IssueCommandProperties.File, out string file);
command.Properties.TryGetValue(IssueCommandProperties.Line, out string line);
@@ -454,10 +455,10 @@ namespace GitHub.Runner.Worker
{
issue.Category = "Code";
if (context.Container != null)
if (container != null)
{
// Translate file path back from container path
file = context.Container.TranslateToHostPath(file);
file = container.TranslateToHostPath(file);
command.Properties[IssueCommandProperties.File] = file;
}
@@ -517,7 +518,7 @@ namespace GitHub.Runner.Worker
public Type ExtensionType => typeof(IActionCommandExtension);
public void ProcessCommand(IExecutionContext context, string line, ActionCommand command)
public void ProcessCommand(IExecutionContext context, string line, ActionCommand command, ContainerInfo container)
{
var data = this is GroupCommandExtension ? command.Data : string.Empty;
context.Output($"##[{Command}]{data}");
@@ -531,7 +532,7 @@ namespace GitHub.Runner.Worker
public Type ExtensionType => typeof(IActionCommandExtension);
public void ProcessCommand(IExecutionContext context, string line, ActionCommand command)
public void ProcessCommand(IExecutionContext context, string line, ActionCommand command, ContainerInfo container)
{
ArgUtil.NotNullOrEmpty(command.Data, "value");

View File

@@ -32,6 +32,7 @@ namespace GitHub.Runner.Worker
public sealed class ActionManifestManager : RunnerService, IActionManifestManager
{
private TemplateSchema _actionManifestSchema;
private IReadOnlyList<String> _fileTable;
public override void Initialize(IHostContext hostContext)
{
@@ -61,6 +62,9 @@ namespace GitHub.Runner.Worker
// Get the file ID
var fileId = context.GetFileId(manifestFile);
_fileTable = context.GetFileTable();
// Read the file
var fileContent = File.ReadAllText(manifestFile);
using (var stringReader = new StringReader(fileContent))
{
@@ -265,6 +269,15 @@ namespace GitHub.Runner.Worker
}
}
// Add the file table
if (_fileTable?.Count > 0)
{
for (var i = 0 ; i < _fileTable.Count ; i++)
{
result.GetFileId(_fileTable[i]);
}
}
return result;
}
@@ -415,566 +428,5 @@ namespace GitHub.Runner.Worker
}
}
}
/// <summary>
/// Converts a YAML file into a TemplateToken
/// </summary>
internal sealed class YamlObjectReader : IObjectReader
{
internal YamlObjectReader(
Int32? fileId,
TextReader input)
{
m_fileId = fileId;
m_parser = new Parser(input);
}
public Boolean AllowLiteral(out LiteralToken value)
{
if (EvaluateCurrent() is Scalar scalar)
{
// Tag specified
if (!string.IsNullOrEmpty(scalar.Tag))
{
// String tag
if (string.Equals(scalar.Tag, c_stringTag, StringComparison.Ordinal))
{
value = new StringToken(m_fileId, scalar.Start.Line, scalar.Start.Column, scalar.Value);
MoveNext();
return true;
}
// Not plain style
if (scalar.Style != ScalarStyle.Plain)
{
throw new NotSupportedException($"The scalar style '{scalar.Style}' on line {scalar.Start.Line} and column {scalar.Start.Column} is not valid with the tag '{scalar.Tag}'");
}
// Boolean, Float, Integer, or Null
switch (scalar.Tag)
{
case c_booleanTag:
value = ParseBoolean(scalar);
break;
case c_floatTag:
value = ParseFloat(scalar);
break;
case c_integerTag:
value = ParseInteger(scalar);
break;
case c_nullTag:
value = ParseNull(scalar);
break;
default:
throw new NotSupportedException($"Unexpected tag '{scalar.Tag}'");
}
MoveNext();
return true;
}
// Plain style, determine type using YAML 1.2 "core" schema https://yaml.org/spec/1.2/spec.html#id2804923
if (scalar.Style == ScalarStyle.Plain)
{
if (MatchNull(scalar, out var nullToken))
{
value = nullToken;
}
else if (MatchBoolean(scalar, out var booleanToken))
{
value = booleanToken;
}
else if (MatchInteger(scalar, out var numberToken) ||
MatchFloat(scalar, out numberToken))
{
value = numberToken;
}
else
{
value = new StringToken(m_fileId, scalar.Start.Line, scalar.Start.Column, scalar.Value);
}
MoveNext();
return true;
}
// Otherwise assume string
value = new StringToken(m_fileId, scalar.Start.Line, scalar.Start.Column, scalar.Value);
MoveNext();
return true;
}
value = default;
return false;
}
public Boolean AllowSequenceStart(out SequenceToken value)
{
if (EvaluateCurrent() is SequenceStart sequenceStart)
{
value = new SequenceToken(m_fileId, sequenceStart.Start.Line, sequenceStart.Start.Column);
MoveNext();
return true;
}
value = default;
return false;
}
public Boolean AllowSequenceEnd()
{
if (EvaluateCurrent() is SequenceEnd)
{
MoveNext();
return true;
}
return false;
}
public Boolean AllowMappingStart(out MappingToken value)
{
if (EvaluateCurrent() is MappingStart mappingStart)
{
value = new MappingToken(m_fileId, mappingStart.Start.Line, mappingStart.Start.Column);
MoveNext();
return true;
}
value = default;
return false;
}
public Boolean AllowMappingEnd()
{
if (EvaluateCurrent() is MappingEnd)
{
MoveNext();
return true;
}
return false;
}
/// <summary>
/// Consumes the last parsing events, which are expected to be DocumentEnd and StreamEnd.
/// </summary>
public void ValidateEnd()
{
if (EvaluateCurrent() is DocumentEnd)
{
MoveNext();
}
else
{
throw new InvalidOperationException("Expected document end parse event");
}
if (EvaluateCurrent() is StreamEnd)
{
MoveNext();
}
else
{
throw new InvalidOperationException("Expected stream end parse event");
}
if (MoveNext())
{
throw new InvalidOperationException("Expected end of parse events");
}
}
/// <summary>
/// Consumes the first parsing events, which are expected to be StreamStart and DocumentStart.
/// </summary>
public void ValidateStart()
{
if (EvaluateCurrent() != null)
{
throw new InvalidOperationException("Unexpected parser state");
}
if (!MoveNext())
{
throw new InvalidOperationException("Expected a parse event");
}
if (EvaluateCurrent() is StreamStart)
{
MoveNext();
}
else
{
throw new InvalidOperationException("Expected stream start parse event");
}
if (EvaluateCurrent() is DocumentStart)
{
MoveNext();
}
else
{
throw new InvalidOperationException("Expected document start parse event");
}
}
private ParsingEvent EvaluateCurrent()
{
if (m_current == null)
{
m_current = m_parser.Current;
if (m_current != null)
{
if (m_current is Scalar scalar)
{
// Verify not using achors
if (scalar.Anchor != null)
{
throw new InvalidOperationException($"Anchors are not currently supported. Remove the anchor '{scalar.Anchor}'");
}
}
else if (m_current is MappingStart mappingStart)
{
// Verify not using achors
if (mappingStart.Anchor != null)
{
throw new InvalidOperationException($"Anchors are not currently supported. Remove the anchor '{mappingStart.Anchor}'");
}
}
else if (m_current is SequenceStart sequenceStart)
{
// Verify not using achors
if (sequenceStart.Anchor != null)
{
throw new InvalidOperationException($"Anchors are not currently supported. Remove the anchor '{sequenceStart.Anchor}'");
}
}
else if (!(m_current is MappingEnd) &&
!(m_current is SequenceEnd) &&
!(m_current is DocumentStart) &&
!(m_current is DocumentEnd) &&
!(m_current is StreamStart) &&
!(m_current is StreamEnd))
{
throw new InvalidOperationException($"Unexpected parsing event type: {m_current.GetType().Name}");
}
}
}
return m_current;
}
private Boolean MoveNext()
{
m_current = null;
return m_parser.MoveNext();
}
private BooleanToken ParseBoolean(Scalar scalar)
{
if (MatchBoolean(scalar, out var token))
{
return token;
}
ThrowInvalidValue(scalar, c_booleanTag); // throws
return default;
}
private NumberToken ParseFloat(Scalar scalar)
{
if (MatchFloat(scalar, out var token))
{
return token;
}
ThrowInvalidValue(scalar, c_floatTag); // throws
return default;
}
private NumberToken ParseInteger(Scalar scalar)
{
if (MatchInteger(scalar, out var token))
{
return token;
}
ThrowInvalidValue(scalar, c_integerTag); // throws
return default;
}
private NullToken ParseNull(Scalar scalar)
{
if (MatchNull(scalar, out var token))
{
return token;
}
ThrowInvalidValue(scalar, c_nullTag); // throws
return default;
}
private Boolean MatchBoolean(
Scalar scalar,
out BooleanToken value)
{
// YAML 1.2 "core" schema https://yaml.org/spec/1.2/spec.html#id2804923
switch (scalar.Value ?? string.Empty)
{
case "true":
case "True":
case "TRUE":
value = new BooleanToken(m_fileId, scalar.Start.Line, scalar.Start.Column, true);
return true;
case "false":
case "False":
case "FALSE":
value = new BooleanToken(m_fileId, scalar.Start.Line, scalar.Start.Column, false);
return true;
}
value = default;
return false;
}
private Boolean MatchFloat(
Scalar scalar,
out NumberToken value)
{
// YAML 1.2 "core" schema https://yaml.org/spec/1.2/spec.html#id2804923
var str = scalar.Value;
if (!string.IsNullOrEmpty(str))
{
// Check for [-+]?(\.inf|\.Inf|\.INF)|\.nan|\.NaN|\.NAN
switch (str)
{
case ".inf":
case ".Inf":
case ".INF":
case "+.inf":
case "+.Inf":
case "+.INF":
value = new NumberToken(m_fileId, scalar.Start.Line, scalar.Start.Column, Double.PositiveInfinity);
return true;
case "-.inf":
case "-.Inf":
case "-.INF":
value = new NumberToken(m_fileId, scalar.Start.Line, scalar.Start.Column, Double.NegativeInfinity);
return true;
case ".nan":
case ".NaN":
case ".NAN":
value = new NumberToken(m_fileId, scalar.Start.Line, scalar.Start.Column, Double.NaN);
return true;
}
// Otherwise check [-+]?(\.[0-9]+|[0-9]+(\.[0-9]*)?)([eE][-+]?[0-9]+)?
// Skip leading sign
var index = str[0] == '-' || str[0] == '+' ? 1 : 0;
// Check for integer portion
var length = str.Length;
var hasInteger = false;
while (index < length && str[index] >= '0' && str[index] <= '9')
{
hasInteger = true;
index++;
}
// Check for decimal point
var hasDot = false;
if (index < length && str[index] == '.')
{
hasDot = true;
index++;
}
// Check for decimal portion
var hasDecimal = false;
while (index < length && str[index] >= '0' && str[index] <= '9')
{
hasDecimal = true;
index++;
}
// Check [-+]?(\.[0-9]+|[0-9]+(\.[0-9]*)?)
if ((hasDot && hasDecimal) || hasInteger)
{
// Check for end
if (index == length)
{
// Try parse
if (Double.TryParse(str, NumberStyles.AllowLeadingSign | NumberStyles.AllowDecimalPoint, CultureInfo.InvariantCulture, out var doubleValue))
{
value = new NumberToken(m_fileId, scalar.Start.Line, scalar.Start.Column, doubleValue);
return true;
}
// Otherwise exceeds range
else
{
ThrowInvalidValue(scalar, c_floatTag); // throws
}
}
// Check [eE][-+]?[0-9]
else if (index < length && (str[index] == 'e' || str[index] == 'E'))
{
index++;
// Skip sign
if (index < length && (str[index] == '-' || str[index] == '+'))
{
index++;
}
// Check for exponent
var hasExponent = false;
while (index < length && str[index] >= '0' && str[index] <= '9')
{
hasExponent = true;
index++;
}
// Check for end
if (hasExponent && index == length)
{
// Try parse
if (Double.TryParse(str, NumberStyles.AllowLeadingSign | NumberStyles.AllowDecimalPoint | NumberStyles.AllowExponent, CultureInfo.InvariantCulture, out var doubleValue))
{
value = new NumberToken(m_fileId, scalar.Start.Line, scalar.Start.Column, (Double)doubleValue);
return true;
}
// Otherwise exceeds range
else
{
ThrowInvalidValue(scalar, c_floatTag); // throws
}
}
}
}
}
value = default;
return false;
}
private Boolean MatchInteger(
Scalar scalar,
out NumberToken value)
{
// YAML 1.2 "core" schema https://yaml.org/spec/1.2/spec.html#id2804923
var str = scalar.Value;
if (!string.IsNullOrEmpty(str))
{
// Check for [0-9]+
var firstChar = str[0];
if (firstChar >= '0' && firstChar <= '9' &&
str.Skip(1).All(x => x >= '0' && x <= '9'))
{
// Try parse
if (Double.TryParse(str, NumberStyles.None, CultureInfo.InvariantCulture, out var doubleValue))
{
value = new NumberToken(m_fileId, scalar.Start.Line, scalar.Start.Column, doubleValue);
return true;
}
// Otherwise exceeds range
ThrowInvalidValue(scalar, c_integerTag); // throws
}
// Check for (-|+)[0-9]+
else if ((firstChar == '-' || firstChar == '+') &&
str.Length > 1 &&
str.Skip(1).All(x => x >= '0' && x <= '9'))
{
// Try parse
if (Double.TryParse(str, NumberStyles.AllowLeadingSign, CultureInfo.InvariantCulture, out var doubleValue))
{
value = new NumberToken(m_fileId, scalar.Start.Line, scalar.Start.Column, doubleValue);
return true;
}
// Otherwise exceeds range
ThrowInvalidValue(scalar, c_integerTag); // throws
}
// Check for 0x[0-9a-fA-F]+
else if (firstChar == '0' &&
str.Length > 2 &&
str[1] == 'x' &&
str.Skip(2).All(x => (x >= '0' && x <= '9') || (x >= 'a' && x <= 'f') || (x >= 'A' && x <= 'F')))
{
// Try parse
if (Int32.TryParse(str.Substring(2), NumberStyles.AllowHexSpecifier, CultureInfo.InvariantCulture, out var integerValue))
{
value = new NumberToken(m_fileId, scalar.Start.Line, scalar.Start.Column, integerValue);
return true;
}
// Otherwise exceeds range
ThrowInvalidValue(scalar, c_integerTag); // throws
}
// Check for 0o[0-9]+
else if (firstChar == '0' &&
str.Length > 2 &&
str[1] == 'o' &&
str.Skip(2).All(x => x >= '0' && x <= '7'))
{
// Try parse
var integerValue = default(Int32);
try
{
integerValue = Convert.ToInt32(str.Substring(2), 8);
}
// Otherwise exceeds range
catch (Exception)
{
ThrowInvalidValue(scalar, c_integerTag); // throws
}
value = new NumberToken(m_fileId, scalar.Start.Line, scalar.Start.Column, integerValue);
return true;
}
}
value = default;
return false;
}
private Boolean MatchNull(
Scalar scalar,
out NullToken value)
{
// YAML 1.2 "core" schema https://yaml.org/spec/1.2/spec.html#id2804923
switch (scalar.Value ?? string.Empty)
{
case "":
case "null":
case "Null":
case "NULL":
case "~":
value = new NullToken(m_fileId, scalar.Start.Line, scalar.Start.Column);
return true;
}
value = default;
return false;
}
private void ThrowInvalidValue(
Scalar scalar,
String tag)
{
throw new NotSupportedException($"The value '{scalar.Value}' on line {scalar.Start.Line} and column {scalar.Start.Column} is invalid for the type '{scalar.Tag}'");
}
private const String c_booleanTag = "tag:yaml.org,2002:bool";
private const String c_floatTag = "tag:yaml.org,2002:float";
private const String c_integerTag = "tag:yaml.org,2002:int";
private const String c_nullTag = "tag:yaml.org,2002:null";
private const String c_stringTag = "tag:yaml.org,2002:string";
private readonly Int32? m_fileId;
private readonly Parser m_parser;
private ParsingEvent m_current;
}
}

View File

@@ -141,9 +141,7 @@ namespace GitHub.Runner.Worker
// Load the inputs.
ExecutionContext.Debug("Loading inputs");
var templateTrace = ExecutionContext.ToTemplateTraceWriter();
var schema = new PipelineTemplateSchemaFactory().CreateSchema();
var templateEvaluator = new PipelineTemplateEvaluator(templateTrace, schema);
var templateEvaluator = ExecutionContext.ToPipelineTemplateEvaluator();
var inputs = templateEvaluator.EvaluateStepInputs(Action.Inputs, ExecutionContext.ExpressionValues);
foreach (KeyValuePair<string, string> input in inputs)
@@ -295,8 +293,7 @@ namespace GitHub.Runner.Worker
return displayName;
}
// Try evaluating fully
var schema = new PipelineTemplateSchemaFactory().CreateSchema();
var templateEvaluator = new PipelineTemplateEvaluator(context.ToTemplateTraceWriter(), schema);
var templateEvaluator = context.ToPipelineTemplateEvaluator();
try
{
didFullyEvaluate = templateEvaluator.TryEvaluateStepDisplayName(tokenToParse, contextData, out displayName);

View File

@@ -11,6 +11,7 @@ using GitHub.Runner.Worker.Container;
using GitHub.Services.WebApi;
using GitHub.DistributedTask.Pipelines;
using GitHub.DistributedTask.Pipelines.ContextData;
using GitHub.DistributedTask.Pipelines.ObjectTemplating;
using GitHub.DistributedTask.WebApi;
using GitHub.Runner.Common.Util;
using GitHub.Runner.Common;
@@ -38,6 +39,7 @@ namespace GitHub.Runner.Worker
string ContextName { get; }
Task ForceCompleted { get; }
TaskResult? Result { get; set; }
TaskResult? Outcome { get; set; }
string ResultCode { get; set; }
TaskResult? CommandResult { get; set; }
CancellationToken CancellationToken { get; }
@@ -46,9 +48,11 @@ namespace GitHub.Runner.Worker
PlanFeatures Features { get; }
Variables Variables { get; }
Dictionary<string, string> IntraActionState { get; }
HashSet<string> OutputVariables { get; }
IDictionary<String, IDictionary<String, String>> JobDefaults { get; }
Dictionary<string, VariableValue> JobOutputs { get; }
IDictionary<String, String> EnvironmentVariables { get; }
IDictionary<String, ContextScope> Scopes { get; }
IList<String> FileTable { get; }
StepsContext StepsContext { get; }
DictionaryContextData ExpressionValues { get; }
List<string> PrependPath { get; }
@@ -108,7 +112,6 @@ namespace GitHub.Runner.Worker
private readonly TimelineRecord _record = new TimelineRecord();
private readonly Dictionary<Guid, TimelineRecord> _detailRecords = new Dictionary<Guid, TimelineRecord>();
private readonly object _loggerLock = new object();
private readonly HashSet<string> _outputvariables = new HashSet<string>(StringComparer.OrdinalIgnoreCase);
private readonly object _matchersLock = new object();
private event OnMatcherChanged _onMatcherChanged;
@@ -138,9 +141,11 @@ namespace GitHub.Runner.Worker
public List<ServiceEndpoint> Endpoints { get; private set; }
public Variables Variables { get; private set; }
public Dictionary<string, string> IntraActionState { get; private set; }
public HashSet<string> OutputVariables => _outputvariables;
public IDictionary<String, IDictionary<String, String>> JobDefaults { get; private set; }
public Dictionary<string, VariableValue> JobOutputs { get; private set; }
public IDictionary<String, String> EnvironmentVariables { get; private set; }
public IDictionary<String, ContextScope> Scopes { get; private set; }
public IList<String> FileTable { get; private set; }
public StepsContext StepsContext { get; private set; }
public DictionaryContextData ExpressionValues { get; } = new DictionaryContextData();
public bool WriteDebug { get; private set; }
@@ -169,6 +174,8 @@ namespace GitHub.Runner.Worker
}
}
public TaskResult? Outcome { get; set; }
public TaskResult? CommandResult { get; set; }
private string ContextType => _record.RecordType;
@@ -265,7 +272,9 @@ namespace GitHub.Runner.Worker
child.IntraActionState = intraActionState;
}
child.EnvironmentVariables = EnvironmentVariables;
child.JobDefaults = JobDefaults;
child.Scopes = Scopes;
child.FileTable = FileTable;
child.StepsContext = StepsContext;
foreach (var pair in ExpressionValues)
{
@@ -343,6 +352,12 @@ namespace GitHub.Runner.Worker
_logger.End();
if (!string.IsNullOrEmpty(ContextName))
{
StepsContext.SetOutcome(ScopeName, ContextName, (Outcome ?? Result ?? TaskResult.Succeeded).ToActionResult().ToString());
StepsContext.SetConclusion(ScopeName, ContextName, (Result ?? TaskResult.Succeeded).ToActionResult().ToString());
}
return Result.Value;
}
@@ -553,6 +568,12 @@ namespace GitHub.Runner.Worker
// Environment variables shared across all actions
EnvironmentVariables = new Dictionary<string, string>(VarUtil.EnvironmentVariableKeyComparer);
// Job defaults shared across all actions
JobDefaults = new Dictionary<string, IDictionary<string, string>>(StringComparer.OrdinalIgnoreCase);
// Job Outputs
JobOutputs = new Dictionary<string, VariableValue>(StringComparer.OrdinalIgnoreCase);
// Service container info
ServiceContainers = new List<ContainerInfo>();
@@ -569,6 +590,9 @@ namespace GitHub.Runner.Worker
}
}
// File table
FileTable = new List<String>(message.FileTable ?? new string[0]);
// Expression functions
if (Variables.GetBoolean("System.HashFilesV2") == true)
{
@@ -592,8 +616,13 @@ namespace GitHub.Runner.Worker
var githubAccessToken = new StringContextData(Variables.Get("system.github.token"));
var base64EncodedToken = Convert.ToBase64String(Encoding.UTF8.GetBytes($"x-access-token:{githubAccessToken}"));
HostContext.SecretMasker.AddValue(base64EncodedToken);
var githubJob = Variables.Get("system.github.job");
var githubContext = new GitHubContext();
githubContext["token"] = githubAccessToken;
if (!string.IsNullOrEmpty(githubJob))
{
githubContext["job"] = new StringContextData(githubJob);
}
var githubDictionary = ExpressionValues["github"].AssertDictionary("github");
foreach (var pair in githubDictionary)
{
@@ -729,7 +758,7 @@ namespace GitHub.Runner.Worker
var owners = config.Matchers.Select(x => $"'{x.Owner}'");
var joinedOwners = string.Join(", ", owners);
// todo: loc
this.Output($"Added matchers: {joinedOwners}. Problem matchers scan action output for known warning or error strings and report these inline.");
this.Debug($"Added matchers: {joinedOwners}. Problem matchers scan action output for known warning or error strings and report these inline.");
}
}
@@ -771,7 +800,7 @@ namespace GitHub.Runner.Worker
owners = removedMatchers.Select(x => $"'{x.Owner}'");
var joinedOwners = string.Join(", ", owners);
// todo: loc
this.Output($"Removed matchers: {joinedOwners}");
this.Debug($"Removed matchers: {joinedOwners}");
}
}
@@ -886,6 +915,13 @@ namespace GitHub.Runner.Worker
}
}
public static PipelineTemplateEvaluator ToPipelineTemplateEvaluator(this IExecutionContext context)
{
var templateTrace = context.ToTemplateTraceWriter();
var schema = new PipelineTemplateSchemaFactory().CreateSchema();
return new PipelineTemplateEvaluator(templateTrace, schema, context.FileTable);
}
public static ObjectTemplating.ITraceWriter ToTemplateTraceWriter(this IExecutionContext context)
{
return new TemplateTraceWriter(context);

View File

@@ -14,6 +14,7 @@ namespace GitHub.Runner.Worker
"event_name",
"event_path",
"head_ref",
"job",
"ref",
"repository",
"run_id",

View File

@@ -84,7 +84,7 @@ namespace GitHub.Runner.Worker.Handlers
{
// This does not need to be inside of a critical section.
// The logging queues and command handlers are thread-safe.
if (_commandManager.TryProcessCommand(_executionContext, line))
if (_commandManager.TryProcessCommand(_executionContext, line, _container))
{
return;
}

View File

@@ -58,12 +58,21 @@ namespace GitHub.Runner.Worker.Handlers
string shellCommandPath = null;
bool validateShellOnHost = !(StepHost is ContainerStepHost);
string prependPath = string.Join(Path.PathSeparator.ToString(), ExecutionContext.PrependPath.Reverse<string>());
Inputs.TryGetValue("shell", out var shell);
string shell = null;
if (!Inputs.TryGetValue("shell", out shell) || string.IsNullOrEmpty(shell))
{
// TODO: figure out how defaults interact with template later
// for now, we won't check job.defaults if we are inside a template.
if (string.IsNullOrEmpty(ExecutionContext.ScopeName) && ExecutionContext.JobDefaults.TryGetValue("run", out var runDefaults))
{
runDefaults.TryGetValue("shell", out shell);
}
}
if (string.IsNullOrEmpty(shell))
{
#if OS_WINDOWS
shellCommand = "pwsh";
if(validateShellOnHost)
if (validateShellOnHost)
{
shellCommandPath = WhichUtil.Which(shellCommand, require: false, Trace, prependPath);
if (string.IsNullOrEmpty(shellCommandPath))
@@ -139,11 +148,36 @@ namespace GitHub.Runner.Worker.Handlers
Inputs.TryGetValue("script", out var contents);
contents = contents ?? string.Empty;
Inputs.TryGetValue("workingDirectory", out var workingDirectory);
string workingDirectory = null;
if (!Inputs.TryGetValue("workingDirectory", out workingDirectory))
{
// TODO: figure out how defaults interact with template later
// for now, we won't check job.defaults if we are inside a template.
if (string.IsNullOrEmpty(ExecutionContext.ScopeName) && ExecutionContext.JobDefaults.TryGetValue("run", out var runDefaults))
{
if (runDefaults.TryGetValue("working-directory", out workingDirectory))
{
ExecutionContext.Debug("Overwrite 'working-directory' base on job defaults.");
}
}
}
var workspaceDir = githubContext["workspace"] as StringContextData;
workingDirectory = Path.Combine(workspaceDir, workingDirectory ?? string.Empty);
Inputs.TryGetValue("shell", out var shell);
string shell = null;
if (!Inputs.TryGetValue("shell", out shell) || string.IsNullOrEmpty(shell))
{
// TODO: figure out how defaults interact with template later
// for now, we won't check job.defaults if we are inside a template.
if (string.IsNullOrEmpty(ExecutionContext.ScopeName) && ExecutionContext.JobDefaults.TryGetValue("run", out var runDefaults))
{
if (runDefaults.TryGetValue("shell", out shell))
{
ExecutionContext.Debug("Overwrite 'shell' base on job defaults.");
}
}
}
var isContainerStepHost = StepHost is ContainerStepHost;
string prependPath = string.Join(Path.PathSeparator.ToString(), ExecutionContext.PrependPath.Reverse<string>());

View File

@@ -3,8 +3,11 @@ using System.Collections.Generic;
using System.Diagnostics;
using System.IO;
using System.Linq;
using System.Runtime.Serialization;
using System.Threading.Tasks;
using GitHub.DistributedTask.Expressions2;
using GitHub.DistributedTask.ObjectTemplating.Tokens;
using GitHub.DistributedTask.Pipelines.ContextData;
using GitHub.DistributedTask.Pipelines.ObjectTemplating;
using GitHub.DistributedTask.WebApi;
using GitHub.Runner.Common;
@@ -14,6 +17,16 @@ using Pipelines = GitHub.DistributedTask.Pipelines;
namespace GitHub.Runner.Worker
{
[DataContract]
public class SetupInfo
{
[DataMember]
public string Group { get; set; }
[DataMember]
public string Detail { get; set; }
}
[ServiceLocator(Default = typeof(JobExtension))]
public interface IJobExtension : IRunnerService
@@ -49,6 +62,44 @@ namespace GitHub.Runner.Worker
context.Start();
context.Debug($"Starting: Set up job");
context.Output($"Current runner version: '{BuildConstants.RunnerPackage.Version}'");
var setupInfoFile = HostContext.GetConfigFile(WellKnownConfigFile.SetupInfo);
if (File.Exists(setupInfoFile))
{
Trace.Info($"Load machine setup info from {setupInfoFile}");
try
{
var setupInfo = IOUtil.LoadObject<List<SetupInfo>>(setupInfoFile);
if (setupInfo?.Count > 0)
{
foreach (var info in setupInfo)
{
if (!string.IsNullOrEmpty(info?.Detail))
{
var groupName = info.Group;
if (string.IsNullOrEmpty(groupName))
{
groupName = "Machine Setup Info";
}
context.Output($"##[group]{groupName}");
var multiLines = info.Detail.Replace("\r\n", "\n").TrimEnd('\n').Split('\n');
foreach (var line in multiLines)
{
context.Output(line);
}
context.Output("##[endgroup]");
}
}
}
}
catch (Exception ex)
{
context.Output($"Fail to load and print machine setup info: {ex.Message}");
Trace.Error(ex);
}
}
var repoFullName = context.GetGitHubContext("repository");
ArgUtil.NotNull(repoFullName, nameof(repoFullName));
context.Debug($"Primary repository: {repoFullName}");
@@ -78,9 +129,7 @@ namespace GitHub.Runner.Worker
// Evaluate the job-level environment variables
context.Debug("Evaluating job-level environment variables");
var templateTrace = context.ToTemplateTraceWriter();
var schema = new PipelineTemplateSchemaFactory().CreateSchema();
var templateEvaluator = new PipelineTemplateEvaluator(templateTrace, schema);
var templateEvaluator = context.ToPipelineTemplateEvaluator();
foreach (var token in message.EnvironmentVariables)
{
var environmentVariables = templateEvaluator.EvaluateStepEnvironment(token, jobContext.ExpressionValues, VarUtil.EnvironmentVariableKeyComparer);
@@ -112,6 +161,26 @@ namespace GitHub.Runner.Worker
}
}
// Evaluate the job defaults
context.Debug("Evaluating job defaults");
foreach (var token in message.Defaults)
{
var defaults = token.AssertMapping("defaults");
if (defaults.Any(x => string.Equals(x.Key.AssertString("defaults key").Value, "run", StringComparison.OrdinalIgnoreCase)))
{
context.JobDefaults["run"] = new Dictionary<string, string>(StringComparer.OrdinalIgnoreCase);
var defaultsRun = defaults.First(x => string.Equals(x.Key.AssertString("defaults key").Value, "run", StringComparison.OrdinalIgnoreCase));
var jobDefaults = templateEvaluator.EvaluateJobDefaultsRun(defaultsRun.Value, jobContext.ExpressionValues);
foreach (var pair in jobDefaults)
{
if (!string.IsNullOrEmpty(pair.Value))
{
context.JobDefaults["run"][pair.Key] = pair.Value;
}
}
}
}
// Build up 2 lists of steps, pre-job, job
// Download actions not already in the cache
Trace.Info("Downloading actions");
@@ -244,6 +313,58 @@ namespace GitHub.Runner.Worker
context.Start();
context.Debug("Starting: Complete job");
// Evaluate job outputs
if (message.JobOutputs != null && message.JobOutputs.Type != TokenType.Null)
{
try
{
context.Output($"Evaluate and set job outputs");
// Populate env context for each step
Trace.Info("Initialize Env context for evaluating job outputs");
#if OS_WINDOWS
var envContext = new DictionaryContextData();
#else
var envContext = new CaseSensitiveDictionaryContextData();
#endif
context.ExpressionValues["env"] = envContext;
foreach (var pair in context.EnvironmentVariables)
{
envContext[pair.Key] = new StringContextData(pair.Value ?? string.Empty);
}
Trace.Info("Initialize steps context for evaluating job outputs");
context.ExpressionValues["steps"] = context.StepsContext.GetScope(context.ScopeName);
var templateEvaluator = context.ToPipelineTemplateEvaluator();
var outputs = templateEvaluator.EvaluateJobOutput(message.JobOutputs, context.ExpressionValues);
foreach (var output in outputs)
{
if (string.IsNullOrEmpty(output.Value))
{
context.Debug($"Skip output '{output.Key}' since it's empty");
continue;
}
if (!string.Equals(output.Value, HostContext.SecretMasker.MaskSecrets(output.Value)))
{
context.Warning($"Skip output '{output.Key}' since it may contain secret.");
continue;
}
context.Output($"Set output '{output.Key}'");
jobContext.JobOutputs[output.Key] = output.Value;
}
}
catch (Exception ex)
{
context.Result = TaskResult.Failed;
context.Error($"Fail to evaluate job outputs");
context.Error(ex);
jobContext.Result = TaskResultUtil.MergeTaskResults(jobContext.Result, TaskResult.Failed);
}
}
if (context.Variables.GetBoolean(Constants.Variables.Actions.RunnerDebug) ?? false)
{
Trace.Info("Support log upload starting.");

View File

@@ -231,7 +231,7 @@ namespace GitHub.Runner.Worker
}
Trace.Info("Raising job completed event.");
var jobCompletedEvent = new JobCompletedEvent(message.RequestId, message.JobId, result);
var jobCompletedEvent = new JobCompletedEvent(message.RequestId, message.JobId, result, jobContext.JobOutputs);
var completeJobRetryLimit = 5;
var exceptions = new List<Exception>();

View File

@@ -56,13 +56,22 @@ namespace GitHub.Runner.Worker
}
}
public void SetResult(
public void SetConclusion(
string scopeName,
string stepName,
string result)
string conclusion)
{
var step = GetStep(scopeName, stepName);
step["result"] = new StringContextData(result);
step["conclusion"] = new StringContextData(conclusion);
}
public void SetOutcome(
string scopeName,
string stepName,
string outcome)
{
var step = GetStep(scopeName, stepName);
step["outcome"] = new StringContextData(outcome);
}
private DictionaryContextData GetStep(string scopeName, string stepName)

View File

@@ -76,38 +76,36 @@ namespace GitHub.Runner.Worker
// Start
step.ExecutionContext.Start();
// Populate env context for each step
Trace.Info("Initialize Env context for step");
#if OS_WINDOWS
var envContext = new DictionaryContextData();
#else
var envContext = new CaseSensitiveDictionaryContextData();
#endif
step.ExecutionContext.ExpressionValues["env"] = envContext;
foreach (var pair in step.ExecutionContext.EnvironmentVariables)
{
envContext[pair.Key] = new StringContextData(pair.Value ?? string.Empty);
}
if (step is IActionRunner actionStep)
{
// Set GITHUB_ACTION
step.ExecutionContext.SetGitHubContext("action", actionStep.Action.Name);
// Evaluate and merge action's env block to env context
var templateTrace = step.ExecutionContext.ToTemplateTraceWriter();
var schema = new PipelineTemplateSchemaFactory().CreateSchema();
var templateEvaluator = new PipelineTemplateEvaluator(templateTrace, schema);
var actionEnvironment = templateEvaluator.EvaluateStepEnvironment(actionStep.Action.Environment, step.ExecutionContext.ExpressionValues, VarUtil.EnvironmentVariableKeyComparer);
foreach (var env in actionEnvironment)
{
envContext[env.Key] = new StringContextData(env.Value ?? string.Empty);
}
}
// Initialize scope
if (InitializeScope(step, scopeInputs))
{
// Populate env context for each step
Trace.Info("Initialize Env context for step");
#if OS_WINDOWS
var envContext = new DictionaryContextData();
#else
var envContext = new CaseSensitiveDictionaryContextData();
#endif
step.ExecutionContext.ExpressionValues["env"] = envContext;
foreach (var pair in step.ExecutionContext.EnvironmentVariables)
{
envContext[pair.Key] = new StringContextData(pair.Value ?? string.Empty);
}
if (step is IActionRunner actionStep)
{
// Set GITHUB_ACTION
step.ExecutionContext.SetGitHubContext("action", actionStep.Action.Name);
// Evaluate and merge action's env block to env context
var templateEvaluator = step.ExecutionContext.ToPipelineTemplateEvaluator();
var actionEnvironment = templateEvaluator.EvaluateStepEnvironment(actionStep.Action.Environment, step.ExecutionContext.ExpressionValues, VarUtil.EnvironmentVariableKeyComparer);
foreach (var env in actionEnvironment)
{
envContext[env.Key] = new StringContextData(env.Value ?? string.Empty);
}
}
var expressionManager = HostContext.GetService<IExpressionManager>();
try
{
@@ -247,7 +245,7 @@ namespace GitHub.Runner.Worker
// Set the timeout
var timeoutMinutes = 0;
var templateEvaluator = CreateTemplateEvaluator(step.ExecutionContext);
var templateEvaluator = step.ExecutionContext.ToPipelineTemplateEvaluator();
try
{
timeoutMinutes = templateEvaluator.EvaluateStepTimeout(step.Timeout, step.ExecutionContext.ExpressionValues);
@@ -353,6 +351,7 @@ namespace GitHub.Runner.Worker
if (continueOnError)
{
step.ExecutionContext.Outcome = step.ExecutionContext.Result;
step.ExecutionContext.Result = TaskResult.Succeeded;
Trace.Info($"Updated step result (continue on error)");
}
@@ -389,7 +388,7 @@ namespace GitHub.Runner.Worker
executionContext.Debug($"Initializing scope '{scope.Name}'");
executionContext.ExpressionValues["steps"] = stepsContext.GetScope(scope.ParentName);
executionContext.ExpressionValues["inputs"] = !String.IsNullOrEmpty(scope.ParentName) ? scopeInputs[scope.ParentName] : null;
var templateEvaluator = CreateTemplateEvaluator(executionContext);
var templateEvaluator = executionContext.ToPipelineTemplateEvaluator();
var inputs = default(DictionaryContextData);
try
{
@@ -445,7 +444,7 @@ namespace GitHub.Runner.Worker
executionContext.Debug($"Finalizing scope '{scope.Name}'");
executionContext.ExpressionValues["steps"] = stepsContext.GetScope(scope.Name);
executionContext.ExpressionValues["inputs"] = null;
var templateEvaluator = CreateTemplateEvaluator(executionContext);
var templateEvaluator = executionContext.ToPipelineTemplateEvaluator();
var outputs = default(DictionaryContextData);
try
{
@@ -477,12 +476,5 @@ namespace GitHub.Runner.Worker
executionContext.Complete(result, resultCode: resultCode);
}
private PipelineTemplateEvaluator CreateTemplateEvaluator(IExecutionContext executionContext)
{
var templateTrace = executionContext.ToTemplateTraceWriter();
var schema = new PipelineTemplateSchemaFactory().CreateSchema();
return new PipelineTemplateEvaluator(templateTrace, schema);
}
}
}

View File

@@ -90,10 +90,8 @@
"github",
"strategy",
"matrix",
"steps",
"job",
"runner",
"env"
"runner"
],
"string": {}
},

View File

@@ -15,6 +15,7 @@ namespace GitHub.DistributedTask.Expressions2
AddFunction<Join>("join", 1, 2);
AddFunction<StartsWith>("startsWith", 2, 2);
AddFunction<ToJson>("toJson", 1, 1);
AddFunction<FromJson>("fromJson", 1, 1);
AddFunction<HashFiles>("hashFiles", 1, 1);
}

View File

@@ -0,0 +1,24 @@
using System;
using System.IO;
using GitHub.DistributedTask.Pipelines.ContextData;
using Newtonsoft.Json;
using Newtonsoft.Json.Linq;
namespace GitHub.DistributedTask.Expressions2.Sdk.Functions
{
internal sealed class FromJson : Function
{
protected sealed override Object EvaluateCore(
EvaluationContext context,
out ResultMemory resultMemory)
{
resultMemory = null;
var json = Parameters[0].Evaluate(context).ConvertToString();
using (var stringReader = new StringReader(json))
using (var jsonReader = new JsonTextReader(stringReader) { DateParseHandling = DateParseHandling.None, FloatParseHandling = FloatParseHandling.Double })
{
var token = JToken.ReadFrom(jsonReader);
return token.ToPipelineContextData();
}
}
}}

View File

@@ -779,5 +779,65 @@ namespace GitHub.DistributedTask.WebApi
userState: userState,
cancellationToken: cancellationToken);
}
/// <summary>
/// [Preview API]
/// </summary>
/// <param name="poolId"></param>
/// <param name="agentId"></param>
/// <param name="userState"></param>
/// <param name="cancellationToken">The cancellation token to cancel operation.</param>
public Task<String> GetAgentAuthUrlAsync(
int poolId,
int agentId,
object userState = null,
CancellationToken cancellationToken = default)
{
HttpMethod httpMethod = new HttpMethod("GET");
Guid locationId = new Guid("a82a119c-1e46-44b6-8d75-c82a79cf975b");
object routeValues = new { poolId = poolId, agentId = agentId };
return SendAsync<String>(
httpMethod,
locationId,
routeValues: routeValues,
version: new ApiResourceVersion(6.0, 1),
userState: userState,
cancellationToken: cancellationToken);
}
/// <summary>
/// [Preview API]
/// </summary>
/// <param name="poolId"></param>
/// <param name="agentId"></param>
/// <param name="error"></param>
/// <param name="userState"></param>
/// <param name="cancellationToken">The cancellation token to cancel operation.</param>
[EditorBrowsable(EditorBrowsableState.Never)]
public virtual async Task ReportAgentAuthUrlMigrationErrorAsync(
int poolId,
int agentId,
string error,
object userState = null,
CancellationToken cancellationToken = default)
{
HttpMethod httpMethod = new HttpMethod("POST");
Guid locationId = new Guid("a82a119c-1e46-44b6-8d75-c82a79cf975b");
object routeValues = new { poolId = poolId, agentId = agentId };
HttpContent content = new ObjectContent<string>(error, new VssJsonMediaTypeFormatter(true));
using (HttpResponseMessage response = await SendAsync(
httpMethod,
locationId,
routeValues: routeValues,
version: new ApiResourceVersion(6.0, 1),
userState: userState,
cancellationToken: cancellationToken,
content: content).ConfigureAwait(false))
{
return;
}
}
}
}

View File

@@ -16,11 +16,6 @@ namespace GitHub.DistributedTask.Logging
{
return Convert.ToBase64String(Encoding.UTF8.GetBytes(value));
}
public static String Base64StringEscapeTrimmed(String value)
{
return TrimBase64End(Convert.ToBase64String(Encoding.UTF8.GetBytes(value)));
}
// Base64 is 6 bits -> char
// A byte is 8 bits
@@ -72,15 +67,15 @@ namespace GitHub.DistributedTask.Logging
{
var shiftArray = new byte[bytes.Length - shift];
Array.Copy(bytes, shift, shiftArray, 0, bytes.Length - shift);
return TrimBase64End(Convert.ToBase64String(shiftArray));
return Convert.ToBase64String(shiftArray);
}
else
{
return TrimBase64End(Convert.ToBase64String(bytes));
return Convert.ToBase64String(bytes);
}
}
public static String UriDataEscape(
private static String UriDataEscape(
String value,
Int32 maxSegmentSize)
{
@@ -108,26 +103,5 @@ namespace GitHub.DistributedTask.Logging
return result.ToString();
}
private static String TrimBase64End(String value)
{
if (String.IsNullOrEmpty(value))
{
return String.Empty;
}
if (value.EndsWith('='))
{
var trimmed = value.TrimEnd('=');
if (trimmed.Length > 1)
{
// If a base64 string ends in '=' it indicates that the base 64 character is only using 2 or 4 of the six bytes and will change if another character is added
// For example 'ab' is 'YWI=' in base 64
// 'abc' is 'YWJj'
// We need to detect YW, not YWI so we trim the last character ('I')
return trimmed.Substring(0, trimmed.Length - 1);
}
}
return value;
}
}
}

View File

@@ -30,8 +30,7 @@ namespace GitHub.DistributedTask.ObjectTemplating.Schema
foreach (var propertiesPair in properties)
{
var propertyName = propertiesPair.Key.AssertString($"{TemplateConstants.Definition} {TemplateConstants.Mapping} {TemplateConstants.Properties} key");
var propertyValue = propertiesPair.Value.AssertString($"{TemplateConstants.Definition} {TemplateConstants.Mapping} {TemplateConstants.Properties} value");
Properties.Add(propertyName.Value, new PropertyValue(propertyValue.Value));
Properties.Add(propertyName.Value, new PropertyValue(propertiesPair.Value));
}
break;
@@ -85,7 +84,7 @@ namespace GitHub.DistributedTask.ObjectTemplating.Schema
}
else
{
throw new ArgumentException($"Property '{TemplateConstants.LooseKeyType}' is defined but '{TemplateConstants.LooseValueType}' is not defined");
throw new ArgumentException($"Property '{TemplateConstants.LooseKeyType}' is defined but '{TemplateConstants.LooseValueType}' is not defined on '{name}'");
}
}
// Otherwise validate loose value type not be defined
@@ -95,9 +94,14 @@ namespace GitHub.DistributedTask.ObjectTemplating.Schema
}
// Lookup each property
foreach (var property in Properties.Values)
foreach (var property in Properties)
{
schema.GetDefinition(property.Type);
if (String.IsNullOrEmpty(property.Value.Type))
{
throw new ArgumentException($"Type not specified for the '{property.Key}' property on the '{name}' type");
}
schema.GetDefinition(property.Value.Type);
}
if (!String.IsNullOrEmpty(Inherits))

View File

@@ -1,18 +1,40 @@
using System;
using GitHub.DistributedTask.ObjectTemplating.Tokens;
namespace GitHub.DistributedTask.ObjectTemplating.Schema
{
internal sealed class PropertyValue
{
internal PropertyValue()
internal PropertyValue(TemplateToken token)
{
}
internal PropertyValue(String type)
{
Type = type;
if (token is StringToken stringToken)
{
Type = stringToken.Value;
}
else
{
var mapping = token.AssertMapping($"{TemplateConstants.MappingPropertyValue}");
foreach (var mappingPair in mapping)
{
var mappingKey = mappingPair.Key.AssertString($"{TemplateConstants.MappingPropertyValue} key");
switch (mappingKey.Value)
{
case TemplateConstants.Type:
Type = mappingPair.Value.AssertString($"{TemplateConstants.MappingPropertyValue} {TemplateConstants.Type}").Value;
break;
case TemplateConstants.Required:
Required = mappingPair.Value.AssertBoolean($"{TemplateConstants.MappingPropertyValue} {TemplateConstants.Required}").Value;
break;
default:
mappingKey.AssertUnexpectedValue($"{TemplateConstants.MappingPropertyValue} key");
break;
}
}
}
}
internal String Type { get; set; }
internal Boolean Required { get; set; }
}
}

View File

@@ -312,8 +312,8 @@ namespace GitHub.DistributedTask.ObjectTemplating.Schema
// template-schema
mappingDefinition = new MappingDefinition();
mappingDefinition.Properties.Add(TemplateConstants.Version, new PropertyValue(TemplateConstants.NonEmptyString));
mappingDefinition.Properties.Add(TemplateConstants.Definitions, new PropertyValue(TemplateConstants.Definitions));
mappingDefinition.Properties.Add(TemplateConstants.Version, new PropertyValue(new StringToken(null, null, null, TemplateConstants.NonEmptyString)));
mappingDefinition.Properties.Add(TemplateConstants.Definitions, new PropertyValue(new StringToken(null, null, null, TemplateConstants.Definitions)));
schema.Definitions.Add(TemplateConstants.TemplateSchema, mappingDefinition);
// definitions
@@ -335,9 +335,9 @@ namespace GitHub.DistributedTask.ObjectTemplating.Schema
// null-definition
mappingDefinition = new MappingDefinition();
mappingDefinition.Properties.Add(TemplateConstants.Description, new PropertyValue(TemplateConstants.String));
mappingDefinition.Properties.Add(TemplateConstants.Context, new PropertyValue(TemplateConstants.SequenceOfNonEmptyString));
mappingDefinition.Properties.Add(TemplateConstants.Null, new PropertyValue(TemplateConstants.NullDefinitionProperties));
mappingDefinition.Properties.Add(TemplateConstants.Description, new PropertyValue(new StringToken(null, null, null, TemplateConstants.String)));
mappingDefinition.Properties.Add(TemplateConstants.Context, new PropertyValue(new StringToken(null, null, null, TemplateConstants.SequenceOfNonEmptyString)));
mappingDefinition.Properties.Add(TemplateConstants.Null, new PropertyValue(new StringToken(null, null, null, TemplateConstants.NullDefinitionProperties)));
schema.Definitions.Add(TemplateConstants.NullDefinition, mappingDefinition);
// null-definition-properties
@@ -346,9 +346,9 @@ namespace GitHub.DistributedTask.ObjectTemplating.Schema
// boolean-definition
mappingDefinition = new MappingDefinition();
mappingDefinition.Properties.Add(TemplateConstants.Description, new PropertyValue(TemplateConstants.String));
mappingDefinition.Properties.Add(TemplateConstants.Context, new PropertyValue(TemplateConstants.SequenceOfNonEmptyString));
mappingDefinition.Properties.Add(TemplateConstants.Boolean, new PropertyValue(TemplateConstants.BooleanDefinitionProperties));
mappingDefinition.Properties.Add(TemplateConstants.Description, new PropertyValue(new StringToken(null, null, null, TemplateConstants.String)));
mappingDefinition.Properties.Add(TemplateConstants.Context, new PropertyValue(new StringToken(null, null, null, TemplateConstants.SequenceOfNonEmptyString)));
mappingDefinition.Properties.Add(TemplateConstants.Boolean, new PropertyValue(new StringToken(null, null, null, TemplateConstants.BooleanDefinitionProperties)));
schema.Definitions.Add(TemplateConstants.BooleanDefinition, mappingDefinition);
// boolean-definition-properties
@@ -357,9 +357,9 @@ namespace GitHub.DistributedTask.ObjectTemplating.Schema
// number-definition
mappingDefinition = new MappingDefinition();
mappingDefinition.Properties.Add(TemplateConstants.Description, new PropertyValue(TemplateConstants.String));
mappingDefinition.Properties.Add(TemplateConstants.Context, new PropertyValue(TemplateConstants.SequenceOfNonEmptyString));
mappingDefinition.Properties.Add(TemplateConstants.Number, new PropertyValue(TemplateConstants.NumberDefinitionProperties));
mappingDefinition.Properties.Add(TemplateConstants.Description, new PropertyValue(new StringToken(null, null, null, TemplateConstants.String)));
mappingDefinition.Properties.Add(TemplateConstants.Context, new PropertyValue(new StringToken(null, null, null, TemplateConstants.SequenceOfNonEmptyString)));
mappingDefinition.Properties.Add(TemplateConstants.Number, new PropertyValue(new StringToken(null, null, null, TemplateConstants.NumberDefinitionProperties)));
schema.Definitions.Add(TemplateConstants.NumberDefinition, mappingDefinition);
// number-definition-properties
@@ -368,55 +368,68 @@ namespace GitHub.DistributedTask.ObjectTemplating.Schema
// string-definition
mappingDefinition = new MappingDefinition();
mappingDefinition.Properties.Add(TemplateConstants.Description, new PropertyValue(TemplateConstants.String));
mappingDefinition.Properties.Add(TemplateConstants.Context, new PropertyValue(TemplateConstants.SequenceOfNonEmptyString));
mappingDefinition.Properties.Add(TemplateConstants.String, new PropertyValue(TemplateConstants.StringDefinitionProperties));
mappingDefinition.Properties.Add(TemplateConstants.Description, new PropertyValue(new StringToken(null, null, null, TemplateConstants.String)));
mappingDefinition.Properties.Add(TemplateConstants.Context, new PropertyValue(new StringToken(null, null, null, TemplateConstants.SequenceOfNonEmptyString)));
mappingDefinition.Properties.Add(TemplateConstants.String, new PropertyValue(new StringToken(null, null, null, TemplateConstants.StringDefinitionProperties)));
schema.Definitions.Add(TemplateConstants.StringDefinition, mappingDefinition);
// string-definition-properties
mappingDefinition = new MappingDefinition();
mappingDefinition.Properties.Add(TemplateConstants.Constant, new PropertyValue(TemplateConstants.NonEmptyString));
mappingDefinition.Properties.Add(TemplateConstants.IgnoreCase, new PropertyValue(TemplateConstants.Boolean));
mappingDefinition.Properties.Add(TemplateConstants.RequireNonEmpty, new PropertyValue(TemplateConstants.Boolean));
mappingDefinition.Properties.Add(TemplateConstants.Constant, new PropertyValue(new StringToken(null, null, null, TemplateConstants.NonEmptyString)));
mappingDefinition.Properties.Add(TemplateConstants.IgnoreCase, new PropertyValue(new StringToken(null, null, null,TemplateConstants.Boolean)));
mappingDefinition.Properties.Add(TemplateConstants.RequireNonEmpty, new PropertyValue(new StringToken(null, null, null, TemplateConstants.Boolean)));
schema.Definitions.Add(TemplateConstants.StringDefinitionProperties, mappingDefinition);
// sequence-definition
mappingDefinition = new MappingDefinition();
mappingDefinition.Properties.Add(TemplateConstants.Description, new PropertyValue(TemplateConstants.String));
mappingDefinition.Properties.Add(TemplateConstants.Context, new PropertyValue(TemplateConstants.SequenceOfNonEmptyString));
mappingDefinition.Properties.Add(TemplateConstants.Sequence, new PropertyValue(TemplateConstants.SequenceDefinitionProperties));
mappingDefinition.Properties.Add(TemplateConstants.Description, new PropertyValue(new StringToken(null, null, null, TemplateConstants.String)));
mappingDefinition.Properties.Add(TemplateConstants.Context, new PropertyValue(new StringToken(null, null, null, TemplateConstants.SequenceOfNonEmptyString)));
mappingDefinition.Properties.Add(TemplateConstants.Sequence, new PropertyValue(new StringToken(null, null, null, TemplateConstants.SequenceDefinitionProperties)));
schema.Definitions.Add(TemplateConstants.SequenceDefinition, mappingDefinition);
// sequence-definition-properties
mappingDefinition = new MappingDefinition();
mappingDefinition.Properties.Add(TemplateConstants.ItemType, new PropertyValue(TemplateConstants.NonEmptyString));
mappingDefinition.Properties.Add(TemplateConstants.ItemType, new PropertyValue(new StringToken(null, null, null, TemplateConstants.NonEmptyString)));
schema.Definitions.Add(TemplateConstants.SequenceDefinitionProperties, mappingDefinition);
// mapping-definition
mappingDefinition = new MappingDefinition();
mappingDefinition.Properties.Add(TemplateConstants.Description, new PropertyValue(TemplateConstants.String));
mappingDefinition.Properties.Add(TemplateConstants.Context, new PropertyValue(TemplateConstants.SequenceOfNonEmptyString));
mappingDefinition.Properties.Add(TemplateConstants.Mapping, new PropertyValue(TemplateConstants.MappingDefinitionProperties));
mappingDefinition.Properties.Add(TemplateConstants.Description, new PropertyValue(new StringToken(null, null, null, TemplateConstants.String)));
mappingDefinition.Properties.Add(TemplateConstants.Context, new PropertyValue(new StringToken(null, null, null, TemplateConstants.SequenceOfNonEmptyString)));
mappingDefinition.Properties.Add(TemplateConstants.Mapping, new PropertyValue(new StringToken(null, null, null, TemplateConstants.MappingDefinitionProperties)));
schema.Definitions.Add(TemplateConstants.MappingDefinition, mappingDefinition);
// mapping-definition-properties
mappingDefinition = new MappingDefinition();
mappingDefinition.Properties.Add(TemplateConstants.Properties, new PropertyValue(TemplateConstants.Properties));
mappingDefinition.Properties.Add(TemplateConstants.LooseKeyType, new PropertyValue(TemplateConstants.NonEmptyString));
mappingDefinition.Properties.Add(TemplateConstants.LooseValueType, new PropertyValue(TemplateConstants.NonEmptyString));
mappingDefinition.Properties.Add(TemplateConstants.Properties, new PropertyValue(new StringToken(null, null, null, TemplateConstants.Properties)));
mappingDefinition.Properties.Add(TemplateConstants.LooseKeyType, new PropertyValue(new StringToken(null, null, null, TemplateConstants.NonEmptyString)));
mappingDefinition.Properties.Add(TemplateConstants.LooseValueType, new PropertyValue(new StringToken(null, null, null, TemplateConstants.NonEmptyString)));
schema.Definitions.Add(TemplateConstants.MappingDefinitionProperties, mappingDefinition);
// properties
mappingDefinition = new MappingDefinition();
mappingDefinition.LooseKeyType = TemplateConstants.NonEmptyString;
mappingDefinition.LooseValueType = TemplateConstants.NonEmptyString;
mappingDefinition.LooseValueType = TemplateConstants.PropertyValue;
schema.Definitions.Add(TemplateConstants.Properties, mappingDefinition);
// property-value
oneOfDefinition = new OneOfDefinition();
oneOfDefinition.OneOf.Add(TemplateConstants.NonEmptyString);
oneOfDefinition.OneOf.Add(TemplateConstants.MappingPropertyValue);
schema.Definitions.Add(TemplateConstants.PropertyValue, oneOfDefinition);
// mapping-property-value
mappingDefinition = new MappingDefinition();
mappingDefinition.Properties.Add(TemplateConstants.Type, new PropertyValue(new StringToken(null, null, null, TemplateConstants.NonEmptyString)));
mappingDefinition.Properties.Add(TemplateConstants.Required, new PropertyValue(new StringToken(null, null, null, TemplateConstants.Boolean)));
schema.Definitions.Add(TemplateConstants.MappingPropertyValue, mappingDefinition);
// one-of-definition
mappingDefinition = new MappingDefinition();
mappingDefinition.Properties.Add(TemplateConstants.Description, new PropertyValue(TemplateConstants.String));
mappingDefinition.Properties.Add(TemplateConstants.Context, new PropertyValue(TemplateConstants.SequenceOfNonEmptyString));
mappingDefinition.Properties.Add(TemplateConstants.OneOf, new PropertyValue(TemplateConstants.SequenceOfNonEmptyString));
mappingDefinition.Properties.Add(TemplateConstants.Description, new PropertyValue(new StringToken(null, null, null, TemplateConstants.String)));
mappingDefinition.Properties.Add(TemplateConstants.Context, new PropertyValue(new StringToken(null, null, null, TemplateConstants.SequenceOfNonEmptyString)));
mappingDefinition.Properties.Add(TemplateConstants.OneOf, new PropertyValue(new StringToken(null, null, null, TemplateConstants.SequenceOfNonEmptyString)));
schema.Definitions.Add(TemplateConstants.OneOfDefinition, mappingDefinition);
// non-empty-string
@@ -477,4 +490,4 @@ namespace GitHub.DistributedTask.ObjectTemplating.Schema
private static readonly Regex s_definitionNameRegex = new Regex("^[a-zA-Z_][a-zA-Z0-9_-]*$", RegexOptions.Compiled);
private static TemplateSchema s_schema;
}
}
}

View File

@@ -22,9 +22,11 @@ namespace GitHub.DistributedTask.ObjectTemplating
internal const String ItemType = "item-type";
internal const String LooseKeyType = "loose-key-type";
internal const String LooseValueType = "loose-value-type";
internal const String MaxConstant = "MAX";
internal const String Mapping = "mapping";
internal const String MappingDefinition = "mapping-definition";
internal const String MappingDefinitionProperties = "mapping-definition-properties";
internal const String MappingPropertyValue = "mapping-property-value";
internal const String NonEmptyString = "non-empty-string";
internal const String Null = "null";
internal const String NullDefinition = "null-definition";
@@ -35,7 +37,9 @@ namespace GitHub.DistributedTask.ObjectTemplating
internal const String OneOf = "one-of";
internal const String OneOfDefinition = "one-of-definition";
internal const String OpenExpression = "${{";
internal const String PropertyValue = "property-value";
internal const String Properties = "properties";
internal const String Required = "required";
internal const String RequireNonEmpty = "require-non-empty";
internal const String Scalar = "scalar";
internal const String ScalarDefinition = "scalar-definition";
@@ -43,6 +47,7 @@ namespace GitHub.DistributedTask.ObjectTemplating
internal const String Sequence = "sequence";
internal const String SequenceDefinition = "sequence-definition";
internal const String SequenceDefinitionProperties = "sequence-definition-properties";
internal const String Type = "type";
internal const String SequenceOfNonEmptyString = "sequence-of-non-empty-string";
internal const String String = "string";
internal const String StringDefinition = "string-definition";

View File

@@ -184,6 +184,7 @@ namespace GitHub.DistributedTask.ObjectTemplating
id = FileIds.Count + 1;
FileIds.Add(file, id);
FileNames.Add(file);
Memory.AddBytes(file);
}
return id;
@@ -191,7 +192,12 @@ namespace GitHub.DistributedTask.ObjectTemplating
internal String GetFileName(Int32 fileId)
{
return FileNames[fileId - 1];
return FileNames.Count >= fileId ? FileNames[fileId - 1] : null;
}
internal IReadOnlyList<String> GetFileTable()
{
return FileNames.AsReadOnly();
}
private String GetErrorPrefix(
@@ -199,9 +205,9 @@ namespace GitHub.DistributedTask.ObjectTemplating
Int32? line,
Int32? column)
{
if (fileId != null)
var fileName = fileId.HasValue ? GetFileName(fileId.Value) : null;
if (!String.IsNullOrEmpty(fileName))
{
var fileName = GetFileName(fileId.Value);
if (line != null && column != null)
{
return $"{fileName} {TemplateStrings.LineColumn(line, column)}:";

View File

@@ -47,7 +47,7 @@ namespace GitHub.DistributedTask.ObjectTemplating
var evaluator = new TemplateEvaluator(context, template, removeBytes);
try
{
var availableContext = new HashSet<String>(context.ExpressionValues.Keys);
var availableContext = new HashSet<String>(context.ExpressionValues.Keys.Concat(context.ExpressionFunctions.Select(x => $"{x.Name}({x.MinParameters},{x.MaxParameters})")));
var definitionInfo = new DefinitionInfo(context.Schema, type, availableContext);
result = evaluator.Evaluate(definitionInfo);
@@ -182,12 +182,14 @@ namespace GitHub.DistributedTask.ObjectTemplating
}
var keys = new HashSet<String>(StringComparer.OrdinalIgnoreCase);
var hasExpressionKey = false;
while (m_unraveler.AllowScalar(definition.Expand, out ScalarToken nextKeyScalar))
{
// Expression
if (nextKeyScalar is ExpressionToken)
{
hasExpressionKey = true;
var anyDefinition = new DefinitionInfo(definition, TemplateConstants.Any);
mapping.Add(nextKeyScalar, Evaluate(anyDefinition));
continue;
@@ -268,6 +270,19 @@ namespace GitHub.DistributedTask.ObjectTemplating
String listToDeDuplicate = String.Join(", ", nonDuplicates);
m_context.Error(mapping, TemplateStrings.UnableToDetermineOneOf(listToDeDuplicate));
}
else if (mappingDefinitions.Count == 1 && !hasExpressionKey)
{
foreach (var property in mappingDefinitions[0].Properties)
{
if (property.Value.Required)
{
if (!keys.Contains(property.Key))
{
m_context.Error(mapping, $"Required property is missing: {property.Key}");
}
}
}
}
m_unraveler.ReadMappingEnd();
}

View File

@@ -178,14 +178,15 @@ namespace GitHub.DistributedTask.ObjectTemplating
}
var keys = new HashSet<String>(StringComparer.OrdinalIgnoreCase);
var hasExpressionKey = false;
while (m_objectReader.AllowLiteral(out LiteralToken rawLiteral))
{
var nextKeyScalar = ParseScalar(rawLiteral, definition.AllowedContext);
// Expression
if (nextKeyScalar is ExpressionToken)
{
hasExpressionKey = true;
// Legal
if (definition.AllowedContext.Length > 0)
{
@@ -280,7 +281,19 @@ namespace GitHub.DistributedTask.ObjectTemplating
String listToDeDuplicate = String.Join(", ", nonDuplicates);
m_context.Error(mapping, TemplateStrings.UnableToDetermineOneOf(listToDeDuplicate));
}
else if (mappingDefinitions.Count == 1 && !hasExpressionKey)
{
foreach (var property in mappingDefinitions[0].Properties)
{
if (property.Value.Required)
{
if (!keys.Contains(property.Key))
{
m_context.Error(mapping, $"Required property is missing: {property.Key}");
}
}
}
}
ExpectMappingEnd();
}

View File

@@ -30,14 +30,14 @@ namespace GitHub.DistributedTask.ObjectTemplating.Tokens
Column = column;
}
[IgnoreDataMember]
internal Int32? FileId { get; set; }
[DataMember(Name = "file", EmitDefaultValue = false)]
internal Int32? FileId { get; private set; }
[DataMember(Name = "line", EmitDefaultValue = false)]
internal Int32? Line { get; }
internal Int32? Line { get; private set; }
[DataMember(Name = "col", EmitDefaultValue = false)]
internal Int32? Column { get; }
internal Int32? Column { get; private set; }
[DataMember(Name = "type", EmitDefaultValue = false)]
internal Int32 Type { get; }

View File

@@ -115,13 +115,12 @@ namespace GitHub.DistributedTask.ObjectTemplating.Tokens
Object value,
JsonSerializer serializer)
{
base.WriteJson(writer, value, serializer);
if (value is TemplateToken token)
{
switch (token.Type)
{
case TokenType.Null:
if (token.Line == null && token.Column == null)
if (token.FileId == null && token.Line == null && token.Column == null)
{
writer.WriteNull();
}
@@ -130,12 +129,17 @@ namespace GitHub.DistributedTask.ObjectTemplating.Tokens
writer.WriteStartObject();
writer.WritePropertyName("type");
writer.WriteValue(token.Type);
if (token.FileId != null)
{
writer.WritePropertyName("file");
writer.WriteValue(token.FileId);
}
if (token.Line != null)
{
writer.WritePropertyName("line");
writer.WriteValue(token.Line);
}
if (token.Line != null)
if (token.Column != null)
{
writer.WritePropertyName("col");
writer.WriteValue(token.Column);
@@ -146,7 +150,7 @@ namespace GitHub.DistributedTask.ObjectTemplating.Tokens
case TokenType.Boolean:
var booleanToken = token as BooleanToken;
if (token.Line == null && token.Column == null)
if (token.FileId == null && token.Line == null && token.Column == null)
{
writer.WriteValue(booleanToken.Value);
}
@@ -155,12 +159,17 @@ namespace GitHub.DistributedTask.ObjectTemplating.Tokens
writer.WriteStartObject();
writer.WritePropertyName("type");
writer.WriteValue(token.Type);
if (token.FileId != null)
{
writer.WritePropertyName("file");
writer.WriteValue(token.FileId);
}
if (token.Line != null)
{
writer.WritePropertyName("line");
writer.WriteValue(token.Line);
}
if (token.Line != null)
if (token.Column != null)
{
writer.WritePropertyName("col");
writer.WriteValue(token.Column);
@@ -173,7 +182,7 @@ namespace GitHub.DistributedTask.ObjectTemplating.Tokens
case TokenType.Number:
var numberToken = token as NumberToken;
if (token.Line == null && token.Column == null)
if (token.FileId == null && token.Line == null && token.Column == null)
{
writer.WriteValue(numberToken.Value);
}
@@ -182,12 +191,17 @@ namespace GitHub.DistributedTask.ObjectTemplating.Tokens
writer.WriteStartObject();
writer.WritePropertyName("type");
writer.WriteValue(token.Type);
if (token.FileId != null)
{
writer.WritePropertyName("file");
writer.WriteValue(token.FileId);
}
if (token.Line != null)
{
writer.WritePropertyName("line");
writer.WriteValue(token.Line);
}
if (token.Line != null)
if (token.Column != null)
{
writer.WritePropertyName("col");
writer.WriteValue(token.Column);
@@ -200,7 +214,7 @@ namespace GitHub.DistributedTask.ObjectTemplating.Tokens
case TokenType.String:
var stringToken = token as StringToken;
if (token.Line == null && token.Column == null)
if (token.FileId == null && token.Line == null && token.Column == null)
{
writer.WriteValue(stringToken.Value);
}
@@ -209,12 +223,17 @@ namespace GitHub.DistributedTask.ObjectTemplating.Tokens
writer.WriteStartObject();
writer.WritePropertyName("type");
writer.WriteValue(token.Type);
if (token.FileId != null)
{
writer.WritePropertyName("file");
writer.WriteValue(token.FileId);
}
if (token.Line != null)
{
writer.WritePropertyName("line");
writer.WriteValue(token.Line);
}
if (token.Line != null)
if (token.Column != null)
{
writer.WritePropertyName("col");
writer.WriteValue(token.Column);
@@ -230,12 +249,17 @@ namespace GitHub.DistributedTask.ObjectTemplating.Tokens
writer.WriteStartObject();
writer.WritePropertyName("type");
writer.WriteValue(token.Type);
if (token.FileId != null)
{
writer.WritePropertyName("file");
writer.WriteValue(token.FileId);
}
if (token.Line != null)
{
writer.WritePropertyName("line");
writer.WriteValue(token.Line);
}
if (token.Line != null)
if (token.Column != null)
{
writer.WritePropertyName("col");
writer.WriteValue(token.Column);
@@ -253,12 +277,17 @@ namespace GitHub.DistributedTask.ObjectTemplating.Tokens
writer.WriteStartObject();
writer.WritePropertyName("type");
writer.WriteValue(token.Type);
if (token.FileId != null)
{
writer.WritePropertyName("file");
writer.WriteValue(token.FileId);
}
if (token.Line != null)
{
writer.WritePropertyName("line");
writer.WriteValue(token.Line);
}
if (token.Line != null)
if (token.Column != null)
{
writer.WritePropertyName("col");
writer.WriteValue(token.Column);
@@ -273,12 +302,17 @@ namespace GitHub.DistributedTask.ObjectTemplating.Tokens
writer.WriteStartObject();
writer.WritePropertyName("type");
writer.WriteValue(token.Type);
if (token.FileId != null)
{
writer.WritePropertyName("file");
writer.WriteValue(token.FileId);
}
if (token.Line != null)
{
writer.WritePropertyName("line");
writer.WriteValue(token.Line);
}
if (token.Line != null)
if (token.Column != null)
{
writer.WritePropertyName("col");
writer.WriteValue(token.Column);
@@ -301,12 +335,17 @@ namespace GitHub.DistributedTask.ObjectTemplating.Tokens
writer.WriteStartObject();
writer.WritePropertyName("type");
writer.WriteValue(token.Type);
if (token.FileId != null)
{
writer.WritePropertyName("file");
writer.WriteValue(token.FileId);
}
if (token.Line != null)
{
writer.WritePropertyName("line");
writer.WriteValue(token.Line);
}
if (token.Line != null)
if (token.Column != null)
{
writer.WritePropertyName("col");
writer.WriteValue(token.Column);

View File

@@ -39,7 +39,10 @@ namespace GitHub.DistributedTask.Pipelines
DictionaryContextData contextData,
WorkspaceOptions workspaceOptions,
IEnumerable<JobStep> steps,
IEnumerable<ContextScope> scopes)
IEnumerable<ContextScope> scopes,
IList<String> fileTable,
TemplateToken jobOutputs,
IList<TemplateToken> defaults)
{
this.MessageType = JobRequestMessageTypes.PipelineAgentJobRequest;
this.Plan = plan;
@@ -51,6 +54,7 @@ namespace GitHub.DistributedTask.Pipelines
this.Timeline = timeline;
this.Resources = jobResources;
this.Workspace = workspaceOptions;
this.JobOutputs = jobOutputs;
m_variables = new Dictionary<String, VariableValue>(variables, StringComparer.OrdinalIgnoreCase);
m_maskHints = new List<MaskHint>(maskHints);
@@ -66,6 +70,11 @@ namespace GitHub.DistributedTask.Pipelines
m_environmentVariables = new List<TemplateToken>(environmentVariables);
}
if (defaults?.Count > 0)
{
m_defaults = new List<TemplateToken>(defaults);
}
this.ContextData = new Dictionary<String, PipelineContextData>(StringComparer.OrdinalIgnoreCase);
if (contextData?.Count > 0)
{
@@ -74,6 +83,11 @@ namespace GitHub.DistributedTask.Pipelines
this.ContextData[pair.Key] = pair.Value;
}
}
if (fileTable?.Count > 0)
{
m_fileTable = new List<String>(fileTable);
}
}
[DataMember]
@@ -132,6 +146,13 @@ namespace GitHub.DistributedTask.Pipelines
private set;
}
[DataMember(EmitDefaultValue = false)]
public TemplateToken JobOutputs
{
get;
private set;
}
[DataMember]
public Int64 RequestId
{
@@ -198,6 +219,21 @@ namespace GitHub.DistributedTask.Pipelines
}
}
/// <summary>
/// Gets the hierarchy of defaults to overlay, last wins.
/// </summary>
public IList<TemplateToken> Defaults
{
get
{
if (m_defaults == null)
{
m_defaults = new List<TemplateToken>();
}
return m_defaults;
}
}
/// <summary>
/// Gets the collection of variables associated with the current context.
/// </summary>
@@ -237,6 +273,21 @@ namespace GitHub.DistributedTask.Pipelines
}
}
/// <summary>
/// Gets the table of files used when parsing the pipeline (e.g. yaml files)
/// </summary>
public IList<String> FileTable
{
get
{
if (m_fileTable == null)
{
m_fileTable = new List<String>();
}
return m_fileTable;
}
}
// todo: remove after feature-flag DistributedTask.EvaluateContainerOnRunner is enabled everywhere
public void SetJobSidecarContainers(IDictionary<String, String> value)
{
@@ -345,6 +396,16 @@ namespace GitHub.DistributedTask.Pipelines
m_environmentVariables = null;
}
if (m_defaults?.Count == 0)
{
m_defaults = null;
}
if (m_fileTable?.Count == 0)
{
m_fileTable = null;
}
if (m_maskHints?.Count == 0)
{
m_maskHints = null;
@@ -374,6 +435,12 @@ namespace GitHub.DistributedTask.Pipelines
[DataMember(Name = "EnvironmentVariables", EmitDefaultValue = false)]
private List<TemplateToken> m_environmentVariables;
[DataMember(Name = "Defaults", EmitDefaultValue = false)]
private List<TemplateToken> m_defaults;
[DataMember(Name = "FileTable", EmitDefaultValue = false)]
private List<String> m_fileTable;
[DataMember(Name = "Mask", EmitDefaultValue = false)]
private List<MaskHint> m_maskHints;

View File

@@ -14,6 +14,7 @@ namespace GitHub.DistributedTask.Pipelines.ObjectTemplating
public const String Clean = "clean";
public const String Container = "container";
public const String ContinueOnError = "continue-on-error";
public const String Defaults = "defaults";
public const String Env = "env";
public const String Event = "event";
public const String EventPattern = "github.event";
@@ -29,7 +30,10 @@ namespace GitHub.DistributedTask.Pipelines.ObjectTemplating
public const String Include = "include";
public const String Inputs = "inputs";
public const String Job = "job";
public const String JobDefaultsRun = "job-defaults-run";
public const String JobOutputs = "job-outputs";
public const String Jobs = "jobs";
public const String Labels = "labels";
public const String Lfs = "lfs";
public const String Matrix = "matrix";
public const String MaxParallel = "max-parallel";

View File

@@ -19,7 +19,8 @@ namespace GitHub.DistributedTask.Pipelines.ObjectTemplating
{
public PipelineTemplateEvaluator(
ITraceWriter trace,
TemplateSchema schema)
TemplateSchema schema,
IList<String> fileTable)
{
if (!String.Equals(schema.Version, PipelineTemplateConstants.Workflow_1_0, StringComparison.Ordinal))
{
@@ -28,6 +29,7 @@ namespace GitHub.DistributedTask.Pipelines.ObjectTemplating
m_trace = trace;
m_schema = schema;
m_fileTable = fileTable;
}
public Int32 MaxDepth => 50;
@@ -229,6 +231,78 @@ namespace GitHub.DistributedTask.Pipelines.ObjectTemplating
return result;
}
public Dictionary<String, String> EvaluateJobOutput(
TemplateToken token,
DictionaryContextData contextData)
{
var result = default(Dictionary<String, String>);
if (token != null && token.Type != TokenType.Null)
{
var context = CreateContext(contextData);
try
{
token = TemplateEvaluator.Evaluate(context, PipelineTemplateConstants.JobOutputs, token, 0, null, omitHeader: true);
context.Errors.Check();
result = new Dictionary<String, String>(StringComparer.OrdinalIgnoreCase);
var mapping = token.AssertMapping("outputs");
foreach (var pair in mapping)
{
// Literal key
var key = pair.Key.AssertString("output key");
// Literal value
var value = pair.Value.AssertString("output value");
result[key.Value] = value.Value;
}
}
catch (Exception ex) when (!(ex is TemplateValidationException))
{
context.Errors.Add(ex);
}
context.Errors.Check();
}
return result;
}
public Dictionary<String, String> EvaluateJobDefaultsRun(
TemplateToken token,
DictionaryContextData contextData)
{
var result = default(Dictionary<String, String>);
if (token != null && token.Type != TokenType.Null)
{
var context = CreateContext(contextData);
try
{
token = TemplateEvaluator.Evaluate(context, PipelineTemplateConstants.JobDefaultsRun, token, 0, null, omitHeader: true);
context.Errors.Check();
result = new Dictionary<String, String>(StringComparer.OrdinalIgnoreCase);
var mapping = token.AssertMapping("defaults run");
foreach (var pair in mapping)
{
// Literal key
var key = pair.Key.AssertString("defaults run key");
// Literal value
var value = pair.Value.AssertString("defaults run value");
result[key.Value] = value.Value;
}
}
catch (Exception ex) when (!(ex is TemplateValidationException))
{
context.Errors.Add(ex);
}
context.Errors.Check();
}
return result;
}
public IList<KeyValuePair<String, JobContainer>> EvaluateJobServiceContainers(
TemplateToken token,
DictionaryContextData contextData)
@@ -324,6 +398,16 @@ namespace GitHub.DistributedTask.Pipelines.ObjectTemplating
TraceWriter = m_trace,
};
// Add the file table
if (m_fileTable?.Count > 0)
{
foreach (var file in m_fileTable)
{
result.GetFileId(file);
}
}
// Add named context
if (contextData != null)
{
foreach (var pair in contextData)
@@ -346,11 +430,13 @@ namespace GitHub.DistributedTask.Pipelines.ObjectTemplating
private readonly ITraceWriter m_trace;
private readonly TemplateSchema m_schema;
private readonly IList<String> m_fileTable;
private readonly String[] s_contextNames = new[]
{
PipelineTemplateConstants.GitHub,
PipelineTemplateConstants.Strategy,
PipelineTemplateConstants.Matrix,
PipelineTemplateConstants.Needs,
PipelineTemplateConstants.Secrets,
PipelineTemplateConstants.Steps,
PipelineTemplateConstants.Inputs,

View File

@@ -0,0 +1,572 @@
using System;
using System.Globalization;
using System.IO;
using System.Linq;
using GitHub.DistributedTask.ObjectTemplating;
using GitHub.DistributedTask.ObjectTemplating.Tokens;
using YamlDotNet.Core;
using YamlDotNet.Core.Events;
namespace GitHub.DistributedTask.Pipelines.ObjectTemplating
{
/// <summary>
/// Converts a YAML file into a TemplateToken
/// </summary>
public sealed class YamlObjectReader : IObjectReader
{
internal YamlObjectReader(
Int32? fileId,
TextReader input)
{
m_fileId = fileId;
m_parser = new Parser(input);
}
public Boolean AllowLiteral(out LiteralToken value)
{
if (EvaluateCurrent() is Scalar scalar)
{
// Tag specified
if (!String.IsNullOrEmpty(scalar.Tag))
{
// String tag
if (String.Equals(scalar.Tag, c_stringTag, StringComparison.Ordinal))
{
value = new StringToken(m_fileId, scalar.Start.Line, scalar.Start.Column, scalar.Value);
MoveNext();
return true;
}
// Not plain style
if (scalar.Style != ScalarStyle.Plain)
{
throw new NotSupportedException($"The scalar style '{scalar.Style}' on line {scalar.Start.Line} and column {scalar.Start.Column} is not valid with the tag '{scalar.Tag}'");
}
// Boolean, Float, Integer, or Null
switch (scalar.Tag)
{
case c_booleanTag:
value = ParseBoolean(scalar);
break;
case c_floatTag:
value = ParseFloat(scalar);
break;
case c_integerTag:
value = ParseInteger(scalar);
break;
case c_nullTag:
value = ParseNull(scalar);
break;
default:
throw new NotSupportedException($"Unexpected tag '{scalar.Tag}'");
}
MoveNext();
return true;
}
// Plain style, determine type using YAML 1.2 "core" schema https://yaml.org/spec/1.2/spec.html#id2804923
if (scalar.Style == ScalarStyle.Plain)
{
if (MatchNull(scalar, out var nullToken))
{
value = nullToken;
}
else if (MatchBoolean(scalar, out var booleanToken))
{
value = booleanToken;
}
else if (MatchInteger(scalar, out var numberToken) ||
MatchFloat(scalar, out numberToken))
{
value = numberToken;
}
else
{
value = new StringToken(m_fileId, scalar.Start.Line, scalar.Start.Column, scalar.Value);
}
MoveNext();
return true;
}
// Otherwise assume string
value = new StringToken(m_fileId, scalar.Start.Line, scalar.Start.Column, scalar.Value);
MoveNext();
return true;
}
value = default;
return false;
}
public Boolean AllowSequenceStart(out SequenceToken value)
{
if (EvaluateCurrent() is SequenceStart sequenceStart)
{
value = new SequenceToken(m_fileId, sequenceStart.Start.Line, sequenceStart.Start.Column);
MoveNext();
return true;
}
value = default;
return false;
}
public Boolean AllowSequenceEnd()
{
if (EvaluateCurrent() is SequenceEnd)
{
MoveNext();
return true;
}
return false;
}
public Boolean AllowMappingStart(out MappingToken value)
{
if (EvaluateCurrent() is MappingStart mappingStart)
{
value = new MappingToken(m_fileId, mappingStart.Start.Line, mappingStart.Start.Column);
MoveNext();
return true;
}
value = default;
return false;
}
public Boolean AllowMappingEnd()
{
if (EvaluateCurrent() is MappingEnd)
{
MoveNext();
return true;
}
return false;
}
/// <summary>
/// Consumes the last parsing events, which are expected to be DocumentEnd and StreamEnd.
/// </summary>
public void ValidateEnd()
{
if (EvaluateCurrent() is DocumentEnd)
{
MoveNext();
}
else
{
throw new InvalidOperationException("Expected document end parse event");
}
if (EvaluateCurrent() is StreamEnd)
{
MoveNext();
}
else
{
throw new InvalidOperationException("Expected stream end parse event");
}
if (MoveNext())
{
throw new InvalidOperationException("Expected end of parse events");
}
}
/// <summary>
/// Consumes the first parsing events, which are expected to be StreamStart and DocumentStart.
/// </summary>
public void ValidateStart()
{
if (EvaluateCurrent() != null)
{
throw new InvalidOperationException("Unexpected parser state");
}
if (!MoveNext())
{
throw new InvalidOperationException("Expected a parse event");
}
if (EvaluateCurrent() is StreamStart)
{
MoveNext();
}
else
{
throw new InvalidOperationException("Expected stream start parse event");
}
if (EvaluateCurrent() is DocumentStart)
{
MoveNext();
}
else
{
throw new InvalidOperationException("Expected document start parse event");
}
}
private ParsingEvent EvaluateCurrent()
{
if (m_current == null)
{
m_current = m_parser.Current;
if (m_current != null)
{
if (m_current is Scalar scalar)
{
// Verify not using achors
if (scalar.Anchor != null)
{
throw new InvalidOperationException($"Anchors are not currently supported. Remove the anchor '{scalar.Anchor}'");
}
}
else if (m_current is MappingStart mappingStart)
{
// Verify not using achors
if (mappingStart.Anchor != null)
{
throw new InvalidOperationException($"Anchors are not currently supported. Remove the anchor '{mappingStart.Anchor}'");
}
}
else if (m_current is SequenceStart sequenceStart)
{
// Verify not using achors
if (sequenceStart.Anchor != null)
{
throw new InvalidOperationException($"Anchors are not currently supported. Remove the anchor '{sequenceStart.Anchor}'");
}
}
else if (!(m_current is MappingEnd) &&
!(m_current is SequenceEnd) &&
!(m_current is DocumentStart) &&
!(m_current is DocumentEnd) &&
!(m_current is StreamStart) &&
!(m_current is StreamEnd))
{
throw new InvalidOperationException($"Unexpected parsing event type: {m_current.GetType().Name}");
}
}
}
return m_current;
}
private Boolean MoveNext()
{
m_current = null;
return m_parser.MoveNext();
}
private BooleanToken ParseBoolean(Scalar scalar)
{
if (MatchBoolean(scalar, out var token))
{
return token;
}
ThrowInvalidValue(scalar, c_booleanTag); // throws
return default;
}
private NumberToken ParseFloat(Scalar scalar)
{
if (MatchFloat(scalar, out var token))
{
return token;
}
ThrowInvalidValue(scalar, c_floatTag); // throws
return default;
}
private NumberToken ParseInteger(Scalar scalar)
{
if (MatchInteger(scalar, out var token))
{
return token;
}
ThrowInvalidValue(scalar, c_integerTag); // throws
return default;
}
private NullToken ParseNull(Scalar scalar)
{
if (MatchNull(scalar, out var token))
{
return token;
}
ThrowInvalidValue(scalar, c_nullTag); // throws
return default;
}
private Boolean MatchBoolean(
Scalar scalar,
out BooleanToken value)
{
// YAML 1.2 "core" schema https://yaml.org/spec/1.2/spec.html#id2804923
switch (scalar.Value ?? String.Empty)
{
case "true":
case "True":
case "TRUE":
value = new BooleanToken(m_fileId, scalar.Start.Line, scalar.Start.Column, true);
return true;
case "false":
case "False":
case "FALSE":
value = new BooleanToken(m_fileId, scalar.Start.Line, scalar.Start.Column, false);
return true;
}
value = default;
return false;
}
private Boolean MatchFloat(
Scalar scalar,
out NumberToken value)
{
// YAML 1.2 "core" schema https://yaml.org/spec/1.2/spec.html#id2804923
var str = scalar.Value;
if (!String.IsNullOrEmpty(str))
{
// Check for [-+]?(\.inf|\.Inf|\.INF)|\.nan|\.NaN|\.NAN
switch (str)
{
case ".inf":
case ".Inf":
case ".INF":
case "+.inf":
case "+.Inf":
case "+.INF":
value = new NumberToken(m_fileId, scalar.Start.Line, scalar.Start.Column, Double.PositiveInfinity);
return true;
case "-.inf":
case "-.Inf":
case "-.INF":
value = new NumberToken(m_fileId, scalar.Start.Line, scalar.Start.Column, Double.NegativeInfinity);
return true;
case ".nan":
case ".NaN":
case ".NAN":
value = new NumberToken(m_fileId, scalar.Start.Line, scalar.Start.Column, Double.NaN);
return true;
}
// Otherwise check [-+]?(\.[0-9]+|[0-9]+(\.[0-9]*)?)([eE][-+]?[0-9]+)?
// Skip leading sign
var index = str[0] == '-' || str[0] == '+' ? 1 : 0;
// Check for integer portion
var length = str.Length;
var hasInteger = false;
while (index < length && str[index] >= '0' && str[index] <= '9')
{
hasInteger = true;
index++;
}
// Check for decimal point
var hasDot = false;
if (index < length && str[index] == '.')
{
hasDot = true;
index++;
}
// Check for decimal portion
var hasDecimal = false;
while (index < length && str[index] >= '0' && str[index] <= '9')
{
hasDecimal = true;
index++;
}
// Check [-+]?(\.[0-9]+|[0-9]+(\.[0-9]*)?)
if ((hasDot && hasDecimal) || hasInteger)
{
// Check for end
if (index == length)
{
// Try parse
if (Double.TryParse(str, NumberStyles.AllowLeadingSign | NumberStyles.AllowDecimalPoint, CultureInfo.InvariantCulture, out var doubleValue))
{
value = new NumberToken(m_fileId, scalar.Start.Line, scalar.Start.Column, doubleValue);
return true;
}
// Otherwise exceeds range
else
{
ThrowInvalidValue(scalar, c_floatTag); // throws
}
}
// Check [eE][-+]?[0-9]
else if (index < length && (str[index] == 'e' || str[index] == 'E'))
{
index++;
// Skip sign
if (index < length && (str[index] == '-' || str[index] == '+'))
{
index++;
}
// Check for exponent
var hasExponent = false;
while (index < length && str[index] >= '0' && str[index] <= '9')
{
hasExponent = true;
index++;
}
// Check for end
if (hasExponent && index == length)
{
// Try parse
if (Double.TryParse(str, NumberStyles.AllowLeadingSign | NumberStyles.AllowDecimalPoint | NumberStyles.AllowExponent, CultureInfo.InvariantCulture, out var doubleValue))
{
value = new NumberToken(m_fileId, scalar.Start.Line, scalar.Start.Column, (Double)doubleValue);
return true;
}
// Otherwise exceeds range
else
{
ThrowInvalidValue(scalar, c_floatTag); // throws
}
}
}
}
}
value = default;
return false;
}
private Boolean MatchInteger(
Scalar scalar,
out NumberToken value)
{
// YAML 1.2 "core" schema https://yaml.org/spec/1.2/spec.html#id2804923
var str = scalar.Value;
if (!String.IsNullOrEmpty(str))
{
// Check for [0-9]+
var firstChar = str[0];
if (firstChar >= '0' && firstChar <= '9' &&
str.Skip(1).All(x => x >= '0' && x <= '9'))
{
// Try parse
if (Double.TryParse(str, NumberStyles.None, CultureInfo.InvariantCulture, out var doubleValue))
{
value = new NumberToken(m_fileId, scalar.Start.Line, scalar.Start.Column, doubleValue);
return true;
}
// Otherwise exceeds range
ThrowInvalidValue(scalar, c_integerTag); // throws
}
// Check for (-|+)[0-9]+
else if ((firstChar == '-' || firstChar == '+') &&
str.Length > 1 &&
str.Skip(1).All(x => x >= '0' && x <= '9'))
{
// Try parse
if (Double.TryParse(str, NumberStyles.AllowLeadingSign, CultureInfo.InvariantCulture, out var doubleValue))
{
value = new NumberToken(m_fileId, scalar.Start.Line, scalar.Start.Column, doubleValue);
return true;
}
// Otherwise exceeds range
ThrowInvalidValue(scalar, c_integerTag); // throws
}
// Check for 0x[0-9a-fA-F]+
else if (firstChar == '0' &&
str.Length > 2 &&
str[1] == 'x' &&
str.Skip(2).All(x => (x >= '0' && x <= '9') || (x >= 'a' && x <= 'f') || (x >= 'A' && x <= 'F')))
{
// Try parse
if (Int32.TryParse(str.Substring(2), NumberStyles.AllowHexSpecifier, CultureInfo.InvariantCulture, out var integerValue))
{
value = new NumberToken(m_fileId, scalar.Start.Line, scalar.Start.Column, integerValue);
return true;
}
// Otherwise exceeds range
ThrowInvalidValue(scalar, c_integerTag); // throws
}
// Check for 0o[0-9]+
else if (firstChar == '0' &&
str.Length > 2 &&
str[1] == 'o' &&
str.Skip(2).All(x => x >= '0' && x <= '7'))
{
// Try parse
var integerValue = default(Int32);
try
{
integerValue = Convert.ToInt32(str.Substring(2), 8);
}
// Otherwise exceeds range
catch (Exception)
{
ThrowInvalidValue(scalar, c_integerTag); // throws
}
value = new NumberToken(m_fileId, scalar.Start.Line, scalar.Start.Column, integerValue);
return true;
}
}
value = default;
return false;
}
private Boolean MatchNull(
Scalar scalar,
out NullToken value)
{
// YAML 1.2 "core" schema https://yaml.org/spec/1.2/spec.html#id2804923
switch (scalar.Value ?? String.Empty)
{
case "":
case "null":
case "Null":
case "NULL":
case "~":
value = new NullToken(m_fileId, scalar.Start.Line, scalar.Start.Column);
return true;
}
value = default;
return false;
}
private void ThrowInvalidValue(
Scalar scalar,
String tag)
{
throw new NotSupportedException($"The value '{scalar.Value}' on line {scalar.Start.Line} and column {scalar.Start.Column} is invalid for the type '{scalar.Tag}'");
}
private const String c_booleanTag = "tag:yaml.org,2002:bool";
private const String c_floatTag = "tag:yaml.org,2002:float";
private const String c_integerTag = "tag:yaml.org,2002:int";
private const String c_nullTag = "tag:yaml.org,2002:null";
private const String c_stringTag = "tag:yaml.org,2002:string";
private readonly Int32? m_fileId;
private readonly Parser m_parser;
private ParsingEvent m_current;
}
}

View File

@@ -9,6 +9,7 @@
"properties": {
"on": "any",
"name": "string",
"defaults": "workflow-defaults",
"env": "workflow-env",
"jobs": "jobs"
}
@@ -38,6 +39,7 @@
"context": [
"github",
"strategy",
"needs",
"matrix",
"secrets",
"steps",
@@ -66,6 +68,7 @@
"github",
"strategy",
"matrix",
"needs",
"secrets",
"steps",
"inputs",
@@ -89,7 +92,8 @@
"context": [
"github",
"strategy",
"matrix"
"matrix",
"needs"
],
"one-of": [
"string",
@@ -112,6 +116,7 @@
"github",
"strategy",
"matrix",
"needs",
"secrets",
"steps",
"job",
@@ -121,6 +126,23 @@
"string": {}
},
"workflow-defaults": {
"mapping": {
"properties": {
"run": "workflow-defaults-run"
}
}
},
"workflow-defaults-run": {
"mapping": {
"properties": {
"shell": "non-empty-string",
"working-directory": "non-empty-string"
}
}
},
"workflow-env": {
"context": [
"github",
@@ -143,16 +165,21 @@
"mapping": {
"properties": {
"needs": "needs",
"if": "string",
"if": "job-if",
"strategy": "strategy",
"name": "string-strategy-context",
"runs-on": "runs-on",
"runs-on": {
"type": "runs-on",
"required": true
},
"timeout-minutes": "number-strategy-context",
"cancel-timeout-minutes": "number-strategy-context",
"continue-on-error": "boolean",
"continue-on-error": "boolean-strategy-context",
"container": "container",
"services": "services",
"env": "job-env",
"outputs": "job-outputs",
"defaults": "job-defaults",
"steps": "steps"
}
}
@@ -165,9 +192,22 @@
]
},
"job-if": {
"context": [
"github",
"needs",
"always(0,0)",
"failure(0,MAX)",
"cancelled(0,0)",
"success(0,MAX)"
],
"string": {}
},
"strategy": {
"context": [
"github"
"github",
"needs"
],
"mapping": {
"properties": {
@@ -233,24 +273,23 @@
"context": [
"github",
"strategy",
"matrix"
"matrix",
"needs"
],
"one-of": [
"runs-on-string",
"non-empty-string",
"sequence-of-non-empty-string",
"runs-on-mapping"
]
},
"runs-on-string": {
"string": {
"require-non-empty": true
}
},
"runs-on-mapping": {
"mapping": {
"properties": {
"pool": "non-empty-string"
"pool": {
"type": "non-empty-string",
"required": true
}
}
}
},
@@ -260,7 +299,8 @@
"github",
"secrets",
"strategy",
"matrix"
"matrix",
"needs"
],
"mapping": {
"loose-key-type": "non-empty-string",
@@ -268,6 +308,37 @@
}
},
"job-defaults": {
"mapping": {
"properties": {
"run": "job-defaults-run"
}
}
},
"job-defaults-run": {
"context": [
"github",
"strategy",
"matrix",
"needs",
"env"
],
"mapping": {
"properties": {
"shell": "non-empty-string",
"working-directory": "non-empty-string"
}
}
},
"job-outputs": {
"mapping": {
"loose-key-type": "non-empty-string",
"loose-value-type": "string-runner-context"
}
},
"steps": {
"sequence": {
"item-type": "steps-item"
@@ -301,9 +372,12 @@
"properties": {
"name": "string-steps-context",
"id": "non-empty-string",
"if": "string",
"if": "step-if",
"timeout-minutes": "number-steps-context",
"run": "string-steps-context",
"run": {
"type": "string-steps-context",
"required": true
},
"continue-on-error": "boolean-steps-context",
"env": "step-env",
"working-directory": "string-steps-context",
@@ -317,9 +391,12 @@
"properties": {
"name": "string-steps-context-in-template",
"id": "non-empty-string",
"if": "string",
"if": "step-if-in-template",
"timeout-minutes": "number-steps-context-in-template",
"run": "string-steps-context-in-template",
"run": {
"type": "string-steps-context-in-template",
"required": true
},
"continue-on-error": "boolean-steps-context-in-template",
"env": "step-env-in-template",
"working-directory": "string-steps-context-in-template",
@@ -333,10 +410,13 @@
"properties": {
"name": "string-steps-context",
"id": "non-empty-string",
"if": "string",
"if": "step-if",
"continue-on-error": "boolean-steps-context",
"timeout-minutes": "number-steps-context",
"uses": "non-empty-string",
"uses": {
"type": "non-empty-string",
"required": true
},
"with": "step-with",
"env": "step-env"
}
@@ -348,16 +428,56 @@
"properties": {
"name": "string-steps-context-in-template",
"id": "non-empty-string",
"if": "string",
"if": "step-if-in-template",
"continue-on-error": "boolean-steps-context-in-template",
"timeout-minutes": "number-steps-context-in-template",
"uses": "non-empty-string",
"uses": {
"type": "non-empty-string",
"required": true
},
"with": "step-with-in-template",
"env": "step-env-in-template"
}
}
},
"step-if": {
"context": [
"github",
"strategy",
"matrix",
"needs",
"steps",
"job",
"runner",
"env",
"always(0,0)",
"failure(0,0)",
"cancelled(0,0)",
"success(0,0)"
],
"string": {}
},
"step-if-in-template": {
"context": [
"github",
"strategy",
"matrix",
"needs",
"steps",
"inputs",
"job",
"runner",
"env",
"always(0,0)",
"failure(0,0)",
"cancelled(0,0)",
"success(0,0)"
],
"string": {}
},
"steps-template-reference": {
"mapping": {
"properties": {
@@ -383,6 +503,7 @@
"github",
"strategy",
"matrix",
"needs",
"secrets",
"steps",
"job",
@@ -400,6 +521,7 @@
"github",
"strategy",
"matrix",
"needs",
"secrets",
"steps",
"inputs",
@@ -418,6 +540,7 @@
"github",
"strategy",
"matrix",
"needs",
"secrets",
"steps",
"job",
@@ -435,6 +558,7 @@
"github",
"strategy",
"matrix",
"needs",
"secrets",
"steps",
"inputs",
@@ -453,6 +577,7 @@
"github",
"strategy",
"matrix",
"needs",
"secrets",
"steps",
"job",
@@ -469,7 +594,8 @@
"context": [
"github",
"strategy",
"matrix"
"matrix",
"needs"
],
"one-of": [
"string",
@@ -493,7 +619,8 @@
"context": [
"github",
"strategy",
"matrix"
"matrix",
"needs"
],
"mapping": {
"loose-key-type": "non-empty-string",
@@ -505,7 +632,8 @@
"context": [
"github",
"strategy",
"matrix"
"matrix",
"needs"
],
"one-of": [
"non-empty-string",
@@ -521,23 +649,24 @@
},
"step-with-in-template": {
"context": [
"github",
"strategy",
"matrix",
"secrets",
"steps",
"inputs",
"job",
"runner",
"env"
],
"mapping": {
"loose-key-type": "non-empty-string",
"loose-value-type": "string"
}
},
"context": [
"github",
"strategy",
"matrix",
"needs",
"secrets",
"steps",
"inputs",
"job",
"runner",
"env"
],
"mapping": {
"loose-key-type": "non-empty-string",
"loose-value-type": "string"
}
},
"non-empty-string": {
"string": {
"require-non-empty": true
@@ -550,11 +679,22 @@
}
},
"boolean-strategy-context": {
"context": [
"github",
"strategy",
"matrix",
"needs"
],
"boolean": {}
},
"number-strategy-context": {
"context": [
"github",
"strategy",
"matrix"
"matrix",
"needs"
],
"number": {}
},
@@ -563,7 +703,8 @@
"context": [
"github",
"strategy",
"matrix"
"matrix",
"needs"
],
"string": {}
},
@@ -573,6 +714,7 @@
"github",
"strategy",
"matrix",
"needs",
"secrets",
"steps",
"job",
@@ -587,6 +729,7 @@
"github",
"strategy",
"matrix",
"needs",
"secrets",
"steps",
"inputs",
@@ -602,6 +745,7 @@
"github",
"strategy",
"matrix",
"needs",
"secrets",
"steps",
"job",
@@ -616,6 +760,7 @@
"github",
"strategy",
"matrix",
"needs",
"secrets",
"steps",
"inputs",
@@ -626,11 +771,27 @@
"number": {}
},
"string-runner-context": {
"context": [
"github",
"strategy",
"matrix",
"needs",
"secrets",
"steps",
"job",
"runner",
"env"
],
"string": {}
},
"string-steps-context": {
"context": [
"github",
"strategy",
"matrix",
"needs",
"secrets",
"steps",
"job",
@@ -645,6 +806,7 @@
"github",
"strategy",
"matrix",
"needs",
"secrets",
"steps",
"inputs",

View File

@@ -31,7 +31,7 @@ namespace GitHub.DistributedTask.WebApi
}
protected JobEvent(
String name,
String name,
Guid jobId)
{
this.Name = name;
@@ -123,11 +123,12 @@ namespace GitHub.DistributedTask.WebApi
Int64 requestId,
Guid jobId,
TaskResult result,
IDictionary<String, VariableValue> outputVariables)
Dictionary<String, VariableValue> outputs)
: base(JobEventTypes.JobCompleted, jobId)
{
this.RequestId = requestId;
this.Result = result;
this.Outputs = outputs;
}
[DataMember(EmitDefaultValue = false)]
@@ -143,6 +144,13 @@ namespace GitHub.DistributedTask.WebApi
get;
set;
}
[DataMember(EmitDefaultValue = false)]
public IDictionary<String, VariableValue> Outputs
{
get;
set;
}
}
[DataContract]
@@ -153,9 +161,9 @@ namespace GitHub.DistributedTask.WebApi
}
protected TaskEvent(
string name,
Guid jobId,
Guid taskId)
string name,
Guid jobId,
Guid taskId)
: base(name, jobId)
{
TaskId = taskId;
@@ -185,9 +193,9 @@ namespace GitHub.DistributedTask.WebApi
}
public override Object ReadJson(
JsonReader reader,
Type objectType,
Object existingValue,
JsonReader reader,
Type objectType,
Object existingValue,
JsonSerializer serializer)
{
var eventObject = JObject.Load(reader);

View File

@@ -260,5 +260,8 @@ namespace GitHub.DistributedTask.WebApi
public static readonly Guid CheckpointResourcesLocationId = new Guid(CheckpointResourcesLocationIdString);
public const String CheckpointResourcesResource = "references";
public static readonly Guid RunnerAuthUrl = new Guid("{A82A119C-1E46-44B6-8D75-C82A79CF975B}");
public const string RunnerAuthUrlResource = "authurl";
}
}

View File

@@ -93,21 +93,12 @@ namespace GitHub.Runner.Common.Tests
Assert.Equal("123***123", _hc.SecretMasker.MaskSecrets("123Pass%20word%20123%21123"));
Assert.Equal("123***123", _hc.SecretMasker.MaskSecrets("123Pass&lt;word&gt;123!123"));
Assert.Equal("123***123", _hc.SecretMasker.MaskSecrets("123Pass''word''123!123"));
Assert.Equal("OlBh***Q==", _hc.SecretMasker.MaskSecrets(Convert.ToBase64String(Encoding.UTF8.GetBytes($":Password123!"))));
Assert.Equal("YTpQ***E=", _hc.SecretMasker.MaskSecrets(Convert.ToBase64String(Encoding.UTF8.GetBytes($"a:Password123!"))));
Assert.Equal("OlBh***", _hc.SecretMasker.MaskSecrets(Convert.ToBase64String(Encoding.UTF8.GetBytes($":Password123!"))));
Assert.Equal("YTpQ***", _hc.SecretMasker.MaskSecrets(Convert.ToBase64String(Encoding.UTF8.GetBytes($"a:Password123!"))));
Assert.Equal("YWI6***", _hc.SecretMasker.MaskSecrets(Convert.ToBase64String(Encoding.UTF8.GetBytes($"ab:Password123!"))));
Assert.Equal("YWJjOlBh***Q==", _hc.SecretMasker.MaskSecrets(Convert.ToBase64String(Encoding.UTF8.GetBytes($"abc:Password123!"))));
Assert.Equal("YWJjZDpQ***E=", _hc.SecretMasker.MaskSecrets(Convert.ToBase64String(Encoding.UTF8.GetBytes($"abcd:Password123!"))));
Assert.Equal("YWJjOlBh***", _hc.SecretMasker.MaskSecrets(Convert.ToBase64String(Encoding.UTF8.GetBytes($"abc:Password123!"))));
Assert.Equal("YWJjZDpQ***", _hc.SecretMasker.MaskSecrets(Convert.ToBase64String(Encoding.UTF8.GetBytes($"abcd:Password123!"))));
Assert.Equal("YWJjZGU6***", _hc.SecretMasker.MaskSecrets(Convert.ToBase64String(Encoding.UTF8.GetBytes($"abcde:Password123!"))));
Assert.Equal("***Og==", _hc.SecretMasker.MaskSecrets(Convert.ToBase64String(Encoding.UTF8.GetBytes($"Password123!:"))));
Assert.Equal("***OmE=", _hc.SecretMasker.MaskSecrets(Convert.ToBase64String(Encoding.UTF8.GetBytes($"Password123!:a"))));
Assert.Equal("***OmFi", _hc.SecretMasker.MaskSecrets(Convert.ToBase64String(Encoding.UTF8.GetBytes($"Password123!:ab"))));
Assert.Equal("***OmFiYw==", _hc.SecretMasker.MaskSecrets(Convert.ToBase64String(Encoding.UTF8.GetBytes($"Password123!:abc"))));
Assert.Equal("***OmFiY2Q=", _hc.SecretMasker.MaskSecrets(Convert.ToBase64String(Encoding.UTF8.GetBytes($"Password123!:abcd"))));
Assert.Equal("***OmFiY2Rl", _hc.SecretMasker.MaskSecrets(Convert.ToBase64String(Encoding.UTF8.GetBytes($"Password123!:abcde"))));
Assert.Equal("OlBh***To=", _hc.SecretMasker.MaskSecrets(Convert.ToBase64String(Encoding.UTF8.GetBytes($":Password123!:"))));
Assert.Equal("YTpQ***E6YQ==", _hc.SecretMasker.MaskSecrets(Convert.ToBase64String(Encoding.UTF8.GetBytes($"a:Password123!:a"))));
Assert.Equal("YWJjOlBh***Tph", _hc.SecretMasker.MaskSecrets(Convert.ToBase64String(Encoding.UTF8.GetBytes($"abc:Password123!:a"))));
}
finally
{

View File

@@ -175,8 +175,8 @@ namespace GitHub.Runner.Common.Tests.Listener.Configuration
Assert.True(s.PoolId.Equals(_expectedPoolId));
Assert.True(s.WorkFolder.Equals(_expectedWorkFolder));
// validate GetAgentPoolsAsync gets called once with automation pool type
_runnerServer.Verify(x => x.GetAgentPoolsAsync(It.IsAny<string>(), It.Is<TaskAgentPoolType>(p => p == TaskAgentPoolType.Automation)), Times.Once);
// validate GetAgentPoolsAsync gets called twice with automation pool type
_runnerServer.Verify(x => x.GetAgentPoolsAsync(It.IsAny<string>(), It.Is<TaskAgentPoolType>(p => p == TaskAgentPoolType.Automation)), Times.Exactly(2));
_runnerServer.Verify(x => x.AddAgentAsync(It.IsAny<int>(), It.Is<TaskAgent>(a => a.Labels.Contains("self-hosted") && a.Labels.Contains(VarUtil.OS) && a.Labels.Contains(VarUtil.OSArchitecture))), Times.Once);
}

View File

@@ -33,7 +33,7 @@ namespace GitHub.Runner.Common.Tests.Listener
TaskOrchestrationPlanReference plan = new TaskOrchestrationPlanReference();
TimelineReference timeline = null;
Guid jobId = Guid.NewGuid();
var result = new Pipelines.AgentJobRequestMessage(plan, timeline, jobId, "someJob", "someJob", null, null, null, new Dictionary<string, VariableValue>(), new List<MaskHint>(), new Pipelines.JobResources(), new Pipelines.ContextData.DictionaryContextData(), new Pipelines.WorkspaceOptions(), new List<Pipelines.ActionStep>(), null);
var result = new Pipelines.AgentJobRequestMessage(plan, timeline, jobId, "someJob", "someJob", null, null, null, new Dictionary<string, VariableValue>(), new List<MaskHint>(), new Pipelines.JobResources(), new Pipelines.ContextData.DictionaryContextData(), new Pipelines.WorkspaceOptions(), new List<Pipelines.ActionStep>(), null, null, null, null);
result.ContextData["github"] = new Pipelines.ContextData.DictionaryContextData();
return result;
}

File diff suppressed because it is too large Load Diff

View File

@@ -43,7 +43,7 @@ namespace GitHub.Runner.Common.Tests.Listener
TaskOrchestrationPlanReference plan = new TaskOrchestrationPlanReference();
TimelineReference timeline = null;
Guid jobId = Guid.NewGuid();
return new Pipelines.AgentJobRequestMessage(plan, timeline, jobId, "test", "test", null, null, null, new Dictionary<string, VariableValue>(), new List<MaskHint>(), new Pipelines.JobResources(), new Pipelines.ContextData.DictionaryContextData(), new Pipelines.WorkspaceOptions(), new List<Pipelines.ActionStep>(), null);
return new Pipelines.AgentJobRequestMessage(plan, timeline, jobId, "test", "test", null, null, null, new Dictionary<string, VariableValue>(), new List<MaskHint>(), new Pipelines.JobResources(), new Pipelines.ContextData.DictionaryContextData(), new Pipelines.WorkspaceOptions(), new List<Pipelines.ActionStep>(), null, null, null, null);
}
private JobCancelMessage CreateJobCancelMessage()

View File

@@ -8,6 +8,7 @@ using Xunit;
using GitHub.Runner.Common.Util;
using System.Threading.Channels;
using GitHub.Runner.Sdk;
using System.Linq;
namespace GitHub.Runner.Common.Tests
{
@@ -81,6 +82,102 @@ namespace GitHub.Runner.Common.Tests
}
}
[Fact]
[Trait("Level", "L0")]
[Trait("Category", "Common")]
public async Task SetCIEnv()
{
using (TestHostContext hc = new TestHostContext(this))
{
var existingCI = Environment.GetEnvironmentVariable("CI");
try
{
// Clear out CI and make sure process invoker sets it.
Environment.SetEnvironmentVariable("CI", null);
Tracing trace = hc.GetTrace();
Int32 exitCode = -1;
var processInvoker = new ProcessInvokerWrapper();
processInvoker.Initialize(hc);
var stdout = new List<string>();
var stderr = new List<string>();
processInvoker.OutputDataReceived += (object sender, ProcessDataReceivedEventArgs e) =>
{
trace.Info(e.Data);
stdout.Add(e.Data);
};
processInvoker.ErrorDataReceived += (object sender, ProcessDataReceivedEventArgs e) =>
{
trace.Info(e.Data);
stderr.Add(e.Data);
};
#if OS_WINDOWS
exitCode = await processInvoker.ExecuteAsync("", "cmd.exe", "/c \"echo %CI%\"", null, CancellationToken.None);
#else
exitCode = await processInvoker.ExecuteAsync("", "bash", "-c \"echo $CI\"", null, CancellationToken.None);
#endif
trace.Info("Exit Code: {0}", exitCode);
Assert.Equal(0, exitCode);
Assert.Equal("true", stdout.First(x => !string.IsNullOrWhiteSpace(x)));
}
finally
{
Environment.SetEnvironmentVariable("CI", existingCI);
}
}
}
[Fact]
[Trait("Level", "L0")]
[Trait("Category", "Common")]
public async Task KeepExistingCIEnv()
{
using (TestHostContext hc = new TestHostContext(this))
{
var existingCI = Environment.GetEnvironmentVariable("CI");
try
{
// Clear out CI and make sure process invoker sets it.
Environment.SetEnvironmentVariable("CI", null);
Tracing trace = hc.GetTrace();
Int32 exitCode = -1;
var processInvoker = new ProcessInvokerWrapper();
processInvoker.Initialize(hc);
var stdout = new List<string>();
var stderr = new List<string>();
processInvoker.OutputDataReceived += (object sender, ProcessDataReceivedEventArgs e) =>
{
trace.Info(e.Data);
stdout.Add(e.Data);
};
processInvoker.ErrorDataReceived += (object sender, ProcessDataReceivedEventArgs e) =>
{
trace.Info(e.Data);
stderr.Add(e.Data);
};
#if OS_WINDOWS
exitCode = await processInvoker.ExecuteAsync("", "cmd.exe", "/c \"echo %CI%\"", new Dictionary<string, string>() { { "CI", "false" } }, CancellationToken.None);
#else
exitCode = await processInvoker.ExecuteAsync("", "bash", "-c \"echo $CI\"", new Dictionary<string, string>() { { "CI", "false" } }, CancellationToken.None);
#endif
trace.Info("Exit Code: {0}", exitCode);
Assert.Equal(0, exitCode);
Assert.Equal("false", stdout.First(x => !string.IsNullOrWhiteSpace(x)));
}
finally
{
Environment.SetEnvironmentVariable("CI", existingCI);
}
}
}
#if !OS_WINDOWS
//Run a process that normally takes 20sec to finish and cancel it.
[Fact]

View File

@@ -198,7 +198,7 @@ namespace GitHub.Runner.Common.Tests
case WellKnownDirectory.Tools:
path = Environment.GetEnvironmentVariable("RUNNER_TOOL_CACHE");
if (string.IsNullOrEmpty(path))
{
path = Path.Combine(
@@ -279,6 +279,13 @@ namespace GitHub.Runner.Common.Tests
GetDirectory(WellKnownDirectory.Root),
".options");
break;
case WellKnownConfigFile.SetupInfo:
path = Path.Combine(
GetDirectory(WellKnownDirectory.Root),
".setup_info");
break;
default:
throw new NotSupportedException($"Unexpected well known config file: '{configFile}'");
}

View File

@@ -1,8 +1,10 @@
using System;
using System.Collections.Generic;
using System.IO;
using System.Runtime.CompilerServices;
using GitHub.DistributedTask.WebApi;
using GitHub.Runner.Worker;
using GitHub.Runner.Worker.Container;
using Moq;
using Xunit;
using Pipelines = GitHub.DistributedTask.Pipelines;
@@ -11,47 +13,35 @@ namespace GitHub.Runner.Common.Tests.Worker
{
public sealed class ActionCommandManagerL0
{
private ActionCommandManager _commandManager;
private Mock<IExecutionContext> _ec;
private Mock<IExtensionManager> _extensionManager;
private Mock<IPipelineDirectoryManager> _pipelineDirectoryManager;
[Fact]
[Trait("Level", "L0")]
[Trait("Category", "Worker")]
public void EnablePluginInternalCommand()
{
using (TestHostContext _hc = new TestHostContext(this))
using (TestHostContext hc = CreateTestContext())
{
var extensionManger = new Mock<IExtensionManager>();
var directoryManager = new Mock<IPipelineDirectoryManager>();
var pluginCommand = new InternalPluginSetRepoPathCommandExtension();
pluginCommand.Initialize(_hc);
var envCommand = new SetEnvCommandExtension();
envCommand.Initialize(_hc);
extensionManger.Setup(x => x.GetExtensions<IActionCommandExtension>())
.Returns(new List<IActionCommandExtension>() { pluginCommand, envCommand });
_hc.SetSingleton<IExtensionManager>(extensionManger.Object);
_hc.SetSingleton<IPipelineDirectoryManager>(directoryManager.Object);
Mock<IExecutionContext> _ec = new Mock<IExecutionContext>();
_ec.Setup(x => x.Write(It.IsAny<string>(), It.IsAny<string>()))
.Returns((string tag, string line) =>
{
_hc.GetTrace().Info($"{tag} {line}");
hc.GetTrace().Info($"{tag} {line}");
return 1;
});
_ec.Setup(x => x.AddIssue(It.IsAny<Issue>(), It.IsAny<string>()))
.Callback((Issue issue, string message) =>
{
_hc.GetTrace().Info($"{issue.Type} {issue.Message} {message ?? string.Empty}");
hc.GetTrace().Info($"{issue.Type} {issue.Message} {message ?? string.Empty}");
});
ActionCommandManager commandManager = new ActionCommandManager();
commandManager.Initialize(_hc);
commandManager.EnablePluginInternalCommand();
_commandManager.EnablePluginInternalCommand();
Assert.True(commandManager.TryProcessCommand(_ec.Object, "##[internal-set-repo-path repoFullName=actions/runner;workspaceRepo=true]somepath"));
Assert.True(_commandManager.TryProcessCommand(_ec.Object, "##[internal-set-repo-path repoFullName=actions/runner;workspaceRepo=true]somepath", null));
directoryManager.Verify(x => x.UpdateRepositoryDirectory(_ec.Object, "actions/runner", "somepath", true), Times.Once);
_pipelineDirectoryManager.Verify(x => x.UpdateRepositoryDirectory(_ec.Object, "actions/runner", "somepath", true), Times.Once);
}
}
@@ -60,47 +50,29 @@ namespace GitHub.Runner.Common.Tests.Worker
[Trait("Category", "Worker")]
public void DisablePluginInternalCommand()
{
using (TestHostContext _hc = new TestHostContext(this))
using (TestHostContext hc = CreateTestContext())
{
var extensionManger = new Mock<IExtensionManager>();
var directoryManager = new Mock<IPipelineDirectoryManager>();
var pluginCommand = new InternalPluginSetRepoPathCommandExtension();
pluginCommand.Initialize(_hc);
var envCommand = new SetEnvCommandExtension();
envCommand.Initialize(_hc);
extensionManger.Setup(x => x.GetExtensions<IActionCommandExtension>())
.Returns(new List<IActionCommandExtension>() { pluginCommand, envCommand });
_hc.SetSingleton<IExtensionManager>(extensionManger.Object);
_hc.SetSingleton<IPipelineDirectoryManager>(directoryManager.Object);
Mock<IExecutionContext> _ec = new Mock<IExecutionContext>();
_ec.Setup(x => x.Write(It.IsAny<string>(), It.IsAny<string>()))
.Returns((string tag, string line) =>
{
_hc.GetTrace().Info($"{tag} {line}");
hc.GetTrace().Info($"{tag} {line}");
return 1;
});
_ec.Setup(x => x.AddIssue(It.IsAny<Issue>(), It.IsAny<string>()))
.Callback((Issue issue, string message) =>
{
_hc.GetTrace().Info($"{issue.Type} {issue.Message} {message ?? string.Empty}");
hc.GetTrace().Info($"{issue.Type} {issue.Message} {message ?? string.Empty}");
});
ActionCommandManager commandManager = new ActionCommandManager();
commandManager.Initialize(_hc);
commandManager.EnablePluginInternalCommand();
_commandManager.EnablePluginInternalCommand();
Assert.True(commandManager.TryProcessCommand(_ec.Object, "##[internal-set-repo-path repoFullName=actions/runner;workspaceRepo=true]somepath"));
Assert.True(_commandManager.TryProcessCommand(_ec.Object, "##[internal-set-repo-path repoFullName=actions/runner;workspaceRepo=true]somepath", null));
commandManager.DisablePluginInternalCommand();
_commandManager.DisablePluginInternalCommand();
Assert.False(commandManager.TryProcessCommand(_ec.Object, "##[internal-set-repo-path repoFullName=actions/runner;workspaceRepo=true]somepath"));
Assert.False(_commandManager.TryProcessCommand(_ec.Object, "##[internal-set-repo-path repoFullName=actions/runner;workspaceRepo=true]somepath", null));
directoryManager.Verify(x => x.UpdateRepositoryDirectory(_ec.Object, "actions/runner", "somepath", true), Times.Once);
_pipelineDirectoryManager.Verify(x => x.UpdateRepositoryDirectory(_ec.Object, "actions/runner", "somepath", true), Times.Once);
}
}
@@ -109,42 +81,27 @@ namespace GitHub.Runner.Common.Tests.Worker
[Trait("Category", "Worker")]
public void StopProcessCommand()
{
using (TestHostContext _hc = new TestHostContext(this))
using (TestHostContext hc = CreateTestContext())
{
var extensionManger = new Mock<IExtensionManager>();
var pluginCommand = new InternalPluginSetRepoPathCommandExtension();
pluginCommand.Initialize(_hc);
var envCommand = new SetEnvCommandExtension();
envCommand.Initialize(_hc);
extensionManger.Setup(x => x.GetExtensions<IActionCommandExtension>())
.Returns(new List<IActionCommandExtension>() { pluginCommand, envCommand });
_hc.SetSingleton<IExtensionManager>(extensionManger.Object);
Mock<IExecutionContext> _ec = new Mock<IExecutionContext>();
_ec.Setup(x => x.Write(It.IsAny<string>(), It.IsAny<string>()))
.Returns((string tag, string line) =>
{
_hc.GetTrace().Info($"{tag} {line}");
hc.GetTrace().Info($"{tag} {line}");
return 1;
});
_ec.Setup(x => x.AddIssue(It.IsAny<Issue>(), It.IsAny<string>()))
.Callback((Issue issue, string message) =>
{
_hc.GetTrace().Info($"{issue.Type} {issue.Message} {message ?? string.Empty}");
hc.GetTrace().Info($"{issue.Type} {issue.Message} {message ?? string.Empty}");
});
_ec.Setup(x => x.EnvironmentVariables).Returns(new Dictionary<string, string>());
ActionCommandManager commandManager = new ActionCommandManager();
commandManager.Initialize(_hc);
Assert.True(commandManager.TryProcessCommand(_ec.Object, "##[stop-commands]stopToken"));
Assert.False(commandManager.TryProcessCommand(_ec.Object, "##[set-env name=foo]bar"));
Assert.True(commandManager.TryProcessCommand(_ec.Object, "##[stopToken]"));
Assert.True(commandManager.TryProcessCommand(_ec.Object, "##[set-env name=foo]bar"));
Assert.True(_commandManager.TryProcessCommand(_ec.Object, "##[stop-commands]stopToken", null));
Assert.False(_commandManager.TryProcessCommand(_ec.Object, "##[set-env name=foo]bar", null));
Assert.True(_commandManager.TryProcessCommand(_ec.Object, "##[stopToken]", null));
Assert.True(_commandManager.TryProcessCommand(_ec.Object, "##[set-env name=foo]bar", null));
}
}
@@ -153,41 +110,29 @@ namespace GitHub.Runner.Common.Tests.Worker
[Trait("Category", "Worker")]
public void EchoProcessCommand()
{
using (TestHostContext _hc = new TestHostContext(this))
using (TestHostContext hc = CreateTestContext())
{
var extensionManager = new Mock<IExtensionManager>();
var echoCommand = new EchoCommandExtension();
echoCommand.Initialize(_hc);
extensionManager.Setup(x => x.GetExtensions<IActionCommandExtension>())
.Returns(new List<IActionCommandExtension>() { echoCommand });
_hc.SetSingleton<IExtensionManager>(extensionManager.Object);
Mock<IExecutionContext> _ec = new Mock<IExecutionContext>();
_ec.Setup(x => x.Write(It.IsAny<string>(), It.IsAny<string>()))
.Returns((string tag, string line) =>
{
_hc.GetTrace().Info($"{tag} {line}");
hc.GetTrace().Info($"{tag} {line}");
return 1;
});
_ec.SetupAllProperties();
ActionCommandManager commandManager = new ActionCommandManager();
commandManager.Initialize(_hc);
Assert.False(_ec.Object.EchoOnActionCommand);
Assert.True(commandManager.TryProcessCommand(_ec.Object, "::echo::on"));
Assert.True(_commandManager.TryProcessCommand(_ec.Object, "::echo::on", null));
Assert.True(_ec.Object.EchoOnActionCommand);
Assert.True(commandManager.TryProcessCommand(_ec.Object, "::echo::off"));
Assert.True(_commandManager.TryProcessCommand(_ec.Object, "::echo::off", null));
Assert.False(_ec.Object.EchoOnActionCommand);
Assert.True(commandManager.TryProcessCommand(_ec.Object, "::echo::ON"));
Assert.True(_commandManager.TryProcessCommand(_ec.Object, "::echo::ON", null));
Assert.True(_ec.Object.EchoOnActionCommand);
Assert.True(commandManager.TryProcessCommand(_ec.Object, "::echo::Off "));
Assert.True(_commandManager.TryProcessCommand(_ec.Object, "::echo::Off ", null));
Assert.False(_ec.Object.EchoOnActionCommand);
}
}
@@ -197,7 +142,7 @@ namespace GitHub.Runner.Common.Tests.Worker
[Trait("Category", "Worker")]
public void EchoProcessCommandDebugOn()
{
using (TestHostContext _hc = new TestHostContext(this))
using (TestHostContext hc = CreateTestContext())
{
// Set up a few things
// 1. Job request message (with ACTIONS_STEP_DEBUG = true)
@@ -205,7 +150,7 @@ namespace GitHub.Runner.Common.Tests.Worker
TimelineReference timeline = new TimelineReference();
Guid jobId = Guid.NewGuid();
string jobName = "some job name";
var jobRequest = new Pipelines.AgentJobRequestMessage(plan, timeline, jobId, jobName, jobName, null, null, null, new Dictionary<string, VariableValue>(), new List<MaskHint>(), new Pipelines.JobResources(), new Pipelines.ContextData.DictionaryContextData(), new Pipelines.WorkspaceOptions(), new List<Pipelines.ActionStep>(), null);
var jobRequest = new Pipelines.AgentJobRequestMessage(plan, timeline, jobId, jobName, jobName, null, null, null, new Dictionary<string, VariableValue>(), new List<MaskHint>(), new Pipelines.JobResources(), new Pipelines.ContextData.DictionaryContextData(), new Pipelines.WorkspaceOptions(), new List<Pipelines.ActionStep>(), null, null, null, null);
jobRequest.Resources.Repositories.Add(new Pipelines.RepositoryResource()
{
Alias = Pipelines.PipelineConstants.SelfAlias,
@@ -219,84 +164,135 @@ namespace GitHub.Runner.Common.Tests.Worker
var jobServerQueue = new Mock<IJobServerQueue>();
jobServerQueue.Setup(x => x.QueueTimelineRecordUpdate(It.IsAny<Guid>(), It.IsAny<TimelineRecord>()));
_hc.SetSingleton(jobServerQueue.Object);
var extensionManager = new Mock<IExtensionManager>();
var echoCommand = new EchoCommandExtension();
echoCommand.Initialize(_hc);
extensionManager.Setup(x => x.GetExtensions<IActionCommandExtension>())
.Returns(new List<IActionCommandExtension>() { echoCommand });
_hc.SetSingleton<IExtensionManager>(extensionManager.Object);
hc.SetSingleton(jobServerQueue.Object);
var configurationStore = new Mock<IConfigurationStore>();
configurationStore.Setup(x => x.GetSettings()).Returns(new RunnerSettings());
_hc.SetSingleton(configurationStore.Object);
hc.SetSingleton(configurationStore.Object);
var pagingLogger = new Mock<IPagingLogger>();
_hc.EnqueueInstance(pagingLogger.Object);
ActionCommandManager commandManager = new ActionCommandManager();
commandManager.Initialize(_hc);
var _ec = new Runner.Worker.ExecutionContext();
_ec.Initialize(_hc);
hc.EnqueueInstance(pagingLogger.Object);
// Initialize the job (to exercise logic that sets EchoOnActionCommand)
_ec.InitializeJob(jobRequest, System.Threading.CancellationToken.None);
var ec = new Runner.Worker.ExecutionContext();
ec.Initialize(hc);
ec.InitializeJob(jobRequest, System.Threading.CancellationToken.None);
_ec.Complete();
ec.Complete();
Assert.True(_ec.EchoOnActionCommand);
Assert.True(ec.EchoOnActionCommand);
Assert.True(commandManager.TryProcessCommand(_ec, "::echo::off"));
Assert.False(_ec.EchoOnActionCommand);
Assert.True(_commandManager.TryProcessCommand(ec, "::echo::off", null));
Assert.False(ec.EchoOnActionCommand);
Assert.True(commandManager.TryProcessCommand(_ec, "::echo::on"));
Assert.True(_ec.EchoOnActionCommand);
Assert.True(_commandManager.TryProcessCommand(ec, "::echo::on", null));
Assert.True(ec.EchoOnActionCommand);
}
}
[Fact]
[Trait("Level", "L0")]
[Trait("Category", "Worker")]
public void EchoProcessCommandInvalid()
{
using (TestHostContext _hc = new TestHostContext(this))
using (TestHostContext hc = CreateTestContext())
{
var extensionManager = new Mock<IExtensionManager>();
var echoCommand = new EchoCommandExtension();
echoCommand.Initialize(_hc);
extensionManager.Setup(x => x.GetExtensions<IActionCommandExtension>())
.Returns(new List<IActionCommandExtension>() { echoCommand });
_hc.SetSingleton<IExtensionManager>(extensionManager.Object);
Mock<IExecutionContext> _ec = new Mock<IExecutionContext>();
_ec.Setup(x => x.Write(It.IsAny<string>(), It.IsAny<string>()))
.Returns((string tag, string line) =>
{
_hc.GetTrace().Info($"{tag} {line}");
hc.GetTrace().Info($"{tag} {line}");
return 1;
});
_ec.SetupAllProperties();
ActionCommandManager commandManager = new ActionCommandManager();
commandManager.Initialize(_hc);
// Echo commands below are considered "processed", but are invalid
// 1. Invalid echo value
Assert.True(commandManager.TryProcessCommand(_ec.Object, "::echo::invalid"));
Assert.True(_commandManager.TryProcessCommand(_ec.Object, "::echo::invalid", null));
Assert.Equal(TaskResult.Failed, _ec.Object.CommandResult);
Assert.False(_ec.Object.EchoOnActionCommand);
// 2. No value
Assert.True(commandManager.TryProcessCommand(_ec.Object, "::echo::"));
Assert.True(_commandManager.TryProcessCommand(_ec.Object, "::echo::", null));
Assert.Equal(TaskResult.Failed, _ec.Object.CommandResult);
Assert.False(_ec.Object.EchoOnActionCommand);
}
}
[Fact]
[Trait("Level", "L0")]
[Trait("Category", "Worker")]
public void AddMatcherTranslatesFilePath()
{
using (TestHostContext hc = CreateTestContext())
{
// Create a problem matcher config file
var hostDirectory = hc.GetDirectory(WellKnownDirectory.Temp);
var hostFile = Path.Combine(hostDirectory, "my-matcher.json");
Directory.CreateDirectory(hostDirectory);
var content = @"
{
""problemMatcher"": [
{
""owner"": ""my-matcher"",
""pattern"": [
{
""regexp"": ""^ERROR: (.+)$"",
""message"": 1
}
]
}
]
}";
File.WriteAllText(hostFile, content);
// Setup translation info
var container = new ContainerInfo();
var containerDirectory = "/some-container-directory";
var containerFile = Path.Combine(containerDirectory, "my-matcher.json");
container.AddPathTranslateMapping(hostDirectory, containerDirectory);
// Act
_commandManager.TryProcessCommand(_ec.Object, $"::add-matcher::{containerFile}", container);
// Assert
_ec.Verify(x => x.AddMatchers(It.IsAny<IssueMatchersConfig>()), Times.Once);
}
}
private TestHostContext CreateTestContext([CallerMemberName] string testName = "")
{
var hostContext = new TestHostContext(this, testName);
// Mock extension manager
_extensionManager = new Mock<IExtensionManager>();
var commands = new IActionCommandExtension[]
{
new AddMatcherCommandExtension(),
new EchoCommandExtension(),
new InternalPluginSetRepoPathCommandExtension(),
new SetEnvCommandExtension(),
};
foreach (var command in commands)
{
command.Initialize(hostContext);
}
_extensionManager.Setup(x => x.GetExtensions<IActionCommandExtension>())
.Returns(new List<IActionCommandExtension>(commands));
hostContext.SetSingleton<IExtensionManager>(_extensionManager.Object);
// Mock pipeline directory manager
_pipelineDirectoryManager = new Mock<IPipelineDirectoryManager>();
hostContext.SetSingleton<IPipelineDirectoryManager>(_pipelineDirectoryManager.Object);
// Execution context
_ec = new Mock<IExecutionContext>();
// Command manager
_commandManager = new ActionCommandManager();
_commandManager.Initialize(hostContext);
return hostContext;
}
}
}

View File

@@ -25,7 +25,7 @@ namespace GitHub.Runner.Common.Tests.Worker
TimelineReference timeline = new TimelineReference();
Guid jobId = Guid.NewGuid();
string jobName = "some job name";
var jobRequest = new Pipelines.AgentJobRequestMessage(plan, timeline, jobId, jobName, jobName, null, null, null, new Dictionary<string, VariableValue>(), new List<MaskHint>(), new Pipelines.JobResources(), new Pipelines.ContextData.DictionaryContextData(), new Pipelines.WorkspaceOptions(), new List<Pipelines.ActionStep>(), null);
var jobRequest = new Pipelines.AgentJobRequestMessage(plan, timeline, jobId, jobName, jobName, null, null, null, new Dictionary<string, VariableValue>(), new List<MaskHint>(), new Pipelines.JobResources(), new Pipelines.ContextData.DictionaryContextData(), new Pipelines.WorkspaceOptions(), new List<Pipelines.ActionStep>(), null, null, null, null);
jobRequest.Resources.Repositories.Add(new Pipelines.RepositoryResource()
{
Alias = Pipelines.PipelineConstants.SelfAlias,
@@ -101,7 +101,7 @@ namespace GitHub.Runner.Common.Tests.Worker
TimelineReference timeline = new TimelineReference();
Guid jobId = Guid.NewGuid();
string jobName = "some job name";
var jobRequest = new Pipelines.AgentJobRequestMessage(plan, timeline, jobId, jobName, jobName, null, null, null, new Dictionary<string, VariableValue>(), new List<MaskHint>(), new Pipelines.JobResources(), new Pipelines.ContextData.DictionaryContextData(), new Pipelines.WorkspaceOptions(), new List<Pipelines.ActionStep>(), null);
var jobRequest = new Pipelines.AgentJobRequestMessage(plan, timeline, jobId, jobName, jobName, null, null, null, new Dictionary<string, VariableValue>(), new List<MaskHint>(), new Pipelines.JobResources(), new Pipelines.ContextData.DictionaryContextData(), new Pipelines.WorkspaceOptions(), new List<Pipelines.ActionStep>(), null, null, null, null);
jobRequest.Resources.Repositories.Add(new Pipelines.RepositoryResource()
{
Alias = Pipelines.PipelineConstants.SelfAlias,
@@ -152,7 +152,7 @@ namespace GitHub.Runner.Common.Tests.Worker
TimelineReference timeline = new TimelineReference();
Guid jobId = Guid.NewGuid();
string jobName = "some job name";
var jobRequest = new Pipelines.AgentJobRequestMessage(plan, timeline, jobId, jobName, jobName, null, null, null, new Dictionary<string, VariableValue>(), new List<MaskHint>(), new Pipelines.JobResources(), new Pipelines.ContextData.DictionaryContextData(), new Pipelines.WorkspaceOptions(), new List<Pipelines.ActionStep>(), null);
var jobRequest = new Pipelines.AgentJobRequestMessage(plan, timeline, jobId, jobName, jobName, null, null, null, new Dictionary<string, VariableValue>(), new List<MaskHint>(), new Pipelines.JobResources(), new Pipelines.ContextData.DictionaryContextData(), new Pipelines.WorkspaceOptions(), new List<Pipelines.ActionStep>(), null, null, null, null);
jobRequest.Resources.Repositories.Add(new Pipelines.RepositoryResource()
{
Alias = Pipelines.PipelineConstants.SelfAlias,

View File

@@ -100,7 +100,7 @@ namespace GitHub.Runner.Common.Tests.Worker
};
Guid jobId = Guid.NewGuid();
_message = new Pipelines.AgentJobRequestMessage(plan, timeline, jobId, "test", "test", null, null, null, new Dictionary<string, VariableValue>(), new List<MaskHint>(), new Pipelines.JobResources(), new Pipelines.ContextData.DictionaryContextData(), new Pipelines.WorkspaceOptions(), steps, null);
_message = new Pipelines.AgentJobRequestMessage(plan, timeline, jobId, "test", "test", null, null, null, new Dictionary<string, VariableValue>(), new List<MaskHint>(), new Pipelines.JobResources(), new Pipelines.ContextData.DictionaryContextData(), new Pipelines.WorkspaceOptions(), steps, null, null, null, null);
GitHubContext github = new GitHubContext();
github["repository"] = new Pipelines.ContextData.StringContextData("actions/runner");
_message.ContextData.Add("github", github);

View File

@@ -63,7 +63,7 @@ namespace GitHub.Runner.Common.Tests.Worker
TaskOrchestrationPlanReference plan = new TaskOrchestrationPlanReference();
TimelineReference timeline = new Timeline(Guid.NewGuid());
Guid jobId = Guid.NewGuid();
_message = new Pipelines.AgentJobRequestMessage(plan, timeline, jobId, testName, testName, null, null, null, new Dictionary<string, VariableValue>(), new List<MaskHint>(), new Pipelines.JobResources(), new Pipelines.ContextData.DictionaryContextData(), new Pipelines.WorkspaceOptions(), new List<Pipelines.ActionStep>(), null);
_message = new Pipelines.AgentJobRequestMessage(plan, timeline, jobId, testName, testName, null, null, null, new Dictionary<string, VariableValue>(), new List<MaskHint>(), new Pipelines.JobResources(), new Pipelines.ContextData.DictionaryContextData(), new Pipelines.WorkspaceOptions(), new List<Pipelines.ActionStep>(), null, null, null, null);
_message.Variables[Constants.Variables.System.Culture] = "en-US";
_message.Resources.Endpoints.Add(new ServiceEndpoint()
{

View File

@@ -973,8 +973,8 @@ namespace GitHub.Runner.Common.Tests.Worker
});
_commandManager = new Mock<IActionCommandManager>();
_commandManager.Setup(x => x.TryProcessCommand(It.IsAny<IExecutionContext>(), It.IsAny<string>()))
.Returns((IExecutionContext executionContext, string line) =>
_commandManager.Setup(x => x.TryProcessCommand(It.IsAny<IExecutionContext>(), It.IsAny<string>(), It.IsAny<ContainerInfo>()))
.Returns((IExecutionContext executionContext, string line, ContainerInfo container) =>
{
if (line.IndexOf("##[some-command]") >= 0)
{

View File

@@ -11,6 +11,7 @@ using Xunit;
using GitHub.DistributedTask.Expressions2;
using GitHub.DistributedTask.Pipelines.ContextData;
using GitHub.DistributedTask.ObjectTemplating.Tokens;
using GitHub.Runner.Common.Util;
namespace GitHub.Runner.Common.Tests.Worker
{
@@ -55,6 +56,9 @@ namespace GitHub.Runner.Common.Tests.Worker
_ec.Setup(x => x.PostJobSteps).Returns(new Stack<IStep>());
var trace = hc.GetTrace();
_ec.Setup(x => x.Write(It.IsAny<string>(), It.IsAny<string>())).Callback((string tag, string message) => { trace.Info($"[{tag}]{message}"); });
_stepsRunner = new StepsRunner();
_stepsRunner.Initialize(hc);
return hc;
@@ -70,9 +74,9 @@ namespace GitHub.Runner.Common.Tests.Worker
// Arrange.
var variableSets = new[]
{
new[] { CreateStep(TaskResult.Succeeded, "success()"), CreateStep(TaskResult.Succeeded, "success()") },
new[] { CreateStep(TaskResult.Succeeded, "success()"), CreateStep(TaskResult.Succeeded, "success() || failure()") },
new[] { CreateStep(TaskResult.Succeeded, "success()"), CreateStep(TaskResult.Succeeded, "always()") }
new[] { CreateStep(hc, TaskResult.Succeeded, "success()"), CreateStep(hc, TaskResult.Succeeded, "success()") },
new[] { CreateStep(hc, TaskResult.Succeeded, "success()"), CreateStep(hc, TaskResult.Succeeded, "success() || failure()") },
new[] { CreateStep(hc, TaskResult.Succeeded, "success()"), CreateStep(hc, TaskResult.Succeeded, "always()") }
};
foreach (var variableSet in variableSets)
{
@@ -102,12 +106,12 @@ namespace GitHub.Runner.Common.Tests.Worker
// Arrange.
var variableSets = new[]
{
new[] { CreateStep(TaskResult.Failed, "success()", true), CreateStep(TaskResult.Succeeded, "success()") },
new[] { CreateStep(TaskResult.Failed, "success()", true), CreateStep(TaskResult.Succeeded, "success() || failure()") },
new[] { CreateStep(TaskResult.Failed, "success()", true), CreateStep(TaskResult.Succeeded, "always()") },
new[] { CreateStep(TaskResult.Failed, "success()", true), CreateStep(TaskResult.Failed, "success()", true) },
new[] { CreateStep(TaskResult.Failed, "success()", true), CreateStep(TaskResult.Failed, "success() || failure()", true) },
new[] { CreateStep(TaskResult.Failed, "success()", true), CreateStep(TaskResult.Failed, "always()", true) }
new[] { CreateStep(hc, TaskResult.Failed, "success()", true), CreateStep(hc, TaskResult.Succeeded, "success()") },
new[] { CreateStep(hc, TaskResult.Failed, "success()", true), CreateStep(hc, TaskResult.Succeeded, "success() || failure()") },
new[] { CreateStep(hc, TaskResult.Failed, "success()", true), CreateStep(hc, TaskResult.Succeeded, "always()") },
new[] { CreateStep(hc, TaskResult.Failed, "success()", true), CreateStep(hc, TaskResult.Failed, "success()", true) },
new[] { CreateStep(hc, TaskResult.Failed, "success()", true), CreateStep(hc, TaskResult.Failed, "success() || failure()", true) },
new[] { CreateStep(hc, TaskResult.Failed, "success()", true), CreateStep(hc, TaskResult.Failed, "always()", true) }
};
foreach (var variableSet in variableSets)
{
@@ -139,12 +143,12 @@ namespace GitHub.Runner.Common.Tests.Worker
{
new
{
Steps = new[] { CreateStep(TaskResult.Failed, "success()"), CreateStep(TaskResult.Succeeded, "success()") },
Steps = new[] { CreateStep(hc, TaskResult.Failed, "success()"), CreateStep(hc, TaskResult.Succeeded, "success()") },
Expected = false,
},
new
{
Steps = new[] { CreateStep(TaskResult.Failed, "success()"), CreateStep(TaskResult.Succeeded, "success() || failure()") },
Steps = new[] { CreateStep(hc, TaskResult.Failed, "success()"), CreateStep(hc, TaskResult.Succeeded, "success() || failure()") },
Expected = true,
},
};
@@ -178,27 +182,27 @@ namespace GitHub.Runner.Common.Tests.Worker
{
new
{
Steps = new[] { CreateStep(TaskResult.Succeeded, "success()"), CreateStep(TaskResult.Succeeded, "always()") },
Steps = new[] { CreateStep(hc, TaskResult.Succeeded, "success()"), CreateStep(hc, TaskResult.Succeeded, "always()") },
Expected = TaskResult.Succeeded,
},
new
{
Steps = new[] { CreateStep(TaskResult.Failed, "success()"), CreateStep(TaskResult.Succeeded, "always()") },
Steps = new[] { CreateStep(hc, TaskResult.Failed, "success()"), CreateStep(hc, TaskResult.Succeeded, "always()") },
Expected = TaskResult.Failed,
},
new
{
Steps = new[] { CreateStep(TaskResult.Failed, "success()"), CreateStep(TaskResult.Succeeded, "always()") },
Steps = new[] { CreateStep(hc, TaskResult.Failed, "success()"), CreateStep(hc, TaskResult.Succeeded, "always()") },
Expected = TaskResult.Failed,
},
new
{
Steps = new[] { CreateStep(TaskResult.Succeeded, "success()"), CreateStep(TaskResult.Failed, "always()") },
Steps = new[] { CreateStep(hc, TaskResult.Succeeded, "success()"), CreateStep(hc, TaskResult.Failed, "always()") },
Expected = TaskResult.Failed,
},
new
{
Steps = new[] { CreateStep(TaskResult.Succeeded, "success()"), CreateStep(TaskResult.Failed, "always()", true) },
Steps = new[] { CreateStep(hc, TaskResult.Succeeded, "success()"), CreateStep(hc, TaskResult.Failed, "always()", true) },
Expected = TaskResult.Succeeded,
},
};
@@ -232,47 +236,47 @@ namespace GitHub.Runner.Common.Tests.Worker
{
new
{
Steps = new[] { CreateStep(TaskResult.Failed, "success()"), CreateStep(TaskResult.Succeeded, "success()") },
Steps = new[] { CreateStep(hc, TaskResult.Failed, "success()"), CreateStep(hc, TaskResult.Succeeded, "success()") },
Expected = TaskResult.Failed
},
new
{
Steps = new[] { CreateStep(TaskResult.Failed, "success()"), CreateStep(TaskResult.Succeeded, "success() || failure()") },
Steps = new[] { CreateStep(hc, TaskResult.Failed, "success()"), CreateStep(hc, TaskResult.Succeeded, "success() || failure()") },
Expected = TaskResult.Failed
},
new
{
Steps = new[] { CreateStep(TaskResult.Failed, "success()"), CreateStep(TaskResult.Succeeded, "always()") },
Steps = new[] { CreateStep(hc, TaskResult.Failed, "success()"), CreateStep(hc, TaskResult.Succeeded, "always()") },
Expected = TaskResult.Failed
},
new
{
Steps = new[] { CreateStep(TaskResult.Failed, "success()", continueOnError: true), CreateStep(TaskResult.Failed, "success()") },
Steps = new[] { CreateStep(hc, TaskResult.Failed, "success()", continueOnError: true), CreateStep(hc, TaskResult.Failed, "success()") },
Expected = TaskResult.Failed
},
new
{
Steps = new[] { CreateStep(TaskResult.Failed, "success()", continueOnError: true), CreateStep(TaskResult.Succeeded, "success()") },
Steps = new[] { CreateStep(hc, TaskResult.Failed, "success()", continueOnError: true), CreateStep(hc, TaskResult.Succeeded, "success()") },
Expected = TaskResult.Succeeded
},
new
{
Steps = new[] { CreateStep(TaskResult.Failed, "success()", continueOnError: true), CreateStep(TaskResult.Failed, "success()", continueOnError: true) },
Steps = new[] { CreateStep(hc, TaskResult.Failed, "success()", continueOnError: true), CreateStep(hc, TaskResult.Failed, "success()", continueOnError: true) },
Expected = TaskResult.Succeeded
},
new
{
Steps = new[] { CreateStep(TaskResult.Succeeded, "success() || failure()") },
Steps = new[] { CreateStep(hc, TaskResult.Succeeded, "success() || failure()") },
Expected = TaskResult.Succeeded
},
new
{
Steps = new[] { CreateStep(TaskResult.Succeeded, "success()"), CreateStep(TaskResult.Failed, "success()") },
Steps = new[] { CreateStep(hc, TaskResult.Succeeded, "success()"), CreateStep(hc, TaskResult.Failed, "success()") },
Expected = TaskResult.Failed
},
new
{
Steps = new[] { CreateStep(TaskResult.Succeeded, "success()"), CreateStep(TaskResult.Succeeded, "success()") },
Steps = new[] { CreateStep(hc, TaskResult.Succeeded, "success()"), CreateStep(hc, TaskResult.Succeeded, "success()") },
Expected = TaskResult.Succeeded
},
// Abandoned
@@ -310,17 +314,17 @@ namespace GitHub.Runner.Common.Tests.Worker
{
new
{
Step = new[] { CreateStep(TaskResult.Failed, "success()"), CreateStep(TaskResult.Succeeded, "success()") },
Step = new[] { CreateStep(hc, TaskResult.Failed, "success()"), CreateStep(hc, TaskResult.Succeeded, "success()") },
Expected = false
},
new
{
Step = new[] { CreateStep(TaskResult.Failed, "success()"), CreateStep(TaskResult.Succeeded, "success() || failure()") },
Step = new[] { CreateStep(hc, TaskResult.Failed, "success()"), CreateStep(hc, TaskResult.Succeeded, "success() || failure()") },
Expected = true
},
new
{
Step = new[] { CreateStep(TaskResult.Failed, "success()"), CreateStep(TaskResult.Succeeded, "always()") },
Step = new[] { CreateStep(hc, TaskResult.Failed, "success()"), CreateStep(hc, TaskResult.Succeeded, "always()") },
Expected = true
}
};
@@ -351,9 +355,9 @@ namespace GitHub.Runner.Common.Tests.Worker
// Arrange.
var variableSets = new[]
{
new[] { CreateStep(TaskResult.Succeeded, "success()"), CreateStep(TaskResult.Succeeded, "always()") },
new[] { CreateStep(TaskResult.Failed, "success()"), CreateStep(TaskResult.Succeeded, "always()") },
new[] { CreateStep(TaskResult.Canceled, "success()"), CreateStep(TaskResult.Succeeded, "always()") }
new[] { CreateStep(hc, TaskResult.Succeeded, "success()"), CreateStep(hc, TaskResult.Succeeded, "always()") },
new[] { CreateStep(hc, TaskResult.Failed, "success()"), CreateStep(hc, TaskResult.Succeeded, "always()") },
new[] { CreateStep(hc, TaskResult.Canceled, "success()"), CreateStep(hc, TaskResult.Succeeded, "always()") }
};
foreach (var variableSet in variableSets)
{
@@ -387,8 +391,8 @@ namespace GitHub.Runner.Common.Tests.Worker
// Arrange.
var variableSets = new[]
{
new[] { CreateStep(TaskResult.Succeeded, "success()") },
new[] { CreateStep(TaskResult.Succeeded, "success()") },
new[] { CreateStep(hc, TaskResult.Succeeded, "success()") },
new[] { CreateStep(hc, TaskResult.Succeeded, "success()") },
};
foreach (var variableSet in variableSets)
{
@@ -416,7 +420,7 @@ namespace GitHub.Runner.Common.Tests.Worker
var env1 = new MappingToken(null, null, null);
env1.Add(new StringToken(null, null, null, "env1"), new StringToken(null, null, null, "100"));
env1.Add(new StringToken(null, null, null, "env2"), new BasicExpressionToken(null, null, null, "env.test"));
var step1 = CreateStep(TaskResult.Succeeded, "success()", env: env1);
var step1 = CreateStep(hc, TaskResult.Succeeded, "success()", env: env1);
_ec.Object.Result = null;
@@ -449,12 +453,12 @@ namespace GitHub.Runner.Common.Tests.Worker
var env1 = new MappingToken(null, null, null);
env1.Add(new StringToken(null, null, null, "env1"), new StringToken(null, null, null, "100"));
env1.Add(new StringToken(null, null, null, "env2"), new BasicExpressionToken(null, null, null, "env.test"));
var step1 = CreateStep(TaskResult.Succeeded, "success()", env: env1);
var step1 = CreateStep(hc, TaskResult.Succeeded, "success()", env: env1);
var env2 = new MappingToken(null, null, null);
env2.Add(new StringToken(null, null, null, "env1"), new StringToken(null, null, null, "1000"));
env2.Add(new StringToken(null, null, null, "env3"), new BasicExpressionToken(null, null, null, "env.test"));
var step2 = CreateStep(TaskResult.Succeeded, "success()", env: env2);
var step2 = CreateStep(hc, TaskResult.Succeeded, "success()", env: env2);
_ec.Object.Result = null;
@@ -477,29 +481,138 @@ namespace GitHub.Runner.Common.Tests.Worker
}
}
private Mock<IActionRunner> CreateStep(TaskResult result, string condition, Boolean continueOnError = false, MappingToken env = null)
[Fact]
[Trait("Level", "L0")]
[Trait("Category", "Worker")]
public async Task PopulateEnvContextAfterSetupStepsContext()
{
using (TestHostContext hc = CreateTestContext())
{
// Arrange.
var env1 = new MappingToken(null, null, null);
env1.Add(new StringToken(null, null, null, "env1"), new StringToken(null, null, null, "100"));
var step1 = CreateStep(hc, TaskResult.Succeeded, "success()", env: env1, name: "foo", setOutput: true);
var env2 = new MappingToken(null, null, null);
env2.Add(new StringToken(null, null, null, "env1"), new StringToken(null, null, null, "1000"));
env2.Add(new StringToken(null, null, null, "env2"), new BasicExpressionToken(null, null, null, "steps.foo.outputs.test"));
var step2 = CreateStep(hc, TaskResult.Succeeded, "success()", env: env2);
_ec.Object.Result = null;
_ec.Setup(x => x.JobSteps).Returns(new Queue<IStep>(new[] { step1.Object, step2.Object }));
// Act.
await _stepsRunner.RunAsync(jobContext: _ec.Object);
// Assert.
Assert.Equal(TaskResult.Succeeded, _ec.Object.Result ?? TaskResult.Succeeded);
#if OS_WINDOWS
Assert.Equal("1000", _ec.Object.ExpressionValues["env"].AssertDictionary("env")["env1"].AssertString("1000"));
Assert.Equal("something", _ec.Object.ExpressionValues["env"].AssertDictionary("env")["env2"].AssertString("something"));
#else
Assert.Equal("1000", _ec.Object.ExpressionValues["env"].AssertCaseSensitiveDictionary("env")["env1"].AssertString("1000"));
Assert.Equal("something", _ec.Object.ExpressionValues["env"].AssertCaseSensitiveDictionary("env")["env2"].AssertString("something"));
#endif
}
}
[Fact]
[Trait("Level", "L0")]
[Trait("Category", "Worker")]
public async Task StepContextOutcome()
{
using (TestHostContext hc = CreateTestContext())
{
// Arrange.
var step1 = CreateStep(hc, TaskResult.Succeeded, "success()", contextName: "step1");
var step2 = CreateStep(hc, TaskResult.Failed, "steps.step1.outcome == 'success'", continueOnError: true, contextName: "step2");
var step3 = CreateStep(hc, TaskResult.Succeeded, "steps.step1.outcome == 'success' && steps.step2.outcome == 'failure'", contextName: "step3");
_ec.Object.Result = null;
_ec.Setup(x => x.JobSteps).Returns(new Queue<IStep>(new[] { step1.Object, step2.Object, step3.Object }));
// Act.
await _stepsRunner.RunAsync(jobContext: _ec.Object);
// Assert.
Assert.Equal(TaskResult.Succeeded, _ec.Object.Result ?? TaskResult.Succeeded);
step1.Verify(x => x.RunAsync(), Times.Once);
step2.Verify(x => x.RunAsync(), Times.Once);
step3.Verify(x => x.RunAsync(), Times.Once);
Assert.Equal(TaskResult.Succeeded.ToActionResult().ToString(), _stepContext.GetScope(null)["step1"].AssertDictionary("")["outcome"].AssertString(""));
Assert.Equal(TaskResult.Succeeded.ToActionResult().ToString(), _stepContext.GetScope(null)["step1"].AssertDictionary("")["conclusion"].AssertString(""));
Assert.Equal(TaskResult.Failed.ToActionResult().ToString(), _stepContext.GetScope(null)["step2"].AssertDictionary("")["outcome"].AssertString(""));
Assert.Equal(TaskResult.Succeeded.ToActionResult().ToString(), _stepContext.GetScope(null)["step2"].AssertDictionary("")["conclusion"].AssertString(""));
Assert.Equal(TaskResult.Succeeded.ToActionResult().ToString(), _stepContext.GetScope(null)["step3"].AssertDictionary("")["outcome"].AssertString(""));
Assert.Equal(TaskResult.Succeeded.ToActionResult().ToString(), _stepContext.GetScope(null)["step3"].AssertDictionary("")["conclusion"].AssertString(""));
}
}
[Fact]
[Trait("Level", "L0")]
[Trait("Category", "Worker")]
public async Task StepContextConclusion()
{
using (TestHostContext hc = CreateTestContext())
{
// Arrange.
var step1 = CreateStep(hc, TaskResult.Succeeded, "false", contextName: "step1");
var step2 = CreateStep(hc, TaskResult.Failed, "steps.step1.conclusion == 'skipped'", continueOnError: true, contextName: "step2");
var step3 = CreateStep(hc, TaskResult.Succeeded, "steps.step1.outcome == 'skipped' && steps.step2.outcome == 'failure' && steps.step2.conclusion == 'success'", contextName: "step3");
_ec.Object.Result = null;
_ec.Setup(x => x.JobSteps).Returns(new Queue<IStep>(new[] { step1.Object, step2.Object, step3.Object }));
// Act.
await _stepsRunner.RunAsync(jobContext: _ec.Object);
// Assert.
Assert.Equal(TaskResult.Succeeded, _ec.Object.Result ?? TaskResult.Succeeded);
step1.Verify(x => x.RunAsync(), Times.Never);
step2.Verify(x => x.RunAsync(), Times.Once);
step3.Verify(x => x.RunAsync(), Times.Once);
Assert.Equal(TaskResult.Skipped.ToActionResult().ToString(), _stepContext.GetScope(null)["step1"].AssertDictionary("")["outcome"].AssertString(""));
Assert.Equal(TaskResult.Skipped.ToActionResult().ToString(), _stepContext.GetScope(null)["step1"].AssertDictionary("")["conclusion"].AssertString(""));
Assert.Equal(TaskResult.Failed.ToActionResult().ToString(), _stepContext.GetScope(null)["step2"].AssertDictionary("")["outcome"].AssertString(""));
Assert.Equal(TaskResult.Succeeded.ToActionResult().ToString(), _stepContext.GetScope(null)["step2"].AssertDictionary("")["conclusion"].AssertString(""));
Assert.Equal(TaskResult.Succeeded.ToActionResult().ToString(), _stepContext.GetScope(null)["step3"].AssertDictionary("")["outcome"].AssertString(""));
Assert.Equal(TaskResult.Succeeded.ToActionResult().ToString(), _stepContext.GetScope(null)["step3"].AssertDictionary("")["conclusion"].AssertString(""));
}
}
private Mock<IActionRunner> CreateStep(TestHostContext hc, TaskResult result, string condition, Boolean continueOnError = false, MappingToken env = null, string name = "Test", bool setOutput = false, string contextName = null)
{
// Setup the step.
var step = new Mock<IActionRunner>();
step.Setup(x => x.Condition).Returns(condition);
step.Setup(x => x.ContinueOnError).Returns(new BooleanToken(null, null, null, continueOnError));
step.Setup(x => x.RunAsync()).Returns(Task.CompletedTask);
step.Setup(x => x.Action)
.Returns(new DistributedTask.Pipelines.ActionStep()
{
Name = "Test",
Name = name,
Id = Guid.NewGuid(),
Environment = env
Environment = env,
ContextName = contextName ?? "Test"
});
// Setup the step execution context.
var stepContext = new Mock<IExecutionContext>();
stepContext.SetupAllProperties();
stepContext.Setup(x => x.WriteDebug).Returns(true);
stepContext.Setup(x => x.Variables).Returns(_variables);
stepContext.Setup(x => x.EnvironmentVariables).Returns(_env);
stepContext.Setup(x => x.ExpressionValues).Returns(_contexts);
stepContext.Setup(x => x.JobContext).Returns(_jobContext);
stepContext.Setup(x => x.StepsContext).Returns(_stepContext);
stepContext.Setup(x => x.ContextName).Returns(step.Object.Action.ContextName);
stepContext.Setup(x => x.Complete(It.IsAny<TaskResult?>(), It.IsAny<string>(), It.IsAny<string>()))
.Callback((TaskResult? r, string currentOperation, string resultCode) =>
{
@@ -507,10 +620,24 @@ namespace GitHub.Runner.Common.Tests.Worker
{
stepContext.Object.Result = r;
}
_stepContext.SetOutcome("", stepContext.Object.ContextName, (stepContext.Object.Outcome ?? stepContext.Object.Result ?? TaskResult.Succeeded).ToActionResult().ToString());
_stepContext.SetConclusion("", stepContext.Object.ContextName, (stepContext.Object.Result ?? TaskResult.Succeeded).ToActionResult().ToString());
});
var trace = hc.GetTrace();
stepContext.Setup(x => x.Write(It.IsAny<string>(), It.IsAny<string>())).Callback((string tag, string message) => { trace.Info($"[{tag}]{message}"); });
stepContext.Object.Result = result;
step.Setup(x => x.ExecutionContext).Returns(stepContext.Object);
if (setOutput)
{
step.Setup(x => x.RunAsync()).Callback(() => { _stepContext.SetOutput(null, name, "test", "something", out string reference); }).Returns(Task.CompletedTask);
}
else
{
step.Setup(x => x.RunAsync()).Returns(Task.CompletedTask);
}
return step;
}

View File

@@ -67,7 +67,7 @@ namespace GitHub.Runner.Common.Tests.Worker
new Pipelines.ContextData.DictionaryContextData()
},
};
var jobRequest = new Pipelines.AgentJobRequestMessage(plan, timeline, JobId, jobName, jobName, new StringToken(null, null, null, "ubuntu"), sidecarContainers, null, variables, new List<MaskHint>(), resources, context, null, actions, null);
var jobRequest = new Pipelines.AgentJobRequestMessage(plan, timeline, JobId, jobName, jobName, new StringToken(null, null, null, "ubuntu"), sidecarContainers, null, variables, new List<MaskHint>(), resources, context, null, actions, null, null, null, null);
return jobRequest;
}

View File

@@ -1,5 +1,16 @@
@setlocal
@echo off
rem add expected utils to path
IF EXIST C:\Program Files\Git\usr\bin (
SET PATH=C:\Program Files\Git\usr\bin;%PATH%
)
IF EXIST C:\Program Files\Git\mingw64\bin (
SET PATH=C:\Program Files\Git\mingw64\bin;%PATH%
)
IF EXIST C:\Program Files\Git\bin (
SET PATH=C:\Program Files\Git\bin;%PATH%
)
rem Check if SH_PATH is defined.
if defined SH_PATH (
goto run

View File

@@ -1 +1 @@
2.165.0
2.165.2