Compare commits

..

59 Commits

Author SHA1 Message Date
TingluoHuang
c44b34e436 c 2020-03-24 16:14:02 -04:00
TingluoHuang
738602ee55 c 2020-03-22 22:37:23 -04:00
TingluoHuang
502b9f06f2 c 2020-03-21 22:25:54 -04:00
TingluoHuang
310530add6 c 2020-03-20 22:54:03 -04:00
TingluoHuang
de07637563 c 2020-03-06 21:28:10 -05:00
Josh Gross
94e7560ccd Add event type to credential call (#352)
* Add event type to credential call

* Move events to contants
2020-03-02 11:22:45 -05:00
Lokesh Gopu
d80ab095a5 Update runnerversion (#348)
* Update runnerversion

* Update runnerversion
2020-02-28 13:28:33 -05:00
Josh Gross
2efd6f70e2 Use the Uri Scheme in the register runner url (#345) 2020-02-25 18:30:33 -05:00
Lokesh Gopu
a6f144b014 Update Runner Register GitHub API URL to Support Org-level Runner (#339)
* Update GitHub API URL

* pr comments

* Updated GitHub API URL
2020-02-24 09:15:15 -05:00
eric sciple
5294a3ee06 commands translate file path from container action (#331) 2020-02-12 21:07:43 -05:00
Tingluo Huang
745b90a8b2 Revert "Update Base64 Encoders to deal with suffixes (#284)" (#330)
This reverts commit c45aebc9ab.
2020-02-12 14:26:30 -05:00
Tingluo Huang
0db908da8d Use authenticate endpoint for testing runner connection. (#311)
* use authenticate endpoint for testing runner connection.

* PR feedback.
2020-02-05 16:56:38 -05:00
Thomas Boop
68de3a94be Remove Temporary Build Step (#316)
* Remove Temporary Build Step

* Updated dev.sh to set path for find
2020-02-04 12:59:49 -05:00
Tingluo Huang
a0a590fb48 setup/evaluate env context after setup steps context. (#309) 2020-01-30 22:14:14 -05:00
Christopher Johnson
87a232c477 Fix windows directions in release notes (#307) 2020-01-29 12:58:09 -05:00
TingluoHuang
a3c2479a29 prepare 2.165.0 runner release. 2020-01-27 22:14:06 -05:00
Thomas Boop
c45aebc9ab Update Base64 Encoders to deal with suffixes (#284)
* Update Base64 Encoders to deal with suffixes

* Set UriDataEscape to public for unit tests
2020-01-27 21:38:31 -05:00
Thomas Boop
b676ab3d33 Add ADR For Base64 Masking Improvements (#297)
* Base64 Secret Masking ADR

* slight addendums

* Update and rename 0000-base64-masking-trailing-characters.md to 0297-base64-masking-trailing-characters.md
2020-01-27 21:38:01 -05:00
Tingluo Huang
0a6bac355d include step.env as part of env context. (#300) 2020-01-27 15:54:28 -05:00
Tingluo Huang
eb78d19b17 Set both http_proxy and HTTP_PROXY env for runner/worker processes. (#298) 2020-01-27 11:04:35 -05:00
David Kale
17970ad1f9 Set http_proxy and related env vars for containers (#304)
* WIP add in basic passthrough for proxy env vars

* Add http_proxy vars after container env is created
2020-01-27 10:56:18 -05:00
Alberto Gimeno
2e0e8eb822 Change prompt message when removing a runner (#303)
* Change prompt message to be consistent with the UI/API

* Update test
2020-01-26 18:40:06 -05:00
Tingluo Huang
2a506cc556 Update 0278-env-context.md (#299) 2020-01-22 20:09:09 -05:00
Tingluo Huang
43dd34820b Follow redirect link for download runner package
GitHub release assets download link returns 302 to storage.
2020-01-21 12:23:30 -05:00
David Kale
746c9d9ec0 Trace javascript action exit code instead of user logs (#290)
* Trace javascript action exit code instead of user logs

* Debug instead of trace
2020-01-21 11:23:30 -05:00
Tingluo Huang
fa2ecfcc4c Fix page log name isn't unqiue. (#295) 2020-01-21 11:08:37 -05:00
Alberto Gimeno
c59c0e2ded Support action.yaml file (#288)
* Support action.yaml file

* L0 tests.

* l0

Co-authored-by: Tingluo Huang <tingluohuang@github.com>
2020-01-20 12:22:59 -05:00
Tingluo Huang
7a382facb3 default post-job action's condition to always(). (#293) 2020-01-20 09:28:29 -05:00
Tingluo Huang
e9ae42693f change hashFiles() expression to use @actions/glob. (#268) 2020-01-19 10:32:13 -05:00
David Kale
9cafe8c028 Update --help (#282)
* Update --help

* Tiny indent

* Remove once option
2020-01-17 13:55:11 -05:00
Thomas Boop
1484c3fb03 Add Temporary Step to fix build on windows (#285)
* Add Temporary Step to fix build on windows

* Update workflow
2020-01-16 14:19:54 -05:00
eric sciple
53d632706d adr: command input echoing (#280) 2020-01-14 14:55:17 -05:00
eric sciple
d6179242ca adr: hashFiles expression function (#279) 2020-01-14 14:54:37 -05:00
eric sciple
0da38a6924 adr: add env context (#278) 2020-01-14 14:54:29 -05:00
eric sciple
b19e5d7924 ADR: Run action shell options (#277) 2020-01-14 14:54:20 -05:00
eric sciple
80ac4a8964 ADR: problem matchers (#276) 2020-01-14 14:54:04 -05:00
eric sciple
02639a2092 translate problem matcher file to host path (#272) 2020-01-13 15:24:57 -05:00
eric sciple
a727194742 allow container to be null/empty (#266) 2020-01-12 23:02:36 -05:00
eric sciple
a9c58d7398 Handle escaped '%' in commands data section (#200) 2020-01-12 03:30:26 -05:00
Bryan MacFarlane
e15414eb5e proxy support ADR (#263) 2020-01-09 11:52:04 -05:00
Tingluo Huang
4ab1e645c3 Fix typo in error message. (#260) 2020-01-07 15:13:05 -05:00
Tingluo Huang
584f6b6ca3 upload log on runner force kill worker. (#255) 2020-01-06 13:04:23 -05:00
Tingluo Huang
abc65839f3 detect source file path without using env. (#257) 2020-01-06 12:56:15 -05:00
Tingluo Huang
06292aa118 Expose whether debug is on/off via RUNNER_DEBUG. (#253) 2020-01-03 21:23:46 -05:00
Joseph Petersen
ac1a076a3b Treat warnings as errors (#249)
* Treat warnings as errors

* fix warnings
2019-12-21 09:51:41 -05:00
Joseph Petersen
300bc67950 ignore .idea folder (#248)
created by rider
2019-12-21 09:50:07 -05:00
Tim Heuer
289c7f36a2 Minor typo (#247)
capitalize .NET
2019-12-20 20:20:44 -05:00
Julio Barba
b89d7fb8ef Remove old "v1" artifact download/publish code (#212)
* Remove old v1 artifact download/publish code
* Remove the Build2 REST API SDK
2019-12-19 16:02:00 -05:00
Josh Gross
5fd705bb84 Update LICENSE (#242) 2019-12-19 15:22:49 -05:00
Julio Barba
9e37732401 Verify that has Windows service started successfully (#236) 2019-12-19 14:34:26 -05:00
Bryan MacFarlane
6c70d53eea clean up some unneeded dockerfiles 2019-12-19 09:16:43 -05:00
Bryan MacFarlane
f791e2d512 update contributions md 2019-12-19 09:03:14 -05:00
Bryan MacFarlane
f1e36651ad build workflow ignores md changes 2019-12-19 08:54:43 -05:00
Bryan MacFarlane
be24fea81b update issue templates 2019-12-19 08:34:17 -05:00
Bryan MacFarlane
84ca2c05ce update readme 2019-12-19 08:20:39 -05:00
Bryan MacFarlane
2249560cec update readme 2019-12-18 23:15:30 -05:00
Bryan MacFarlane
2d4b821abe update readme 2019-12-18 23:13:22 -05:00
Bryan MacFarlane
371bf8e607 update readme and contributions 2019-12-18 22:49:31 -05:00
Tingluo Huang
9ba11da490 move .sln file. (#238) 2019-12-18 20:13:57 -05:00
104 changed files with 7934 additions and 1927 deletions

View File

@@ -1,10 +0,0 @@
## Runner Version and Platform
Version of your runner?
OS of the machine running the runner? OSX/Windows/Linux/...
## What's not working?
Please include error messages and screenshots.
## Runner and Worker's Diagnostic Logs
Logs are located in the runner's `_diag` folder. The runner logs are prefixed with `Runner_` and the worker logs are prefixed with `Worker_`. All sensitive information should already be masked out, but please double-check before pasting here.

34
.github/ISSUE_TEMPLATE/bug_report.md vendored Normal file
View File

@@ -0,0 +1,34 @@
---
name: Bug report
about: Create a report to help us improve
title: ''
labels: bug
assignees: ''
---
**Describe the bug**
A clear and concise description of what the bug is.
**To Reproduce**
Steps to reproduce the behavior:
1. Go to '...'
2. Run '....'
3. See error
**Expected behavior**
A clear and concise description of what you expected to happen.
## Runner Version and Platform
Version of your runner?
OS of the machine running the runner? OSX/Windows/Linux/...
## What's not working?
Please include error messages and screenshots.
## Job Log Output
If applicable, include the relevant part of the job / step log output here. All sensitive information should already be masked out, but please double-check before pasting here.
## Runner and Worker's Diagnostic Logs
If applicable, add relevant diagnostic log information. Logs are located in the runner's `_diag` folder. The runner logs are prefixed with `Runner_` and the worker logs are prefixed with `Worker_`. Each job run correlates to a worker log. All sensitive information should already be masked out, but please double-check before pasting here.

View File

@@ -0,0 +1,27 @@
---
name: Feature Request
about: Create a request to help us improve
title: ''
labels: enhancement
assignees: ''
---
Thank you 🙇‍♀ for wanting to create a feature in this repository. Before you do, please ensure you are filing the issue in the right place. Issues should only be opened on if the issue **relates to code in this repository**.
* If you have found a security issue [please submit it here](https://hackerone.com/github)
* If you have questions or issues with the service, writing workflows or actions, then please [visit the GitHub Community Forum's Actions Board](https://github.community/t5/GitHub-Actions/bd-p/actions)
* If you are having an issue or question about GitHub Actions then please [contact customer support](https://help.github.com/en/actions/automating-your-workflow-with-github-actions/about-github-actions#contacting-support)
If you have a feature request that is relevant to this repository, the runner, then please include the information below:
**Describe the enhancement**
A clear and concise description of what the features or enhancement you need.
**Code Snippet**
If applicable, add a code snippet.
**Additional information**
Add any other context about the feature here.
NOTE: if the feature request has been agreed upon then the assignee will create an ADR. See docs/adrs/README.md

View File

@@ -5,9 +5,13 @@ on:
branches:
- master
- releases/*
paths-ignore:
- '**.md'
pull_request:
branches:
- '*'
paths-ignore:
- '**.md'
jobs:
build:

8
.gitignore vendored
View File

@@ -1,12 +1,19 @@
# build output
**/bin
**/obj
**/libs
**/lib
# editors
**/*.xproj
**/*.xproj.user
**/.vs
**/.vscode
**/*.error
**/*.json.pretty
.idea/
# output
node_modules
_downloads
_layout
@@ -19,4 +26,3 @@ TestLogs
#generated
src/Runner.Sdk/BuildConstants.cs

View File

@@ -1,5 +1,5 @@
The MIT License (MIT)
Copyright (c) Microsoft Corporation
Copyright (c) 2019 GitHub
Permission is hereby granted, free of charge, to any person obtaining a copy
of this software and associated documentation files (the "Software"), to deal
@@ -17,4 +17,4 @@ FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE
SOFTWARE.
SOFTWARE.

View File

@@ -1,31 +1,25 @@
# GitHub Actions Runner
<p align="center">
<img src="docs/res/github-graph.png">
</p>
# GitHub Actions Runner
[![Actions Status](https://github.com/actions/runner/workflows/Runner%20CI/badge.svg)](https://github.com/actions/runner/actions)
The runner is the application that runs a job from a GitHub Actions workflow. The runner can run on the [hosted machine pools](https://github.com/actions/virtual-environments) or run on [self-hosted environments](https://help.github.com/en/actions/automating-your-workflow-with-github-actions/about-self-hosted-runners).
## Get Started
![win](docs/res/win_sm.png) [Pre-reqs](docs/start/envwin.md) | [Download](https://github.com/actions/runner/releases/latest)
For more information about installing and using self-hosted runners, see [Adding self-hosted runners](https://help.github.com/en/actions/automating-your-workflow-with-github-actions/adding-self-hosted-runners) and [Using self-hosted runners in a workflow](https://help.github.com/en/actions/automating-your-workflow-with-github-actions/using-self-hosted-runners-in-a-workflow)
![macOS](docs/res/apple_sm.png) [Pre-reqs](docs/start/envosx.md) | [Download](https://github.com/actions/runner/releases/latest)
Runner releases:
![linux](docs/res/linux_sm.png) [Pre-reqs](docs/start/envlinux.md) | [Download](https://github.com/actions/runner/releases/latest)
![win](docs/res/win_sm.png) [Pre-reqs](docs/start/envwin.md) | [Download](https://github.com/actions/runner/releases)
**Configure:**
![macOS](docs/res/apple_sm.png) [Pre-reqs](docs/start/envosx.md) | [Download](https://github.com/actions/runner/releases)
*MacOS and Linux*
```bash
./config.sh
```
*Windows*
```bash
config.cmd
```
![linux](docs/res/linux_sm.png) [Pre-reqs](docs/start/envlinux.md) | [Download](https://github.com/actions/runner/releases)
## Contribute
For developers that want to contribute, [read here](docs/contribute.md) on how to build and test.
We accept contributions in the form of issues and pull requests. [Read more here](docs/contribute.md) before contributing.

View File

@@ -0,0 +1,61 @@
# ADR 263: Self Hosted Runner Proxies
**Date**: 2019-11-13
**Status**: Accepted
## Context
- Proxy support is required for some enterprises and organizations to start using their own self hosted runners
- While there is not a standard convention, many applications support setting proxies via the environmental variables `http_proxy`, `https_proxy`, `no_proxy`, such as curl, wget, perl, python, docker, git, R, ect
- Some of these applications use `HTTPS_PROXY` versus `https_proxy`, but most understand or primarily support the lowercase variant
## Decision
We will update the Runner to use the conventional environment variables for proxies: `http_proxy`, `https_proxy` and `no_proxy` if they are set.
These are described in detail below:
- `https_proxy` a proxy URL for all https traffic. It may contain basic authentication credentials. For example:
- http://proxy.com
- http://127.0.0.1:8080
- http://user:password@proxy.com
- `http_proxy` a proxy URL for all http traffic. It may contain basic authentication credentials. For example:
- http://proxy.com
- http://127.0.0.1:8080
- http://user:password@proxy.com
- `no_proxy` a comma seperated list of hosts that should not use the proxy. An optional port may be specified
- `google.com`
- `yahoo.com:443`
- `google.com,bing.com`
We won't use `http_proxy` for https traffic when `https_proxy` is not set, this behavior lines up with any libcurl based tools (curl, git) and wget.
Otherwise action authors and workflow users need to adjust to differences between the runner proxy convention, and tools used by their actions and scripts.
Example:
Customer set `http_proxy=http://127.0.0.1:8888` and configure the runner against `https://github.com/owner/repo`, with the `https_proxy` -> `http_proxy` fallback, the runner will connect to server without any problem. However, if user runs `git push` to `https://github.com/owner/repo`, `git` won't use the proxy since it require `https_proxy` to be set for any https traffic.
> `golang`, `node.js` and other dev tools from the linux community use `http_proxy` for both http and https traffic base on my research.
A majority of our users are using Linux where these variables are commonly required to be set by various programs. By reading these values, we simplify the process for self hosted runners to set up proxy, and expose it in a way users are already familiar with.
A password provided for a proxy will be masked in the logs.
We will support the lowercase and uppercase variants, with lowercase taking priority if both are set.
### No Proxy Format
While exact implementations are different per application on handle `no_proxy` env, most applications accept a comma separated list of hosts. Some accept wildcard characters (*). We are going to do exact case-insentive matches, and not support wildcards at this time.
For example:
- example.com will match example.com, foo.example.com, foo.bar.example.com
- foo.example.com will match bar.foo.example.com and foo.example.com
We will not support IP addresses for `no_proxy`, only hostnames.
## Consequences
1. Enterprises and organizations needing proxy support will be able to embrace self hosted runners
2. Users will need to set these environmental variables before configuring the runner in order to use a proxy when configuring
3. The runner will read from the environmental variables during config and runtime and use the provided proxy if it exists
4. Users may need to pass these environmental variables into other applications if they do not natively take these variables
5. Action authors may need to update their workflows to react to the these environment variables
6. We will document the way of setting environmental variables for runners using the environmental variables and how the runner uses them
7. Like all other secrets, users will be able to relatively easily figure out proxy password if they can modify a workflow file running on a self hosted machine

View File

@@ -0,0 +1,263 @@
# ADR 0276: Problem Matchers
**Date** 2019-06-05
**Status** Accepted
## Context
Compilation failures during a CI build should surface good error messages.
For example, the actual compile errors from the typescript compiler should bubble as issues in the UI. And not simply "tsc exited with exit code 1".
VSCode has an extensible model for solving this type of problem. VSCode allows users to configure which problems matchers to use, when scanning output. For example, a user can apply the `tsc` problem matcher to receive a rich error output experience in VSCode, when compiling their typescript project.
The problem-matcher concept fits well with "setup" actions. For example, the `setup-nodejs` action will download node.js, add it to the PATH, and register the `tsc` problem matcher. For the duration of the job, the `tsc` problem matcher will be applied against the output.
## Decision
### Registration
#### Using `##` command
`##[add-matcher]path-to-problem-matcher-config.json`
Using a `##` command allows for flexibility:
- Ad hoc scripts can register problem matchers
- Allows problem matchers to be conditionally registered
Note, if a matcher with the same name is registered a second time, it will clobber the first instance.
#### Unregister using `##` command
A way out for rare cases where scoping is a problem.
`##[remove-matcher]owner`
For the this to be usable, the `owner` needs to be discoverable. Therefore, debug print the owner on registration.
### Single line matcher
Consider the output:
```
[...]
Build FAILED.
"C:\temp\problemmatcher\myproject\ConsoleApp1\ConsoleApp1.sln" (default target) (1) ->
"C:\temp\problemmatcher\myproject\ConsoleApp1\ConsoleApp1\ConsoleApp1.csproj" (default target) (2) ->
"C:\temp\problemmatcher\myproject\ConsoleApp1\ClassLibrary1\ClassLibrary1.csproj" (default target) (3) ->
(CoreCompile target) ->
Class1.cs(16,24): warning CS0612: 'ClassLibrary1.Helpers.MyHelper.Name' is obsolete [C:\temp\problemmatcher\myproject\ConsoleApp1\ClassLibrary1\ClassLibrary1.csproj]
"C:\temp\problemmatcher\myproject\ConsoleApp1\ConsoleApp1.sln" (default target) (1) ->
"C:\temp\problemmatcher\myproject\ConsoleApp1\ConsoleApp1\ConsoleApp1.csproj" (default target) (2) ->
"C:\temp\problemmatcher\myproject\ConsoleApp1\ClassLibrary1\ClassLibrary1.csproj" (default target) (3) ->
(CoreCompile target) ->
Helpers\MyHelper.cs(16,30): error CS1002: ; expected [C:\temp\problemmatcher\myproject\ConsoleApp1\ClassLibrary1\ClassLibrary1.csproj]
1 Warning(s)
1 Error(s)
```
The below match configuration uses a regular expression to discover problem lines. And the match groups are mapped into issue-properties.
```json
"owner": "msbuild",
"pattern": [
{
"regexp": "^\\s*([^:]+)\\((\\d+),(\\d+)\\): (error|warning) ([^:]+): (.*) \\[(.+)\\]$",
"file": 1,
"line": 2,
"column": 3,
"severity": 4,
"code": 5,
"message": 6,
"fromPath": 7
}
]
```
The above output and match configuration produces the following matches:
```
line: Class1.cs(16,24): warning CS0612: 'ClassLibrary1.Helpers.MyHelper.Name' is obsolete [C:\myrepo\myproject\ConsoleApp1\ClassLibrary1\ClassLibrary1.csproj]
file: Class1.cs
line: 16
column: 24
severity: warning
code: CS0612
message: 'ClassLibrary1.Helpers.MyHelper.Name' is obsolete
fromPath: C:\myrepo\myproject\ConsoleApp1\ClassLibrary1\ClassLibrary1.csproj
```
```
line: Helpers\MyHelper.cs(16,30): error CS1002: ; expected [C:\myrepo\myproject\ConsoleApp1\ClassLibrary1\ClassLibrary1.csproj]
file: Helpers\MyHelper.cs
line: 16
column: 30
severity: error
code: CS1002
message: ; expected
fromPath: C:\myrepo\myproject\ConsoleApp1\ClassLibrary1\ClassLibrary1.csproj
```
Additionally the line will appear red in the web UI (prefix with `##[error]`).
Note, an error does not imply task failure. Exit codes communicate failure.
Note, strip color codes when evaluating regular expressions.
### Multi-line matcher
Consider the below output from ESLint in stylish mode. The file name is printed once, yet multiple error lines are printed.
```
test.js
1:0 error Missing "use strict" statement strict
5:10 error 'addOne' is defined but never used no-unused-vars
✖ 2 problems (2 errors, 0 warnings)
```
The below match configuration uses multiple regular expressions, for the multiple lines.
And the last pattern of a multiline matcher can specify the `loop` property. This allows multiple errors to be discovered.
```json
"owner": "eslint-stylish",
"pattern": [
{
"regexp": "^([^\\s].*)$",
"file": 1
},
{
"regexp": "^\\s+(\\d+):(\\d+)\\s+(error|warning|info)\\s+(.*)\\s\\s+(.*)$",
"line": 1,
"column": 2,
"severity": 3,
"message": 4,
"code": 5,
"loop": true
}
]
```
The above output and match configuration produces two matches:
```
line: 1:0 error Missing "use strict" statement strict
file: test.js
line: 1
column: 0
severity: error
message: Missing "use strict" statement
code: strict
```
```
line: 5:10 error 'addOne' is defined but never used no-unused-vars
file: test.js
line: 5
column: 10
severity: error
message: 'addOne' is defined but never used
code: no-unused-vars
```
Note, in the above example only the error line will appear red in the web UI. The \"file\" line will not appear red.
### Other details
#### Configuration `owner`
Can be used to stomp over or remove.
#### Rooting the file
The goal of the file information is to provide a hyperlink in the UI.
Solving this problem means:
- Rooting the file when unrooted:
- Use the `fromPath` if specified (assume file path)
- Use the `github.workspace` (where the repo is cloned on disk)
- Match against a repository to determine the relative path within the repo
This is a place where we diverge from VSCode. VSCode task configuration are specific to the local workspace (workspace root is known or can be specified). We're solving a more generic problem, so we need more information - specifically the `fromPath` property - in order to accurately root the path.
In order to avoid creating inaccurate hyperlinks on the error issues, the agent will verify the file exists and is in the main repository. Otherwise omit the file property from the error issue and debug trace what happened.
#### Supported severity levels
Ordinal ignore case:
- `warning`
- `error`
Coalesce empty with \"error\". For any other values, omit logging an issue and debug trace what happened.
#### Default severity level
Problem matchers are unable to interpret severity strings other than `warning` and `error`. The `severity` match group expects `warning` or `error` (case insensitive).
However some tools indicate error/warning in different ways. For example `flake8` uses codes like `E100`, `W200`, and `F300` (error, warning, fatal, respectively).
Therefore, allow a property `severity`, sibling to `owner`, which identifies the default severity for the problem matcher. This allows two problem matchers are registered - one for warnings and one for errors.
For example, given the following `flake8` output:
```
./bootcamp/settings.py:156:80: E501 line too long (94 > 79 characters)
./bootcamp/settings.py:165:5: F403 'from local_settings import *' used; unable to detect undefined names
```
Two problem matchers can be used:
```json
{
"problemMatcher": [
{
"owner": "flake8",
"pattern": [
{
"regexp": "^(.+):(\\d+):(\\d+): ([EF]\\d+) (.+)$",
"file": 1,
"line": 2,
"column": 3,
"code": 4,
"message": 5
}
]
},
{
"owner": "flake8-warnings",
"severity": "warning",
"pattern": [
{
"regexp": "^(.+):(\\d+):(\\d+): (W\\d+) (.+)$",
"file": 1,
"line": 2,
"column": 3,
"code": 4,
"message": 5
}
]
}
]
}
```
#### Mitigate regular expression denial of service (ReDos)
If a matcher exceeds a 1 second timeout when processing a line, retry up to two three times total.
After three unsuccessful attempts, warn and eject the matcher. The matcher will not run again for the duration of the job.
### Where we diverge from VSCode
- We added the `fromPath` concept for rooting paths. This is done differently in VSCode, since a task is the scope (root path well known). For us, the job is the scope.
- VSCode allows additional activation info background tasks that are always running (recompile on files changed). They allow regular expressions to define when the matcher scope begins and ends. This is an interesting concept that we could leverage to help solve our scoping problem.
## Consequences
- Setup actions should register problem matchers

View File

@@ -0,0 +1,93 @@
# ADR 0277: Run action shell option
**Date** 2019-07-09
**Status** Accepted
## Context
run-actions run scripts using a platform specific shell:
`bash -eo pipefail` on non-windows, and `cmd.exe /c /d /s` on windows
The `shell` option overwrites this to allow different flags or completely different shells/interpreters
A small example is:
```yml
jobs:
bash-job:
actions:
- run: echo "Hello"
shell: bash
python-job:
actions:
- run: print("Hello")
shell: python {0}
```
## Decision
___
### Shell option
The keyword being used is `shell`
`shell` can be either:
1. Builtins / Explicitly supported keywords. It is useful to support at least `cmd`, and `powershell` on Windows. Because `cmd my_cmd_script` and `powershell my_ps1_script` are not valid the same way many Linux/cross-platform interpreters are, e.g. `bash myscript` or `python myscript`. Those tools (and potentially others) also require the correct file extension to run, or must be run in a particular way to get the exit codes consistently, so we must have first class knowledge about them. We provide default templates for these keywords as follows:
- `cmd`: Default is: `%ComSpec% /D /E:ON /V:OFF /S /C "CALL "{0}""` where the script name is automatically appended with `.cmd` and substituted for `{0}`
- Note this is equivalent to the default Windows behavior if no shell option is given
- `pwsh`: Default is: `pwsh -command "& '{0}'"` where the script is automatically appended with `.ps1`
- `powershell`: Default is: `powershell -command "& '{0}'"` where the script is automatically appended with `.ps1`
- `bash`: Uses `bash --noprofile --norc -eo pipefail {0}`
- The default behavior on non-Windows if no shell is given is to attempt this first
- `sh`: Uses `sh -e {0}`
- This is the default behavior on non-Windows if no shell is given, AND `bash` (see above) was not located on the PATH
- `python`: `python {0}`
- **NOTE**: The exact command ran may vary by machine. We only provide default arguments and command format for the listed shell. While the above behavior is expected on hosted machines, private runners may vary. For example, `sh` (or other commands) may actually be a link to `/bin/dash`, `/bin/bash`, or other
1. A template string: `command [...options] {0} [...more_options]`
- As above, the file name of the temporary script will be templated in. This gives users more control to have options at any location relative to the script path
- The first whitespace-delimited word of the string will be interpreted as the command
- e.g. `python {0} arg1 arg2` or similar can be used if passing args is needed. Some shells will require other options after the filename for various reasons
Note that (1) simply provides defaults that are executed with the same mechanism as (2). That is:
- A temporary script file is generated, and the path to that file is templated into the string at `{0}`
- The first word of the formatted string is assumed to be a command, and we attempt to locate its full path
- The fully qualified path to the command, plus the remaining arguments, is executed
- e.g. `shell: bash` expands to `/bin/bash --noprofile --norc -eo pipefail /runner/_layout/_work/_temp/f8d4fb2b-19d9-47e6-a786-4cc538d52761.sh` on my private runner
At this time, **THE LIST OF WELL-KNOWN SHELL OPTIONS IS**:
- cmd - Windows (hosted vs2017, vs2019) only
- powershell - Windows (hosted vs2017, vs2019) only
- sh - All hosted platforms
- pwsh - All hosted platforms
- bash - All hosted platforms
- python - All hosted platforms. Can use setup-python to configure which python will be used
___
### Containers
For container jobs, `shell` should just work the same as above, transparently. We will simply `exec` the command in the job container, passing the same arguments in
___
### Exit codes / Error action preference
For builtin shells, we provide defaults that make the most sense for CI, running within Actions, and being executed by our runner
bash/sh:
- Fail-fast behavior using `set -e o pipefail` is the default for `bash` and `shell` builtins, and by default when no option is given on non-Windows platforms
- Users can opt out of fail-fast and take full control easily by providing a template string to the shell options, eg: `bash {0}`.
- sh-like shells exit with the exit code of the last command executed in a script, and is our default behavior. Thus the runner reports the status of the step as fail/succeed based on this exit code
powershell/pwsh
- Fail-fast behavior when possible. For `pwsh` and `powershell` builtins, we will prepend `$ErrorActionPreference = 'stop'` to script contents
- We append `if ((Test-Path -LiteralPath variable:\LASTEXITCODE)) { exit $LASTEXITCODE }` to powershell scripts to get Action statuses to reflect the script's last exit code
- Users can always opt out by not using the builtins, and providing a shell option like: `pwsh -File {0}`, or `powershell -Command "& '{0}'"`, depending on need
cmd
- There doesnt seem to be a way to fully opt in to fail-fast behavior other than writing your script to check each error code and respond accordingly, so we cant actually provide that behavior by default, it will be completely up to the user to write this behavior into their script
- cmd.exe will exit (return the error code to the runner) with the errorlevel of the last program it executed. This is internally consistent with the previous default behavior (sh, pwsh) and is the cmd.exe default, so we keep that behavior
## Consequences
Valid `shell` options will depend on the hosted images. We will need to maintain tight image compat
First class support for a shell will require a major version schema change to modify. We cannot remove or modify the behavior of a well-known supported option, However, adding first class support for new shells is backwards compatible. For instance, we can add a well-known `python` option, because non-well-known options would have always needed to include `{0}`, e.g. `python {0}`

View File

@@ -0,0 +1,60 @@
# ADR 0278: Env Context
**Date**: 2019-09-30
**Status**: Accepted
## Context
User wants to reference workflow variables defined in workflow yaml file for action's input, displayName and condition.
## Decision
### Add `env` context in the runner
Runner will create and populate the `env` context for every job execution using following logic:
1. On job start, create `env` context with any environment variables in the job message, these are env defined in customer's YAML file's job/workflow level `env` section.
2. Update `env` context when customer use `::set-env::` to set env at the runner level.
3. Update `env` context with step's `env` block before each step runs.
The `env` context is only available in the runner, customer can't use the `env` context in any server evaluation part, just like the `runner` context
Example yaml:
```yaml
env:
env1: 10
env2: 20
env3: 30
jobs:
build:
env:
env1: 100
env2: 200
runs-on: ubuntu-latest
steps:
- run: |
echo ${{ env.env1 }} // 1000
echo $env1 // 1000
echo $env2 // 200
echo $env3 // 30
if: env.env2 == 200 // true
name: ${{ env.env1 }}_${{ env.env2 }} //1000_200
env:
env1: 1000
```
### Don't populate the `env` context with environment variables from runner machine.
With job container and container action, the `env` context may not have the right value customer want and will cause confusion.
Ex:
```yaml
build:
runs-on: ubuntu-latest <- $USER=runner in hosted machine
container: ubuntu:16.04 <- $USER=root in container
steps:
- run: echo ${{env.USER}} <- what should customer expect this output? runner/root
- uses: docker://ubuntu:18.04
with:
args: echo ${{env.USER}} <- what should customer expect this output? runner/root
```

View File

@@ -0,0 +1,71 @@
# ADR 0279: HashFiles Expression Function
**Date**: 2019-09-30
**Status**: Accepted
## Context
First party action `actions/cache` needs a input which is an explicit `key` used for restoring and saving the cache. For packages caching, the most comment `key` might be the hash result of contents from all `package-lock.json` under `node_modules` folder.
There are serval different ways to get the hash `key` input for `actions/cache` action.
1. Customer calculate the `key` themselves from a different action, customer won't like this since it needs extra step for using cache feature
```yaml
steps:
- run: |
hash=some_linux_hash_method(file1, file2, file3)
echo ::set-output name=hash::$hash
id: createHash
- uses: actions/cache@v1
with:
key: ${{ steps.createHash.outputs.hash }}
```
2. Make the `key` input of `actions/cache` follow certain convention to calculate hash, this limited the `key` input to a certain format customer may not want.
```yaml
steps:
- uses: actions/cache@v1
with:
key: ${{ runner.os }}|${{ github.workspace }}|**/package-lock.json
```
## Decision
### Add hashFiles() function to expression engine for calculate files' hash
`hashFiles()` will only allow on runner side since it needs to read files on disk, using `hashFiles()` on any server side evaluated expression will cause runtime errors.
`hashFiles()` will only support hashing files under the `$GITHUB_WORKSPACE` since the expression evaluated on the runner, if customer use job container or container action, the runner won't have access to file system inside the container.
`hashFiles()` will only take 1 parameters:
- `hashFiles('**/package-lock.json')` // Search files under $GITHUB_WORKSPACE and calculate a hash for them
**Question: Do we need to support more than one match patterns?**
Ex: `hashFiles('**/package-lock.json', '!toolkit/core/package-lock.json', '!toolkit/io/package-lock.json')`
Answer: Only support single match pattern for GA, we can always add later.
This will help customer has better experience with the `actions/cache` action's input.
```yaml
steps:
- uses: actions/cache@v1
with:
key: ${{hashFiles('**/package-lock.json')}}-${{github.ref}}-${{runner.os}}
```
For search pattern, we will use basic globbing (`*` `?` and `[]`) and globstar (`**`).
Additional pattern details:
- Root relative paths with `github.workspace` (the main repo)
- Make `*` match files that start with `.`
- Case insensitive on Windows
- Accept `\` or `/` path separators on Windows
Hashing logic:
1. Get all files under `$GITHUB_WORKSPACE`.
2. Use search pattern filter all files to get files that matches the search pattern. (search pattern only apply to file path not folder path)
3. Sort all matched files by full file path in alphabet order.
4. Use SHA256 algorithm to hash each matched file and store hash result.
5. Use SHA256 to hash all stored files' hash results to get the final 64 chars hash result.
**Question: Should we include the folder structure info into the hash?**
Answer: No

View File

@@ -0,0 +1,30 @@
# ADR 0280: Echoing of Command Input
**Date**: 2019-11-04
**Status**: Accepted
## Context
Command echoing as a default behavior tends to clutter the user logs, so we want to swap to a system where users have to opt in to see this information.
Command outputs will still be echoed in the case there are any errors processing such commands. This is so the end user can have more context on why the command failed and help with troubleshooting.
Echo output in the user logs can be explicitly controlled by the new commands `::echo::on` and `::echo::off`. By default, echoing is enabled if `ACTIONS_STEP_DEBUG` secret is enabled, otherwise echoing is disabled.
## Decision
- The only commands that currently echo output are
- `remove-matcher`
- `add-matcher`
- `add-path`
- These will no longer echo the command, if processed successfully
- All commands echo the input when any of these conditions is fulfilled:
1. When such commands fail with an error
2. When `::echo::on` is set
3. When the `ACTIONS_STEP_DEBUG` is set, and echoing hasn't been explicitly disabled with `::echo::off`
- There are a few commands that won't be echoed, even when echo is enabled. These are (as of 2019/11/04):
- `add-mask`
- `debug`
- `warning`
- `error`
- The three commands above will not echo, either because echoing the command would leak secrets (e.g. `add-mask`), or it would not add any additional troubleshooting information to the logs (e.g. `debug`). It's expected that future commands would follow these "echo-suppressing" guidelines as well. Echo-suppressed commands are still free to output other information to the logs, as deemed fit.

View File

@@ -0,0 +1,48 @@
# ADR 0297: Base64 Masking Trailing Characters
**Date** 2020-01-21
**Status** Proposed
## Context
The Runner registers a number of Value Encoders, which mask various encodings of a provided secret. Currently, we register a 3 base64 Encoders:
- The base64 encoded secret
- The secret with the first character removed then base64 encoded
- The secret with the first two characters removed then base64 encoded
This gives us good coverage across the board for secrets and secrets with a prefix (i.e. `base64($user:$pass)`).
However, we don't have great coverage for cases where the secret has a string appended to it before it is base64 encoded (i.e.: `base64($pass\n))`).
Most notably we've seen this as a result of user error where a user accidentially appends a newline or space character before encoding their secret in base64.
## Decision
### Trim end characters
We are going to modify all existing base64 encoders to trim information before registering as a secret.
We will trim:
- `=` from the end of all base64 strings. This is a padding character that contains no information.
- Based on the number of `=`'s at the end of a base64 string, a malicious user could predict the length of the original secret modulo 3.
- If a user saw `***==`, they would know the secret could be 1,4,7,10... characters.
- If a string contains `=` we will also trim the last non-padding character from the base64 secret.
- This character can change if a string is appended to the secret before the encoding.
### Register a fourth encoder
We will also add back in the original base64 encoded secret encoder for four total encoders:
- The base64 encoded secret
- The base64 encoded secret trimmed
- The secret with the first character removed then base64 encoded and trimmed
- The secret with the first two characters removed then base64 encoded and trimmed
This allows us to fully cover the most common scenario where a user base64 encodes their secret and expects the entire thing to be masked.
This will result in us only revealing length or bit information when a prefix or suffix is added to a secret before encoding.
## Consequences
- In the case where a secret has a prefix or suffix added before base64 encoding, we may now reveal up to 20 bits of information and the length of the original string modulo 3, rather then the original 16 bits and no length information
- Secrets with a suffix appended before encoding will now be masked across the board. Previously it was only masked if it was a multiple of 3 characters
- Performance will suffer in a neglible way

19
docs/adrs/README.md Normal file
View File

@@ -0,0 +1,19 @@
# ADRs
ADR, short for "Architecture Decision Record" is a way of capturing important architectural decisions, along with their context and consequences.
This folder includes ADRs for the actions runner. ADRs are proposed in the form of a pull request, and they commonly follow this format:
* **Title**: short present tense imperative phrase, less than 50 characters, like a git commit message.
* **Status**: proposed, accepted, rejected, deprecated, superseded, etc.
* **Context**: what is the issue that we're seeing that is motivating this decision or change.
* **Decision**: what is the change that we're actually proposing or doing.
* **Consequences**: what becomes easier or more difficult to do because of this change.
---
- More information about ADRs can be found [here](https://github.com/joelparkerhenderson/architecture_decision_record).

View File

@@ -1,10 +1,31 @@
# Contribution guide for developers
# Contributions
## Required Dev Dependencies
We welcome contributions in the form of issues and pull requests. We view the contributions and the process as the same for github and external contributors.
> IMPORTANT: Building your own runner is critical for the dev inner loop process when contributing changes. However, only runners built and distributed by GitHub (releases) are supported in production. Be aware that workflows and orchestrations run service side with the runner being a remote process to run steps. For that reason, the service can pull the runner forward so customizations can be lost.
## Issues
Log issues for both bugs and enhancement requests. Logging issues are important for the open community.
Issues in this repository should be for the runner application. Note that the VM and virtual machine images (including the developer toolsets) installed on the actions hosted machine pools are located [in this repository](https://github.com/actions/virtual-environments)
## Enhancements and Feature Requests
We ask that before significant effort is put into code changes, that we have agreement on taking the change before time is invested in code changes.
1. Create a feature request. Once agreed we will take the enhancment
2. Create an ADR to agree on the details of the change.
An ADR is an Architectural Decision Record. This allows consensus on the direction forward and also serves as a record of the change and motivation. [Read more here](adrs/README.md)
## Development Life Cycle
### Required Dev Dependencies
![Win](res/win_sm.png) Git for Windows [Install Here](https://git-scm.com/downloads) (needed for dev sh script)
## To Build, Test, Layout
### To Build, Test, Layout
Navigate to the `src` directory and run the following command:
@@ -29,12 +50,12 @@ cd ./src
./dev.(sh/cmd) test # run all unit tests before git commit/push
```
## Editors
### Editors
[Using Visual Studio 2019](https://www.visualstudio.com/vs/)
[Using Visual Studio Code](https://code.visualstudio.com/)
[Using Visual Studio 2019](https://www.visualstudio.com/vs/)
## Styling
### Styling
We use the .NET Foundation and CoreCLR style guidelines [located here](
https://github.com/dotnet/corefx/blob/master/Documentation/coding-guidelines/coding-style.md)

View File

@@ -28,7 +28,7 @@ Execute ./bin/installdependencies.sh to install any missing Dotnet Core 3.0 depe
```
You can easily correct the problem by executing `./bin/installdependencies.sh`.
The `installdependencies.sh` script should install all required dependencies on all supported Linux versions
> Note: The `installdependencies.sh` script will try to use the default package management mechanism on your Linux flavor (ex. `yum`/`apt-get`/`apt`). You might need to deal with error coming from the package management mechanism related to your setup, like [#1353](https://github.com/Microsoft/vsts-agent/issues/1353)
> Note: The `installdependencies.sh` script will try to use the default package management mechanism on your Linux flavor (ex. `yum`/`apt-get`/`apt`).
### Full dependencies list

View File

@@ -9,4 +9,4 @@
- Windows Server 2016 64-bit
- Windows Server 2019 64-bit
## [More .Net Core Prerequisites Information](https://docs.microsoft.com/en-us/dotnet/core/windows-prerequisites?tabs=netcore30)
## [More .NET Core Prerequisites Information](https://docs.microsoft.com/en-us/dotnet/core/windows-prerequisites?tabs=netcore30)

View File

@@ -1,7 +0,0 @@
FROM mcr.microsoft.com/dotnet/core/runtime-deps:2.1
RUN apt-get update \
&& apt-get install -y --no-install-recommends \
curl \
git \
&& rm -rf /var/lib/apt/lists/*

View File

@@ -1,150 +0,0 @@
FROM centos:6
# Install dependencies
RUN yum install -y \
centos-release-SCL \
epel-release \
wget \
unzip \
&& \
rpm --import http://linuxsoft.cern.ch/cern/slc6X/x86_64/RPM-GPG-KEY-cern && \
wget -O /etc/yum.repos.d/slc6-devtoolset.repo http://linuxsoft.cern.ch/cern/devtoolset/slc6-devtoolset.repo && \
yum install -y \
"perl(Time::HiRes)" \
autoconf \
cmake \
cmake3 \
devtoolset-2-toolchain \
doxygen \
expat-devel \
gcc \
gcc-c++ \
gdb \
gettext-devel \
krb5-devel \
libedit-devel \
libidn-devel \
libmetalink-devel \
libnghttp2-devel \
libssh2-devel \
libunwind-devel \
libuuid-devel \
lttng-ust-devel \
lzma \
ncurses-devel \
openssl-devel \
perl-devel \
python-argparse \
python27 \
readline-devel \
swig \
xz \
zlib-devel \
&& \
yum clean all
# Build and install clang and lldb 3.9.1
RUN wget ftp://sourceware.org/pub/binutils/snapshots/binutils-2.29.1.tar.xz && \
wget http://releases.llvm.org/3.9.1/cfe-3.9.1.src.tar.xz && \
wget http://releases.llvm.org/3.9.1/llvm-3.9.1.src.tar.xz && \
wget http://releases.llvm.org/3.9.1/lldb-3.9.1.src.tar.xz && \
wget http://releases.llvm.org/3.9.1/compiler-rt-3.9.1.src.tar.xz && \
\
tar -xf binutils-2.29.1.tar.xz && \
tar -xf llvm-3.9.1.src.tar.xz && \
mkdir llvm-3.9.1.src/tools/clang && \
mkdir llvm-3.9.1.src/tools/lldb && \
mkdir llvm-3.9.1.src/projects/compiler-rt && \
tar -xf cfe-3.9.1.src.tar.xz --strip 1 -C llvm-3.9.1.src/tools/clang && \
tar -xf lldb-3.9.1.src.tar.xz --strip 1 -C llvm-3.9.1.src/tools/lldb && \
tar -xf compiler-rt-3.9.1.src.tar.xz --strip 1 -C llvm-3.9.1.src/projects/compiler-rt && \
rm binutils-2.29.1.tar.xz && \
rm cfe-3.9.1.src.tar.xz && \
rm lldb-3.9.1.src.tar.xz && \
rm llvm-3.9.1.src.tar.xz && \
rm compiler-rt-3.9.1.src.tar.xz && \
\
mkdir llvmbuild && \
cd llvmbuild && \
scl enable python27 devtoolset-2 \
' \
cmake3 \
-DCMAKE_CXX_COMPILER=/opt/rh/devtoolset-2/root/usr/bin/g++ \
-DCMAKE_C_COMPILER=/opt/rh/devtoolset-2/root/usr/bin/gcc \
-DCMAKE_LINKER=/opt/rh/devtoolset-2/root/usr/bin/ld \
-DCMAKE_BUILD_TYPE=Release \
-DLLVM_LIBDIR_SUFFIX=64 \
-DLLVM_ENABLE_EH=1 \
-DLLVM_ENABLE_RTTI=1 \
-DLLVM_BINUTILS_INCDIR=../binutils-2.29.1/include \
../llvm-3.9.1.src \
&& \
make -j $(($(getconf _NPROCESSORS_ONLN)+1)) && \
make install \
' && \
cd .. && \
rm -r llvmbuild && \
rm -r llvm-3.9.1.src && \
rm -r binutils-2.29.1
# Build and install curl 7.45.0
RUN wget https://curl.haxx.se/download/curl-7.45.0.tar.lzma && \
tar -xf curl-7.45.0.tar.lzma && \
rm curl-7.45.0.tar.lzma && \
cd curl-7.45.0 && \
scl enable python27 devtoolset-2 \
' \
./configure \
--disable-dict \
--disable-ftp \
--disable-gopher \
--disable-imap \
--disable-ldap \
--disable-ldaps \
--disable-libcurl-option \
--disable-manual \
--disable-pop3 \
--disable-rtsp \
--disable-smb \
--disable-smtp \
--disable-telnet \
--disable-tftp \
--enable-ipv6 \
--enable-optimize \
--enable-symbol-hiding \
--with-ca-bundle=/etc/pki/tls/certs/ca-bundle.crt \
--with-nghttp2 \
--with-gssapi \
--with-ssl \
--without-librtmp \
&& \
make install \
' && \
cd .. && \
rm -r curl-7.45.0
# Install ICU 57.1
RUN wget http://download.icu-project.org/files/icu4c/57.1/icu4c-57_1-RHEL6-x64.tgz && \
tar -xf icu4c-57_1-RHEL6-x64.tgz -C / && \
rm icu4c-57_1-RHEL6-x64.tgz
# Compile and install a version of the git that supports the features that cli repo build needs
# NOTE: The git needs to be built after the curl so that it can use the libcurl to add https
# protocol support.
RUN \
wget https://www.kernel.org/pub/software/scm/git/git-2.9.5.tar.gz && \
tar -xf git-2.9.5.tar.gz && \
rm git-2.9.5.tar.gz && \
cd git-2.9.5 && \
make configure && \
./configure --prefix=/usr/local --without-tcltk && \
make -j $(nproc --all) all && \
make install && \
cd .. && \
rm -r git-2.9.5
ENV LD_LIBRARY_PATH=/usr/local/lib

View File

@@ -1,17 +1,30 @@
## Features
- Remove runner flow: Change from PAT to "deletion token" in prompt (#225)
- Expose github.run_id and github.run_number to action runtime env. (#224)
- Expose whether debug is on/off via RUNNER_DEBUG. (#253)
- Upload log on runner when worker get killed due to cancellation timeout. (#255)
- Update config.sh/cmd --help documentation (#282)
- Set http_proxy and related env vars for job/service containers (#304)
- Set both http_proxy and HTTP_PROXY env for runner/worker processes. (#298)
## Bugs
- Clean up error messages for container scenarios (#221)
- Pick shell from prependpath (#231)
- Verify runner Windows service hash started successfully after configuration (#236)
- Detect source file path in L0 without using env. (#257)
- Handle escaped '%' in commands data section (#200)
- Allow container to be null/empty during matrix expansion (#266)
- Translate problem matcher file to host path (#272)
- Change hashFiles() expression function to use @actions/glob. (#268)
- Default post-job action's condition to always(). (#293)
- Support action.yaml file as action's entry file (#288)
- Trace javascript action exit code to debug instead of user logs (#290)
- Change prompt message when removing a runner to lines up with GitHub.com UI (#303)
- Include step.env as part of env context. (#300)
- Update Base64 Encoders to deal with suffixes (#284)
## Misc
- Runner code cleanup (#218 #227, #228, #229, #230)
- Consume dotnet core 3.1 in runner. (#213)
- Move .sln file under ./src (#238)
- Treat warnings as errors during compile (#249)
## Windows x64
We recommend configuring the runner under "<DRIVE>:\actions-runner". This will help avoid issues related to service identity folder permissions and long file path restrictions on Windows
We recommend configuring the runner in a root folder of the Windows drive (e.g. "C:\actions-runner"). This will help avoid issues related to service identity folder permissions and long file path restrictions on Windows
```
// Create a folder under the drive root
mkdir \actions-runner ; cd \actions-runner
@@ -19,7 +32,7 @@ mkdir \actions-runner ; cd \actions-runner
Invoke-WebRequest -Uri https://github.com/actions/runner/releases/download/v<RUNNER_VERSION>/actions-runner-win-x64-<RUNNER_VERSION>.zip -OutFile actions-runner-win-x64-<RUNNER_VERSION>.zip
// Extract the installer
Add-Type -AssemblyName System.IO.Compression.FileSystem ;
[System.IO.Compression.ZipFile]::ExtractToDirectory("$HOME\Downloads\actions-runner-win-x64-<RUNNER_VERSION>.zip", "$PWD")
[System.IO.Compression.ZipFile]::ExtractToDirectory("$PWD\actions-runner-win-x64-<RUNNER_VERSION>.zip", "$PWD")
```
## OSX
@@ -28,7 +41,7 @@ Add-Type -AssemblyName System.IO.Compression.FileSystem ;
// Create a folder
mkdir actions-runner && cd actions-runner
// Download the latest runner package
curl -O https://github.com/actions/runner/releases/download/v<RUNNER_VERSION>/actions-runner-osx-x64-<RUNNER_VERSION>.tar.gz
curl -O -L https://github.com/actions/runner/releases/download/v<RUNNER_VERSION>/actions-runner-osx-x64-<RUNNER_VERSION>.tar.gz
// Extract the installer
tar xzf ./actions-runner-osx-x64-<RUNNER_VERSION>.tar.gz
```
@@ -39,7 +52,7 @@ tar xzf ./actions-runner-osx-x64-<RUNNER_VERSION>.tar.gz
// Create a folder
mkdir actions-runner && cd actions-runner
// Download the latest runner package
curl -O https://github.com/actions/runner/releases/download/v<RUNNER_VERSION>/actions-runner-linux-x64-<RUNNER_VERSION>.tar.gz
curl -O -L https://github.com/actions/runner/releases/download/v<RUNNER_VERSION>/actions-runner-linux-x64-<RUNNER_VERSION>.tar.gz
// Extract the installer
tar xzf ./actions-runner-linux-x64-<RUNNER_VERSION>.tar.gz
```
@@ -50,7 +63,7 @@ tar xzf ./actions-runner-linux-x64-<RUNNER_VERSION>.tar.gz
// Create a folder
mkdir actions-runner && cd actions-runner
// Download the latest runner package
curl -O https://github.com/actions/runner/releases/download/v<RUNNER_VERSION>/actions-runner-linux-arm64-<RUNNER_VERSION>.tar.gz
curl -O -L https://github.com/actions/runner/releases/download/v<RUNNER_VERSION>/actions-runner-linux-arm64-<RUNNER_VERSION>.tar.gz
// Extract the installer
tar xzf ./actions-runner-linux-arm64-<RUNNER_VERSION>.tar.gz
```
@@ -61,7 +74,7 @@ tar xzf ./actions-runner-linux-arm64-<RUNNER_VERSION>.tar.gz
// Create a folder
mkdir actions-runner && cd actions-runner
// Download the latest runner package
curl -O https://github.com/actions/runner/releases/download/v<RUNNER_VERSION>/actions-runner-linux-arm-<RUNNER_VERSION>.tar.gz
curl -O -L https://github.com/actions/runner/releases/download/v<RUNNER_VERSION>/actions-runner-linux-arm-<RUNNER_VERSION>.tar.gz
// Extract the installer
tar xzf ./actions-runner-linux-arm-<RUNNER_VERSION>.tar.gz
```

View File

@@ -3,23 +3,23 @@ Microsoft Visual Studio Solution File, Format Version 12.00
# Visual Studio Version 16
VisualStudioVersion = 16.0.29411.138
MinimumVisualStudioVersion = 10.0.40219.1
Project("{9A19103F-16F7-4668-BE54-9A1E7A4F7556}") = "Runner.Common", "src\Runner.Common\Runner.Common.csproj", "{084289A3-CD7A-42E0-9219-4348B4B7E19B}"
Project("{9A19103F-16F7-4668-BE54-9A1E7A4F7556}") = "Runner.Common", "Runner.Common\Runner.Common.csproj", "{084289A3-CD7A-42E0-9219-4348B4B7E19B}"
EndProject
Project("{9A19103F-16F7-4668-BE54-9A1E7A4F7556}") = "Runner.Listener", "src\Runner.Listener\Runner.Listener.csproj", "{7D461AEE-BF2A-4855-BD96-56921160B36A}"
Project("{9A19103F-16F7-4668-BE54-9A1E7A4F7556}") = "Runner.Listener", "Runner.Listener\Runner.Listener.csproj", "{7D461AEE-BF2A-4855-BD96-56921160B36A}"
EndProject
Project("{9A19103F-16F7-4668-BE54-9A1E7A4F7556}") = "Runner.PluginHost", "src\Runner.PluginHost\Runner.PluginHost.csproj", "{D0320EB1-CB6D-4179-BFDC-2F2B664A370C}"
Project("{9A19103F-16F7-4668-BE54-9A1E7A4F7556}") = "Runner.PluginHost", "Runner.PluginHost\Runner.PluginHost.csproj", "{D0320EB1-CB6D-4179-BFDC-2F2B664A370C}"
EndProject
Project("{9A19103F-16F7-4668-BE54-9A1E7A4F7556}") = "Runner.Plugins", "src\Runner.Plugins\Runner.Plugins.csproj", "{C23AFD6F-4DCD-4243-BC61-865BE31B9168}"
Project("{9A19103F-16F7-4668-BE54-9A1E7A4F7556}") = "Runner.Plugins", "Runner.Plugins\Runner.Plugins.csproj", "{C23AFD6F-4DCD-4243-BC61-865BE31B9168}"
EndProject
Project("{9A19103F-16F7-4668-BE54-9A1E7A4F7556}") = "Runner.Sdk", "src\Runner.Sdk\Runner.Sdk.csproj", "{D0484633-DA97-4C34-8E47-1DADE212A57A}"
Project("{9A19103F-16F7-4668-BE54-9A1E7A4F7556}") = "Runner.Sdk", "Runner.Sdk\Runner.Sdk.csproj", "{D0484633-DA97-4C34-8E47-1DADE212A57A}"
EndProject
Project("{FAE04EC0-301F-11D3-BF4B-00C04F79EFBC}") = "RunnerService", "src\Runner.Service\Windows\RunnerService.csproj", "{D12EBD71-0464-46D0-8394-40BCFBA0A6F2}"
Project("{FAE04EC0-301F-11D3-BF4B-00C04F79EFBC}") = "RunnerService", "Runner.Service\Windows\RunnerService.csproj", "{D12EBD71-0464-46D0-8394-40BCFBA0A6F2}"
EndProject
Project("{9A19103F-16F7-4668-BE54-9A1E7A4F7556}") = "Runner.Worker", "src\Runner.Worker\Runner.Worker.csproj", "{C2F5B9FA-2621-411F-8EB2-273ED276F503}"
Project("{9A19103F-16F7-4668-BE54-9A1E7A4F7556}") = "Runner.Worker", "Runner.Worker\Runner.Worker.csproj", "{C2F5B9FA-2621-411F-8EB2-273ED276F503}"
EndProject
Project("{9A19103F-16F7-4668-BE54-9A1E7A4F7556}") = "Sdk", "src\Sdk\Sdk.csproj", "{D2EE812B-E4DF-49BB-AE87-12BC49949B5F}"
Project("{9A19103F-16F7-4668-BE54-9A1E7A4F7556}") = "Sdk", "Sdk\Sdk.csproj", "{D2EE812B-E4DF-49BB-AE87-12BC49949B5F}"
EndProject
Project("{9A19103F-16F7-4668-BE54-9A1E7A4F7556}") = "Test", "src\Test\Test.csproj", "{C932061F-F6A1-4F1E-B854-A6C6B30DC3EF}"
Project("{9A19103F-16F7-4668-BE54-9A1E7A4F7556}") = "Test", "Test\Test.csproj", "{C932061F-F6A1-4F1E-B854-A6C6B30DC3EF}"
EndProject
Global
GlobalSection(SolutionConfigurationPlatforms) = preSolution

View File

@@ -46,4 +46,9 @@
<PropertyGroup Condition="'$(Configuration)' == 'Debug'">
<DefineConstants>$(DefineConstants);DEBUG</DefineConstants>
</PropertyGroup>
<!-- Set Treat tarnings as errors -->
<PropertyGroup>
<TreatWarningsAsErrors>true</TreatWarningsAsErrors>
</PropertyGroup>
</Project>

View File

@@ -0,0 +1,3 @@
dist/
lib/
node_modules/

View File

@@ -0,0 +1,59 @@
{
"plugins": ["jest", "@typescript-eslint"],
"extends": ["plugin:github/es6"],
"parser": "@typescript-eslint/parser",
"parserOptions": {
"ecmaVersion": 9,
"sourceType": "module",
"project": "./tsconfig.json"
},
"rules": {
"eslint-comments/no-use": "off",
"import/no-namespace": "off",
"no-console": "off",
"no-unused-vars": "off",
"@typescript-eslint/no-unused-vars": "error",
"@typescript-eslint/explicit-member-accessibility": ["error", {"accessibility": "no-public"}],
"@typescript-eslint/no-require-imports": "error",
"@typescript-eslint/array-type": "error",
"@typescript-eslint/await-thenable": "error",
"@typescript-eslint/ban-ts-ignore": "error",
"camelcase": "off",
"@typescript-eslint/camelcase": "error",
"@typescript-eslint/class-name-casing": "error",
"@typescript-eslint/explicit-function-return-type": ["error", {"allowExpressions": true}],
"@typescript-eslint/func-call-spacing": ["error", "never"],
"@typescript-eslint/generic-type-naming": ["error", "^[A-Z][A-Za-z]*$"],
"@typescript-eslint/no-array-constructor": "error",
"@typescript-eslint/no-empty-interface": "error",
"@typescript-eslint/no-explicit-any": "error",
"@typescript-eslint/no-extraneous-class": "error",
"@typescript-eslint/no-for-in-array": "error",
"@typescript-eslint/no-inferrable-types": "error",
"@typescript-eslint/no-misused-new": "error",
"@typescript-eslint/no-namespace": "error",
"@typescript-eslint/no-non-null-assertion": "warn",
"@typescript-eslint/no-object-literal-type-assertion": "error",
"@typescript-eslint/no-unnecessary-qualifier": "error",
"@typescript-eslint/no-unnecessary-type-assertion": "error",
"@typescript-eslint/no-useless-constructor": "error",
"@typescript-eslint/no-var-requires": "error",
"@typescript-eslint/prefer-for-of": "warn",
"@typescript-eslint/prefer-function-type": "warn",
"@typescript-eslint/prefer-includes": "error",
"@typescript-eslint/prefer-interface": "error",
"@typescript-eslint/prefer-string-starts-ends-with": "error",
"@typescript-eslint/promise-function-async": "error",
"@typescript-eslint/require-array-sort-compare": "error",
"@typescript-eslint/restrict-plus-operands": "error",
"semi": "off",
"@typescript-eslint/semi": ["error", "never"],
"@typescript-eslint/type-annotation-spacing": "error",
"@typescript-eslint/unbound-method": "error"
},
"env": {
"node": true,
"es6": true,
"jest/globals": true
}
}

View File

@@ -0,0 +1,3 @@
dist/
lib/
node_modules/

View File

@@ -0,0 +1,11 @@
{
"printWidth": 80,
"tabWidth": 2,
"useTabs": false,
"semi": false,
"singleQuote": true,
"trailingComma": "none",
"bracketSpacing": false,
"arrowParens": "avoid",
"parser": "typescript"
}

View File

@@ -0,0 +1 @@
To update hashFiles under `Misc/layoutbin` run `npm install && npm run all`

File diff suppressed because it is too large Load Diff

View File

@@ -0,0 +1,35 @@
{
"name": "hashFiles",
"version": "1.0.0",
"description": "GitHub Actions HashFiles() expression function",
"main": "lib/hashFiles.js",
"scripts": {
"build": "tsc",
"format": "prettier --write **/*.ts",
"format-check": "prettier --check **/*.ts",
"lint": "eslint src/**/*.ts",
"pack": "ncc build -o ../../layoutbin/hashFiles",
"all": "npm run build && npm run format && npm run lint && npm run pack"
},
"repository": {
"type": "git",
"url": "git+https://github.com/actions/runner.git"
},
"keywords": [
"actions"
],
"author": "GitHub Actions",
"license": "MIT",
"dependencies": {
"@actions/glob": "^0.1.0"
},
"devDependencies": {
"@types/node": "^12.7.12",
"@typescript-eslint/parser": "^2.8.0",
"@zeit/ncc": "^0.20.5",
"eslint": "^5.16.0",
"eslint-plugin-github": "^2.0.0",
"prettier": "^1.19.1",
"typescript": "^3.6.4"
}
}

View File

@@ -0,0 +1,55 @@
import * as glob from '@actions/glob'
import * as crypto from 'crypto'
import * as fs from 'fs'
import * as stream from 'stream'
import * as util from 'util'
import * as path from 'path'
async function run(): Promise<void> {
// arg0 -> node
// arg1 -> hashFiles.js
// env[followSymbolicLinks] = true/null
// env[patterns] -> glob patterns
let followSymbolicLinks = false
const matchPatterns = process.env.patterns || ''
if (process.env.followSymbolicLinks === 'true') {
console.log('Follow symbolic links')
followSymbolicLinks = true
}
console.log(`Match Pattern: ${matchPatterns}`)
let hasMatch = false
const githubWorkspace = process.cwd()
const result = crypto.createHash('sha256')
let count = 0
const globber = await glob.create(matchPatterns, {followSymbolicLinks})
for await (const file of globber.globGenerator()) {
console.log(file)
if (!file.startsWith(`${githubWorkspace}${path.sep}`)) {
console.log(`Ignore '${file}' since it is not under GITHUB_WORKSPACE.`)
continue
}
if (fs.statSync(file).isDirectory()) {
console.log(`Skip directory '${file}'.`)
continue
}
const hash = crypto.createHash('sha256')
const pipeline = util.promisify(stream.pipeline)
await pipeline(fs.createReadStream(file), hash)
result.write(hash.digest())
count++
if (!hasMatch) {
hasMatch = true
}
}
result.end()
if (hasMatch) {
console.log(`Find ${count} files to hash.`)
console.error(`__OUTPUT__${result.digest('hex')}__OUTPUT__`)
} else {
console.error(`__OUTPUT____OUTPUT__`)
}
}
run()

View File

@@ -0,0 +1,12 @@
{
"compilerOptions": {
"target": "es6", /* Specify ECMAScript target version: 'ES3' (default), 'ES5', 'ES2015', 'ES2016', 'ES2017', 'ES2018', 'ES2019' or 'ESNEXT'. */
"module": "commonjs", /* Specify module code generation: 'none', 'commonjs', 'amd', 'system', 'umd', 'es2015', or 'ESNext'. */
"outDir": "./lib", /* Redirect output structure to the directory. */
"rootDir": "./src", /* Specify the root directory of input files. Use to control the output directory structure with --outDir. */
"strict": true, /* Enable all strict type-checking options. */
"noImplicitAny": true, /* Raise error on expressions and declarations with an implied 'any' type. */
"esModuleInterop": true /* Enables emit interoperability between CommonJS and ES Modules via creation of namespace objects for all imports. Implies 'allowSyntheticDefaultImports'. */
},
"exclude": ["node_modules", "**/*.test.ts"]
}

File diff suppressed because it is too large Load Diff

View File

@@ -3,8 +3,6 @@
<packageSources>
<!--To inherit the global NuGet package sources remove the <clear/> line below -->
<clear />
<add key="dotnet-core" value="https://www.myget.org/F/dotnet-core/api/v3/index.json" />
<add key="dotnet-buildtools" value="https://www.myget.org/F/dotnet-buildtools/api/v3/index.json" />
<add key="api.nuget.org" value="https://api.nuget.org/v3/index.json" />
</packageSources>
</configuration>

View File

@@ -9,26 +9,27 @@ namespace GitHub.Runner.Common
{
private static readonly EscapeMapping[] _escapeMappings = new[]
{
new EscapeMapping(token: "%", replacement: "%25"),
new EscapeMapping(token: ";", replacement: "%3B"),
new EscapeMapping(token: "\r", replacement: "%0D"),
new EscapeMapping(token: "\n", replacement: "%0A"),
new EscapeMapping(token: "]", replacement: "%5D"),
new EscapeMapping(token: "%", replacement: "%25"),
};
private static readonly EscapeMapping[] _escapeDataMappings = new[]
{
new EscapeMapping(token: "\r", replacement: "%0D"),
new EscapeMapping(token: "\n", replacement: "%0A"),
new EscapeMapping(token: "%", replacement: "%25"),
};
private static readonly EscapeMapping[] _escapePropertyMappings = new[]
{
new EscapeMapping(token: "%", replacement: "%25"),
new EscapeMapping(token: "\r", replacement: "%0D"),
new EscapeMapping(token: "\n", replacement: "%0A"),
new EscapeMapping(token: ":", replacement: "%3A"),
new EscapeMapping(token: ",", replacement: "%2C"),
new EscapeMapping(token: "%", replacement: "%25"),
};
private readonly Dictionary<string, string> _properties = new Dictionary<string, string>(StringComparer.OrdinalIgnoreCase);

View File

@@ -136,6 +136,12 @@ namespace GitHub.Runner.Common
}
}
public static class RunnerEvent
{
public static readonly string Register = "register";
public static readonly string Remove = "remove";
}
public static class Pipeline
{
public static class Path
@@ -162,7 +168,8 @@ namespace GitHub.Runner.Common
public static class Path
{
public static readonly string ActionsDirectory = "_actions";
public static readonly string ActionManifestFile = "action.yml";
public static readonly string ActionManifestYmlFile = "action.yml";
public static readonly string ActionManifestYamlFile = "action.yaml";
public static readonly string BinDirectory = "bin";
public static readonly string DiagDirectory = "_diag";
public static readonly string ExternalsDirectory = "externals";

View File

@@ -46,6 +46,7 @@ namespace GitHub.Runner.Common
Add<T>(extensions, "GitHub.Runner.Worker.SetOutputCommandExtension, Runner.Worker");
Add<T>(extensions, "GitHub.Runner.Worker.SaveStateCommandExtension, Runner.Worker");
Add<T>(extensions, "GitHub.Runner.Worker.AddPathCommandExtension, Runner.Worker");
Add<T>(extensions, "GitHub.Runner.Worker.RefreshTokenCommandExtension, Runner.Worker");
Add<T>(extensions, "GitHub.Runner.Worker.AddMaskCommandExtension, Runner.Worker");
Add<T>(extensions, "GitHub.Runner.Worker.AddMatcherCommandExtension, Runner.Worker");
Add<T>(extensions, "GitHub.Runner.Worker.RemoveMatcherCommandExtension, Runner.Worker");

View File

@@ -22,6 +22,7 @@ namespace GitHub.Runner.Common
Task<List<TimelineRecord>> UpdateTimelineRecordsAsync(Guid scopeIdentifier, string hubName, Guid planId, Guid timelineId, IEnumerable<TimelineRecord> records, CancellationToken cancellationToken);
Task RaisePlanEventAsync<T>(Guid scopeIdentifier, string hubName, Guid planId, T eventData, CancellationToken cancellationToken) where T : JobEvent;
Task<Timeline> GetTimelineAsync(Guid scopeIdentifier, string hubName, Guid planId, Guid timelineId, CancellationToken cancellationToken);
Task<GitHubToken> RefreshGitHubTokenAsync(Guid scopeIdentifier, string hubName, Guid planId, Guid jobId, CancellationToken cancellationToken);
}
public sealed class JobServer : RunnerService, IJobServer
@@ -113,5 +114,11 @@ namespace GitHub.Runner.Common
CheckConnection();
return _taskClient.GetTimelineAsync(scopeIdentifier, hubName, planId, timelineId, includeRecords: true, cancellationToken: cancellationToken);
}
public Task<GitHubToken> RefreshGitHubTokenAsync(Guid scopeIdentifier, string hubName, Guid planId, Guid jobId, CancellationToken cancellationToken)
{
CheckConnection();
return _taskClient.RefreshTokenAsync(scopeIdentifier, hubName, planId, jobId, cancellationToken: cancellationToken);
}
}
}

View File

@@ -24,7 +24,6 @@ namespace GitHub.Runner.Common
private Guid _timelineId;
private Guid _timelineRecordId;
private string _pageId;
private FileStream _pageData;
private StreamWriter _pageWriter;
private int _byteCount;
@@ -40,7 +39,6 @@ namespace GitHub.Runner.Common
{
base.Initialize(hostContext);
_totalLines = 0;
_pageId = Guid.NewGuid().ToString();
_pagesFolder = Path.Combine(hostContext.GetDirectory(WellKnownDirectory.Diag), PagingFolder);
_jobServerQueue = HostContext.GetService<IJobServerQueue>();
Directory.CreateDirectory(_pagesFolder);
@@ -102,7 +100,7 @@ namespace GitHub.Runner.Common
{
EndPage();
_byteCount = 0;
_dataFileName = Path.Combine(_pagesFolder, $"{_pageId}_{++_pageCount}.log");
_dataFileName = Path.Combine(_pagesFolder, $"{_timelineId}_{_timelineRecordId}_{++_pageCount}.log");
_pageData = new FileStream(_dataFileName, FileMode.CreateNew);
_pageWriter = new StreamWriter(_pageData, System.Text.Encoding.UTF8);
}

View File

@@ -190,7 +190,7 @@ namespace GitHub.Runner.Listener
{
return GetArgOrPrompt(
name: Constants.Runner.CommandLine.Args.Token,
description: "Enter runner deletion token:",
description: "Enter runner remove token:",
defaultValue: string.Empty,
validator: Validators.NonEmptyValidator);
}
@@ -291,7 +291,7 @@ namespace GitHub.Runner.Listener
if (!string.IsNullOrEmpty(result))
{
// After read the arg from input commandline args, remove it from Arg dictionary,
// This will help if bad arg value passed through CommandLine arg, when ConfigurationManager ask CommandSetting the second time,
// This will help if bad arg value passed through CommandLine arg, when ConfigurationManager ask CommandSetting the second time,
// It will prompt for input instead of continue use the bad input.
_trace.Info($"Remove {name} from Arg dictionary.");
RemoveArg(name);

View File

@@ -1,19 +1,18 @@
using GitHub.DistributedTask.WebApi;
using GitHub.Runner.Common;
using GitHub.Runner.Common.Util;
using GitHub.Runner.Sdk;
using GitHub.Services.Common;
using GitHub.Services.OAuth;
using GitHub.Services.WebApi;
using System;
using System.Collections.Generic;
using System.IO;
using System.Linq;
using System.Security.Cryptography;
using System.Threading.Tasks;
using System.Runtime.InteropServices;
using GitHub.Runner.Common;
using GitHub.Runner.Sdk;
using System.Net.Http;
using System.Net.Http.Headers;
using System.Runtime.InteropServices;
using System.Security.Cryptography;
using System.Threading.Tasks;
namespace GitHub.Runner.Listener.Configuration
{
@@ -109,7 +108,7 @@ namespace GitHub.Runner.Listener.Configuration
{
runnerSettings.GitHubUrl = inputUrl;
var githubToken = command.GetRunnerRegisterToken();
GitHubAuthResult authResult = await GetTenantCredential(inputUrl, githubToken);
GitHubAuthResult authResult = await GetTenantCredential(inputUrl, githubToken, Constants.RunnerEvent.Register);
runnerSettings.ServerUrl = authResult.TenantUrl;
creds = authResult.ToVssCredentials();
Trace.Info("cred retrieved via GitHub auth");
@@ -277,12 +276,15 @@ namespace GitHub.Runner.Listener.Configuration
throw new NotSupportedException("Message queue listen OAuth token.");
}
// Testing agent connection, detect any protential connection issue, like local clock skew that cause OAuth token expired.
// Testing agent connection, detect any potential connection issue, like local clock skew that cause OAuth token expired.
var credMgr = HostContext.GetService<ICredentialManager>();
VssCredentials credential = credMgr.LoadCredentials();
try
{
await _runnerServer.ConnectAsync(new Uri(runnerSettings.ServerUrl), credential);
// ConnectAsync() hits _apis/connectionData which is an anonymous endpoint
// Need to hit an authenticate endpoint to trigger OAuth token exchange.
await _runnerServer.GetAgentPoolsAsync();
_term.WriteSuccessMessage("Runner connection is good");
}
catch (VssOAuthTokenRequestException ex) when (ex.Message.Contains("Current server time is"))
@@ -373,7 +375,7 @@ namespace GitHub.Runner.Listener.Configuration
else
{
var githubToken = command.GetRunnerDeletionToken();
GitHubAuthResult authResult = await GetTenantCredential(settings.GitHubUrl, githubToken);
GitHubAuthResult authResult = await GetTenantCredential(settings.GitHubUrl, githubToken, Constants.RunnerEvent.Remove);
creds = authResult.ToVssCredentials();
Trace.Info("cred retrieved via GitHub auth");
}
@@ -517,17 +519,23 @@ namespace GitHub.Runner.Listener.Configuration
}
}
private async Task<GitHubAuthResult> GetTenantCredential(string githubUrl, string githubToken)
private async Task<GitHubAuthResult> GetTenantCredential(string githubUrl, string githubToken, string runnerEvent)
{
var gitHubUrl = new UriBuilder(githubUrl);
var githubApiUrl = $"https://api.{gitHubUrl.Host}/repos/{gitHubUrl.Path.Trim('/')}/actions-runners/registration";
var gitHubUrlBuilder = new UriBuilder(githubUrl);
var githubApiUrl = $"{gitHubUrlBuilder.Scheme}://api.{gitHubUrlBuilder.Host}/actions/runner-registration";
using (var httpClientHandler = HostContext.CreateHttpClientHandler())
using (var httpClient = new HttpClient(httpClientHandler))
{
httpClient.DefaultRequestHeaders.Authorization = new AuthenticationHeaderValue("RemoteAuth", githubToken);
httpClient.DefaultRequestHeaders.UserAgent.Add(HostContext.UserAgent);
httpClient.DefaultRequestHeaders.Accept.Add(new MediaTypeWithQualityHeaderValue("application/vnd.github.shuri-preview+json"));
var response = await httpClient.PostAsync(githubApiUrl, new StringContent("", null, "application/json"));
var bodyObject = new Dictionary<string, string>()
{
{"url", githubUrl},
{"runner_event", runnerEvent}
};
var response = await httpClient.PostAsync(githubApiUrl, new StringContent(StringUtil.ConvertToJson(bodyObject), null, "application/json"));
if (response.IsSuccessStatusCode)
{

View File

@@ -678,6 +678,17 @@ namespace GitHub.Runner.Listener.Configuration
if (service != null)
{
service.Start();
try
{
_term.WriteLine("Waiting for service to start...");
service.WaitForStatus(ServiceControllerStatus.Running, TimeSpan.FromSeconds(60));
}
catch (System.ServiceProcess.TimeoutException)
{
throw new InvalidOperationException($"Cannot start the service {serviceName} in a timely fashion.");
}
_term.WriteLine($"Service {serviceName} started successfully");
}
else

View File

@@ -568,6 +568,10 @@ namespace GitHub.Runner.Listener
{
Trace.Info("worker process has been killed.");
}
// When worker doesn't exit within cancel timeout, the runner will kill the worker process and worker won't finish upload job logs.
// The runner will try to upload these logs at this time.
await TryUploadUnfinishedLogs(message);
}
Trace.Info($"finish job request for job {message.JobId} with result: {resultOnAbandonOrCancel}");
@@ -712,6 +716,121 @@ namespace GitHub.Runner.Listener
}
}
// Best effort upload any logs for this job.
private async Task TryUploadUnfinishedLogs(Pipelines.AgentJobRequestMessage message)
{
Trace.Entering();
var logFolder = Path.Combine(HostContext.GetDirectory(WellKnownDirectory.Diag), PagingLogger.PagingFolder);
if (!Directory.Exists(logFolder))
{
return;
}
var logs = Directory.GetFiles(logFolder);
if (logs.Length == 0)
{
return;
}
try
{
var systemConnection = message.Resources.Endpoints.SingleOrDefault(x => string.Equals(x.Name, WellKnownServiceEndpointNames.SystemVssConnection));
ArgUtil.NotNull(systemConnection, nameof(systemConnection));
var jobServer = HostContext.GetService<IJobServer>();
VssCredentials jobServerCredential = VssUtil.GetVssCredential(systemConnection);
VssConnection jobConnection = VssUtil.CreateConnection(systemConnection.Url, jobServerCredential);
await jobServer.ConnectAsync(jobConnection);
var timeline = await jobServer.GetTimelineAsync(message.Plan.ScopeIdentifier, message.Plan.PlanType, message.Plan.PlanId, message.Timeline.Id, CancellationToken.None);
var updatedRecords = new List<TimelineRecord>();
var logPages = new Dictionary<Guid, Dictionary<int, string>>();
var logRecords = new Dictionary<Guid, TimelineRecord>();
foreach (var log in logs)
{
var logName = Path.GetFileNameWithoutExtension(log);
var logNameParts = logName.Split('_', StringSplitOptions.RemoveEmptyEntries);
if (logNameParts.Length != 3)
{
Trace.Warning($"log file '{log}' doesn't follow naming convension 'GUID_GUID_INT'.");
continue;
}
var logPageSeperator = logName.IndexOf('_');
var logRecordId = Guid.Empty;
var pageNumber = 0;
if (!Guid.TryParse(logNameParts[0], out Guid timelineId) || timelineId != timeline.Id)
{
Trace.Warning($"log file '{log}' is not belongs to current job");
continue;
}
if (!Guid.TryParse(logNameParts[1], out logRecordId))
{
Trace.Warning($"log file '{log}' doesn't follow naming convension 'GUID_GUID_INT'.");
continue;
}
if (!int.TryParse(logNameParts[2], out pageNumber))
{
Trace.Warning($"log file '{log}' doesn't follow naming convension 'GUID_GUID_INT'.");
continue;
}
var record = timeline.Records.FirstOrDefault(x => x.Id == logRecordId);
if (record != null)
{
if (!logPages.ContainsKey(record.Id))
{
logPages[record.Id] = new Dictionary<int, string>();
logRecords[record.Id] = record;
}
logPages[record.Id][pageNumber] = log;
}
}
foreach (var pages in logPages)
{
var record = logRecords[pages.Key];
if (record.Log == null)
{
// Create the log
record.Log = await jobServer.CreateLogAsync(message.Plan.ScopeIdentifier, message.Plan.PlanType, message.Plan.PlanId, new TaskLog(String.Format(@"logs\{0:D}", record.Id)), default(CancellationToken));
// Need to post timeline record updates to reflect the log creation
updatedRecords.Add(record.Clone());
}
for (var i = 1; i <= pages.Value.Count; i++)
{
var logFile = pages.Value[i];
// Upload the contents
using (FileStream fs = File.Open(logFile, FileMode.Open, FileAccess.Read, FileShare.ReadWrite))
{
var logUploaded = await jobServer.AppendLogContentAsync(message.Plan.ScopeIdentifier, message.Plan.PlanType, message.Plan.PlanId, record.Log.Id, fs, default(CancellationToken));
}
Trace.Info($"Uploaded unfinished log '{logFile}' for current job.");
IOUtil.DeleteFile(logFile);
}
}
if (updatedRecords.Count > 0)
{
await jobServer.UpdateTimelineRecordsAsync(message.Plan.ScopeIdentifier, message.Plan.PlanType, message.Plan.PlanId, message.Timeline.Id, updatedRecords, CancellationToken.None);
}
}
catch (Exception ex)
{
// Ignore any error during log upload since it's best effort
Trace.Error(ex);
}
}
// TODO: We need send detailInfo back to DT in order to add an issue for the job
private async Task CompleteJobRequestAsync(int poolId, Pipelines.AgentJobRequestMessage message, Guid lockToken, TaskResult result, string detailInfo = null)
{

View File

@@ -451,16 +451,38 @@ namespace GitHub.Runner.Listener
ext = "sh";
#endif
_term.WriteLine($@"
Commands:,
.{separator}config.{ext} Configures the runner
.{separator}config.{ext} remove Unconfigures the runner
.{separator}run.{ext} Runs the runner interactively. Does not require any options.
Commands:
.{separator}config.{ext} Configures the runner
.{separator}config.{ext} remove Unconfigures the runner
.{separator}run.{ext} Runs the runner interactively. Does not require any options.
Options:
--help Prints the help for each command
--version Prints the runner version
--commit Prints the runner commit
--help Prints the help for each command
");
Config Options:
--unattended Disable interactive prompts for missing arguments. Defaults will be used for missing options
--url string Repository to add the runner to. Required if unattended
--token string Registration token. Required if unattended
--name string Name of the runner to configure (default {Environment.MachineName ?? "myrunner"})
--work string Relative runner work directory (default {Constants.Path.WorkDirectory})
--replace Replace any existing runner with the same name (default false)");
#if OS_WINDOWS
_term.WriteLine($@" --runasservice Run the runner as a service");
_term.WriteLine($@" --windowslogonaccount string Account to run the service as. Requires runasservice");
_term.WriteLine($@" --windowslogonpassword string Password for the service account. Requires runasservice");
#endif
_term.WriteLine($@"
Examples:
Configure a runner non-interactively:
.{separator}config.{ext} --unattended --url <url> --token <token>
Configure a runner non-interactively, replacing any existing runner with the same name:
.{separator}config.{ext} --unattended --url <url> --token <token> --replace [--name <name>]");
#if OS_WINDOWS
_term.WriteLine($@" Configure a runner to run as a service:");
_term.WriteLine($@" .{separator}config.{ext} --url <url> --token <token> --runasservice");
#endif
}
}
}

View File

@@ -1,58 +0,0 @@
using System;
using System.Collections.Generic;
using System.Threading;
using System.Threading.Tasks;
using GitHub.Runner.Sdk;
using GitHub.Services.WebApi;
using GitHub.Build.WebApi;
namespace GitHub.Runner.Plugins.Artifact
{
// A client wrapper interacting with Build's Artifact API
public class BuildServer
{
private readonly BuildHttpClient _buildHttpClient;
public BuildServer(VssConnection connection)
{
ArgUtil.NotNull(connection, nameof(connection));
_buildHttpClient = connection.GetClient<BuildHttpClient>();
}
// Associate the specified artifact with a build, along with custom data.
public async Task<BuildArtifact> AssociateArtifact(
Guid projectId,
int pipelineId,
string jobId,
string name,
string type,
string data,
Dictionary<string, string> propertiesDictionary,
CancellationToken cancellationToken = default(CancellationToken))
{
BuildArtifact artifact = new BuildArtifact()
{
Name = name,
Source = jobId,
Resource = new ArtifactResource()
{
Data = data,
Type = type,
Properties = propertiesDictionary
}
};
return await _buildHttpClient.CreateArtifactAsync(artifact, projectId, pipelineId, cancellationToken: cancellationToken);
}
// Get named artifact from a build
public async Task<BuildArtifact> GetArtifact(
Guid projectId,
int pipelineId,
string name,
CancellationToken cancellationToken)
{
return await _buildHttpClient.GetArtifactAsync(projectId, pipelineId, name, cancellationToken: cancellationToken);
}
}
}

View File

@@ -3,7 +3,6 @@ using System.Collections.Generic;
using System.IO;
using System.Threading;
using System.Threading.Tasks;
using GitHub.Build.WebApi;
using GitHub.Services.Common;
using GitHub.Runner.Sdk;
@@ -40,70 +39,31 @@ namespace GitHub.Runner.Plugins.Artifact
targetPath = Path.IsPathFullyQualified(targetPath) ? targetPath : Path.GetFullPath(Path.Combine(defaultWorkingDirectory, targetPath));
// Project ID
Guid projectId = new Guid(context.Variables.GetValueOrDefault(BuildVariables.TeamProjectId)?.Value ?? Guid.Empty.ToString());
// Build ID
string buildIdStr = context.Variables.GetValueOrDefault(BuildVariables.BuildId)?.Value ?? string.Empty;
string buildIdStr = context.Variables.GetValueOrDefault(SdkConstants.Variables.Build.BuildId)?.Value ?? string.Empty;
if (!int.TryParse(buildIdStr, out int buildId))
{
throw new ArgumentException($"Run Id is not an Int32: {buildIdStr}");
}
// Determine whether to call Pipelines or Build endpoint to publish artifact based on variable setting
string usePipelinesArtifactEndpointVar = context.Variables.GetValueOrDefault("Runner.UseActionsArtifactsApis")?.Value;
bool.TryParse(usePipelinesArtifactEndpointVar, out bool usePipelinesArtifactEndpoint);
string containerPath;
long containerId;
context.Output($"Downloading artifact '{artifactName}' to: '{targetPath}'");
if (usePipelinesArtifactEndpoint)
// Definition ID is a dummy value only used by HTTP client routing purposes
int definitionId = 1;
var pipelinesHelper = new PipelinesServer(context.VssConnection);
var actionsStorageArtifact = await pipelinesHelper.GetActionsStorageArtifact(definitionId, buildId, artifactName, token);
if (actionsStorageArtifact == null)
{
context.Debug("Downloading artifact using v2 endpoint");
// Definition ID is a dummy value only used by HTTP client routing purposes
int definitionId = 1;
var pipelinesHelper = new PipelinesServer(context.VssConnection);
var actionsStorageArtifact = await pipelinesHelper.GetActionsStorageArtifact(definitionId, buildId, artifactName, token);
if (actionsStorageArtifact == null)
{
throw new Exception($"The actions storage artifact for '{artifactName}' could not be found, or is no longer available");
}
containerPath = actionsStorageArtifact.Name; // In actions storage artifacts, name equals the path
containerId = actionsStorageArtifact.ContainerId;
}
else
{
context.Debug("Downloading artifact using v1 endpoint");
BuildServer buildHelper = new BuildServer(context.VssConnection);
BuildArtifact buildArtifact = await buildHelper.GetArtifact(projectId, buildId, artifactName, token);
if (string.Equals(buildArtifact.Resource.Type, "Container", StringComparison.OrdinalIgnoreCase) ||
// Artifact was published by Pipelines endpoint, check new type here to handle rollback scenario
string.Equals(buildArtifact.Resource.Type, "Actions_Storage", StringComparison.OrdinalIgnoreCase))
{
string containerUrl = buildArtifact.Resource.Data;
string[] parts = containerUrl.Split(new[] { '/' }, 3);
if (parts.Length < 3 || !long.TryParse(parts[1], out containerId))
{
throw new ArgumentOutOfRangeException($"Invalid container url '{containerUrl}' for artifact '{buildArtifact.Name}'");
}
containerPath = parts[2];
}
else
{
throw new NotSupportedException($"Invalid artifact type: {buildArtifact.Resource.Type}");
}
throw new Exception($"The actions storage artifact for '{artifactName}' could not be found, or is no longer available");
}
FileContainerServer fileContainerServer = new FileContainerServer(context.VssConnection, projectId, containerId, containerPath);
string containerPath = actionsStorageArtifact.Name; // In actions storage artifacts, name equals the path
long containerId = actionsStorageArtifact.ContainerId;
FileContainerServer fileContainerServer = new FileContainerServer(context.VssConnection, projectId: new Guid(), containerId, containerPath);
await fileContainerServer.DownloadFromContainerAsync(context, targetPath, token);
context.Output("Artifact download finished.");

View File

@@ -4,9 +4,7 @@ using System.IO;
using System.Linq;
using System.Threading;
using System.Threading.Tasks;
using GitHub.Build.WebApi;
using GitHub.Services.Common;
using GitHub.DistributedTask.WebApi;
using GitHub.Runner.Sdk;
namespace GitHub.Runner.Plugins.Artifact
@@ -45,11 +43,8 @@ namespace GitHub.Runner.Plugins.Artifact
throw new ArgumentException($"Artifact name is not valid: {artifactName}. It cannot contain '\\', '/', \"', ':', '<', '>', '|', '*', and '?'");
}
// Project ID
Guid projectId = new Guid(context.Variables.GetValueOrDefault(BuildVariables.TeamProjectId)?.Value ?? Guid.Empty.ToString());
// Build ID
string buildIdStr = context.Variables.GetValueOrDefault(BuildVariables.BuildId)?.Value ?? string.Empty;
string buildIdStr = context.Variables.GetValueOrDefault(SdkConstants.Variables.Build.BuildId)?.Value ?? string.Empty;
if (!int.TryParse(buildIdStr, out int buildId))
{
throw new ArgumentException($"Run Id is not an Int32: {buildIdStr}");
@@ -65,7 +60,7 @@ namespace GitHub.Runner.Plugins.Artifact
}
// Container ID
string containerIdStr = context.Variables.GetValueOrDefault(BuildVariables.ContainerId)?.Value ?? string.Empty;
string containerIdStr = context.Variables.GetValueOrDefault(SdkConstants.Variables.Build.ContainerId)?.Value ?? string.Empty;
if (!long.TryParse(containerIdStr, out long containerId))
{
throw new ArgumentException($"Container Id is not an Int64: {containerIdStr}");
@@ -73,7 +68,7 @@ namespace GitHub.Runner.Plugins.Artifact
context.Output($"Uploading artifact '{artifactName}' from '{fullPath}' for run #{buildId}");
FileContainerServer fileContainerHelper = new FileContainerServer(context.VssConnection, projectId, containerId, artifactName);
FileContainerServer fileContainerHelper = new FileContainerServer(context.VssConnection, projectId: Guid.Empty, containerId, artifactName);
var propertiesDictionary = new Dictionary<string, string>(StringComparer.OrdinalIgnoreCase);
long size = 0;
@@ -89,38 +84,20 @@ namespace GitHub.Runner.Plugins.Artifact
// if any of the results were successful, make sure to attach them to the build
finally
{
// Determine whether to call Pipelines or Build endpoint to publish artifact based on variable setting
string usePipelinesArtifactEndpointVar = context.Variables.GetValueOrDefault("Runner.UseActionsArtifactsApis")?.Value;
bool.TryParse(usePipelinesArtifactEndpointVar, out bool usePipelinesArtifactEndpoint);
// Definition ID is a dummy value only used by HTTP client routing purposes
int definitionId = 1;
if (usePipelinesArtifactEndpoint)
{
// Definition ID is a dummy value only used by HTTP client routing purposes
int definitionId = 1;
PipelinesServer pipelinesHelper = new PipelinesServer(context.VssConnection);
PipelinesServer pipelinesHelper = new PipelinesServer(context.VssConnection);
var artifact = await pipelinesHelper.AssociateActionsStorageArtifactAsync(
definitionId,
buildId,
containerId,
artifactName,
size,
token);
var artifact = await pipelinesHelper.AssociateActionsStorageArtifactAsync(
definitionId,
buildId,
containerId,
artifactName,
size,
token);
context.Output($"Associated artifact {artifactName} ({artifact.ContainerId}) with run #{buildId}");
context.Debug($"Associated artifact using v2 endpoint");
}
else
{
string fileContainerFullPath = StringUtil.Format($"#/{containerId}/{artifactName}");
BuildServer buildHelper = new BuildServer(context.VssConnection);
string jobId = context.Variables.GetValueOrDefault(WellKnownDistributedTaskVariables.JobId).Value ?? string.Empty;
var artifact = await buildHelper.AssociateArtifact(projectId, buildId, jobId, artifactName, ArtifactResourceTypes.Container, fileContainerFullPath, propertiesDictionary, token);
context.Output($"Associated artifact {artifactName} ({artifact.Id}) with run #{buildId}");
context.Debug($"Associated artifact using v1 endpoint");
}
context.Output($"Associated artifact {artifactName} ({artifact.ContainerId}) with run #{buildId}");
}
}
}

View File

@@ -21,6 +21,7 @@ namespace GitHub.Runner.Sdk
private string _httpsProxyAddress;
private string _httpsProxyUsername;
private string _httpsProxyPassword;
private string _noProxyString;
private readonly List<ByPassInfo> _noProxyList = new List<ByPassInfo>();
private readonly HashSet<string> _noProxyUnique = new HashSet<string>(StringComparer.OrdinalIgnoreCase);
@@ -33,6 +34,7 @@ namespace GitHub.Runner.Sdk
public string HttpsProxyAddress => _httpsProxyAddress;
public string HttpsProxyUsername => _httpsProxyUsername;
public string HttpsProxyPassword => _httpsProxyPassword;
public string NoProxyString => _noProxyString;
public List<ByPassInfo> NoProxyList => _noProxyList;
@@ -71,6 +73,10 @@ namespace GitHub.Runner.Sdk
{
_httpProxyAddress = proxyHttpUri.AbsoluteUri;
// Set both environment variables since there are tools support both casing (curl, wget) and tools support only one casing (docker)
Environment.SetEnvironmentVariable("HTTP_PROXY", _httpProxyAddress);
Environment.SetEnvironmentVariable("http_proxy", _httpProxyAddress);
// the proxy url looks like http://[user:pass@]127.0.0.1:8888
var userInfo = Uri.UnescapeDataString(proxyHttpUri.UserInfo).Split(':', 2, StringSplitOptions.RemoveEmptyEntries);
if (userInfo.Length == 2)
@@ -97,6 +103,10 @@ namespace GitHub.Runner.Sdk
{
_httpsProxyAddress = proxyHttpsUri.AbsoluteUri;
// Set both environment variables since there are tools support both casing (curl, wget) and tools support only one casing (docker)
Environment.SetEnvironmentVariable("HTTPS_PROXY", _httpsProxyAddress);
Environment.SetEnvironmentVariable("https_proxy", _httpsProxyAddress);
// the proxy url looks like http://[user:pass@]127.0.0.1:8888
var userInfo = Uri.UnescapeDataString(proxyHttpsUri.UserInfo).Split(':', 2, StringSplitOptions.RemoveEmptyEntries);
if (userInfo.Length == 2)
@@ -121,6 +131,12 @@ namespace GitHub.Runner.Sdk
if (!string.IsNullOrEmpty(noProxyList))
{
_noProxyString = noProxyList;
// Set both environment variables since there are tools support both casing (curl, wget) and tools support only one casing (docker)
Environment.SetEnvironmentVariable("NO_PROXY", noProxyList);
Environment.SetEnvironmentVariable("no_proxy", noProxyList);
var noProxyListSplit = noProxyList.Split(',', StringSplitOptions.RemoveEmptyEntries);
foreach (string noProxy in noProxyListSplit)
{

View File

@@ -0,0 +1,19 @@
using System;
namespace GitHub.Runner.Sdk
{
public class SdkConstants
{
public static class Variables
{
public static class Build
{
// Legacy "build" variables historically used by the runner
// DO NOT add new variables here -- instead use either the Actions or Runner namespaces
public const String BuildId = "build.buildId";
public const String BuildNumber = "build.buildNumber";
public const String ContainerId = "build.containerId";
}
}
}
}

View File

@@ -1,6 +1,7 @@
using GitHub.DistributedTask.Pipelines;
using GitHub.DistributedTask.WebApi;
using GitHub.Runner.Common.Util;
using GitHub.Runner.Worker.Container;
using System;
using System.Collections.Generic;
using System.IO;
@@ -15,14 +16,14 @@ namespace GitHub.Runner.Worker
{
void EnablePluginInternalCommand();
void DisablePluginInternalCommand();
bool TryProcessCommand(IExecutionContext context, string input);
bool TryProcessCommand(IExecutionContext context, string input, ContainerInfo container);
}
public sealed class ActionCommandManager : RunnerService, IActionCommandManager
{
private const string _stopCommand = "stop-commands";
private readonly Dictionary<string, IActionCommandExtension> _commandExtensions = new Dictionary<string, IActionCommandExtension>(StringComparer.OrdinalIgnoreCase);
private HashSet<string> _registeredCommands = new HashSet<string>(StringComparer.OrdinalIgnoreCase);
private readonly HashSet<string> _registeredCommands = new HashSet<string>(StringComparer.OrdinalIgnoreCase);
private readonly object _commandSerializeLock = new object();
private bool _stopProcessCommand = false;
private string _stopToken = null;
@@ -58,7 +59,7 @@ namespace GitHub.Runner.Worker
_registeredCommands.Remove("internal-set-repo-path");
}
public bool TryProcessCommand(IExecutionContext context, string input)
public bool TryProcessCommand(IExecutionContext context, string input, ContainerInfo container)
{
if (string.IsNullOrEmpty(input))
{
@@ -114,7 +115,7 @@ namespace GitHub.Runner.Worker
try
{
extension.ProcessCommand(context, input, actionCommand);
extension.ProcessCommand(context, input, actionCommand, container);
}
catch (Exception ex)
{
@@ -140,7 +141,7 @@ namespace GitHub.Runner.Worker
string Command { get; }
bool OmitEcho { get; }
void ProcessCommand(IExecutionContext context, string line, ActionCommand command);
void ProcessCommand(IExecutionContext context, string line, ActionCommand command, ContainerInfo container);
}
public sealed class InternalPluginSetRepoPathCommandExtension : RunnerService, IActionCommandExtension
@@ -150,7 +151,7 @@ namespace GitHub.Runner.Worker
public Type ExtensionType => typeof(IActionCommandExtension);
public void ProcessCommand(IExecutionContext context, string line, ActionCommand command)
public void ProcessCommand(IExecutionContext context, string line, ActionCommand command, ContainerInfo container)
{
if (!command.Properties.TryGetValue(SetRepoPathCommandProperties.repoFullName, out string repoFullName) || string.IsNullOrEmpty(repoFullName))
{
@@ -180,7 +181,7 @@ namespace GitHub.Runner.Worker
public Type ExtensionType => typeof(IActionCommandExtension);
public void ProcessCommand(IExecutionContext context, string line, ActionCommand command)
public void ProcessCommand(IExecutionContext context, string line, ActionCommand command, ContainerInfo container)
{
if (!command.Properties.TryGetValue(SetEnvCommandProperties.Name, out string envName) || string.IsNullOrEmpty(envName))
{
@@ -205,7 +206,7 @@ namespace GitHub.Runner.Worker
public Type ExtensionType => typeof(IActionCommandExtension);
public void ProcessCommand(IExecutionContext context, string line, ActionCommand command)
public void ProcessCommand(IExecutionContext context, string line, ActionCommand command, ContainerInfo container)
{
if (!command.Properties.TryGetValue(SetOutputCommandProperties.Name, out string outputName) || string.IsNullOrEmpty(outputName))
{
@@ -229,7 +230,7 @@ namespace GitHub.Runner.Worker
public Type ExtensionType => typeof(IActionCommandExtension);
public void ProcessCommand(IExecutionContext context, string line, ActionCommand command)
public void ProcessCommand(IExecutionContext context, string line, ActionCommand command, ContainerInfo container)
{
if (!command.Properties.TryGetValue(SaveStateCommandProperties.Name, out string stateName) || string.IsNullOrEmpty(stateName))
{
@@ -253,7 +254,7 @@ namespace GitHub.Runner.Worker
public Type ExtensionType => typeof(IActionCommandExtension);
public void ProcessCommand(IExecutionContext context, string line, ActionCommand command)
public void ProcessCommand(IExecutionContext context, string line, ActionCommand command, ContainerInfo container)
{
if (string.IsNullOrWhiteSpace(command.Data))
{
@@ -279,7 +280,7 @@ namespace GitHub.Runner.Worker
public Type ExtensionType => typeof(IActionCommandExtension);
public void ProcessCommand(IExecutionContext context, string line, ActionCommand command)
public void ProcessCommand(IExecutionContext context, string line, ActionCommand command, ContainerInfo container)
{
ArgUtil.NotNullOrEmpty(command.Data, "path");
context.PrependPath.RemoveAll(x => string.Equals(x, command.Data, StringComparison.CurrentCulture));
@@ -287,6 +288,30 @@ namespace GitHub.Runner.Worker
}
}
public sealed class RefreshTokenCommandExtension : RunnerService, IActionCommandExtension
{
public string Command => "refresh-token";
public bool OmitEcho => false;
public Type ExtensionType => typeof(IActionCommandExtension);
public void ProcessCommand(IExecutionContext context, string line, ActionCommand command, ContainerInfo container)
{
ArgUtil.NotNullOrEmpty(command.Data, "token file");
var runnerTemp = HostContext.GetDirectory(WellKnownDirectory.Temp);
var tempFile = Path.Combine(runnerTemp, command.Data);
if (!tempFile.StartsWith(runnerTemp + Path.DirectorySeparatorChar))
{
throw new Exception($"'{command.Data}' has to be under runner temp directory '{runnerTemp}'");
}
Trace.Info("Here");
var githubToken = context.GetGitHubToken().GetAwaiter().GetResult();
File.WriteAllText(tempFile, githubToken);
Trace.Info("HereAgain");
}
}
public sealed class AddMatcherCommandExtension : RunnerService, IActionCommandExtension
{
public string Command => "add-matcher";
@@ -294,7 +319,7 @@ namespace GitHub.Runner.Worker
public Type ExtensionType => typeof(IActionCommandExtension);
public void ProcessCommand(IExecutionContext context, string line, ActionCommand command)
public void ProcessCommand(IExecutionContext context, string line, ActionCommand command, ContainerInfo container)
{
var file = command.Data;
@@ -306,9 +331,9 @@ namespace GitHub.Runner.Worker
}
// Translate file path back from container path
if (context.Container != null)
if (container != null)
{
file = context.Container.TranslateToHostPath(file);
file = container.TranslateToHostPath(file);
}
// Root the path
@@ -341,7 +366,7 @@ namespace GitHub.Runner.Worker
public Type ExtensionType => typeof(IActionCommandExtension);
public void ProcessCommand(IExecutionContext context, string line, ActionCommand command)
public void ProcessCommand(IExecutionContext context, string line, ActionCommand command, ContainerInfo container)
{
command.Properties.TryGetValue(RemoveMatcherCommandProperties.Owner, out string owner);
var file = command.Data;
@@ -369,9 +394,9 @@ namespace GitHub.Runner.Worker
else
{
// Translate file path back from container path
if (context.Container != null)
if (container != null)
{
file = context.Container.TranslateToHostPath(file);
file = container.TranslateToHostPath(file);
}
// Root the path
@@ -409,7 +434,7 @@ namespace GitHub.Runner.Worker
public Type ExtensionType => typeof(IActionCommandExtension);
public void ProcessCommand(IExecutionContext context, string inputLine, ActionCommand command)
public void ProcessCommand(IExecutionContext context, string inputLine, ActionCommand command, ContainerInfo container)
{
context.Debug(command.Data);
}
@@ -437,7 +462,7 @@ namespace GitHub.Runner.Worker
public Type ExtensionType => typeof(IActionCommandExtension);
public void ProcessCommand(IExecutionContext context, string inputLine, ActionCommand command)
public void ProcessCommand(IExecutionContext context, string inputLine, ActionCommand command, ContainerInfo container)
{
command.Properties.TryGetValue(IssueCommandProperties.File, out string file);
command.Properties.TryGetValue(IssueCommandProperties.Line, out string line);
@@ -454,10 +479,10 @@ namespace GitHub.Runner.Worker
{
issue.Category = "Code";
if (context.Container != null)
if (container != null)
{
// Translate file path back from container path
file = context.Container.TranslateToHostPath(file);
file = container.TranslateToHostPath(file);
command.Properties[IssueCommandProperties.File] = file;
}
@@ -517,7 +542,7 @@ namespace GitHub.Runner.Worker
public Type ExtensionType => typeof(IActionCommandExtension);
public void ProcessCommand(IExecutionContext context, string line, ActionCommand command)
public void ProcessCommand(IExecutionContext context, string line, ActionCommand command, ContainerInfo container)
{
var data = this is GroupCommandExtension ? command.Data : string.Empty;
context.Output($"##[{Command}]{data}");
@@ -531,7 +556,7 @@ namespace GitHub.Runner.Worker
public Type ExtensionType => typeof(IActionCommandExtension);
public void ProcessCommand(IExecutionContext context, string line, ActionCommand command)
public void ProcessCommand(IExecutionContext context, string line, ActionCommand command, ContainerInfo container)
{
ArgUtil.NotNullOrEmpty(command.Data, "value");

View File

@@ -21,11 +21,24 @@ using PipelineTemplateConstants = GitHub.DistributedTask.Pipelines.ObjectTemplat
namespace GitHub.Runner.Worker
{
public class PrepareResult
{
public PrepareResult(List<JobExtensionRunner> containerSetupSteps, Dictionary<Guid, IActionRunner> preStepTracker)
{
this.ContainerSetupSteps = containerSetupSteps;
this.PreStepTracker = preStepTracker;
}
public List<JobExtensionRunner> ContainerSetupSteps { get; set; }
public Dictionary<Guid, IActionRunner> PreStepTracker { get; set; }
}
[ServiceLocator(Default = typeof(ActionManager))]
public interface IActionManager : IRunnerService
{
Dictionary<Guid, ContainerInfo> CachedActionContainers { get; }
Task<List<JobExtensionRunner>> PrepareActionsAsync(IExecutionContext executionContext, IEnumerable<Pipelines.JobStep> steps);
Task<PrepareResult> PrepareActionsAsync(IExecutionContext executionContext, IEnumerable<Pipelines.JobStep> steps);
Definition LoadAction(IExecutionContext executionContext, Pipelines.ActionStep action);
}
@@ -33,13 +46,13 @@ namespace GitHub.Runner.Worker
{
private const int _defaultFileStreamBufferSize = 4096;
//81920 is the default used by System.IO.Stream.CopyTo and is under the large object heap threshold (85k).
//81920 is the default used by System.IO.Stream.CopyTo and is under the large object heap threshold (85k).
private const int _defaultCopyBufferSize = 81920;
private readonly Dictionary<Guid, ContainerInfo> _cachedActionContainers = new Dictionary<Guid, ContainerInfo>();
public Dictionary<Guid, ContainerInfo> CachedActionContainers => _cachedActionContainers;
public async Task<List<JobExtensionRunner>> PrepareActionsAsync(IExecutionContext executionContext, IEnumerable<Pipelines.JobStep> steps)
public async Task<PrepareResult> PrepareActionsAsync(IExecutionContext executionContext, IEnumerable<Pipelines.JobStep> steps)
{
ArgUtil.NotNull(executionContext, nameof(executionContext));
ArgUtil.NotNull(steps, nameof(steps));
@@ -49,6 +62,7 @@ namespace GitHub.Runner.Worker
Dictionary<string, List<Guid>> imagesToBuild = new Dictionary<string, List<Guid>>(StringComparer.OrdinalIgnoreCase);
Dictionary<string, ActionContainer> imagesToBuildInfo = new Dictionary<string, ActionContainer>(StringComparer.OrdinalIgnoreCase);
List<JobExtensionRunner> containerSetupSteps = new List<JobExtensionRunner>();
Dictionary<Guid, IActionRunner> preStepTracker = new Dictionary<Guid, IActionRunner>();
IEnumerable<Pipelines.ActionStep> actions = steps.OfType<Pipelines.ActionStep>();
// TODO: Depreciate the PREVIEW_ACTION_TOKEN
@@ -111,6 +125,21 @@ namespace GitHub.Runner.Worker
imagesToBuildInfo[setupInfo.ActionRepository] = setupInfo;
}
}
var repoAction = action.Reference as Pipelines.RepositoryPathReference;
if (repoAction.RepositoryType != Pipelines.PipelineConstants.SelfAlias)
{
var definition = LoadAction(executionContext, action);
if (definition.Data.Execution.HasInit)
{
var actionRunner = HostContext.CreateService<IActionRunner>();
actionRunner.Action = action;
actionRunner.Stage = ActionRunStage.Pre;
actionRunner.Condition = definition.Data.Execution.InitCondition;
preStepTracker[action.Id] = actionRunner;
}
}
}
}
@@ -147,7 +176,7 @@ namespace GitHub.Runner.Worker
}
#endif
return containerSetupSteps;
return new PrepareResult(containerSetupSteps, preStepTracker);
}
public Definition LoadAction(IExecutionContext executionContext, Pipelines.ActionStep action)
@@ -162,6 +191,7 @@ namespace GitHub.Runner.Worker
Data = new ActionDefinitionData()
};
// bool localAction = false;
if (action.Reference.Type == Pipelines.ActionSourceType.ContainerRegistry)
{
Trace.Info("Load action that reference container from registry.");
@@ -180,6 +210,7 @@ namespace GitHub.Runner.Worker
var repoAction = action.Reference as Pipelines.RepositoryPathReference;
if (string.Equals(repoAction.RepositoryType, Pipelines.PipelineConstants.SelfAlias, StringComparison.OrdinalIgnoreCase))
{
// localAction = true;
actionDirectory = executionContext.GetGitHubContext("workspace");
if (!string.IsNullOrEmpty(repoAction.Path))
{
@@ -198,14 +229,21 @@ namespace GitHub.Runner.Worker
Trace.Info($"Load action that reference repository from '{actionDirectory}'");
definition.Directory = actionDirectory;
string manifestFile = Path.Combine(actionDirectory, "action.yml");
string manifestFile = Path.Combine(actionDirectory, Constants.Path.ActionManifestYmlFile);
string manifestFileYaml = Path.Combine(actionDirectory, Constants.Path.ActionManifestYamlFile);
string dockerFile = Path.Combine(actionDirectory, "Dockerfile");
string dockerFileLowerCase = Path.Combine(actionDirectory, "dockerfile");
if (File.Exists(manifestFile))
if (File.Exists(manifestFile) || File.Exists(manifestFileYaml))
{
var manifestManager = HostContext.GetService<IActionManifestManager>();
definition.Data = manifestManager.Load(executionContext, manifestFile);
if (File.Exists(manifestFile))
{
definition.Data = manifestManager.Load(executionContext, manifestFile);
}
else
{
definition.Data = manifestManager.Load(executionContext, manifestFileYaml);
}
Trace.Verbose($"Action friendly name: '{definition.Data.Name}'");
Trace.Verbose($"Action description: '{definition.Data.Description}'");
@@ -232,6 +270,11 @@ namespace GitHub.Runner.Worker
Trace.Info($"Action container env: {StringUtil.ConvertToJson(containerAction.Environment)}.");
}
if (!string.IsNullOrEmpty(containerAction.Init))
{
Trace.Info($"Action container init entrypoint: {containerAction.Init}.");
}
if (!string.IsNullOrEmpty(containerAction.EntryPoint))
{
Trace.Info($"Action container entrypoint: {containerAction.EntryPoint}.");
@@ -251,6 +294,7 @@ namespace GitHub.Runner.Worker
else if (definition.Data.Execution.ExecutionType == ActionExecutionType.NodeJS)
{
var nodeAction = definition.Data.Execution as NodeJSActionExecutionData;
Trace.Info($"Action init node.js file: {nodeAction.Init ?? "N/A"}.");
Trace.Info($"Action node.js file: {nodeAction.Script}.");
Trace.Info($"Action cleanup node.js file: {nodeAction.Cleanup ?? "N/A"}.");
}
@@ -314,7 +358,7 @@ namespace GitHub.Runner.Worker
else
{
var fullPath = IOUtil.ResolvePath(actionDirectory, "."); // resolve full path without access filesystem.
throw new NotSupportedException($"Can't find 'action.yml' or 'Dockerfile' under '{fullPath}'. Did you forget to run actions/checkout before running your local action?");
throw new NotSupportedException($"Can't find 'action.yml', 'action.yaml' or 'Dockerfile' under '{fullPath}'. Did you forget to run actions/checkout before running your local action?");
}
}
else if (action.Reference.Type == Pipelines.ActionSourceType.Script)
@@ -477,7 +521,7 @@ namespace GitHub.Runner.Worker
int retryCount = 0;
// Allow up to 20 * 60s for any action to be downloaded from github graph.
// Allow up to 20 * 60s for any action to be downloaded from github graph.
int timeoutSeconds = 20 * 60;
while (retryCount < 3)
{
@@ -655,12 +699,21 @@ namespace GitHub.Runner.Worker
// find the docker file or action.yml file
var dockerFile = Path.Combine(actionEntryDirectory, "Dockerfile");
var dockerFileLowerCase = Path.Combine(actionEntryDirectory, "dockerfile");
var actionManifest = Path.Combine(actionEntryDirectory, "action.yml");
if (File.Exists(actionManifest))
var actionManifest = Path.Combine(actionEntryDirectory, Constants.Path.ActionManifestYmlFile);
var actionManifestYaml = Path.Combine(actionEntryDirectory, Constants.Path.ActionManifestYamlFile);
if (File.Exists(actionManifest) || File.Exists(actionManifestYaml))
{
executionContext.Debug($"action.yml for action: '{actionManifest}'.");
var manifestManager = HostContext.GetService<IActionManifestManager>();
var actionDefinitionData = manifestManager.Load(executionContext, actionManifest);
ActionDefinitionData actionDefinitionData = null;
if (File.Exists(actionManifest))
{
actionDefinitionData = manifestManager.Load(executionContext, actionManifest);
}
else
{
actionDefinitionData = manifestManager.Load(executionContext, actionManifestYaml);
}
if (actionDefinitionData.Execution.ExecutionType == ActionExecutionType.Container)
{
@@ -720,7 +773,7 @@ namespace GitHub.Runner.Worker
else
{
var fullPath = IOUtil.ResolvePath(actionEntryDirectory, "."); // resolve full path without access filesystem.
throw new InvalidOperationException($"Can't find 'action.yml' or 'Dockerfile' under '{fullPath}'. Did you forget to run actions/checkout before running your local action?");
throw new InvalidOperationException($"Can't find 'action.yml', 'action.yaml' or 'Dockerfile' under '{fullPath}'. Did you forget to run actions/checkout before running your local action?");
}
}
}
@@ -756,6 +809,8 @@ namespace GitHub.Runner.Worker
{
public override ActionExecutionType ExecutionType => ActionExecutionType.Container;
public override bool HasInit => !string.IsNullOrEmpty(Init);
public override bool HasMain => !string.IsNullOrEmpty(EntryPoint);
public override bool HasCleanup => !string.IsNullOrEmpty(Cleanup);
public string Image { get; set; }
@@ -766,6 +821,8 @@ namespace GitHub.Runner.Worker
public MappingToken Environment { get; set; }
public string Init { get; set; }
public string Cleanup { get; set; }
}
@@ -773,10 +830,14 @@ namespace GitHub.Runner.Worker
{
public override ActionExecutionType ExecutionType => ActionExecutionType.NodeJS;
public override bool HasInit => !string.IsNullOrEmpty(Init);
public override bool HasMain => !string.IsNullOrEmpty(Script);
public override bool HasCleanup => !string.IsNullOrEmpty(Cleanup);
public string Script { get; set; }
public string Init { get; set; }
public string Cleanup { get; set; }
}
@@ -784,6 +845,9 @@ namespace GitHub.Runner.Worker
{
public override ActionExecutionType ExecutionType => ActionExecutionType.Plugin;
public override bool HasInit => false;
public override bool HasMain => true;
public override bool HasCleanup => !string.IsNullOrEmpty(Cleanup);
public string Plugin { get; set; }
@@ -794,16 +858,20 @@ namespace GitHub.Runner.Worker
public sealed class ScriptActionExecutionData : ActionExecutionData
{
public override ActionExecutionType ExecutionType => ActionExecutionType.Script;
public override bool HasInit => false;
public override bool HasMain => true;
public override bool HasCleanup => false;
}
public abstract class ActionExecutionData
{
private string _initCondition = $"{Constants.Expressions.Always}()";
private string _cleanupCondition = $"{Constants.Expressions.Always}()";
public abstract ActionExecutionType ExecutionType { get; }
public abstract bool HasInit { get; }
public abstract bool HasMain { get; }
public abstract bool HasCleanup { get; }
public string CleanupCondition
@@ -811,6 +879,12 @@ namespace GitHub.Runner.Worker
get { return _cleanupCondition; }
set { _cleanupCondition = value; }
}
public string InitCondition
{
get { return _initCondition; }
set { _initCondition = value; }
}
}
public class ContainerSetupInfo

View File

@@ -116,7 +116,7 @@ namespace GitHub.Runner.Worker
if (actionDefinition.Execution == null)
{
executionContext.Debug($"Loaded action.yml file: {StringUtil.ConvertToJson(actionDefinition)}");
throw new ArgumentException($"Top level 'run:' section is required for {manifestFile}");
throw new ArgumentException($"Top level 'runs:' section is required for {manifestFile}");
}
else
{
@@ -280,6 +280,9 @@ namespace GitHub.Runner.Worker
var envToken = default(MappingToken);
var mainToken = default(StringToken);
var pluginToken = default(StringToken);
var preToken = default(StringToken);
var preEntrypointToken = default(StringToken);
var preIfToken = default(StringToken);
var postToken = default(StringToken);
var postEntrypointToken = default(StringToken);
var postIfToken = default(StringToken);
@@ -318,6 +321,15 @@ namespace GitHub.Runner.Worker
case "post-if":
postIfToken = run.Value.AssertString("post-if");
break;
case "pre":
preToken = run.Value.AssertString("pre");
break;
case "pre-entrypoint":
preEntrypointToken = run.Value.AssertString("pre-entrypoint");
break;
case "pre-if":
preIfToken = run.Value.AssertString("pre-if");
break;
default:
Trace.Info($"Ignore run property {runsKey}.");
break;
@@ -340,8 +352,10 @@ namespace GitHub.Runner.Worker
Arguments = argsToken,
EntryPoint = entrypointToken?.Value,
Environment = envToken,
Init = preEntrypointToken?.Value,
InitCondition = preIfToken?.Value ?? "always()",
Cleanup = postEntrypointToken?.Value,
CleanupCondition = postIfToken?.Value
CleanupCondition = postIfToken?.Value ?? "always()"
};
}
}
@@ -356,8 +370,10 @@ namespace GitHub.Runner.Worker
return new NodeJSActionExecutionData()
{
Script = mainToken.Value,
Init = preToken?.Value,
InitCondition = preIfToken?.Value ?? "always()",
Cleanup = postToken?.Value,
CleanupCondition = postIfToken?.Value
CleanupCondition = postIfToken?.Value ?? "always()"
};
}
}

View File

@@ -18,6 +18,7 @@ namespace GitHub.Runner.Worker
{
public enum ActionRunStage
{
Pre,
Main,
Post,
}
@@ -81,20 +82,18 @@ namespace GitHub.Runner.Worker
ActionExecutionData handlerData = definition.Data?.Execution;
ArgUtil.NotNull(handlerData, nameof(handlerData));
if (handlerData.HasInit &&
Action.Reference is Pipelines.RepositoryPathReference repoAction &&
string.Equals(repoAction.RepositoryType, Pipelines.PipelineConstants.SelfAlias, StringComparison.OrdinalIgnoreCase))
{
ExecutionContext.Warning($"`pre` execution is not supported for local action from '{repoAction.Path}'");
}
// The action has post cleanup defined.
// we need to create timeline record for them and add them to the step list that StepRunner is using
if (handlerData.HasCleanup && Stage == ActionRunStage.Main)
if (handlerData.HasCleanup && (Stage == ActionRunStage.Pre || Stage == ActionRunStage.Main))
{
string postDisplayName = null;
if (this.DisplayName.StartsWith(PipelineTemplateConstants.RunDisplayPrefix))
{
postDisplayName = $"Post {this.DisplayName.Substring(PipelineTemplateConstants.RunDisplayPrefix.Length)}";
}
else
{
postDisplayName = $"Post {this.DisplayName}";
}
string postDisplayName = $"Post {this.DisplayName}";
var repositoryReference = Action.Reference as RepositoryPathReference;
var pathString = string.IsNullOrEmpty(repositoryReference.Path) ? string.Empty : $"/{repositoryReference.Path}";
var repoString = string.IsNullOrEmpty(repositoryReference.Ref) ? $"{repositoryReference.Name}{pathString}" :
@@ -108,7 +107,7 @@ namespace GitHub.Runner.Worker
actionRunner.Condition = handlerData.CleanupCondition;
actionRunner.DisplayName = postDisplayName;
ExecutionContext.RegisterPostJobStep($"{actionRunner.Action.Name}_post", actionRunner);
ExecutionContext.RegisterPostJobStep(actionRunner);
}
IStepHost stepHost = HostContext.CreateService<IDefaultStepHost>();
@@ -179,17 +178,15 @@ namespace GitHub.Runner.Worker
ExecutionContext.Debug("Loading env");
var environment = new Dictionary<String, String>(VarUtil.EnvironmentVariableKeyComparer);
// Apply environment set using ##[set-env] first since these are job level env
foreach (var env in ExecutionContext.EnvironmentVariables)
#if OS_WINDOWS
var envContext = ExecutionContext.ExpressionValues["env"] as DictionaryContextData;
#else
var envContext = ExecutionContext.ExpressionValues["env"] as CaseSensitiveDictionaryContextData;
#endif
// Apply environment from env context, env context contains job level env and action's evn block
foreach (var env in envContext)
{
environment[env.Key] = env.Value ?? string.Empty;
}
// Apply action's env block later.
var actionEnvironment = templateEvaluator.EvaluateStepEnvironment(Action.Environment, ExecutionContext.ExpressionValues, VarUtil.EnvironmentVariableKeyComparer);
foreach (var env in actionEnvironment)
{
environment[env.Key] = env.Value ?? string.Empty;
environment[env.Key] = env.Value.ToString();
}
// Apply action's intra-action state at last

View File

@@ -2,9 +2,9 @@
using System.Collections.Generic;
using System.IO;
using GitHub.Runner.Common.Util;
using Pipelines = GitHub.DistributedTask.Pipelines;
using GitHub.Runner.Common;
using GitHub.Runner.Sdk;
using Pipelines = GitHub.DistributedTask.Pipelines;
namespace GitHub.Runner.Worker.Container
{
@@ -19,7 +19,6 @@ namespace GitHub.Runner.Worker.Container
public ContainerInfo()
{
}
public ContainerInfo(IHostContext hostContext, Pipelines.JobContainer container, bool isJobContainer = true, string networkAlias = null)
@@ -64,6 +63,8 @@ namespace GitHub.Runner.Worker.Container
UserMountVolumes[volume] = volume;
}
}
UpdateWebProxyEnv(hostContext.WebProxy);
}
public string ContainerId { get; set; }
@@ -223,6 +224,26 @@ namespace GitHub.Runner.Worker.Container
{
_pathMappings.Insert(0, new PathMapping(hostCommonPath, containerCommonPath));
}
private void UpdateWebProxyEnv(RunnerWebProxy webProxy)
{
// Set common forms of proxy variables if configured in Runner and not set directly by container.env
if (!String.IsNullOrEmpty(webProxy.HttpProxyAddress))
{
ContainerEnvironmentVariables.TryAdd("HTTP_PROXY", webProxy.HttpProxyAddress);
ContainerEnvironmentVariables.TryAdd("http_proxy", webProxy.HttpProxyAddress);
}
if (!String.IsNullOrEmpty(webProxy.HttpsProxyAddress))
{
ContainerEnvironmentVariables.TryAdd("HTTPS_PROXY", webProxy.HttpsProxyAddress);
ContainerEnvironmentVariables.TryAdd("https_proxy", webProxy.HttpsProxyAddress);
}
if (!String.IsNullOrEmpty(webProxy.NoProxyString))
{
ContainerEnvironmentVariables.TryAdd("NO_PROXY", webProxy.NoProxyString);
ContainerEnvironmentVariables.TryAdd("no_proxy", webProxy.NoProxyString);
}
}
}
public class MountVolume

View File

@@ -49,7 +49,7 @@ namespace GitHub.Runner.Worker
data: data);
executionContext.Debug($"Register post job cleanup for stopping/deleting containers.");
executionContext.RegisterPostJobStep(nameof(StopContainersAsync), postJobStep);
executionContext.RegisterPostJobStep(postJobStep);
// Check whether we are inside a container.
// Our container feature requires to map working directory from host to the container.

View File

@@ -20,6 +20,8 @@ using System.Text;
using System.Collections;
using ObjectTemplating = GitHub.DistributedTask.ObjectTemplating;
using Pipelines = GitHub.DistributedTask.Pipelines;
using GitHub.DistributedTask.Expressions2;
using System.Diagnostics;
namespace GitHub.Runner.Worker
{
@@ -97,7 +99,9 @@ namespace GitHub.Runner.Worker
// others
void ForceTaskComplete();
void RegisterPostJobStep(string refName, IStep step);
void RegisterPostJobStep(IStep step);
Task<string> GetGitHubToken();
Task UpdateGitHubTokenInContext();
}
public sealed class ExecutionContext : RunnerService, IExecutionContext
@@ -109,6 +113,7 @@ namespace GitHub.Runner.Worker
private readonly object _loggerLock = new object();
private readonly HashSet<string> _outputvariables = new HashSet<string>(StringComparer.OrdinalIgnoreCase);
private readonly object _matchersLock = new object();
public readonly List<Task> _getGitHubTokenTasks = new List<Task>();
private event OnMatcherChanged _onMatcherChanged;
@@ -129,6 +134,13 @@ namespace GitHub.Runner.Worker
// only job level ExecutionContext will track throttling delay.
private long _totalThrottlingDelayInMilliseconds = 0;
private Guid _jobId;
private Guid _scopeIdentifier;
private string _hubName;
private Guid _planId;
private Stopwatch _githubTokenExpireTimer = new Stopwatch();
public Guid Id => _record.Id;
public string ScopeName { get; private set; }
public string ContextName { get; private set; }
@@ -153,6 +165,9 @@ namespace GitHub.Runner.Worker
// Only job level ExecutionContext has PostJobSteps
public Stack<IStep> PostJobSteps { get; private set; }
// Only job level ExecutionContext has StepsWithPostRegisted
public HashSet<Guid> StepsWithPostRegisted { get; private set; }
public bool EchoOnActionCommand { get; set; }
@@ -238,9 +253,68 @@ namespace GitHub.Runner.Worker
});
}
public void RegisterPostJobStep(string refName, IStep step)
public Task<string> GetGitHubToken()
{
step.ExecutionContext = Root.CreatePostChild(step.DisplayName, refName, IntraActionState);
Trace.Info($"Try get new GITHUB_TOKEN");
return Root.RefreshGitHubToken();
}
public async Task UpdateGitHubTokenInContext()
{
try
{
await Root.EnsureGitHubToken();
}
catch (Exception ex)
{
this.Warning($"Fail to get a new GITHUB_TOKEN, error: {ex.Message}");
this.Debug(ex.ToString());
}
}
private async Task<string> RefreshGitHubToken()
{
Trace.Info("Request new GITHUB_TOKEN");
var jobServer = HostContext.GetService<IJobServer>();
var githubToken = await jobServer.RefreshGitHubTokenAsync(_scopeIdentifier, _hubName, _planId, _jobId, CancellationToken.None);
if (!string.IsNullOrEmpty(githubToken?.Token))
{
// register secret masker
Trace.Info("Register secret masker for new GITHUB_TOKEN");
HostContext.SecretMasker.AddValue(githubToken.Token);
return githubToken.Token;
}
else
{
throw new InvalidOperationException("Get empty GTIHUB_TOKEN.");
}
}
private async Task EnsureGitHubToken()
{
// needs to refresh GITHUB_TOKEN every 50 mins since the token is good for 60 min by default
if (_githubTokenExpireTimer.Elapsed.TotalMilliseconds > 10)
{
var githubToken = await this.RefreshGitHubToken();
var secretsContext = ExpressionValues["secrets"] as DictionaryContextData;
secretsContext["GITHUB_TOKEN"] = new StringContextData(githubToken);
var githubContext = ExpressionValues["github"] as GitHubContext;
githubContext["token"] = new StringContextData(githubToken);
_githubTokenExpireTimer.Restart();
}
}
public void RegisterPostJobStep(IStep step)
{
if (step is IActionRunner actionRunner && !Root.StepsWithPostRegisted.Add(actionRunner.Action.Id))
{
Trace.Info($"'post' of '{actionRunner.DisplayName}' already push to post step stack.");
return;
}
step.ExecutionContext = Root.CreatePostChild(step.DisplayName, ScopeName, ContextName, IntraActionState);
Root.PostJobSteps.Push(step);
}
@@ -568,6 +642,12 @@ namespace GitHub.Runner.Worker
}
}
// Expression functions
if (Variables.GetBoolean("System.HashFilesV2") == true)
{
ExpressionConstants.UpdateFunction<Handlers.HashFiles>("hashFiles", 1, byte.MaxValue);
}
// Expression values
if (message.ContextData?.Count > 0)
{
@@ -601,6 +681,7 @@ namespace GitHub.Runner.Worker
ExpressionValues["env"] = new CaseSensitiveDictionaryContextData();
#endif
_githubTokenExpireTimer.Start();
// Prepend Path
PrependPath = new List<string>();
@@ -611,6 +692,14 @@ namespace GitHub.Runner.Worker
// PostJobSteps for job ExecutionContext
PostJobSteps = new Stack<IStep>();
// StepsWithPostRegisted for job ExecutionContext
StepsWithPostRegisted = new HashSet<Guid>();
_scopeIdentifier = message.Plan.ScopeIdentifier;
_hubName = message.Plan.PlanType;
_jobId = message.JobId;
_planId = message.Plan.PlanId;
// Job timeline record.
InitializeTimelineRecord(
timelineId: message.Timeline.Id,
@@ -811,7 +900,7 @@ namespace GitHub.Runner.Worker
}
}
private IExecutionContext CreatePostChild(string displayName, string refName, Dictionary<string, string> intraActionState)
private IExecutionContext CreatePostChild(string displayName, string scopeName, string contextName, Dictionary<string, string> intraActionState)
{
if (!_expandedForPostJob)
{
@@ -820,7 +909,8 @@ namespace GitHub.Runner.Worker
_childTimelineRecordOrder = _childTimelineRecordOrder * 2;
}
return CreateChild(Guid.NewGuid(), displayName, refName, null, null, intraActionState, _childTimelineRecordOrder - Root.PostJobSteps.Count);
var newGuid = Guid.NewGuid();
return CreateChild(newGuid, displayName, newGuid.ToString("N"), scopeName, contextName, intraActionState, _childTimelineRecordOrder - Root.PostJobSteps.Count);
}
}

View File

@@ -0,0 +1,126 @@
using System;
using System.IO;
using GitHub.DistributedTask.Expressions2.Sdk;
using GitHub.DistributedTask.Pipelines.ContextData;
using GitHub.DistributedTask.Pipelines.ObjectTemplating;
using GitHub.Runner.Sdk;
using System.Reflection;
using System.Threading;
using System.Collections.Generic;
namespace GitHub.Runner.Worker.Handlers
{
public class FunctionTrace : ITraceWriter
{
private GitHub.DistributedTask.Expressions2.ITraceWriter _trace;
public FunctionTrace(GitHub.DistributedTask.Expressions2.ITraceWriter trace)
{
_trace = trace;
}
public void Info(string message)
{
_trace.Info(message);
}
public void Verbose(string message)
{
_trace.Info(message);
}
}
public sealed class HashFiles : Function
{
protected sealed override Object EvaluateCore(
EvaluationContext context,
out ResultMemory resultMemory)
{
resultMemory = null;
var templateContext = context.State as DistributedTask.ObjectTemplating.TemplateContext;
ArgUtil.NotNull(templateContext, nameof(templateContext));
templateContext.ExpressionValues.TryGetValue(PipelineTemplateConstants.GitHub, out var githubContextData);
ArgUtil.NotNull(githubContextData, nameof(githubContextData));
var githubContext = githubContextData as DictionaryContextData;
ArgUtil.NotNull(githubContext, nameof(githubContext));
githubContext.TryGetValue(PipelineTemplateConstants.Workspace, out var workspace);
var workspaceData = workspace as StringContextData;
ArgUtil.NotNull(workspaceData, nameof(workspaceData));
string githubWorkspace = workspaceData.Value;
bool followSymlink = false;
List<string> patterns = new List<string>();
var firstParameter = true;
foreach (var parameter in Parameters)
{
var parameterString = parameter.Evaluate(context).ConvertToString();
if (firstParameter)
{
firstParameter = false;
if (parameterString.StartsWith("--"))
{
if (string.Equals(parameterString, "--follow-symbolic-links", StringComparison.OrdinalIgnoreCase))
{
followSymlink = true;
continue;
}
else
{
throw new ArgumentOutOfRangeException($"Invalid glob option {parameterString}, avaliable option: '--follow-symbolic-links'.");
}
}
}
patterns.Add(parameterString);
}
context.Trace.Info($"Search root directory: '{githubWorkspace}'");
context.Trace.Info($"Search pattern: '{string.Join(", ", patterns)}'");
string binDir = Path.GetDirectoryName(Assembly.GetEntryAssembly().Location);
string runnerRoot = new DirectoryInfo(binDir).Parent.FullName;
string node = Path.Combine(runnerRoot, "externals", "node12", "bin", $"node{IOUtil.ExeExtension}");
string hashFilesScript = Path.Combine(binDir, "hashFiles");
var hashResult = string.Empty;
var p = new ProcessInvoker(new FunctionTrace(context.Trace));
p.ErrorDataReceived += ((_, data) =>
{
if (!string.IsNullOrEmpty(data.Data) && data.Data.StartsWith("__OUTPUT__") && data.Data.EndsWith("__OUTPUT__"))
{
hashResult = data.Data.Substring(10, data.Data.Length - 20);
context.Trace.Info($"Hash result: '{hashResult}'");
}
else
{
context.Trace.Info(data.Data);
}
});
p.OutputDataReceived += ((_, data) =>
{
context.Trace.Info(data.Data);
});
var env = new Dictionary<string, string>();
if (followSymlink)
{
env["followSymbolicLinks"] = "true";
}
env["patterns"] = string.Join(Environment.NewLine, patterns);
int exitCode = p.ExecuteAsync(workingDirectory: githubWorkspace,
fileName: node,
arguments: $"\"{hashFilesScript.Replace("\"", "\\\"")}\"",
environment: env,
requireExitCodeZero: false,
cancellationToken: new CancellationTokenSource(TimeSpan.FromSeconds(120)).Token).GetAwaiter().GetResult();
if (exitCode != 0)
{
throw new InvalidOperationException($"hashFiles('{ExpressionUtility.StringEscape(string.Join(", ", patterns))}') failed. Fail to hash files under directory '{githubWorkspace}'");
}
return hashResult;
}
}
}

View File

@@ -82,6 +82,10 @@ namespace GitHub.Runner.Worker.Handlers
container.ContainerEntryPoint = Inputs.GetValueOrDefault("entryPoint");
}
}
else if (stage == ActionRunStage.Pre)
{
container.ContainerEntryPoint = Data.Init;
}
else if (stage == ActionRunStage.Post)
{
container.ContainerEntryPoint = Data.Cleanup;
@@ -189,13 +193,13 @@ namespace GitHub.Runner.Worker.Handlers
container.ContainerEnvironmentVariables[variable.Key] = container.TranslateToContainerPath(variable.Value);
}
using (var stdoutManager = new OutputManager(ExecutionContext, ActionCommandManager))
using (var stderrManager = new OutputManager(ExecutionContext, ActionCommandManager))
using (var stdoutManager = new OutputManager(ExecutionContext, ActionCommandManager, container))
using (var stderrManager = new OutputManager(ExecutionContext, ActionCommandManager, container))
{
var runExitCode = await dockerManger.DockerRun(ExecutionContext, container, stdoutManager.OnDataReceived, stderrManager.OnDataReceived);
ExecutionContext.Debug($"Docker Action run completed with exit code {runExitCode}");
if (runExitCode != 0)
{
ExecutionContext.Error($"Docker run failed with exit code {runExitCode}");
ExecutionContext.Result = TaskResult.Failed;
}
}

View File

@@ -60,6 +60,10 @@ namespace GitHub.Runner.Worker.Handlers
{
target = Data.Script;
}
else if (stage == ActionRunStage.Pre)
{
target = Data.Init;
}
else if (stage == ActionRunStage.Post)
{
target = Data.Cleanup;
@@ -122,9 +126,9 @@ namespace GitHub.Runner.Worker.Handlers
else
{
var exitCode = await step;
ExecutionContext.Debug($"Node Action run completed with exit code {exitCode}");
if (exitCode != 0)
{
ExecutionContext.Error($"Node run failed with exit code {exitCode}");
ExecutionContext.Result = TaskResult.Failed;
}
}

View File

@@ -6,6 +6,7 @@ using System.Linq;
using System.Text.RegularExpressions;
using GitHub.Runner.Common;
using GitHub.Runner.Sdk;
using GitHub.Runner.Worker.Container;
using DTWebApi = GitHub.DistributedTask.WebApi;
namespace GitHub.Runner.Worker.Handlers
@@ -17,6 +18,7 @@ namespace GitHub.Runner.Worker.Handlers
private const string _timeoutKey = "GITHUB_ACTIONS_RUNNER_ISSUE_MATCHER_TIMEOUT";
private static readonly Regex _colorCodeRegex = new Regex(@"\x0033\[[0-9;]*m?", RegexOptions.Compiled | RegexOptions.CultureInvariant);
private readonly IActionCommandManager _commandManager;
private readonly ContainerInfo _container;
private readonly IExecutionContext _executionContext;
private readonly int _failsafe = 50;
private readonly object _matchersLock = new object();
@@ -25,10 +27,11 @@ namespace GitHub.Runner.Worker.Handlers
// Mapping that indicates whether a directory belongs to the workflow repository
private readonly Dictionary<string, string> _directoryMap = new Dictionary<string, string>();
public OutputManager(IExecutionContext executionContext, IActionCommandManager commandManager)
public OutputManager(IExecutionContext executionContext, IActionCommandManager commandManager, ContainerInfo container = null)
{
_executionContext = executionContext;
_commandManager = commandManager;
_container = container ?? executionContext.Container;
// Recursion failsafe (test override)
var failsafeString = Environment.GetEnvironmentVariable("RUNNER_TEST_GET_REPOSITORY_PATH_FAILSAFE");
@@ -81,7 +84,7 @@ namespace GitHub.Runner.Worker.Handlers
{
// This does not need to be inside of a critical section.
// The logging queues and command handlers are thread-safe.
if (_commandManager.TryProcessCommand(_executionContext, line))
if (_commandManager.TryProcessCommand(_executionContext, line, _container))
{
return;
}
@@ -257,6 +260,7 @@ namespace GitHub.Runner.Worker.Handlers
if (!string.IsNullOrWhiteSpace(match.File))
{
var file = match.File;
var translate = _container != null;
// Root using fromPath
if (!string.IsNullOrWhiteSpace(match.FromPath) && !Path.IsPathFullyQualified(file))
@@ -275,11 +279,19 @@ namespace GitHub.Runner.Worker.Handlers
ArgUtil.NotNullOrEmpty(workspace, "workspace");
file = Path.Combine(workspace, file);
translate = false;
}
// Remove relative pathing and normalize slashes
file = Path.GetFullPath(file);
// Translate to host
if (translate)
{
file = _container.TranslateToHostPath(file);
file = Path.GetFullPath(file);
}
// Check whether the file exists
if (File.Exists(file))
{

View File

@@ -116,8 +116,8 @@ namespace GitHub.Runner.Worker
// Download actions not already in the cache
Trace.Info("Downloading actions");
var actionManager = HostContext.GetService<IActionManager>();
var prepareSteps = await actionManager.PrepareActionsAsync(context, message.Steps);
preJobSteps.AddRange(prepareSteps);
var prepareResult = await actionManager.PrepareActionsAsync(context, message.Steps);
preJobSteps.AddRange(prepareResult.ContainerSetupSteps);
// Add start-container steps, record and stop-container steps
if (jobContext.Container != null || jobContext.ServiceContainers.Count > 0)
@@ -158,9 +158,23 @@ namespace GitHub.Runner.Worker
actionRunner.TryEvaluateDisplayName(contextData, context);
jobSteps.Add(actionRunner);
if (prepareResult.PreStepTracker.TryGetValue(step.Id, out var preStep))
{
Trace.Info($"Adding pre-{action.DisplayName}.");
preStep.TryEvaluateDisplayName(contextData, context);
preStep.DisplayName = $"Pre {preStep.DisplayName}";
preJobSteps.Add(preStep);
}
}
}
var intraActionStates = new Dictionary<Guid, Dictionary<string, string>>();
foreach (var preStep in prepareResult.PreStepTracker)
{
intraActionStates[preStep.Key] = new Dictionary<string, string>();
}
// Create execution context for pre-job steps
foreach (var step in preJobSteps)
{
@@ -171,6 +185,12 @@ namespace GitHub.Runner.Worker
Guid stepId = Guid.NewGuid();
extensionStep.ExecutionContext = jobContext.CreateChild(stepId, extensionStep.DisplayName, null, null, stepId.ToString("N"));
}
else if (step is IActionRunner actionStep)
{
ArgUtil.NotNull(actionStep, step.DisplayName);
Guid stepId = Guid.NewGuid();
actionStep.ExecutionContext = jobContext.CreateChild(stepId, actionStep.DisplayName, null, actionStep.Action.ScopeName, actionStep.Action.ContextName, intraActionStates[actionStep.Action.Id]);
}
}
// Create execution context for job steps
@@ -179,7 +199,14 @@ namespace GitHub.Runner.Worker
if (step is IActionRunner actionStep)
{
ArgUtil.NotNull(actionStep, step.DisplayName);
actionStep.ExecutionContext = jobContext.CreateChild(actionStep.Action.Id, actionStep.DisplayName, actionStep.Action.Name, actionStep.Action.ScopeName, actionStep.Action.ContextName);
if (intraActionStates.ContainsKey(actionStep.Action.Id))
{
actionStep.ExecutionContext = jobContext.CreateChild(actionStep.Action.Id, actionStep.DisplayName, actionStep.Action.Name, actionStep.Action.ScopeName, actionStep.Action.ContextName, intraActionStates[actionStep.Action.Id]);
}
else
{
actionStep.ExecutionContext = jobContext.CreateChild(actionStep.Action.Id, actionStep.DisplayName, actionStep.Action.Name, actionStep.Action.ScopeName, actionStep.Action.ContextName);
}
}
}

View File

@@ -107,6 +107,11 @@ namespace GitHub.Runner.Worker
return await CompleteJobAsync(jobServer, jobContext, message, TaskResult.Failed);
}
if (jobContext.WriteDebug)
{
jobContext.SetRunnerContext("debug", "1");
}
jobContext.SetRunnerContext("os", VarUtil.OS);
string toolsDirectory = HostContext.GetDirectory(WellKnownDirectory.Tools);

View File

@@ -75,16 +75,40 @@ namespace GitHub.Runner.Worker
// Start
step.ExecutionContext.Start();
// Set GITHUB_ACTION
if (step is IActionRunner actionStep)
{
step.ExecutionContext.SetGitHubContext("action", actionStep.Action.Name);
}
await step.ExecutionContext.UpdateGitHubTokenInContext();
// Initialize scope
if (InitializeScope(step, scopeInputs))
{
// Populate env context for each step
Trace.Info("Initialize Env context for step");
#if OS_WINDOWS
var envContext = new DictionaryContextData();
#else
var envContext = new CaseSensitiveDictionaryContextData();
#endif
step.ExecutionContext.ExpressionValues["env"] = envContext;
foreach (var pair in step.ExecutionContext.EnvironmentVariables)
{
envContext[pair.Key] = new StringContextData(pair.Value ?? string.Empty);
}
if (step is IActionRunner actionStep)
{
// Set GITHUB_ACTION
step.ExecutionContext.SetGitHubContext("action", actionStep.Action.Name);
// Evaluate and merge action's env block to env context
var templateTrace = step.ExecutionContext.ToTemplateTraceWriter();
var schema = new PipelineTemplateSchemaFactory().CreateSchema();
var templateEvaluator = new PipelineTemplateEvaluator(templateTrace, schema);
var actionEnvironment = templateEvaluator.EvaluateStepEnvironment(actionStep.Action.Environment, step.ExecutionContext.ExpressionValues, VarUtil.EnvironmentVariableKeyComparer);
foreach (var env in actionEnvironment)
{
envContext[env.Key] = new StringContextData(env.Value ?? string.Empty);
}
}
var expressionManager = HostContext.GetService<IExpressionManager>();
try
{

View File

@@ -2,7 +2,6 @@
using System.Collections.Concurrent;
using System.Collections.Generic;
using System.Linq;
using GitHub.Build.WebApi;
using GitHub.DistributedTask.WebApi;
using GitHub.DistributedTask.Logging;
using GitHub.DistributedTask.Pipelines.ContextData;
@@ -63,7 +62,7 @@ namespace GitHub.Runner.Worker
// DO NOT add file path variable to here.
// All file path variables needs to be retrive and set through ExecutionContext, so it can handle container file path translation.
public string Build_Number => Get(BuildVariables.BuildNumber);
public string Build_Number => Get(SdkConstants.Variables.Build.BuildNumber);
#if OS_WINDOWS
public bool Retain_Default_Encoding => false;

View File

@@ -43,6 +43,8 @@
"entrypoint": "non-empty-string",
"args": "container-runs-args",
"env": "container-runs-env",
"pre-entrypoint": "non-empty-string",
"pre-if": "non-empty-string",
"post-entrypoint": "non-empty-string",
"post-if": "non-empty-string"
}
@@ -67,6 +69,8 @@
"properties": {
"using": "non-empty-string",
"main": "non-empty-string",
"pre": "non-empty-string",
"pre-if": "non-empty-string",
"post": "non-empty-string",
"post-if": "non-empty-string"
}

View File

@@ -1,14 +0,0 @@
using System;
using GitHub.Services.Common;
namespace GitHub.Build.WebApi
{
public static class ArtifactResourceTypes
{
/// <summary>
/// Build container reference
/// E.g. #/2121/drop
/// </summary>
public const String Container = "Container";
}
}

View File

@@ -1,61 +0,0 @@
using System;
using System.Collections.Generic;
using System.Globalization;
using System.Linq;
using System.Net.Http;
using System.Threading;
using System.Threading.Tasks;
using GitHub.Services.Common;
using GitHub.Services.WebApi;
using GitHub.Services.WebApi.Patch;
using GitHub.Services.WebApi.Patch.Json;
namespace GitHub.Build.WebApi
{
public class BuildHttpClient : BuildHttpClientBase
{
static BuildHttpClient()
{
}
public BuildHttpClient(
Uri baseUrl,
VssCredentials credentials)
: base(baseUrl, credentials)
{
}
public BuildHttpClient(
Uri baseUrl,
VssCredentials credentials,
VssHttpRequestSettings settings)
: base(baseUrl, credentials, settings)
{
}
public BuildHttpClient(
Uri baseUrl,
VssCredentials credentials,
params DelegatingHandler[] handlers)
: base(baseUrl, credentials, handlers)
{
}
public BuildHttpClient(
Uri baseUrl,
VssCredentials credentials,
VssHttpRequestSettings settings,
params DelegatingHandler[] handlers)
: base(baseUrl, credentials, settings, handlers)
{
}
public BuildHttpClient(
Uri baseUrl,
HttpMessageHandler pipeline,
Boolean disposeHandler)
: base(baseUrl, pipeline, disposeHandler)
{
}
}
}

View File

@@ -1,10 +0,0 @@
using System;
namespace GitHub.Build.WebApi
{
public static class BuildResourceIds
{
public const String AreaId = "5D6898BB-45EC-463F-95F9-54D49C71752E";
public const String AreaName = "build";
}
}

View File

@@ -1,13 +0,0 @@
using System;
namespace GitHub.Build.WebApi
{
public static class BuildVariables
{
public const String TeamProjectId = "system.teamProjectId";
public const String BuildId = "build.buildId";
public const String BuildNumber = "build.buildNumber";
public const String ContainerId = "build.containerId";
}
}

View File

@@ -1,54 +0,0 @@
using System;
using System.Collections.Generic;
using System.Runtime.Serialization;
using GitHub.Services.WebApi;
namespace GitHub.Build.WebApi
{
[DataContract]
public class ArtifactResource : BaseSecuredObject
{
public ArtifactResource()
{
}
public ArtifactResource(
ISecuredObject securedObject)
: base(securedObject)
{
}
/// <summary>
/// The type of the resource: File container, version control folder, UNC path, etc.
/// </summary>
[DataMember(EmitDefaultValue = false)]
public String Type
{
get;
set;
}
/// <summary>
/// Type-specific data about the artifact.
/// </summary>
/// <remarks>
/// For example, "#/10002/5/drop", "$/drops/5", "\\myshare\myfolder\mydrops\5"
/// </remarks>
[DataMember(EmitDefaultValue = false)]
public String Data
{
get;
set;
}
/// <summary>
/// Type-specific properties of the artifact.
/// </summary>
[DataMember(IsRequired = false, EmitDefaultValue = false)]
public Dictionary<String, String> Properties
{
get;
set;
}
}
}

View File

@@ -1,63 +0,0 @@
using System;
using System.Runtime.Serialization;
using GitHub.Services.WebApi;
namespace GitHub.Build.WebApi
{
/// <summary>
/// Represents an artifact produced by a build.
/// </summary>
[DataContract]
public class BuildArtifact : BaseSecuredObject
{
public BuildArtifact()
{
}
internal BuildArtifact(
ISecuredObject securedObject)
: base(securedObject)
{
}
/// <summary>
/// The artifact ID.
/// </summary>
[DataMember]
public Int32 Id
{
get;
set;
}
/// <summary>
/// The name of the artifact.
/// </summary>
[DataMember]
public String Name
{
get;
set;
}
/// <summary>
/// The artifact source, which will be the ID of the job that produced this artifact.
/// </summary>
[DataMember]
public String Source
{
get;
set;
}
/// <summary>
/// The actual resource.
/// </summary>
[DataMember]
public ArtifactResource Resource
{
get;
set;
}
}
}

View File

@@ -1,107 +0,0 @@
/*
* ---------------------------------------------------------
* Copyright(C) Microsoft Corporation. All rights reserved.
* ---------------------------------------------------------
*/
using System;
using System.Collections.Generic;
using System.Net.Http;
using System.Threading;
using System.Threading.Tasks;
using GitHub.Services.Common;
using GitHub.Services.WebApi;
namespace GitHub.Build.WebApi
{
[ResourceArea(BuildResourceIds.AreaId)]
public abstract class BuildHttpClientBase : VssHttpClientBase
{
public BuildHttpClientBase(Uri baseUrl, VssCredentials credentials)
: base(baseUrl, credentials)
{
}
public BuildHttpClientBase(Uri baseUrl, VssCredentials credentials, VssHttpRequestSettings settings)
: base(baseUrl, credentials, settings)
{
}
public BuildHttpClientBase(Uri baseUrl, VssCredentials credentials, params DelegatingHandler[] handlers)
: base(baseUrl, credentials, handlers)
{
}
public BuildHttpClientBase(Uri baseUrl, VssCredentials credentials, VssHttpRequestSettings settings, params DelegatingHandler[] handlers)
: base(baseUrl, credentials, settings, handlers)
{
}
public BuildHttpClientBase(Uri baseUrl, HttpMessageHandler pipeline, bool disposeHandler)
: base(baseUrl, pipeline, disposeHandler)
{
}
/// <summary>
/// [Preview API] Associates an artifact with a build.
/// </summary>
/// <param name="artifact">The artifact.</param>
/// <param name="project">Project ID</param>
/// <param name="buildId">The ID of the build.</param>
/// <param name="userState"></param>
/// <param name="cancellationToken">The cancellation token to cancel operation.</param>
public virtual Task<BuildArtifact> CreateArtifactAsync(
BuildArtifact artifact,
Guid project,
int buildId,
object userState = null,
CancellationToken cancellationToken = default)
{
HttpMethod httpMethod = new HttpMethod("POST");
Guid locationId = new Guid("1db06c96-014e-44e1-ac91-90b2d4b3e984");
object routeValues = new { project = project, buildId = buildId };
HttpContent content = new ObjectContent<BuildArtifact>(artifact, new VssJsonMediaTypeFormatter(true));
return SendAsync<BuildArtifact>(
httpMethod,
locationId,
routeValues: routeValues,
version: new ApiResourceVersion(5.2, 5),
userState: userState,
cancellationToken: cancellationToken,
content: content);
}
/// <summary>
/// [Preview API] Gets a specific artifact for a build.
/// </summary>
/// <param name="project">Project ID</param>
/// <param name="buildId">The ID of the build.</param>
/// <param name="artifactName">The name of the artifact.</param>
/// <param name="userState"></param>
/// <param name="cancellationToken">The cancellation token to cancel operation.</param>
public virtual Task<BuildArtifact> GetArtifactAsync(
Guid project,
int buildId,
string artifactName,
object userState = null,
CancellationToken cancellationToken = default)
{
HttpMethod httpMethod = new HttpMethod("GET");
Guid locationId = new Guid("1db06c96-014e-44e1-ac91-90b2d4b3e984");
object routeValues = new { project = project, buildId = buildId };
List<KeyValuePair<string, string>> queryParams = new List<KeyValuePair<string, string>>();
queryParams.Add("artifactName", artifactName);
return SendAsync<BuildArtifact>(
httpMethod,
locationId,
routeValues: routeValues,
version: new ApiResourceVersion(5.2, 5),
queryParameters: queryParams,
userState: userState,
cancellationToken: cancellationToken);
}
}
}

View File

@@ -5,7 +5,7 @@ using GitHub.DistributedTask.Expressions2.Sdk.Functions;
namespace GitHub.DistributedTask.Expressions2
{
internal static class ExpressionConstants
public static class ExpressionConstants
{
static ExpressionConstants()
{
@@ -24,6 +24,12 @@ namespace GitHub.DistributedTask.Expressions2
WellKnownFunctions.Add(name, new FunctionInfo<T>(name, minParameters, maxParameters));
}
public static void UpdateFunction<T>(String name, Int32 minParameters, Int32 maxParameters)
where T : Function, new()
{
WellKnownFunctions[name] = new FunctionInfo<T>(name, minParameters, maxParameters);
}
internal static readonly String False = "false";
internal static readonly String Infinity = "Infinity";
internal static readonly Int32 MaxDepth = 50;

View File

@@ -317,5 +317,40 @@ namespace GitHub.DistributedTask.WebApi
userState: userState,
cancellationToken: cancellationToken);
}
/// <summary>
/// [Preview API]
/// </summary>
/// <param name="scopeIdentifier">The project GUID to scope the request</param>
/// <param name="hubName">The name of the server hub: "build" for the Build server or "rm" for the Release Management server</param>
/// <param name="planId"></param>
/// <param name="jobId"></param>
/// <param name="userState"></param>
/// <param name="cancellationToken">The cancellation token to cancel operation.</param>
public virtual Task<GitHubToken> RefreshTokenAsync(
Guid scopeIdentifier,
string hubName,
Guid planId,
Guid jobId,
object userState = null,
CancellationToken cancellationToken = default)
{
HttpMethod httpMethod = new HttpMethod("GET");
Guid locationId = new Guid("8aa8aff7-751b-496e-be8d-b7818770efb3");
object routeValues = new { scopeIdentifier = scopeIdentifier, hubName = hubName, planId = planId };
List<KeyValuePair<string, string>> queryParams = new List<KeyValuePair<string, string>>();
queryParams.Add("jobId", jobId.ToString());
return SendAsync<GitHubToken>(
httpMethod,
locationId,
routeValues: routeValues,
version: new ApiResourceVersion(6.0, 1),
queryParameters: queryParams,
userState: userState,
cancellationToken: cancellationToken);
}
}
}

View File

@@ -35,6 +35,19 @@ namespace GitHub.DistributedTask.Pipelines.ContextData
throw new ArgumentException($"Unexpected type '{value?.GetType().Name}' encountered while reading '{objectDescription}'. The type '{nameof(DictionaryContextData)}' was expected.");
}
[EditorBrowsable(EditorBrowsableState.Never)]
public static CaseSensitiveDictionaryContextData AssertCaseSensitiveDictionary(
this PipelineContextData value,
String objectDescription)
{
if (value is CaseSensitiveDictionaryContextData dictionary)
{
return dictionary;
}
throw new ArgumentException($"Unexpected type '{value?.GetType().Name}' encountered while reading '{objectDescription}'. The type '{nameof(CaseSensitiveDictionaryContextData)}' was expected.");
}
[EditorBrowsable(EditorBrowsableState.Never)]
public static StringContextData AssertString(
this PipelineContextData value,

View File

@@ -154,6 +154,11 @@ namespace GitHub.DistributedTask.Pipelines.ObjectTemplating
if (value is StringToken containerLiteral)
{
if (String.IsNullOrEmpty(containerLiteral.Value))
{
return null;
}
result.Image = containerLiteral.Value;
}
else

View File

@@ -472,7 +472,7 @@
"matrix"
],
"one-of": [
"non-empty-string",
"string",
"container-mapping"
]
},
@@ -497,10 +497,22 @@
],
"mapping": {
"loose-key-type": "non-empty-string",
"loose-value-type": "container"
"loose-value-type": "services-container"
}
},
"services-container": {
"context": [
"github",
"strategy",
"matrix"
],
"one-of": [
"non-empty-string",
"container-mapping"
]
},
"container-env": {
"mapping": {
"loose-key-type": "non-empty-string",

View File

@@ -4,6 +4,17 @@ using System.Runtime.Serialization;
namespace GitHub.DistributedTask.WebApi
{
[DataContract]
public class GitHubToken
{
[DataMember(EmitDefaultValue = false)]
public string Token { get; set; }
[DataMember(EmitDefaultValue = false)]
public DateTime Expires_at { get; set; }
}
[DataContract]
public class Issue
{

View File

@@ -1,344 +0,0 @@
namespace AsyncFixer
{
}
namespace GitHub.Actions.Pipelines.WebApi
{
}
namespace GitHub.Actions.Pipelines.WebApi.Contracts
{
}
namespace GitHub.Build.WebApi
{
}
namespace GitHub.DistributedTask.Expressions
{
}
namespace GitHub.DistributedTask.Expressions2
{
}
namespace GitHub.DistributedTask.Expressions2.Sdk
{
}
namespace GitHub.DistributedTask.Expressions2.Sdk.Functions
{
}
namespace GitHub.DistributedTask.Expressions2.Sdk.Operators
{
}
namespace GitHub.DistributedTask.Expressions2.Tokens
{
}
namespace GitHub.DistributedTask.Logging
{
}
namespace GitHub.DistributedTask.ObjectTemplating
{
}
namespace GitHub.DistributedTask.ObjectTemplating.Schema
{
}
namespace GitHub.DistributedTask.ObjectTemplating.Tokens
{
}
namespace GitHub.DistributedTask.Pipelines
{
}
namespace GitHub.DistributedTask.Pipelines.ContextData
{
}
namespace GitHub.DistributedTask.Pipelines.Expressions
{
}
namespace GitHub.DistributedTask.Pipelines.ObjectTemplating
{
}
namespace GitHub.DistributedTask.Pipelines.Validation
{
}
namespace GitHub.DistributedTask.WebApi
{
}
namespace GitHub.GraphProfile.WebApi
{
}
namespace GitHub.Services
{
}
namespace GitHub.Services.Account
{
}
namespace GitHub.Services.ActivityStatistic
{
}
namespace GitHub.Services.Auditing
{
}
namespace GitHub.Services.AzureFrontDoor
{
}
namespace GitHub.Services.CentralizedFeature
{
}
namespace GitHub.Services.ClientNotification
{
}
namespace GitHub.Services.Commerce
{
}
namespace GitHub.Services.Common
{
}
namespace GitHub.Services.Common.ClientStorage
{
}
namespace GitHub.Services.Common.Diagnostics
{
}
namespace GitHub.Services.Common.Internal
{
}
namespace GitHub.Services.Compliance
{
}
namespace GitHub.Services.ContentSecurityPolicy
{
}
namespace GitHub.Services.DelegatedAuthorization
{
}
namespace GitHub.Services.Directories.DirectoryService
{
}
namespace GitHub.Services.FeatureAvailability
{
}
namespace GitHub.Services.FileContainer
{
}
namespace GitHub.Services.FileContainer.Client
{
}
namespace GitHub.Services.FormInput
{
}
namespace GitHub.Services.GitHubConnector
{
}
namespace GitHub.Services.Graph
{
}
namespace GitHub.Services.Graph.Client
{
}
namespace GitHub.Services.GroupLicensingRule
{
}
namespace GitHub.Services.Health
{
}
namespace GitHub.Services.HostAcquisition
{
}
namespace GitHub.Services.Identity
{
}
namespace GitHub.Services.Identity.Client
{
}
namespace GitHub.Services.Identity.Mru
{
}
namespace GitHub.Services.IdentityPicker
{
}
namespace GitHub.Services.Invitation
{
}
namespace GitHub.Services.Licensing
{
}
namespace GitHub.Services.Location
{
}
namespace GitHub.Services.Location.Client
{
}
namespace GitHub.Services.MarketingPreferences
{
}
namespace GitHub.Services.Notification
{
}
namespace GitHub.Services.OAuth
{
}
namespace GitHub.Services.OAuthWhitelist
{
}
namespace GitHub.Services.Operations
{
}
namespace GitHub.Services.Organization
{
}
namespace GitHub.Services.PermissionLevel
{
}
namespace GitHub.Services.Profile
{
}
namespace GitHub.Services.Security
{
}
namespace GitHub.Services.ServicePrincipal
{
}
namespace GitHub.Services.Servicing
{
}
namespace GitHub.Services.Settings
{
}
namespace GitHub.Services.TokenAdmin.Client
{
}
namespace GitHub.Services.TokenRevocation
{
}
namespace GitHub.Services.Tokens
{
}
namespace GitHub.Services.Tokens.TokenAdmin.Client
{
}
namespace GitHub.Services.TokenSigningKeyLifecycle
{
}
namespace GitHub.Services.UserMapping
{
}
namespace GitHub.Services.Users
{
}
namespace GitHub.Services.WebApi
{
}
namespace GitHub.Services.WebApi.Exceptions
{
}
namespace GitHub.Services.WebApi.Internal
{
}
namespace GitHub.Services.WebApi.Jwt
{
}
namespace GitHub.Services.WebApi.Location
{
}
namespace GitHub.Services.WebApi.Patch
{
}
namespace GitHub.Services.WebApi.Patch.Json
{
}
namespace GitHub.Services.WebApi.Utilities
{
}
namespace GitHub.Services.WebApi.Utilities.Internal
{
}
namespace GitHub.Services.WebApi.Xml
{
}
namespace GitHub.Services.WebPlatform
{
}
namespace GitHub.Services.Zeus
{
}

View File

@@ -5,10 +5,8 @@ using System.Diagnostics;
using System.Linq;
using System.Reflection;
using System.Runtime.Serialization;
using System.Xml;
using System.Xml.Serialization;
using GitHub.Services.Common;
using GitHub.Services.Graph.Client;
namespace GitHub.Services.WebApi.Xml
{

View File

@@ -1,432 +0,0 @@
$ErrorActionPreference = "Stop"
$runnerRepo = Read-Host -Prompt "actions/runner repository root"
if (!(Test-Path -LiteralPath "$runnerRepo/src")) {
Write-Error "$runnerRepo should contains a /src folder"
return 1
}
$gitHubSdkFolder = Join-Path -Path "$runnerRepo/src" -ChildPath "Sdk"
$vsoRepo = $PWD
while ($true) {
if (Test-Path -LiteralPath "$vsoRepo/init.cmd") {
break;
}
else {
$vsoRepo = (Get-Item $vsoRepo).Parent.FullName
}
}
$targetFolders = @(
# "Common"
# "WebApi"
# "AadAuthentication"
# "DTContracts"
# "DTGenerated"
# "DTLogging"
# "DTExpressions"
# "DTExpressions2"
# "DTObjectTemplating"
# "DTPipelines"
# "DTWebApi"
# "Resources"
# "BuildWebApi"
# "CoreWebApi"
# "ArtifactWebApi"
# "ArtifactContentTelemetry"
# "ArtifactContent"
# "BlobStoreWebApi"
# "BlobStoreCommonTelemetry"
# "BlobStoreCommon"
)
$sourceFolders = @{
# "Vssf\Client\Common" = "Common";
# "Vssf\Client\WebApi" = "WebApi";
# "DistributedTask\Shared\Common\Contracts" = "DTContracts";
# "DistributedTask\Client\WebApi\Generated" = "DTGenerated";
# "DistributedTask\Client\WebApi\Logging" = "DTLogging";
# "DistributedTask\Client\WebApi\Expressions" = "DTExpressions";
# "Actions\Runtime\Client\WebApi\Expressions2" = "DTExpressions2";
# "Actions\Runtime\Client\WebApi\ObjectTemplating" = "DTObjectTemplating";
# "Actions\Runtime\Client\WebApi\Pipelines" = "DTPipelines";
# "DistributedTask\Client\WebApi\WebApi" = "DTWebApi";
# "..\obj\Debug.AnyCPU\Vssf.Client\MS.VS.Services.Common\EmbeddedVersionInfo.cs" = "Common\EmbeddedVersionInfo.cs";
# "Vssf\InteractiveClient\Client\Authentication\VssAadToken.cs" = "AadAuthentication";
# "Vssf\InteractiveClient\Client\Authentication\VssAadTokenProvider.cs" = "AadAuthentication";
# "Vssf\InteractiveClient\Client\Authentication\VssAadCredential.cs" = "AadAuthentication";
# "Vssf\InteractiveClient\Client\VssAadSettings.cs" = "AadAuthentication";
# "Vssf\InteractiveClient\Client\Authentication\VssFederatedCredential.cs" = "AadAuthentication";
# "Vssf\InteractiveClient\Client\Authentication\VssFederatedToken.cs" = "AadAuthentication";
# "Vssf\InteractiveClient\Client\Authentication\VssFederatedTokenProvider.cs" = "AadAuthentication";
# "Vssf\InteractiveClient\Client\Authentication\Utility\CookieUtility.cs" = "AadAuthentication";
# "Actions\Runtime\Client\WebApi\Pipelines\ObjectTemplating\workflow-v1.0.json" = "DTPipelines";
# "Tfs\Client\Build2\Api" = "BuildWebApi";
# "Tfs\Client\Core" = "CoreWebApi";
# "ArtifactServices\Client\WebApi" = "ArtifactWebApi";
# "ArtifactServices\Shared\Content.Common.Telemetry" = "ArtifactContentTelemetry";
# "ArtifactServices\Shared\Content.Common" = "ArtifactContent";
# "BlobStore\Client\WebApi" = "BlobStoreWebApi";
# "ArtifactServices\Shared\BlobStore.Common.Telemetry" = "BlobStoreCommonTelemetry";
# "ArtifactServices\Shared\BlobStore.Common" = "BlobStoreCommon";
}
$extraFiles = @(
# "BlobStoreCommon\BlobStore.Common\AzureStorageOperationTraceAdapter.cs"
# "BlobStoreCommon\BlobStore.Common\BlobIdentifierHelperExtensions.cs"
# "BlobStoreCommon\BlobStore.Common\BlobIdentifierHexConverter.cs"
# "BlobStoreCommon\BlobStore.Common\EdgeCacheUrlBuilder.cs"
# "BlobStoreCommon\BlobStore.Common\Exceptions.cs"
# "BlobStoreCommon\BlobStore.Common\IDownloader.cs"
# "BlobStoreCommon\BlobStore.Common\InternalsVisibleTo.cs"
# "BlobStoreCommon\BlobStore.Common\IUrlSigner.cs"
# "BlobStoreCommon\BlobStore.Common\ManagedParallelBlobDownloader.cs"
# "BlobStoreCommon\BlobStore.Common\NullableExtensions.cs"
# "BlobStoreCommon\BlobStore.Common\ObjectExtensions.cs"
# "BlobStoreCommon\BlobStore.Common\PerfCounters\InstrumentationManifest.cs"
# "BlobStoreCommon\BlobStore.Common\PerfCounters\InstrumentationManifestException.cs"
# "BlobStoreCommon\BlobStore.Common\PerfCounters\IPerformanceDataFacade.cs"
# "BlobStoreCommon\BlobStore.Common\PerfCounters\ManifestCounters.cs"
# "BlobStoreCommon\BlobStore.Common\PerfCounters\NoopPerfCounter.cs"
# "BlobStoreCommon\BlobStore.Common\PerfCounters\NoopPerformanceDataFacade.cs"
# "BlobStoreCommon\BlobStore.Common\PerfCounters\PerfCounter.cs"
# "BlobStoreCommon\BlobStore.Common\PerfCounters\PerfCounterSet.cs"
# "BlobStoreCommon\BlobStore.Common\PerfCounters\PerformanceDataFacade.cs"
# "BlobStoreCommon\BlobStore.Common\ReceiptSecretConstants.cs"
# "BlobStoreCommon\BlobStore.Common\SecurityDefinitions.cs"
# "BlobStoreCommon\BlobStore.Common\VsoHashAlgorithm.cs"
# "BlobStoreCommonTelemetry\BlobStore.Common.Telemetry\InternalsVisibleTo.cs"
# "ArtifactContentTelemetry\Content.Common.Telemetry\InternalsVisibleTo.cs"
# "ArtifactContentTelemetry\Content.Common.Telemetry\Telemetry\NoopClientTelemetry.cs"
# "ArtifactContentTelemetry\Content.Common.Telemetry\Telemetry\TelemetryPlatformSpecificNetFramework.cs"
# "ArtifactContentTelemetry\Content.Common.Telemetry\Telemetry\TelemetryEnvironmentHelper.cs"
# "ArtifactContent\Content.Common\AsyncEnumerator\AsyncEnumeratorExceptionWrapper.cs"
# "ArtifactContent\Content.Common\AsyncEnumerator\IEnumeratorExceptionMapper.cs"
# "ArtifactContent\Content.Common\AsyncEnumerator\AsyncEnumeratorWithCursor.cs"
# "ArtifactContent\Content.Common\Authentication\AadAcquireTokenException.cs"
# "ArtifactContent\Content.Common\Authentication\AadErrorHandlingPolicy.cs"
# "ArtifactContent\Content.Common\Authentication\CredentialProvider\CredentialProviderException.cs"
# "ArtifactContent\Content.Common\Authentication\CredentialProvider\CredentialProviderLoader.cs"
# "ArtifactContent\Content.Common\Authentication\CredentialProvider\CredentialProviderManager.cs"
# "ArtifactContent\Content.Common\Authentication\CredentialProvider\CredentialResponse.cs"
# "ArtifactContent\Content.Common\Authentication\CredentialProvider\ICredentialProvider.cs"
# "ArtifactContent\Content.Common\Authentication\CredentialProvider\PluginCredentialProvider.cs"
# "ArtifactContent\Content.Common\Authentication\LocalTokenCacheArgs.cs"
# "ArtifactContent\Content.Common\Authentication\TestableAuthenticationContext.cs"
# "ArtifactContent\Content.Common\Authentication\VsoAadConstants.cs"
# "ArtifactContent\Content.Common\Authentication\VsoCredentialHelper.cs"
# "ArtifactContent\Content.Common\AutoKillProcessHandle.cs"
# "ArtifactContent\Content.Common\ConcurrencyConsolidator.cs"
# "ArtifactContent\Content.Common\EnumUtilities.cs"
# "ArtifactContent\Content.Common\EquatableTuple.cs"
# "ArtifactContent\Content.Common\FileVersionHelpers.cs"
# "ArtifactContent\Content.Common\Histogram.cs"
# "ArtifactContent\Content.Common\InternalsVisibleTo.cs"
# "ArtifactContent\Content.Common\InUseDetection.cs"
# "ArtifactContent\Content.Common\IteratorPartition.cs"
# "ArtifactContent\Content.Common\Json\ByteArrayAsBase64JsonConvertor.cs"
# "ArtifactContent\Content.Common\Json\ByteArrayAsHexJsonConvertor.cs"
# "ArtifactContent\Content.Common\Json\JsonEnumerator.cs"
# "ArtifactContent\Content.Common\Json\JsonNormalizer.cs"
# "ArtifactContent\Content.Common\Json\JsonProperty.cs"
# "ArtifactContent\Content.Common\Json\JsonStream.cs"
# "ArtifactContent\Content.Common\Json\JsonWrite.cs"
# "ArtifactContent\Content.Common\Json\JsonWriterStream.cs"
# "ArtifactContent\Content.Common\Json\JsonWrites.cs"
# "ArtifactContent\Content.Common\Json\ULongJsonConverter.cs"
# "ArtifactContent\Content.Common\Kvp.cs"
# "ArtifactContent\Content.Common\Operations\SecureStringConverter.cs"
# "ArtifactContent\Content.Common\PagedEnumerator.cs"
# "ArtifactContent\Content.Common\PerformanceInfo.cs"
# "ArtifactContent\Content.Common\ReaderWriterLockSlimExtensions.cs"
# "ArtifactContent\Content.Common\ReadOnlySet.cs"
# "ArtifactContent\Content.Common\RetrievalOptions.cs"
# "ArtifactContent\Content.Common\ServiceInstanceTypes.cs"
# "ArtifactContent\Content.Common\ServicePointHelper.cs"
# "ArtifactContent\Content.Common\ShardableLocator.cs"
# "ArtifactContent\Content.Common\StringExtensions.cs"
# "ArtifactContent\Content.Common\ThreadLocalRandom.cs"
# "ArtifactContent\Content.Common\ThreadPoolHelper.cs"
# "ArtifactContent\Content.Common\Tracing\AppTraceListener.cs"
# "ArtifactContent\Content.Common\Tracing\ArtifactServicesTraceSource.cs"
# "ArtifactContent\Content.Common\Tracing\ConsoleMessageUtil.cs"
# "ArtifactContent\Content.Common\Tracing\ConsoleTraceListener.cs"
# "ArtifactContent\Content.Common\Tracing\FileTraceListener.cs"
# "ArtifactContent\Content.Common\Tracing\InMemoryLog.cs"
# "ArtifactContent\Content.Common\Tracing\InMemoryTraceListener.cs"
# "Common\Common\CommandLine\Argument.cs"
# "Common\Common\CommandLine\AttributeBasedOperationModeHandlerFactory.cs"
# "Common\Common\CommandLine\AttributeBasedOptionParserAdapter.cs"
# "Common\Common\CommandLine\BasicParser.cs"
# "Common\Common\CommandLine\CommandLineLexer.cs"
# "Common\Common\CommandLine\Enumerations.cs"
# "Common\Common\CommandLine\Exceptions.cs"
# "Common\Common\CommandLine\Extensions.cs"
# "Common\Common\CommandLine\IEnumerable.cs"
# "Common\Common\CommandLine\OperationHandler.cs"
# "Common\Common\CommandLine\OperationHandlerFactory.cs"
# "Common\Common\CommandLine\OperationModeAttribute.cs"
# "Common\Common\CommandLine\Option.cs"
# "Common\Common\CommandLine\OptionAttribute.cs"
# "Common\Common\CommandLine\OptionParser.cs"
# "Common\Common\CommandLine\OptionReader.cs"
# "Common\Common\CommandLine\ResponseFileOptionReader.cs"
# "Common\Common\CommandLine\Validation\DefaultValidation.cs"
# "Common\Common\CommandLine\Validation\IOptionValidation.cs"
# "Common\Common\CommandLine\Validation\OptionExistsFilter.cs"
# "Common\Common\CommandLine\Validation\OptionMustExist.cs"
# "Common\Common\CommandLine\Validation\OptionRequiresSpecificValue.cs"
# "Common\Common\CommandLine\Validation\OptionsAreMutuallyExclusive.cs"
# "Common\Common\CommandLine\Validation\OptionsAreMutuallyInclusive.cs"
# "Common\Common\CommandLine\Validation\OptionValidation.cs"
# "Common\Common\CommandLine\Validation\OptionValidationFilter.cs"
# "Common\Common\CommandLine\Validation\OptionValueFilter.cs"
# "Common\Common\CommandLine\ValueConverters\CsvCollectionConverter.cs"
# "Common\Common\CommandLine\ValueConverters\EnumConverter.cs"
# "Common\Common\CommandLine\ValueConverters\IValueConvertible.cs"
# "Common\Common\CommandLine\ValueConverters\UriConverter.cs"
# "Common\Common\CommandLine\ValueConverters\ValueConverter.cs"
# "Common\Common\ExternalProviders\IExternalProviderHttpRequester.cs"
# "Common\Common\Performance\PerformanceNativeMethods.cs"
# "Common\Common\TokenStorage\RegistryToken.cs"
# "Common\Common\TokenStorage\RegistryTokenStorage.cs"
# "Common\Common\TokenStorage\RegistryTokenStorageHelper.cs"
# "Common\Common\TokenStorage\VssTokenStorageFactory.cs"
# "Common\Common\Utility\CredentialsCacheManager.cs"
# "Common\Common\Utility\EncryptionUtility.cs"
# "Common\Common\Utility\EnumerableUtility.cs"
# "Common\Common\Utility\EnvironmentWrapper.cs"
# "Common\Common\Utility\ExceptionExtentions.cs"
# "Common\Common\Utility\NativeMethods.cs"
# "Common\Common\Utility\OSDetails.cs"
# "Common\Common\Utility\DateTimeUtility.cs"
# "Common\Common\Utility\PasswordUtility.cs"
# "Common\Common\Utility\RegistryHelper.cs"
# "Common\Common\Utility\SerializationHelper.cs"
# "Common\Common\Utility\Csv\CsvException.cs"
# "Common\Common\Utility\Csv\CsvConfiguration.cs"
# "Common\Common\Utility\Csv\CsvWriter.cs"
# "Common\Common\VssEnvironment.cs"
# "WebApi\WebApi\AssemblyAttributes.cs"
# "WebApi\WebApi\Contracts\DelegatedAuthorization\ExpiringToken.cs"
# "WebApi\WebApi\Contracts\ExternalEvent\Comparers\ExternalGitIssueComparer.cs"
# "WebApi\WebApi\Contracts\ExternalEvent\ExternalGitExtensions.cs"
# "WebApi\WebApi\Contracts\ExternalEvent\Comparers\ExternalGitPullRequestComparer.cs"
# "WebApi\WebApi\Contracts\ExternalEvent\Comparers\ExternalGitCommitComparer.cs"
# "WebApi\WebApi\Contracts\ExternalEvent\ExternalGitIssueEvent.cs"
# "WebApi\WebApi\Contracts\ExternalEvent\Comparers\ExternalGitRepoComparer.cs"
# "WebApi\WebApi\Contracts\ExternalEvent\ExternalGitCommitCommentEvent.cs"
# "WebApi\WebApi\Contracts\PermissionLevel\Client\PagedPermissionLevelAssignment.cs"
# "WebApi\WebApi\Contracts\PermissionLevel\Client\PermissionLevelAssignment.cs"
# "WebApi\WebApi\Contracts\PermissionLevel\Enumerations.cs"
# "WebApi\WebApi\Contracts\PermissionLevel\Client\PermissionLevelDefinition.cs"
# "WebApi\WebApi\Contracts\Tokens\PATAddedEvent.cs"
# "WebApi\WebApi\Contracts\Tokens\SshKeyAddedEvent.cs"
# "WebApi\WebApi\Contracts\Tokens\ExpiringTokenEvent.cs"
# "WebApi\WebApi\Contracts\DelegatedAuthorization\PATAddedEvent.cs"
# "WebApi\WebApi\Contracts\DelegatedAuthorization\SshKeyAddedEvent.cs"
# "WebApi\WebApi\Contracts\DelegatedAuthorization\ExpiringTokenEvent.cs"
# "WebApi\WebApi\Contracts\DelegatedAuthorization\Migration\DelegatedAuthMigrationStatus.cs"
# "WebApi\WebApi\Contracts\DelegatedAuthorization\Migration\DelegatedAuthorizationMigrationBase.cs"
# "WebApi\WebApi\Contracts\DelegatedAuthorization\Migration\TokenDelegatedAuthorizationAccessKeyPublicDataMigration.cs"
# "WebApi\WebApi\Contracts\DelegatedAuthorization\Migration\TokenDelegatedAuthorizationAccessMigration.cs"
# "WebApi\WebApi\Contracts\DelegatedAuthorization\Migration\TokenDelegatedAuthorizationMigration.cs"
# "WebApi\WebApi\Contracts\DelegatedAuthorization\Migration\TokenDelegatedAuthorizationAccessKeyMigration.cs"
# "WebApi\WebApi\Contracts\DelegatedAuthorization\Migration\TokenDelegatedAuthorizationRegistrationMigration.cs"
# "WebApi\WebApi\Contracts\DelegatedAuthorization\Migration\TokenDelegatedAuthorizationRegistrationRedirectLocationMigration.cs"
# "WebApi\WebApi\Contracts\DelegatedAuthorization\Migration\TokenDelegatedHostAuthorizationMigration.cs"
# "WebApi\WebApi\Contracts\OAuthWhitelist\OAuthWhitelistEntry.cs"
# "WebApi\WebApi\Contracts\TokenAdmin\PatRevokedEvent.cs"
# "WebApi\WebApi\Contracts\TokenAdmin\TokenAdministrationRevocation.cs"
# "WebApi\WebApi\Contracts\TokenAdmin\TokenAdminPagedSessionTokens.cs"
# "WebApi\WebApi\Contracts\TokenAdmin\TokenAdminRevocation.cs"
# "WebApi\WebApi\Contracts\TokenAdmin\TokenAdminRevocationRule.cs"
# "WebApi\WebApi\Exceptions\AuditLogExceptions.cs"
# "WebApi\WebApi\Exceptions\AadExceptions.cs"
# "WebApi\WebApi\Exceptions\PermissionLevelExceptions.cs"
# "WebApi\WebApi\HttpClients\CsmResourceProviderHttpClient.cs"
# "WebApi\WebApi\HttpClients\Generated\CsmResourceProviderHttpClientBase.cs"
# "WebApi\WebApi\HttpClients\Generated\OAuthWhitelistHttpClient.cs"
# "WebApi\WebApi\HttpClients\Generated\TokenAdminHttpClient.cs"
# "WebApi\WebApi\HttpClients\Generated\TokenAdministrationHttpClient.cs"
# "WebApi\WebApi\HttpClients\Generated\TokenExpirationHttpClient.cs"
# "WebApi\WebApi\HttpClients\Generated\TokenMigrationHttpClient.cs"
# "WebApi\WebApi\HttpClients\Generated\PermissionLevelHttpClient.cs"
# "WebApi\WebApi\HttpClients\CommerceHostHelperHttpClient.cs"
# "WebApi\WebApi\Utilities\DelegatedAuthComparers.cs"
# "WebApi\WebApi\Utilities\HttpHeadersExtensions.cs"
# "WebApi\WebApi\VssClientCertificateManager.cs"
# "WebApi\WebApi\VssClientEnvironment.cs"
# "WebApi\WebApi\VssSoapMediaTypeFormatter.cs"
)
$resourceFiles = @{
# "ExpressionResources" = "DistributedTask\Client\WebApi\Expressions\ExpressionResources.resx";
# "PipelineStrings" = "DistributedTask\Client\WebApi\Pipelines\PipelineStrings.resx";
# "CommonResources" = "Vssf\Client\Common\Resources.resx";
# "IdentityResources" = "Vssf\Client\WebApi\Resources\IdentityResources.resx";
# "JwtResources" = "Vssf\Client\WebApi\Resources\JwtResources.resx";
# "WebApiResources" = "Vssf\Client\WebApi\Resources\WebApiResources.resx";
# "DataImportResources" = "Vssf\Client\WebApi\Resources\DataImportResources.resx";
# "PatchResources" = "Vssf\Client\WebApi\Resources\PatchResources.resx";
# "AccountResources" = "Vssf\Client\WebApi\Resources\AccountResources.resx";
# "TemplateStrings" = "DistributedTask\Client\WebApi\ObjectTemplating\TemplateStrings.resx";
# "GraphResources" = "Vssf\Client\WebApi\Resources\GraphResources.resx";
# "FileContainerResources" = "Vssf\Client\WebApi\Resources\FileContainerResources.resx";
# "LocationResources" = "Vssf\Client\WebApi\Resources\LocationResources.resx";
# "CommerceResources" = "Vssf\Client\WebApi\Resources\CommerceResources.resx";
# "SecurityResources" = "Vssf\Client\WebApi\Resources\SecurityResources.resx";
# "WebPlatformResources" = "Vssf\Client\WebApi\Resources\WebPlatformResources.resx";
# "ZeusWebApiResources" = "Vssf\Client\WebApi\Resources\ZeusWebApiResources.resx";
# "NameResolutionResources" = "Vssf\Client\WebApi\Resources\NameResolutionResources.resx";
# "PartitioningResources" = "Vssf\Client\WebApi\Resources\PartitioningResources.resx";
# "WebApiResources" = "Tfs\Client\Core\Resources\WebApiResources.resx";
# "BlobStoreResources" = "BlobStore\Client\WebApi\Resources.resx"
# "ContentResources" = "ArtifactServices\Shared\Content.Common\Resources.resx"
# "BlobStoreCommonResources" = "ArtifactServices\Shared\BlobStore.Common\Resources.resx"
}
$resourceNamespace = @{
# "ExpressionResources" = "Microsoft.TeamFoundation.DistributedTask.Expressions";
# "PipelineStrings" = "Microsoft.TeamFoundation.DistributedTask.Pipelines";
# "CommonResources" = "Microsoft.VisualStudio.Services.Common.Internal";
# "IdentityResources" = "Microsoft.VisualStudio.Services.WebApi";
# "JwtResources" = "Microsoft.VisualStudio.Services.WebApi";
# "WebApiResources" = "Microsoft.VisualStudio.Services.WebApi";
# "DataImportResources" = "Microsoft.VisualStudio.Services.WebApi";
# "PatchResources" = "Microsoft.VisualStudio.Services.WebApi";
# "AccountResources" = "Microsoft.VisualStudio.Services.WebApi";
# "TemplateStrings" = "Microsoft.TeamFoundation.DistributedTask.ObjectTemplating";
# "GraphResources" = "Microsoft.VisualStudio.Services.WebApi";
# "FileContainerResources" = "Microsoft.VisualStudio.Services.WebApi";
# "LocationResources" = "Microsoft.VisualStudio.Services.WebApi";
# "CommerceResources" = "Microsoft.VisualStudio.Services.WebApi";
# "SecurityResources" = "Microsoft.VisualStudio.Services.WebApi";
# "WebPlatformResources" = "Microsoft.VisualStudio.Services.WebApi";
# "ZeusWebApiResources" = "Microsoft.VisualStudio.Services.WebApi";
# "NameResolutionResources" = "Microsoft.VisualStudio.Services.WebApi";
# "PartitioningResources" = "Microsoft.VisualStudio.Services.WebApi";
# "WebApiResources" = "Microsoft.TeamFoundation.Core.WebApi";
# "ContentResources" = "Microsoft.VisualStudio.Services.Content.Common";
# "BlobStoreCommonResources" = "Microsoft.VisualStudio.Services.BlobStore.Common";
# "BlobStoreResources" = "Microsoft.VisualStudio.Services.BlobStore.WebApi";
}
foreach ($folder in $targetFolders) {
Write-Host "Recreate $gitHubSdkFolder\$folder"
if (Test-Path -LiteralPath "$gitHubSdkFolder\$folder") {
Remove-Item -LiteralPath "$gitHubSdkFolder\$folder" -Force -Recurse
}
New-Item -Path $gitHubSdkFolder -Name $folder -ItemType "directory" -Force
}
foreach ($sourceFolder in $sourceFolders.Keys) {
$copySource = Join-Path -Path $vsoRepo -ChildPath $sourceFolder
$copyDest = Join-Path -Path $gitHubSdkFolder -ChildPath $sourceFolders[$sourceFolder]
Write-Host "Copy $copySource to $copyDest"
Copy-Item -Path $copySource -Destination $copyDest -Filter "*.cs" -Recurse -Force
}
Write-Host "Delete extra none NetStandard files"
foreach ($extraFile in $extraFiles) {
Remove-Item -LiteralPath "$gitHubSdkFolder\$extraFile" -Force
}
Write-Host "Generate C# file for resx files"
foreach ($resourceFile in $resourceFiles.Keys) {
Write-Host "Generate file for $resourceFile"
$stringBuilder = New-Object System.Text.StringBuilder
$file = $resourceFiles[$resourceFile]
$xml = [xml](Get-Content -LiteralPath "$vsoRepo\$file")
$null = $stringBuilder.AppendLine('using System.Globalization;')
$null = $stringBuilder.AppendLine('')
$namespace = $resourceNamespace[$resourceFile]
$null = $stringBuilder.AppendLine("namespace $namespace")
$null = $stringBuilder.AppendLine('{')
$null = $stringBuilder.AppendLine(" public static class $resourceFile")
$null = $stringBuilder.AppendLine(' {')
foreach ($data in $xml.root.data) {
$i = 0
$args = ""
$inputs = ""
while ($true) {
if ($data.value.Contains("{$i}") -or $data.value.Contains("{$i" + ":")) {
if ($i -eq 0) {
$args = "object arg$i"
$inputs = "arg$i"
}
else {
$args = $args + ", " + "object arg$i"
$inputs = $inputs + ", " + "arg$i"
}
$i++
}
else {
break
}
}
$null = $stringBuilder.AppendLine("")
$null = $stringBuilder.AppendLine(" public static string $($data.name)($($args))")
$null = $stringBuilder.AppendLine(" {")
$null = $stringBuilder.AppendLine(@"
const string Format = @"$($data.value.Replace('"', '""'))";
"@)
if ($i -eq 0) {
$null = $stringBuilder.AppendLine(" return Format;")
}
else {
$null = $stringBuilder.AppendLine(" return string.Format(CultureInfo.CurrentCulture, Format, $inputs);")
}
$null = $stringBuilder.AppendLine(" }")
}
$null = $stringBuilder.AppendLine(" }")
$null = $stringBuilder.AppendLine("}")
# Write Resources.g.cs.
$genResourceFile = Join-Path -Path $gitHubSdkFolder -ChildPath "Resources\$resourceFile.g.cs"
[System.IO.File]::WriteAllText($genResourceFile, ($stringBuilder.ToString()), ([System.Text.Encoding]::UTF8))
}
# Print out all namespaces
Write-Host "Rename namespaces:"
$namespaces = New-Object 'System.Collections.Generic.HashSet[string]'
$sourceFiles = Get-ChildItem -LiteralPath $gitHubSdkFolder -Filter "*.cs" -Recurse -Force -File
foreach ($file in $sourceFiles) {
foreach ($line in Get-Content $file.FullName) {
if ($line.StartsWith("namespace ")) {
$namespace = $line.Substring("namespace ".Length)
if ($namespaces.Add($namespace)) {
Write-Host $namespace
}
}
}
}
# Rename all namespaces to GitHub
$allSourceFiles = Get-ChildItem -LiteralPath $gitHubSdkFolder -Filter "*.cs" -Recurse -Force -File
foreach ($file in $allSourceFiles) {
$stringBuilder = New-Object System.Text.StringBuilder
foreach ($line in Get-Content $file.FullName) {
if ($line.Contains("Microsoft.VisualStudio")) {
$line = $line.Replace("Microsoft.VisualStudio", "GitHub");
}
elseif ($line.Contains("Microsoft.Azure.DevOps")) {
$line = $line.Replace("Microsoft.Azure.DevOps", "GitHub");
}
elseif ($line.Contains("Microsoft.TeamFoundation")) {
$line = $line.Replace("Microsoft.TeamFoundation", "GitHub");
}
$null = $stringBuilder.AppendLine($line)
}
[System.IO.File]::WriteAllText($file.FullName, ($stringBuilder.ToString()), ([System.Text.Encoding]::UTF8))
}
Write-Host "Done"

View File

@@ -498,7 +498,7 @@ namespace GitHub.Runner.Common.Tests
_promptManager
.Setup(x => x.ReadValue(
Constants.Runner.CommandLine.Args.Token, // argName
"Enter runner deletion token:", // description
"Enter runner remove token:", // description
true, // secret
string.Empty, // defaultValue
Validators.NonEmptyValidator, // validator

View File

@@ -175,8 +175,8 @@ namespace GitHub.Runner.Common.Tests.Listener.Configuration
Assert.True(s.PoolId.Equals(_expectedPoolId));
Assert.True(s.WorkFolder.Equals(_expectedWorkFolder));
// validate GetAgentPoolsAsync gets called once with automation pool type
_runnerServer.Verify(x => x.GetAgentPoolsAsync(It.IsAny<string>(), It.Is<TaskAgentPoolType>(p => p == TaskAgentPoolType.Automation)), Times.Once);
// validate GetAgentPoolsAsync gets called twice with automation pool type
_runnerServer.Verify(x => x.GetAgentPoolsAsync(It.IsAny<string>(), It.Is<TaskAgentPoolType>(p => p == TaskAgentPoolType.Automation)), Times.Exactly(2));
_runnerServer.Verify(x => x.AddAgentAsync(It.IsAny<int>(), It.Is<TaskAgent>(a => a.Labels.Contains("self-hosted") && a.Labels.Contains(VarUtil.OS) && a.Labels.Contains(VarUtil.OSArchitecture))), Times.Once);
}

View File

@@ -260,7 +260,7 @@ namespace GitHub.Runner.Common.Tests
var proc = await processInvoker.ExecuteAsync("", "bash", "-c \"cat /proc/$$/oom_score_adj\"", null, false, null, false, null, false, false,
highPriorityProcess: false,
cancellationToken: tokenSource.Token);
Assert.Equal(oomScoreAdj, 500);
Assert.Equal(500, oomScoreAdj);
}
catch (OperationCanceledException)
{
@@ -293,12 +293,12 @@ namespace GitHub.Runner.Common.Tests
};
try
{
var proc = await processInvoker.ExecuteAsync("", "bash", "-c \"cat /proc/$$/oom_score_adj\"",
var proc = await processInvoker.ExecuteAsync("", "bash", "-c \"cat /proc/$$/oom_score_adj\"",
new Dictionary<string, string> { {"PIPELINE_JOB_OOMSCOREADJ", "1234"} },
false, null, false, null, false, false,
highPriorityProcess: false,
cancellationToken: tokenSource.Token);
Assert.Equal(oomScoreAdj, 1234);
Assert.Equal(1234, oomScoreAdj);
}
catch (OperationCanceledException)
{
@@ -336,7 +336,7 @@ namespace GitHub.Runner.Common.Tests
var proc = await processInvoker.ExecuteAsync("", "bash", "-c \"cat /proc/$$/oom_score_adj\"", null, false, null, false, null, false, false,
highPriorityProcess: true,
cancellationToken: tokenSource.Token);
Assert.Equal(oomScoreAdj, 123);
Assert.Equal(123, oomScoreAdj);
}
catch (OperationCanceledException)
{

View File

@@ -1,8 +1,7 @@
using GitHub.Runner.Common.Util;
using System.IO;
using System.IO;
using Xunit;
using System;
using GitHub.Runner.Sdk;
using System.Runtime.CompilerServices;
namespace GitHub.Runner.Common.Tests
{
@@ -21,9 +20,16 @@ namespace GitHub.Runner.Common.Tests
return projectDir;
}
public static string GetTestFilePath([CallerFilePath] string path = null)
{
return path;
}
public static string GetSrcPath()
{
string srcDir = Environment.GetEnvironmentVariable("GITHUB_RUNNER_SRC_DIR");
string L0dir = Path.GetDirectoryName(GetTestFilePath());
string testDir = Path.GetDirectoryName(L0dir);
string srcDir = Path.GetDirectoryName(testDir);
ArgUtil.Directory(srcDir, nameof(srcDir));
Assert.Equal(Src, Path.GetFileName(srcDir));
return srcDir;

View File

@@ -51,6 +51,19 @@ namespace GitHub.Runner.Common.Tests.Worker
Assert.True(ActionCommand.TryParse(message, commands, out verify));
Assert.True(IsEqualCommand(hc, test, verify));
message = "";
test = null;
verify = null;
//##[do-something k1=%253B=%250D=%250A=%255D;]%253B-%250D-%250A-%255D
message = "##[do-something k1=%253B=%250D=%250A=%255D;]%253B-%250D-%250A-%255D";
test = new ActionCommand("do-something")
{
Data = "%3B-%0D-%0A-%5D",
};
test.Properties.Add("k1", "%3B=%0D=%0A=%5D");
Assert.True(ActionCommand.TryParse(message, commands, out verify));
Assert.True(IsEqualCommand(hc, test, verify));
message = "";
test = null;
verify = null;
@@ -109,7 +122,7 @@ namespace GitHub.Runner.Common.Tests.Worker
message = "";
test = null;
verify = null;
//::do-something k1=%3B=%0D=%0A=%5D;::%3B-%0D-%0A-%5D
//::do-something k1=;=%2C=%0D=%0A=]=%3A,::;-%0D-%0A-]-:-,
message = "::do-something k1=;=%2C=%0D=%0A=]=%3A,::;-%0D-%0A-]-:-,";
test = new ActionCommand("do-something")
{
@@ -119,6 +132,19 @@ namespace GitHub.Runner.Common.Tests.Worker
Assert.True(ActionCommand.TryParseV2(message, commands, out verify));
Assert.True(IsEqualCommand(hc, test, verify));
message = "";
test = null;
verify = null;
//::do-something k1=;=%252C=%250D=%250A=]=%253A,::;-%250D-%250A-]-:-,
message = "::do-something k1=;=%252C=%250D=%250A=]=%253A,::;-%250D-%250A-]-:-,";
test = new ActionCommand("do-something")
{
Data = ";-%0D-%0A-]-:-,",
};
test.Properties.Add("k1", ";=%2C=%0D=%0A=]=%3A");
Assert.True(ActionCommand.TryParseV2(message, commands, out verify));
Assert.True(IsEqualCommand(hc, test, verify));
message = "";
test = null;
verify = null;

View File

@@ -1,8 +1,10 @@
using System;
using System.Collections.Generic;
using System.IO;
using System.Runtime.CompilerServices;
using GitHub.DistributedTask.WebApi;
using GitHub.Runner.Worker;
using GitHub.Runner.Worker.Container;
using Moq;
using Xunit;
using Pipelines = GitHub.DistributedTask.Pipelines;
@@ -11,47 +13,35 @@ namespace GitHub.Runner.Common.Tests.Worker
{
public sealed class ActionCommandManagerL0
{
private ActionCommandManager _commandManager;
private Mock<IExecutionContext> _ec;
private Mock<IExtensionManager> _extensionManager;
private Mock<IPipelineDirectoryManager> _pipelineDirectoryManager;
[Fact]
[Trait("Level", "L0")]
[Trait("Category", "Worker")]
public void EnablePluginInternalCommand()
{
using (TestHostContext _hc = new TestHostContext(this))
using (TestHostContext hc = CreateTestContext())
{
var extensionManger = new Mock<IExtensionManager>();
var directoryManager = new Mock<IPipelineDirectoryManager>();
var pluginCommand = new InternalPluginSetRepoPathCommandExtension();
pluginCommand.Initialize(_hc);
var envCommand = new SetEnvCommandExtension();
envCommand.Initialize(_hc);
extensionManger.Setup(x => x.GetExtensions<IActionCommandExtension>())
.Returns(new List<IActionCommandExtension>() { pluginCommand, envCommand });
_hc.SetSingleton<IExtensionManager>(extensionManger.Object);
_hc.SetSingleton<IPipelineDirectoryManager>(directoryManager.Object);
Mock<IExecutionContext> _ec = new Mock<IExecutionContext>();
_ec.Setup(x => x.Write(It.IsAny<string>(), It.IsAny<string>()))
.Returns((string tag, string line) =>
{
_hc.GetTrace().Info($"{tag} {line}");
hc.GetTrace().Info($"{tag} {line}");
return 1;
});
_ec.Setup(x => x.AddIssue(It.IsAny<Issue>(), It.IsAny<string>()))
.Callback((Issue issue, string message) =>
{
_hc.GetTrace().Info($"{issue.Type} {issue.Message} {message ?? string.Empty}");
hc.GetTrace().Info($"{issue.Type} {issue.Message} {message ?? string.Empty}");
});
ActionCommandManager commandManager = new ActionCommandManager();
commandManager.Initialize(_hc);
commandManager.EnablePluginInternalCommand();
_commandManager.EnablePluginInternalCommand();
Assert.True(commandManager.TryProcessCommand(_ec.Object, "##[internal-set-repo-path repoFullName=actions/runner;workspaceRepo=true]somepath"));
Assert.True(_commandManager.TryProcessCommand(_ec.Object, "##[internal-set-repo-path repoFullName=actions/runner;workspaceRepo=true]somepath", null));
directoryManager.Verify(x => x.UpdateRepositoryDirectory(_ec.Object, "actions/runner", "somepath", true), Times.Once);
_pipelineDirectoryManager.Verify(x => x.UpdateRepositoryDirectory(_ec.Object, "actions/runner", "somepath", true), Times.Once);
}
}
@@ -60,47 +50,29 @@ namespace GitHub.Runner.Common.Tests.Worker
[Trait("Category", "Worker")]
public void DisablePluginInternalCommand()
{
using (TestHostContext _hc = new TestHostContext(this))
using (TestHostContext hc = CreateTestContext())
{
var extensionManger = new Mock<IExtensionManager>();
var directoryManager = new Mock<IPipelineDirectoryManager>();
var pluginCommand = new InternalPluginSetRepoPathCommandExtension();
pluginCommand.Initialize(_hc);
var envCommand = new SetEnvCommandExtension();
envCommand.Initialize(_hc);
extensionManger.Setup(x => x.GetExtensions<IActionCommandExtension>())
.Returns(new List<IActionCommandExtension>() { pluginCommand, envCommand });
_hc.SetSingleton<IExtensionManager>(extensionManger.Object);
_hc.SetSingleton<IPipelineDirectoryManager>(directoryManager.Object);
Mock<IExecutionContext> _ec = new Mock<IExecutionContext>();
_ec.Setup(x => x.Write(It.IsAny<string>(), It.IsAny<string>()))
.Returns((string tag, string line) =>
{
_hc.GetTrace().Info($"{tag} {line}");
hc.GetTrace().Info($"{tag} {line}");
return 1;
});
_ec.Setup(x => x.AddIssue(It.IsAny<Issue>(), It.IsAny<string>()))
.Callback((Issue issue, string message) =>
{
_hc.GetTrace().Info($"{issue.Type} {issue.Message} {message ?? string.Empty}");
hc.GetTrace().Info($"{issue.Type} {issue.Message} {message ?? string.Empty}");
});
ActionCommandManager commandManager = new ActionCommandManager();
commandManager.Initialize(_hc);
commandManager.EnablePluginInternalCommand();
_commandManager.EnablePluginInternalCommand();
Assert.True(commandManager.TryProcessCommand(_ec.Object, "##[internal-set-repo-path repoFullName=actions/runner;workspaceRepo=true]somepath"));
Assert.True(_commandManager.TryProcessCommand(_ec.Object, "##[internal-set-repo-path repoFullName=actions/runner;workspaceRepo=true]somepath", null));
commandManager.DisablePluginInternalCommand();
_commandManager.DisablePluginInternalCommand();
Assert.False(commandManager.TryProcessCommand(_ec.Object, "##[internal-set-repo-path repoFullName=actions/runner;workspaceRepo=true]somepath"));
Assert.False(_commandManager.TryProcessCommand(_ec.Object, "##[internal-set-repo-path repoFullName=actions/runner;workspaceRepo=true]somepath", null));
directoryManager.Verify(x => x.UpdateRepositoryDirectory(_ec.Object, "actions/runner", "somepath", true), Times.Once);
_pipelineDirectoryManager.Verify(x => x.UpdateRepositoryDirectory(_ec.Object, "actions/runner", "somepath", true), Times.Once);
}
}
@@ -109,42 +81,27 @@ namespace GitHub.Runner.Common.Tests.Worker
[Trait("Category", "Worker")]
public void StopProcessCommand()
{
using (TestHostContext _hc = new TestHostContext(this))
using (TestHostContext hc = CreateTestContext())
{
var extensionManger = new Mock<IExtensionManager>();
var pluginCommand = new InternalPluginSetRepoPathCommandExtension();
pluginCommand.Initialize(_hc);
var envCommand = new SetEnvCommandExtension();
envCommand.Initialize(_hc);
extensionManger.Setup(x => x.GetExtensions<IActionCommandExtension>())
.Returns(new List<IActionCommandExtension>() { pluginCommand, envCommand });
_hc.SetSingleton<IExtensionManager>(extensionManger.Object);
Mock<IExecutionContext> _ec = new Mock<IExecutionContext>();
_ec.Setup(x => x.Write(It.IsAny<string>(), It.IsAny<string>()))
.Returns((string tag, string line) =>
{
_hc.GetTrace().Info($"{tag} {line}");
hc.GetTrace().Info($"{tag} {line}");
return 1;
});
_ec.Setup(x => x.AddIssue(It.IsAny<Issue>(), It.IsAny<string>()))
.Callback((Issue issue, string message) =>
{
_hc.GetTrace().Info($"{issue.Type} {issue.Message} {message ?? string.Empty}");
hc.GetTrace().Info($"{issue.Type} {issue.Message} {message ?? string.Empty}");
});
_ec.Setup(x => x.EnvironmentVariables).Returns(new Dictionary<string, string>());
ActionCommandManager commandManager = new ActionCommandManager();
commandManager.Initialize(_hc);
Assert.True(commandManager.TryProcessCommand(_ec.Object, "##[stop-commands]stopToken"));
Assert.False(commandManager.TryProcessCommand(_ec.Object, "##[set-env name=foo]bar"));
Assert.True(commandManager.TryProcessCommand(_ec.Object, "##[stopToken]"));
Assert.True(commandManager.TryProcessCommand(_ec.Object, "##[set-env name=foo]bar"));
Assert.True(_commandManager.TryProcessCommand(_ec.Object, "##[stop-commands]stopToken", null));
Assert.False(_commandManager.TryProcessCommand(_ec.Object, "##[set-env name=foo]bar", null));
Assert.True(_commandManager.TryProcessCommand(_ec.Object, "##[stopToken]", null));
Assert.True(_commandManager.TryProcessCommand(_ec.Object, "##[set-env name=foo]bar", null));
}
}
@@ -153,41 +110,29 @@ namespace GitHub.Runner.Common.Tests.Worker
[Trait("Category", "Worker")]
public void EchoProcessCommand()
{
using (TestHostContext _hc = new TestHostContext(this))
using (TestHostContext hc = CreateTestContext())
{
var extensionManager = new Mock<IExtensionManager>();
var echoCommand = new EchoCommandExtension();
echoCommand.Initialize(_hc);
extensionManager.Setup(x => x.GetExtensions<IActionCommandExtension>())
.Returns(new List<IActionCommandExtension>() { echoCommand });
_hc.SetSingleton<IExtensionManager>(extensionManager.Object);
Mock<IExecutionContext> _ec = new Mock<IExecutionContext>();
_ec.Setup(x => x.Write(It.IsAny<string>(), It.IsAny<string>()))
.Returns((string tag, string line) =>
{
_hc.GetTrace().Info($"{tag} {line}");
hc.GetTrace().Info($"{tag} {line}");
return 1;
});
_ec.SetupAllProperties();
ActionCommandManager commandManager = new ActionCommandManager();
commandManager.Initialize(_hc);
Assert.False(_ec.Object.EchoOnActionCommand);
Assert.True(commandManager.TryProcessCommand(_ec.Object, "::echo::on"));
Assert.True(_commandManager.TryProcessCommand(_ec.Object, "::echo::on", null));
Assert.True(_ec.Object.EchoOnActionCommand);
Assert.True(commandManager.TryProcessCommand(_ec.Object, "::echo::off"));
Assert.True(_commandManager.TryProcessCommand(_ec.Object, "::echo::off", null));
Assert.False(_ec.Object.EchoOnActionCommand);
Assert.True(commandManager.TryProcessCommand(_ec.Object, "::echo::ON"));
Assert.True(_commandManager.TryProcessCommand(_ec.Object, "::echo::ON", null));
Assert.True(_ec.Object.EchoOnActionCommand);
Assert.True(commandManager.TryProcessCommand(_ec.Object, "::echo::Off "));
Assert.True(_commandManager.TryProcessCommand(_ec.Object, "::echo::Off ", null));
Assert.False(_ec.Object.EchoOnActionCommand);
}
}
@@ -197,7 +142,7 @@ namespace GitHub.Runner.Common.Tests.Worker
[Trait("Category", "Worker")]
public void EchoProcessCommandDebugOn()
{
using (TestHostContext _hc = new TestHostContext(this))
using (TestHostContext hc = CreateTestContext())
{
// Set up a few things
// 1. Job request message (with ACTIONS_STEP_DEBUG = true)
@@ -219,84 +164,135 @@ namespace GitHub.Runner.Common.Tests.Worker
var jobServerQueue = new Mock<IJobServerQueue>();
jobServerQueue.Setup(x => x.QueueTimelineRecordUpdate(It.IsAny<Guid>(), It.IsAny<TimelineRecord>()));
_hc.SetSingleton(jobServerQueue.Object);
var extensionManager = new Mock<IExtensionManager>();
var echoCommand = new EchoCommandExtension();
echoCommand.Initialize(_hc);
extensionManager.Setup(x => x.GetExtensions<IActionCommandExtension>())
.Returns(new List<IActionCommandExtension>() { echoCommand });
_hc.SetSingleton<IExtensionManager>(extensionManager.Object);
hc.SetSingleton(jobServerQueue.Object);
var configurationStore = new Mock<IConfigurationStore>();
configurationStore.Setup(x => x.GetSettings()).Returns(new RunnerSettings());
_hc.SetSingleton(configurationStore.Object);
hc.SetSingleton(configurationStore.Object);
var pagingLogger = new Mock<IPagingLogger>();
_hc.EnqueueInstance(pagingLogger.Object);
ActionCommandManager commandManager = new ActionCommandManager();
commandManager.Initialize(_hc);
var _ec = new Runner.Worker.ExecutionContext();
_ec.Initialize(_hc);
hc.EnqueueInstance(pagingLogger.Object);
// Initialize the job (to exercise logic that sets EchoOnActionCommand)
_ec.InitializeJob(jobRequest, System.Threading.CancellationToken.None);
var ec = new Runner.Worker.ExecutionContext();
ec.Initialize(hc);
ec.InitializeJob(jobRequest, System.Threading.CancellationToken.None);
_ec.Complete();
ec.Complete();
Assert.True(_ec.EchoOnActionCommand);
Assert.True(ec.EchoOnActionCommand);
Assert.True(commandManager.TryProcessCommand(_ec, "::echo::off"));
Assert.False(_ec.EchoOnActionCommand);
Assert.True(_commandManager.TryProcessCommand(ec, "::echo::off", null));
Assert.False(ec.EchoOnActionCommand);
Assert.True(commandManager.TryProcessCommand(_ec, "::echo::on"));
Assert.True(_ec.EchoOnActionCommand);
Assert.True(_commandManager.TryProcessCommand(ec, "::echo::on", null));
Assert.True(ec.EchoOnActionCommand);
}
}
[Fact]
[Trait("Level", "L0")]
[Trait("Category", "Worker")]
public void EchoProcessCommandInvalid()
{
using (TestHostContext _hc = new TestHostContext(this))
using (TestHostContext hc = CreateTestContext())
{
var extensionManager = new Mock<IExtensionManager>();
var echoCommand = new EchoCommandExtension();
echoCommand.Initialize(_hc);
extensionManager.Setup(x => x.GetExtensions<IActionCommandExtension>())
.Returns(new List<IActionCommandExtension>() { echoCommand });
_hc.SetSingleton<IExtensionManager>(extensionManager.Object);
Mock<IExecutionContext> _ec = new Mock<IExecutionContext>();
_ec.Setup(x => x.Write(It.IsAny<string>(), It.IsAny<string>()))
.Returns((string tag, string line) =>
{
_hc.GetTrace().Info($"{tag} {line}");
hc.GetTrace().Info($"{tag} {line}");
return 1;
});
_ec.SetupAllProperties();
ActionCommandManager commandManager = new ActionCommandManager();
commandManager.Initialize(_hc);
// Echo commands below are considered "processed", but are invalid
// 1. Invalid echo value
Assert.True(commandManager.TryProcessCommand(_ec.Object, "::echo::invalid"));
Assert.True(_commandManager.TryProcessCommand(_ec.Object, "::echo::invalid", null));
Assert.Equal(TaskResult.Failed, _ec.Object.CommandResult);
Assert.False(_ec.Object.EchoOnActionCommand);
// 2. No value
Assert.True(commandManager.TryProcessCommand(_ec.Object, "::echo::"));
Assert.True(_commandManager.TryProcessCommand(_ec.Object, "::echo::", null));
Assert.Equal(TaskResult.Failed, _ec.Object.CommandResult);
Assert.False(_ec.Object.EchoOnActionCommand);
}
}
[Fact]
[Trait("Level", "L0")]
[Trait("Category", "Worker")]
public void AddMatcherTranslatesFilePath()
{
using (TestHostContext hc = CreateTestContext())
{
// Create a problem matcher config file
var hostDirectory = hc.GetDirectory(WellKnownDirectory.Temp);
var hostFile = Path.Combine(hostDirectory, "my-matcher.json");
Directory.CreateDirectory(hostDirectory);
var content = @"
{
""problemMatcher"": [
{
""owner"": ""my-matcher"",
""pattern"": [
{
""regexp"": ""^ERROR: (.+)$"",
""message"": 1
}
]
}
]
}";
File.WriteAllText(hostFile, content);
// Setup translation info
var container = new ContainerInfo();
var containerDirectory = "/some-container-directory";
var containerFile = Path.Combine(containerDirectory, "my-matcher.json");
container.AddPathTranslateMapping(hostDirectory, containerDirectory);
// Act
_commandManager.TryProcessCommand(_ec.Object, $"::add-matcher::{containerFile}", container);
// Assert
_ec.Verify(x => x.AddMatchers(It.IsAny<IssueMatchersConfig>()), Times.Once);
}
}
private TestHostContext CreateTestContext([CallerMemberName] string testName = "")
{
var hostContext = new TestHostContext(this, testName);
// Mock extension manager
_extensionManager = new Mock<IExtensionManager>();
var commands = new IActionCommandExtension[]
{
new AddMatcherCommandExtension(),
new EchoCommandExtension(),
new InternalPluginSetRepoPathCommandExtension(),
new SetEnvCommandExtension(),
};
foreach (var command in commands)
{
command.Initialize(hostContext);
}
_extensionManager.Setup(x => x.GetExtensions<IActionCommandExtension>())
.Returns(new List<IActionCommandExtension>(commands));
hostContext.SetSingleton<IExtensionManager>(_extensionManager.Object);
// Mock pipeline directory manager
_pipelineDirectoryManager = new Mock<IPipelineDirectoryManager>();
hostContext.SetSingleton<IPipelineDirectoryManager>(_pipelineDirectoryManager.Object);
// Execution context
_ec = new Mock<IExecutionContext>();
// Command manager
_commandManager = new ActionCommandManager();
_commandManager.Initialize(hostContext);
return hostContext;
}
}
}

View File

@@ -54,11 +54,11 @@ namespace GitHub.Runner.Common.Tests.Worker
};
//Act
var steps = await _actionManager.PrepareActionsAsync(_ec.Object, actions);
var steps = (await _actionManager.PrepareActionsAsync(_ec.Object, actions)).ContainerSetupSteps;
//Assert
Assert.Equal((steps[0].Data as ContainerSetupInfo).StepIds[0], actionId);
Assert.Equal((steps[0].Data as ContainerSetupInfo).Container.Image, "ubuntu:16.04");
Assert.Equal(actionId, (steps[0].Data as ContainerSetupInfo).StepIds[0]);
Assert.Equal("ubuntu:16.04", (steps[0].Data as ContainerSetupInfo).Container.Image);
}
finally
{
@@ -125,7 +125,7 @@ namespace GitHub.Runner.Common.Tests.Worker
Directory.CreateDirectory(Path.GetDirectoryName(watermarkFile));
File.WriteAllText(watermarkFile, DateTime.UtcNow.ToString());
Directory.CreateDirectory(Path.Combine(Path.GetDirectoryName(watermarkFile), "notexist"));
File.Copy(Path.Combine(Environment.GetEnvironmentVariable("GITHUB_RUNNER_SRC_DIR"), "Test", TestDataFolderName, "dockerfileaction.yml"), Path.Combine(Path.GetDirectoryName(watermarkFile), "notexist", "action.yml"));
File.Copy(Path.Combine(TestUtil.GetSrcPath(), "Test", TestDataFolderName, "dockerfileaction.yml"), Path.Combine(Path.GetDirectoryName(watermarkFile), "notexist", "action.yml"));
//Act
await _actionManager.PrepareActionsAsync(_ec.Object, actions);
@@ -164,7 +164,7 @@ namespace GitHub.Runner.Common.Tests.Worker
};
//Act
var steps = await _actionManager.PrepareActionsAsync(_ec.Object, actions);
var steps = (await _actionManager.PrepareActionsAsync(_ec.Object, actions)).ContainerSetupSteps;
Assert.True(steps.Count == 0);
}
@@ -203,10 +203,10 @@ namespace GitHub.Runner.Common.Tests.Worker
var actionDir = Path.Combine(_hc.GetDirectory(WellKnownDirectory.Actions), "TingluoHuang", "runner_L0", "repositoryactionwithdockerfile");
//Act
var steps = await _actionManager.PrepareActionsAsync(_ec.Object, actions);
Assert.Equal((steps[0].Data as ContainerSetupInfo).StepIds[0], actionId);
Assert.Equal((steps[0].Data as ContainerSetupInfo).Container.WorkingDirectory, actionDir);
Assert.Equal((steps[0].Data as ContainerSetupInfo).Container.Dockerfile, Path.Combine(actionDir, "Dockerfile"));
var steps = (await _actionManager.PrepareActionsAsync(_ec.Object, actions)).ContainerSetupSteps;
Assert.Equal(actionId, (steps[0].Data as ContainerSetupInfo).StepIds[0]);
Assert.Equal(actionDir, (steps[0].Data as ContainerSetupInfo).Container.WorkingDirectory);
Assert.Equal(Path.Combine(actionDir, "Dockerfile"), (steps[0].Data as ContainerSetupInfo).Container.Dockerfile);
}
finally
{
@@ -243,11 +243,11 @@ namespace GitHub.Runner.Common.Tests.Worker
var actionDir = Path.Combine(_hc.GetDirectory(WellKnownDirectory.Actions), "TingluoHuang", "runner_L0", "repositoryactionwithdockerfileinrelativepath");
//Act
var steps = await _actionManager.PrepareActionsAsync(_ec.Object, actions);
var steps = (await _actionManager.PrepareActionsAsync(_ec.Object, actions)).ContainerSetupSteps;
Assert.Equal((steps[0].Data as ContainerSetupInfo).StepIds[0], actionId);
Assert.Equal((steps[0].Data as ContainerSetupInfo).Container.WorkingDirectory, actionDir);
Assert.Equal((steps[0].Data as ContainerSetupInfo).Container.Dockerfile, Path.Combine(actionDir, "images/cli", "Dockerfile"));
Assert.Equal(actionId, (steps[0].Data as ContainerSetupInfo).StepIds[0]);
Assert.Equal(actionDir, (steps[0].Data as ContainerSetupInfo).Container.WorkingDirectory);
Assert.Equal(Path.Combine(actionDir, "images/cli", "Dockerfile"), (steps[0].Data as ContainerSetupInfo).Container.Dockerfile);
}
finally
{
@@ -282,11 +282,11 @@ namespace GitHub.Runner.Common.Tests.Worker
var actionDir = Path.Combine(_hc.GetDirectory(WellKnownDirectory.Actions), "TingluoHuang", "runner_L0", "repositoryactionwithdockerfileinrelativepath");
//Act
var steps = await _actionManager.PrepareActionsAsync(_ec.Object, actions);
var steps = (await _actionManager.PrepareActionsAsync(_ec.Object, actions)).ContainerSetupSteps;
Assert.Equal((steps[0].Data as ContainerSetupInfo).StepIds[0], actionId);
Assert.Equal((steps[0].Data as ContainerSetupInfo).Container.WorkingDirectory, actionDir);
Assert.Equal((steps[0].Data as ContainerSetupInfo).Container.Dockerfile, Path.Combine(actionDir, "Dockerfile"));
Assert.Equal(actionId, (steps[0].Data as ContainerSetupInfo).StepIds[0]);
Assert.Equal(actionDir, (steps[0].Data as ContainerSetupInfo).Container.WorkingDirectory);
Assert.Equal(Path.Combine(actionDir, "Dockerfile"), (steps[0].Data as ContainerSetupInfo).Container.Dockerfile);
}
finally
{
@@ -322,11 +322,11 @@ namespace GitHub.Runner.Common.Tests.Worker
var actionDir = Path.Combine(_hc.GetDirectory(WellKnownDirectory.Actions), "TingluoHuang", "runner_L0", "RepositoryActionWithActionfile_DockerfileRelativePath");
//Act
var steps = await _actionManager.PrepareActionsAsync(_ec.Object, actions);
var steps = (await _actionManager.PrepareActionsAsync(_ec.Object, actions)).ContainerSetupSteps;
Assert.Equal((steps[0].Data as ContainerSetupInfo).StepIds[0], actionId);
Assert.Equal((steps[0].Data as ContainerSetupInfo).Container.WorkingDirectory, actionDir);
Assert.Equal((steps[0].Data as ContainerSetupInfo).Container.Dockerfile, Path.Combine(actionDir, "images/Dockerfile"));
Assert.Equal(actionId, (steps[0].Data as ContainerSetupInfo).StepIds[0]);
Assert.Equal(actionDir, (steps[0].Data as ContainerSetupInfo).Container.WorkingDirectory);
Assert.Equal(Path.Combine(actionDir, "images/Dockerfile"), (steps[0].Data as ContainerSetupInfo).Container.Dockerfile);
}
finally
{
@@ -362,10 +362,49 @@ namespace GitHub.Runner.Common.Tests.Worker
var actionDir = Path.Combine(_hc.GetDirectory(WellKnownDirectory.Actions), "TingluoHuang", "runner_L0", "RepositoryActionWithActionfile_DockerHubImage");
//Act
var steps = await _actionManager.PrepareActionsAsync(_ec.Object, actions);
var steps = (await _actionManager.PrepareActionsAsync(_ec.Object, actions)).ContainerSetupSteps;
Assert.Equal(actionId, (steps[0].Data as ContainerSetupInfo).StepIds[0]);
Assert.Equal("ubuntu:18.04", (steps[0].Data as ContainerSetupInfo).Container.Image);
}
finally
{
Teardown();
}
}
[Fact]
[Trait("Level", "L0")]
[Trait("Category", "Worker")]
public async void PrepareActions_RepositoryActionWithActionYamlFile_DockerHubImage()
{
try
{
//Arrange
Setup();
var actionId = Guid.NewGuid();
var actions = new List<Pipelines.ActionStep>
{
new Pipelines.ActionStep()
{
Name = "action",
Id = actionId,
Reference = new Pipelines.RepositoryPathReference()
{
Name = "TingluoHuang/runner_L0",
Ref = "RepositoryActionWithActionYamlFile_DockerHubImage",
RepositoryType = "GitHub"
}
}
};
var actionDir = Path.Combine(_hc.GetDirectory(WellKnownDirectory.Actions), "TingluoHuang", "runner_L0", "RepositoryActionWithActionYamlFile_DockerHubImage");
//Act
var steps = (await _actionManager.PrepareActionsAsync(_ec.Object, actions)).ContainerSetupSteps;
Assert.Equal((steps[0].Data as ContainerSetupInfo).StepIds[0], actionId);
Assert.Equal((steps[0].Data as ContainerSetupInfo).Container.Image, "ubuntu:18.04");
Assert.Equal("ubuntu:18.04", (steps[0].Data as ContainerSetupInfo).Container.Image);
}
finally
{
@@ -401,11 +440,11 @@ namespace GitHub.Runner.Common.Tests.Worker
var actionDir = Path.Combine(_hc.GetDirectory(WellKnownDirectory.Actions), "TingluoHuang", "runner_L0", "repositoryactionwithactionfileanddockerfile");
//Act
var steps = await _actionManager.PrepareActionsAsync(_ec.Object, actions);
var steps = (await _actionManager.PrepareActionsAsync(_ec.Object, actions)).ContainerSetupSteps;
Assert.Equal((steps[0].Data as ContainerSetupInfo).StepIds[0], actionId);
Assert.Equal((steps[0].Data as ContainerSetupInfo).Container.WorkingDirectory, actionDir);
Assert.Equal((steps[0].Data as ContainerSetupInfo).Container.Dockerfile, Path.Combine(actionDir, "Dockerfile"));
Assert.Equal(actionId, (steps[0].Data as ContainerSetupInfo).StepIds[0]);
Assert.Equal(actionDir, (steps[0].Data as ContainerSetupInfo).Container.WorkingDirectory);
Assert.Equal(Path.Combine(actionDir, "Dockerfile"), (steps[0].Data as ContainerSetupInfo).Container.Dockerfile);
}
finally
{
@@ -518,33 +557,33 @@ namespace GitHub.Runner.Common.Tests.Worker
};
//Act
var steps = await _actionManager.PrepareActionsAsync(_ec.Object, actions);
var steps = (await _actionManager.PrepareActionsAsync(_ec.Object, actions)).ContainerSetupSteps;
//Assert
Assert.Equal((steps[0].Data as ContainerSetupInfo).StepIds[0], actionId1);
Assert.Equal((steps[0].Data as ContainerSetupInfo).Container.Image, "ubuntu:16.04");
Assert.Equal(actionId1, (steps[0].Data as ContainerSetupInfo).StepIds[0]);
Assert.Equal("ubuntu:16.04", (steps[0].Data as ContainerSetupInfo).Container.Image);
Assert.True((steps[1].Data as ContainerSetupInfo).StepIds.Contains(actionId2));
Assert.True((steps[1].Data as ContainerSetupInfo).StepIds.Contains(actionId3));
Assert.True((steps[1].Data as ContainerSetupInfo).StepIds.Contains(actionId4));
Assert.Equal((steps[1].Data as ContainerSetupInfo).Container.Image, "ubuntu:18.04");
Assert.Contains(actionId2, (steps[1].Data as ContainerSetupInfo).StepIds);
Assert.Contains(actionId3, (steps[1].Data as ContainerSetupInfo).StepIds);
Assert.Contains(actionId4, (steps[1].Data as ContainerSetupInfo).StepIds);
Assert.Equal("ubuntu:18.04", (steps[1].Data as ContainerSetupInfo).Container.Image);
var actionDir = Path.Combine(_hc.GetDirectory(WellKnownDirectory.Actions), "TingluoHuang", "runner_L0", "repositoryactionwithdockerfile");
Assert.Equal((steps[2].Data as ContainerSetupInfo).StepIds[0], actionId5);
Assert.Equal((steps[2].Data as ContainerSetupInfo).Container.WorkingDirectory, actionDir);
Assert.Equal((steps[2].Data as ContainerSetupInfo).Container.Dockerfile, Path.Combine(actionDir, "Dockerfile"));
Assert.Equal(actionId5, (steps[2].Data as ContainerSetupInfo).StepIds[0]);
Assert.Equal(actionDir, (steps[2].Data as ContainerSetupInfo).Container.WorkingDirectory);
Assert.Equal(Path.Combine(actionDir, "Dockerfile"), (steps[2].Data as ContainerSetupInfo).Container.Dockerfile);
actionDir = Path.Combine(_hc.GetDirectory(WellKnownDirectory.Actions), "TingluoHuang", "runner_L0", "repositoryactionwithdockerfileinrelativepath");
Assert.True((steps[3].Data as ContainerSetupInfo).StepIds.Contains(actionId6));
Assert.True((steps[3].Data as ContainerSetupInfo).StepIds.Contains(actionId7));
Assert.Equal((steps[3].Data as ContainerSetupInfo).Container.WorkingDirectory, actionDir);
Assert.Equal((steps[3].Data as ContainerSetupInfo).Container.Dockerfile, Path.Combine(actionDir, "Dockerfile"));
Assert.Contains(actionId6, (steps[3].Data as ContainerSetupInfo).StepIds);
Assert.Contains(actionId7, (steps[3].Data as ContainerSetupInfo).StepIds);
Assert.Equal(actionDir, (steps[3].Data as ContainerSetupInfo).Container.WorkingDirectory);
Assert.Equal(Path.Combine(actionDir, "Dockerfile"), (steps[3].Data as ContainerSetupInfo).Container.Dockerfile);
Assert.Equal((steps[4].Data as ContainerSetupInfo).StepIds[0], actionId8);
Assert.Equal((steps[4].Data as ContainerSetupInfo).Container.WorkingDirectory, actionDir);
Assert.Equal((steps[4].Data as ContainerSetupInfo).Container.Dockerfile, Path.Combine(actionDir, "images/cli", "Dockerfile"));
Assert.Equal(actionId8, (steps[4].Data as ContainerSetupInfo).StepIds[0]);
Assert.Equal(actionDir, (steps[4].Data as ContainerSetupInfo).Container.WorkingDirectory);
Assert.Equal(Path.Combine(actionDir, "images/cli", "Dockerfile"), (steps[4].Data as ContainerSetupInfo).Container.Dockerfile);
}
finally
{
@@ -579,7 +618,7 @@ namespace GitHub.Runner.Common.Tests.Worker
};
//Act
var steps = await _actionManager.PrepareActionsAsync(_ec.Object, actions);
var steps = (await _actionManager.PrepareActionsAsync(_ec.Object, actions)).ContainerSetupSteps;
// node.js based action doesn't need any extra steps to build/pull containers.
Assert.True(steps.Count == 0);
@@ -590,6 +629,47 @@ namespace GitHub.Runner.Common.Tests.Worker
}
}
[Fact]
[Trait("Level", "L0")]
[Trait("Category", "Worker")]
public async void PrepareActions_RepositoryActionWithInvalidWrapperActionfile_Node()
{
try
{
//Arrange
Setup();
var actionId = Guid.NewGuid();
var actions = new List<Pipelines.ActionStep>
{
new Pipelines.ActionStep()
{
Name = "action",
Id = actionId,
Reference = new Pipelines.RepositoryPathReference()
{
Name = "TingluoHuang/runner_L0",
Ref = "RepositoryActionWithInvalidWrapperActionfile_Node",
RepositoryType = "GitHub"
}
}
};
//Act
try
{
await _actionManager.PrepareActionsAsync(_ec.Object, actions);
}
catch (ArgumentNullException e)
{
Assert.Contains("Entry javascript fils is not provided.", e.Message);
}
}
finally
{
Teardown();
}
}
[Fact]
[Trait("Level", "L0")]
[Trait("Category", "Worker")]
@@ -925,6 +1005,87 @@ runs:
}
}
[Fact]
[Trait("Level", "L0")]
[Trait("Category", "Worker")]
public void LoadsNodeActionDefinitionYaml()
{
try
{
// Arrange.
Setup();
const string Content = @"
# Container action
name: 'Hello World'
description: 'Greet the world and record the time'
author: 'GitHub'
inputs:
greeting: # id of input
description: 'The greeting we choose - will print ""{greeting}, World!"" on stdout'
required: true
default: 'Hello'
entryPoint: # id of input
description: 'optional docker entrypoint overwrite.'
required: false
outputs:
time: # id of output
description: 'The time we did the greeting'
icon: 'hello.svg' # vector art to display in the GitHub Marketplace
color: 'green' # optional, decorates the entry in the GitHub Marketplace
runs:
using: 'node12'
main: 'task.js'
";
Pipelines.ActionStep instance;
string directory;
directory = Path.Combine(_workFolder, Constants.Path.ActionsDirectory, "GitHub/actions".Replace(Path.AltDirectorySeparatorChar, Path.DirectorySeparatorChar), "master");
string file = Path.Combine(directory, Constants.Path.ActionManifestYamlFile);
Directory.CreateDirectory(Path.GetDirectoryName(file));
File.WriteAllText(file, Content);
instance = new Pipelines.ActionStep()
{
Id = Guid.NewGuid(),
Reference = new Pipelines.RepositoryPathReference()
{
Name = "GitHub/actions",
Ref = "master",
RepositoryType = Pipelines.RepositoryTypes.GitHub
}
};
// Act.
Definition definition = _actionManager.LoadAction(_ec.Object, instance);
// Assert.
Assert.NotNull(definition);
Assert.Equal(directory, definition.Directory);
Assert.NotNull(definition.Data);
Assert.NotNull(definition.Data.Inputs); // inputs
Dictionary<string, string> inputDefaults = new Dictionary<string, string>(StringComparer.OrdinalIgnoreCase);
foreach (var input in definition.Data.Inputs)
{
var name = input.Key.AssertString("key").Value;
var value = input.Value.AssertScalar("value").ToString();
_hc.GetTrace().Info($"Default: {name} = {value}");
inputDefaults[name] = value;
}
Assert.Equal(2, inputDefaults.Count);
Assert.True(inputDefaults.ContainsKey("greeting"));
Assert.Equal("Hello", inputDefaults["greeting"]);
Assert.True(string.IsNullOrEmpty(inputDefaults["entryPoint"]));
Assert.NotNull(definition.Data.Execution); // execution
Assert.NotNull((definition.Data.Execution as NodeJSActionExecutionData));
Assert.Equal("task.js", (definition.Data.Execution as NodeJSActionExecutionData).Script);
}
finally
{
Teardown();
}
}
[Fact]
[Trait("Level", "L0")]
[Trait("Category", "Worker")]
@@ -1433,7 +1594,7 @@ runs:
private void CreateAction(string yamlContent, out Pipelines.ActionStep instance, out string directory)
{
directory = Path.Combine(_workFolder, Constants.Path.ActionsDirectory, "GitHub/actions".Replace(Path.AltDirectorySeparatorChar, Path.DirectorySeparatorChar), "master");
string file = Path.Combine(directory, Constants.Path.ActionManifestFile);
string file = Path.Combine(directory, Constants.Path.ActionManifestYmlFile);
Directory.CreateDirectory(Path.GetDirectoryName(file));
File.WriteAllText(file, yamlContent);
instance = new Pipelines.ActionStep()
@@ -1451,7 +1612,7 @@ runs:
private void CreateSelfRepoAction(string yamlContent, out Pipelines.ActionStep instance, out string directory)
{
directory = Path.Combine(_workFolder, "actions", "actions");
string file = Path.Combine(directory, Constants.Path.ActionManifestFile);
string file = Path.Combine(directory, Constants.Path.ActionManifestYmlFile);
Directory.CreateDirectory(Path.GetDirectoryName(file));
File.WriteAllText(file, yamlContent);
instance = new Pipelines.ActionStep()

View File

@@ -63,6 +63,52 @@ namespace GitHub.Runner.Common.Tests.Worker
}
}
[Fact]
[Trait("Level", "L0")]
[Trait("Category", "Worker")]
public void Load_ContainerAction_Dockerfile_Pre()
{
try
{
//Arrange
Setup();
var actionManifest = new ActionManifestManager();
actionManifest.Initialize(_hc);
//Act
var result = actionManifest.Load(_ec.Object, Path.Combine(TestUtil.GetTestDataPath(), "dockerfileaction_init.yml"));
//Assert
Assert.Equal("Hello World", result.Name);
Assert.Equal("Greet the world and record the time", result.Description);
Assert.Equal(2, result.Inputs.Count);
Assert.Equal("greeting", result.Inputs[0].Key.AssertString("key").Value);
Assert.Equal("Hello", result.Inputs[0].Value.AssertString("value").Value);
Assert.Equal("entryPoint", result.Inputs[1].Key.AssertString("key").Value);
Assert.Equal("", result.Inputs[1].Value.AssertString("value").Value);
Assert.Equal(ActionExecutionType.Container, result.Execution.ExecutionType);
var containerAction = result.Execution as ContainerActionExecutionData;
Assert.Equal("Dockerfile", containerAction.Image);
Assert.Equal("main.sh", containerAction.EntryPoint);
Assert.Equal("init.sh", containerAction.Init);
Assert.Equal("success()", containerAction.InitCondition);
Assert.Equal("bzz", containerAction.Arguments[0].ToString());
Assert.Equal("Token", containerAction.Environment[0].Key.ToString());
Assert.Equal("foo", containerAction.Environment[0].Value.ToString());
Assert.Equal("Url", containerAction.Environment[1].Key.ToString());
Assert.Equal("bar", containerAction.Environment[1].Value.ToString());
}
finally
{
Teardown();
}
}
[Fact]
[Trait("Level", "L0")]
[Trait("Category", "Worker")]
@@ -109,6 +155,98 @@ namespace GitHub.Runner.Common.Tests.Worker
}
}
[Fact]
[Trait("Level", "L0")]
[Trait("Category", "Worker")]
public void Load_ContainerAction_Dockerfile_Pre_DefaultCondition()
{
try
{
//Arrange
Setup();
var actionManifest = new ActionManifestManager();
actionManifest.Initialize(_hc);
//Act
var result = actionManifest.Load(_ec.Object, Path.Combine(TestUtil.GetTestDataPath(), "dockerfileaction_init_default.yml"));
//Assert
Assert.Equal("Hello World", result.Name);
Assert.Equal("Greet the world and record the time", result.Description);
Assert.Equal(2, result.Inputs.Count);
Assert.Equal("greeting", result.Inputs[0].Key.AssertString("key").Value);
Assert.Equal("Hello", result.Inputs[0].Value.AssertString("value").Value);
Assert.Equal("entryPoint", result.Inputs[1].Key.AssertString("key").Value);
Assert.Equal("", result.Inputs[1].Value.AssertString("value").Value);
Assert.Equal(ActionExecutionType.Container, result.Execution.ExecutionType);
var containerAction = result.Execution as ContainerActionExecutionData;
Assert.Equal("Dockerfile", containerAction.Image);
Assert.Equal("main.sh", containerAction.EntryPoint);
Assert.Equal("init.sh", containerAction.Init);
Assert.Equal("always()", containerAction.InitCondition);
Assert.Equal("bzz", containerAction.Arguments[0].ToString());
Assert.Equal("Token", containerAction.Environment[0].Key.ToString());
Assert.Equal("foo", containerAction.Environment[0].Value.ToString());
Assert.Equal("Url", containerAction.Environment[1].Key.ToString());
Assert.Equal("bar", containerAction.Environment[1].Value.ToString());
}
finally
{
Teardown();
}
}
[Fact]
[Trait("Level", "L0")]
[Trait("Category", "Worker")]
public void Load_ContainerAction_Dockerfile_Post_DefaultCondition()
{
try
{
//Arrange
Setup();
var actionManifest = new ActionManifestManager();
actionManifest.Initialize(_hc);
//Act
var result = actionManifest.Load(_ec.Object, Path.Combine(TestUtil.GetTestDataPath(), "dockerfileaction_cleanup_default.yml"));
//Assert
Assert.Equal("Hello World", result.Name);
Assert.Equal("Greet the world and record the time", result.Description);
Assert.Equal(2, result.Inputs.Count);
Assert.Equal("greeting", result.Inputs[0].Key.AssertString("key").Value);
Assert.Equal("Hello", result.Inputs[0].Value.AssertString("value").Value);
Assert.Equal("entryPoint", result.Inputs[1].Key.AssertString("key").Value);
Assert.Equal("", result.Inputs[1].Value.AssertString("value").Value);
Assert.Equal(ActionExecutionType.Container, result.Execution.ExecutionType);
var containerAction = result.Execution as ContainerActionExecutionData;
Assert.Equal("Dockerfile", containerAction.Image);
Assert.Equal("main.sh", containerAction.EntryPoint);
Assert.Equal("cleanup.sh", containerAction.Cleanup);
Assert.Equal("always()", containerAction.CleanupCondition);
Assert.Equal("bzz", containerAction.Arguments[0].ToString());
Assert.Equal("Token", containerAction.Environment[0].Key.ToString());
Assert.Equal("foo", containerAction.Environment[0].Value.ToString());
Assert.Equal("Url", containerAction.Environment[1].Key.ToString());
Assert.Equal("bar", containerAction.Environment[1].Value.ToString());
}
finally
{
Teardown();
}
}
[Fact]
[Trait("Level", "L0")]
[Trait("Category", "Worker")]
@@ -275,6 +413,94 @@ namespace GitHub.Runner.Common.Tests.Worker
}
}
[Fact]
[Trait("Level", "L0")]
[Trait("Category", "Worker")]
public void Load_NodeAction_Pre()
{
try
{
//Arrange
Setup();
var actionManifest = new ActionManifestManager();
actionManifest.Initialize(_hc);
//Act
var result = actionManifest.Load(_ec.Object, Path.Combine(TestUtil.GetTestDataPath(), "nodeaction_init.yml"));
//Assert
Assert.Equal("Hello World", result.Name);
Assert.Equal("Greet the world and record the time", result.Description);
Assert.Equal(2, result.Inputs.Count);
Assert.Equal("greeting", result.Inputs[0].Key.AssertString("key").Value);
Assert.Equal("Hello", result.Inputs[0].Value.AssertString("value").Value);
Assert.Equal("entryPoint", result.Inputs[1].Key.AssertString("key").Value);
Assert.Equal("", result.Inputs[1].Value.AssertString("value").Value);
Assert.Equal(1, result.Deprecated.Count);
Assert.True(result.Deprecated.ContainsKey("greeting"));
result.Deprecated.TryGetValue("greeting", out string value);
Assert.Equal("This property has been deprecated", value);
Assert.Equal(ActionExecutionType.NodeJS, result.Execution.ExecutionType);
var nodeAction = result.Execution as NodeJSActionExecutionData;
Assert.Equal("main.js", nodeAction.Script);
Assert.Equal("init.js", nodeAction.Init);
Assert.Equal("cancelled()", nodeAction.InitCondition);
}
finally
{
Teardown();
}
}
[Fact]
[Trait("Level", "L0")]
[Trait("Category", "Worker")]
public void Load_NodeAction_Init_DefaultCondition()
{
try
{
//Arrange
Setup();
var actionManifest = new ActionManifestManager();
actionManifest.Initialize(_hc);
//Act
var result = actionManifest.Load(_ec.Object, Path.Combine(TestUtil.GetTestDataPath(), "nodeaction_init_default.yml"));
//Assert
Assert.Equal("Hello World", result.Name);
Assert.Equal("Greet the world and record the time", result.Description);
Assert.Equal(2, result.Inputs.Count);
Assert.Equal("greeting", result.Inputs[0].Key.AssertString("key").Value);
Assert.Equal("Hello", result.Inputs[0].Value.AssertString("value").Value);
Assert.Equal("entryPoint", result.Inputs[1].Key.AssertString("key").Value);
Assert.Equal("", result.Inputs[1].Value.AssertString("value").Value);
Assert.Equal(1, result.Deprecated.Count);
Assert.True(result.Deprecated.ContainsKey("greeting"));
result.Deprecated.TryGetValue("greeting", out string value);
Assert.Equal("This property has been deprecated", value);
Assert.Equal(ActionExecutionType.NodeJS, result.Execution.ExecutionType);
var nodeAction = result.Execution as NodeJSActionExecutionData;
Assert.Equal("main.js", nodeAction.Script);
Assert.Equal("init.js", nodeAction.Init);
Assert.Equal("always()", nodeAction.InitCondition);
}
finally
{
Teardown();
}
}
[Fact]
[Trait("Level", "L0")]
[Trait("Category", "Worker")]
@@ -319,6 +545,50 @@ namespace GitHub.Runner.Common.Tests.Worker
}
}
[Fact]
[Trait("Level", "L0")]
[Trait("Category", "Worker")]
public void Load_NodeAction_Cleanup_DefaultCondition()
{
try
{
//Arrange
Setup();
var actionManifest = new ActionManifestManager();
actionManifest.Initialize(_hc);
//Act
var result = actionManifest.Load(_ec.Object, Path.Combine(TestUtil.GetTestDataPath(), "nodeaction_cleanup_default.yml"));
//Assert
Assert.Equal("Hello World", result.Name);
Assert.Equal("Greet the world and record the time", result.Description);
Assert.Equal(2, result.Inputs.Count);
Assert.Equal("greeting", result.Inputs[0].Key.AssertString("key").Value);
Assert.Equal("Hello", result.Inputs[0].Value.AssertString("value").Value);
Assert.Equal("entryPoint", result.Inputs[1].Key.AssertString("key").Value);
Assert.Equal("", result.Inputs[1].Value.AssertString("value").Value);
Assert.Equal(1, result.Deprecated.Count);
Assert.True(result.Deprecated.ContainsKey("greeting"));
result.Deprecated.TryGetValue("greeting", out string value);
Assert.Equal("This property has been deprecated", value);
Assert.Equal(ActionExecutionType.NodeJS, result.Execution.ExecutionType);
var nodeAction = result.Execution as NodeJSActionExecutionData;
Assert.Equal("main.js", nodeAction.Script);
Assert.Equal("cleanup.js", nodeAction.Cleanup);
Assert.Equal("always()", nodeAction.CleanupCondition);
}
finally
{
Teardown();
}
}
[Fact]
[Trait("Level", "L0")]
[Trait("Category", "Worker")]

View File

@@ -314,6 +314,12 @@ namespace GitHub.Runner.Common.Tests.Worker
githubContext.Add("event", JToken.Parse("{\"foo\":\"bar\"}").ToPipelineContextData());
_context.Add("github", githubContext);
#if OS_WINDOWS
_context["env"] = new DictionaryContextData();
#else
_context["env"] = new CaseSensitiveDictionaryContextData();
#endif
_ec = new Mock<IExecutionContext>();
_ec.Setup(x => x.ExpressionValues).Returns(_context);
_ec.Setup(x => x.IntraActionState).Returns(new Dictionary<string, string>());

View File

@@ -199,20 +199,20 @@ namespace GitHub.Runner.Common.Tests.Worker
var postRunner1 = hc.CreateService<IActionRunner>();
postRunner1.Action = new Pipelines.ActionStep() { Name = "post1", DisplayName = "Test 1", Reference = new Pipelines.RepositoryPathReference() { Name = "actions/action" } };
postRunner1.Action = new Pipelines.ActionStep() { Id = Guid.NewGuid(), Name = "post1", DisplayName = "Test 1", Reference = new Pipelines.RepositoryPathReference() { Name = "actions/action" } };
postRunner1.Stage = ActionRunStage.Post;
postRunner1.Condition = "always()";
postRunner1.DisplayName = "post1";
var postRunner2 = hc.CreateService<IActionRunner>();
postRunner2.Action = new Pipelines.ActionStep() { Name = "post2", DisplayName = "Test 2", Reference = new Pipelines.RepositoryPathReference() { Name = "actions/action" } };
postRunner2.Action = new Pipelines.ActionStep() { Id = Guid.NewGuid(), Name = "post2", DisplayName = "Test 2", Reference = new Pipelines.RepositoryPathReference() { Name = "actions/action" } };
postRunner2.Stage = ActionRunStage.Post;
postRunner2.Condition = "always()";
postRunner2.DisplayName = "post2";
action1.RegisterPostJobStep("post1", postRunner1);
action2.RegisterPostJobStep("post2", postRunner2);
action1.RegisterPostJobStep(postRunner1);
action2.RegisterPostJobStep(postRunner2);
Assert.NotNull(jobContext.JobSteps);
Assert.NotNull(jobContext.PostJobSteps);

View File

@@ -144,7 +144,7 @@ namespace GitHub.Runner.Common.Tests.Worker
jobExtension.Initialize(hc);
_actionManager.Setup(x => x.PrepareActionsAsync(It.IsAny<IExecutionContext>(), It.IsAny<IEnumerable<Pipelines.JobStep>>()))
.Returns(Task.FromResult(new List<JobExtensionRunner>()));
.Returns(Task.FromResult(new PrepareResult(new List<JobExtensionRunner>(), new Dictionary<Guid, IActionRunner>())));
List<IStep> result = await jobExtension.InitializeJob(_jobEc, _message);
@@ -179,7 +179,7 @@ namespace GitHub.Runner.Common.Tests.Worker
jobExtension.Initialize(hc);
_actionManager.Setup(x => x.PrepareActionsAsync(It.IsAny<IExecutionContext>(), It.IsAny<IEnumerable<Pipelines.JobStep>>()))
.Returns(Task.FromResult(new List<JobExtensionRunner>() { new JobExtensionRunner(null, "", "prepare1", null), new JobExtensionRunner(null, "", "prepare2", null) }));
.Returns(Task.FromResult(new PrepareResult(new List<JobExtensionRunner>() { new JobExtensionRunner(null, "", "prepare1", null), new JobExtensionRunner(null, "", "prepare2", null) }, new Dictionary<Guid, IActionRunner>())));
List<IStep> result = await jobExtension.InitializeJob(_jobEc, _message);

View File

@@ -8,6 +8,7 @@ using System.Threading.Tasks;
using System.Runtime.CompilerServices;
using GitHub.Runner.Sdk;
using GitHub.Runner.Worker;
using GitHub.Runner.Worker.Container;
using GitHub.Runner.Worker.Handlers;
using Moq;
using Xunit;
@@ -748,6 +749,130 @@ namespace GitHub.Runner.Common.Tests.Worker
Environment.SetEnvironmentVariable("RUNNER_TEST_GET_REPOSITORY_PATH_FAILSAFE", "");
}
#if OS_LINUX
[Fact]
[Trait("Level", "L0")]
[Trait("Category", "Worker")]
public async void MatcherFile_JobContainer()
{
var matchers = new IssueMatchersConfig
{
Matchers =
{
new IssueMatcherConfig
{
Owner = "my-matcher-1",
Patterns = new[]
{
new IssuePatternConfig
{
Pattern = @"(.+): (.+)",
File = 1,
Message = 2,
},
},
},
},
};
var container = new ContainerInfo();
using (var hostContext = Setup(matchers: matchers, jobContainer: container))
using (_outputManager)
{
// Setup github.workspace, github.repository
var workDirectory = hostContext.GetDirectory(WellKnownDirectory.Work);
ArgUtil.NotNullOrEmpty(workDirectory, nameof(workDirectory));
Directory.CreateDirectory(workDirectory);
var workspaceDirectory = Path.Combine(workDirectory, "workspace");
Directory.CreateDirectory(workspaceDirectory);
_executionContext.Setup(x => x.GetGitHubContext("workspace")).Returns(workspaceDirectory);
_executionContext.Setup(x => x.GetGitHubContext("repository")).Returns("my-org/workflow-repo");
// Setup a git repository
await CreateRepository(hostContext, workspaceDirectory, "https://github.com/my-org/workflow-repo");
// Create test files
var file = Path.Combine(workspaceDirectory, "some-file.txt");
File.WriteAllText(file, "");
// Add translation path
container.AddPathTranslateMapping(workspaceDirectory, "/container/path/to/workspace");
// Process
Process($"/container/path/to/workspace/some-file.txt: some error 1");
Process($"some-file.txt: some error 2");
Assert.Equal(2, _issues.Count);
Assert.Equal("some error 1", _issues[0].Item1.Message);
Assert.Equal("some-file.txt", _issues[0].Item1.Data["file"]);
Assert.Equal("some error 2", _issues[1].Item1.Message);
Assert.Equal("some-file.txt", _issues[1].Item1.Data["file"]);
}
}
[Fact]
[Trait("Level", "L0")]
[Trait("Category", "Worker")]
public async void MatcherFile_StepContainer()
{
var matchers = new IssueMatchersConfig
{
Matchers =
{
new IssueMatcherConfig
{
Owner = "my-matcher-1",
Patterns = new[]
{
new IssuePatternConfig
{
Pattern = @"(.+): (.+)",
File = 1,
Message = 2,
},
},
},
},
};
var container = new ContainerInfo();
using (var hostContext = Setup(matchers: matchers, stepContainer: container))
using (_outputManager)
{
// Setup github.workspace, github.repository
var workDirectory = hostContext.GetDirectory(WellKnownDirectory.Work);
ArgUtil.NotNullOrEmpty(workDirectory, nameof(workDirectory));
Directory.CreateDirectory(workDirectory);
var workspaceDirectory = Path.Combine(workDirectory, "workspace");
Directory.CreateDirectory(workspaceDirectory);
_executionContext.Setup(x => x.GetGitHubContext("workspace")).Returns(workspaceDirectory);
_executionContext.Setup(x => x.GetGitHubContext("repository")).Returns("my-org/workflow-repo");
// Setup a git repository
await CreateRepository(hostContext, workspaceDirectory, "https://github.com/my-org/workflow-repo");
// Create test files
var file = Path.Combine(workspaceDirectory, "some-file.txt");
File.WriteAllText(file, "");
// Add translation path
container.AddPathTranslateMapping(workspaceDirectory, "/container/path/to/workspace");
// Process
Process($"/container/path/to/workspace/some-file.txt: some error 1");
Process($"some-file.txt: some error 2");
Assert.Equal(2, _issues.Count);
Assert.Equal("some error 1", _issues[0].Item1.Message);
Assert.Equal("some-file.txt", _issues[0].Item1.Data["file"]);
Assert.Equal("some error 2", _issues[1].Item1.Message);
Assert.Equal("some-file.txt", _issues[1].Item1.Data["file"]);
}
}
#endif
[Fact]
[Trait("Level", "L0")]
[Trait("Category", "Worker")]
@@ -806,7 +931,9 @@ namespace GitHub.Runner.Common.Tests.Worker
private TestHostContext Setup(
[CallerMemberName] string name = "",
IssueMatchersConfig matchers = null)
IssueMatchersConfig matchers = null,
ContainerInfo jobContainer = null,
ContainerInfo stepContainer = null)
{
matchers?.Validate();
@@ -824,6 +951,8 @@ namespace GitHub.Runner.Common.Tests.Worker
.Returns(true);
_executionContext.Setup(x => x.Variables)
.Returns(_variables);
_executionContext.Setup(x => x.Container)
.Returns(jobContainer);
_executionContext.Setup(x => x.GetMatchers())
.Returns(matchers?.Matchers ?? new List<IssueMatcherConfig>());
_executionContext.Setup(x => x.Add(It.IsAny<OnMatcherChanged>()))
@@ -844,8 +973,8 @@ namespace GitHub.Runner.Common.Tests.Worker
});
_commandManager = new Mock<IActionCommandManager>();
_commandManager.Setup(x => x.TryProcessCommand(It.IsAny<IExecutionContext>(), It.IsAny<string>()))
.Returns((IExecutionContext executionContext, string line) =>
_commandManager.Setup(x => x.TryProcessCommand(It.IsAny<IExecutionContext>(), It.IsAny<string>(), It.IsAny<ContainerInfo>()))
.Returns((IExecutionContext executionContext, string line, ContainerInfo container) =>
{
if (line.IndexOf("##[some-command]") >= 0)
{
@@ -856,7 +985,7 @@ namespace GitHub.Runner.Common.Tests.Worker
return false;
});
_outputManager = new OutputManager(_executionContext.Object, _commandManager.Object);
_outputManager = new OutputManager(_executionContext.Object, _commandManager.Object, stepContainer);
return hostContext;
}

View File

@@ -19,6 +19,7 @@ namespace GitHub.Runner.Common.Tests.Worker
private Mock<IExecutionContext> _ec;
private StepsRunner _stepsRunner;
private Variables _variables;
private Dictionary<string, string> _env;
private DictionaryContextData _contexts;
private JobContext _jobContext;
private StepsContext _stepContext;
@@ -32,6 +33,11 @@ namespace GitHub.Runner.Common.Tests.Worker
_variables = new Variables(
hostContext: hc,
copy: variablesToCopy);
_env = new Dictionary<string, string>()
{
{"env1", "1"},
{"test", "github_actions"}
};
_ec = new Mock<IExecutionContext>();
_ec.SetupAllProperties();
_ec.Setup(x => x.Variables).Returns(_variables);
@@ -64,9 +70,9 @@ namespace GitHub.Runner.Common.Tests.Worker
// Arrange.
var variableSets = new[]
{
new[] { CreateStep(TaskResult.Succeeded, "success()"), CreateStep(TaskResult.Succeeded, "success()") },
new[] { CreateStep(TaskResult.Succeeded, "success()"), CreateStep(TaskResult.Succeeded, "success() || failure()") },
new[] { CreateStep(TaskResult.Succeeded, "success()"), CreateStep(TaskResult.Succeeded, "always()") }
new[] { CreateStep(hc, TaskResult.Succeeded, "success()"), CreateStep(hc, TaskResult.Succeeded, "success()") },
new[] { CreateStep(hc, TaskResult.Succeeded, "success()"), CreateStep(hc, TaskResult.Succeeded, "success() || failure()") },
new[] { CreateStep(hc, TaskResult.Succeeded, "success()"), CreateStep(hc, TaskResult.Succeeded, "always()") }
};
foreach (var variableSet in variableSets)
{
@@ -96,12 +102,12 @@ namespace GitHub.Runner.Common.Tests.Worker
// Arrange.
var variableSets = new[]
{
new[] { CreateStep(TaskResult.Failed, "success()", true), CreateStep(TaskResult.Succeeded, "success()") },
new[] { CreateStep(TaskResult.Failed, "success()", true), CreateStep(TaskResult.Succeeded, "success() || failure()") },
new[] { CreateStep(TaskResult.Failed, "success()", true), CreateStep(TaskResult.Succeeded, "always()") },
new[] { CreateStep(TaskResult.Failed, "success()", true), CreateStep(TaskResult.Failed, "success()", true) },
new[] { CreateStep(TaskResult.Failed, "success()", true), CreateStep(TaskResult.Failed, "success() || failure()", true) },
new[] { CreateStep(TaskResult.Failed, "success()", true), CreateStep(TaskResult.Failed, "always()", true) }
new[] { CreateStep(hc, TaskResult.Failed, "success()", true), CreateStep(hc, TaskResult.Succeeded, "success()") },
new[] { CreateStep(hc, TaskResult.Failed, "success()", true), CreateStep(hc, TaskResult.Succeeded, "success() || failure()") },
new[] { CreateStep(hc, TaskResult.Failed, "success()", true), CreateStep(hc, TaskResult.Succeeded, "always()") },
new[] { CreateStep(hc, TaskResult.Failed, "success()", true), CreateStep(hc, TaskResult.Failed, "success()", true) },
new[] { CreateStep(hc, TaskResult.Failed, "success()", true), CreateStep(hc, TaskResult.Failed, "success() || failure()", true) },
new[] { CreateStep(hc, TaskResult.Failed, "success()", true), CreateStep(hc, TaskResult.Failed, "always()", true) }
};
foreach (var variableSet in variableSets)
{
@@ -133,12 +139,12 @@ namespace GitHub.Runner.Common.Tests.Worker
{
new
{
Steps = new[] { CreateStep(TaskResult.Failed, "success()"), CreateStep(TaskResult.Succeeded, "success()") },
Steps = new[] { CreateStep(hc, TaskResult.Failed, "success()"), CreateStep(hc, TaskResult.Succeeded, "success()") },
Expected = false,
},
new
{
Steps = new[] { CreateStep(TaskResult.Failed, "success()"), CreateStep(TaskResult.Succeeded, "success() || failure()") },
Steps = new[] { CreateStep(hc, TaskResult.Failed, "success()"), CreateStep(hc, TaskResult.Succeeded, "success() || failure()") },
Expected = true,
},
};
@@ -172,27 +178,27 @@ namespace GitHub.Runner.Common.Tests.Worker
{
new
{
Steps = new[] { CreateStep(TaskResult.Succeeded, "success()"), CreateStep(TaskResult.Succeeded, "always()") },
Steps = new[] { CreateStep(hc, TaskResult.Succeeded, "success()"), CreateStep(hc, TaskResult.Succeeded, "always()") },
Expected = TaskResult.Succeeded,
},
new
{
Steps = new[] { CreateStep(TaskResult.Failed, "success()"), CreateStep(TaskResult.Succeeded, "always()") },
Steps = new[] { CreateStep(hc, TaskResult.Failed, "success()"), CreateStep(hc, TaskResult.Succeeded, "always()") },
Expected = TaskResult.Failed,
},
new
{
Steps = new[] { CreateStep(TaskResult.Failed, "success()"), CreateStep(TaskResult.Succeeded, "always()") },
Steps = new[] { CreateStep(hc, TaskResult.Failed, "success()"), CreateStep(hc, TaskResult.Succeeded, "always()") },
Expected = TaskResult.Failed,
},
new
{
Steps = new[] { CreateStep(TaskResult.Succeeded, "success()"), CreateStep(TaskResult.Failed, "always()") },
Steps = new[] { CreateStep(hc, TaskResult.Succeeded, "success()"), CreateStep(hc, TaskResult.Failed, "always()") },
Expected = TaskResult.Failed,
},
new
{
Steps = new[] { CreateStep(TaskResult.Succeeded, "success()"), CreateStep(TaskResult.Failed, "always()", true) },
Steps = new[] { CreateStep(hc, TaskResult.Succeeded, "success()"), CreateStep(hc, TaskResult.Failed, "always()", true) },
Expected = TaskResult.Succeeded,
},
};
@@ -226,47 +232,47 @@ namespace GitHub.Runner.Common.Tests.Worker
{
new
{
Steps = new[] { CreateStep(TaskResult.Failed, "success()"), CreateStep(TaskResult.Succeeded, "success()") },
Steps = new[] { CreateStep(hc, TaskResult.Failed, "success()"), CreateStep(hc, TaskResult.Succeeded, "success()") },
Expected = TaskResult.Failed
},
new
{
Steps = new[] { CreateStep(TaskResult.Failed, "success()"), CreateStep(TaskResult.Succeeded, "success() || failure()") },
Steps = new[] { CreateStep(hc, TaskResult.Failed, "success()"), CreateStep(hc, TaskResult.Succeeded, "success() || failure()") },
Expected = TaskResult.Failed
},
new
{
Steps = new[] { CreateStep(TaskResult.Failed, "success()"), CreateStep(TaskResult.Succeeded, "always()") },
Steps = new[] { CreateStep(hc, TaskResult.Failed, "success()"), CreateStep(hc, TaskResult.Succeeded, "always()") },
Expected = TaskResult.Failed
},
new
{
Steps = new[] { CreateStep(TaskResult.Failed, "success()", continueOnError: true), CreateStep(TaskResult.Failed, "success()") },
Steps = new[] { CreateStep(hc, TaskResult.Failed, "success()", continueOnError: true), CreateStep(hc, TaskResult.Failed, "success()") },
Expected = TaskResult.Failed
},
new
{
Steps = new[] { CreateStep(TaskResult.Failed, "success()", continueOnError: true), CreateStep(TaskResult.Succeeded, "success()") },
Steps = new[] { CreateStep(hc, TaskResult.Failed, "success()", continueOnError: true), CreateStep(hc, TaskResult.Succeeded, "success()") },
Expected = TaskResult.Succeeded
},
new
{
Steps = new[] { CreateStep(TaskResult.Failed, "success()", continueOnError: true), CreateStep(TaskResult.Failed, "success()", continueOnError: true) },
Steps = new[] { CreateStep(hc, TaskResult.Failed, "success()", continueOnError: true), CreateStep(hc, TaskResult.Failed, "success()", continueOnError: true) },
Expected = TaskResult.Succeeded
},
new
{
Steps = new[] { CreateStep(TaskResult.Succeeded, "success() || failure()") },
Steps = new[] { CreateStep(hc, TaskResult.Succeeded, "success() || failure()") },
Expected = TaskResult.Succeeded
},
new
{
Steps = new[] { CreateStep(TaskResult.Succeeded, "success()"), CreateStep(TaskResult.Failed, "success()") },
Steps = new[] { CreateStep(hc, TaskResult.Succeeded, "success()"), CreateStep(hc, TaskResult.Failed, "success()") },
Expected = TaskResult.Failed
},
new
{
Steps = new[] { CreateStep(TaskResult.Succeeded, "success()"), CreateStep(TaskResult.Succeeded, "success()") },
Steps = new[] { CreateStep(hc, TaskResult.Succeeded, "success()"), CreateStep(hc, TaskResult.Succeeded, "success()") },
Expected = TaskResult.Succeeded
},
// Abandoned
@@ -304,17 +310,17 @@ namespace GitHub.Runner.Common.Tests.Worker
{
new
{
Step = new[] { CreateStep(TaskResult.Failed, "success()"), CreateStep(TaskResult.Succeeded, "success()") },
Step = new[] { CreateStep(hc, TaskResult.Failed, "success()"), CreateStep(hc, TaskResult.Succeeded, "success()") },
Expected = false
},
new
{
Step = new[] { CreateStep(TaskResult.Failed, "success()"), CreateStep(TaskResult.Succeeded, "success() || failure()") },
Step = new[] { CreateStep(hc, TaskResult.Failed, "success()"), CreateStep(hc, TaskResult.Succeeded, "success() || failure()") },
Expected = true
},
new
{
Step = new[] { CreateStep(TaskResult.Failed, "success()"), CreateStep(TaskResult.Succeeded, "always()") },
Step = new[] { CreateStep(hc, TaskResult.Failed, "success()"), CreateStep(hc, TaskResult.Succeeded, "always()") },
Expected = true
}
};
@@ -345,9 +351,9 @@ namespace GitHub.Runner.Common.Tests.Worker
// Arrange.
var variableSets = new[]
{
new[] { CreateStep(TaskResult.Succeeded, "success()"), CreateStep(TaskResult.Succeeded, "always()") },
new[] { CreateStep(TaskResult.Failed, "success()"), CreateStep(TaskResult.Succeeded, "always()") },
new[] { CreateStep(TaskResult.Canceled, "success()"), CreateStep(TaskResult.Succeeded, "always()") }
new[] { CreateStep(hc, TaskResult.Succeeded, "success()"), CreateStep(hc, TaskResult.Succeeded, "always()") },
new[] { CreateStep(hc, TaskResult.Failed, "success()"), CreateStep(hc, TaskResult.Succeeded, "always()") },
new[] { CreateStep(hc, TaskResult.Canceled, "success()"), CreateStep(hc, TaskResult.Succeeded, "always()") }
};
foreach (var variableSet in variableSets)
{
@@ -381,8 +387,8 @@ namespace GitHub.Runner.Common.Tests.Worker
// Arrange.
var variableSets = new[]
{
new[] { CreateStep(TaskResult.Succeeded, "success()") },
new[] { CreateStep(TaskResult.Succeeded, "success()") },
new[] { CreateStep(hc, TaskResult.Succeeded, "success()") },
new[] { CreateStep(hc, TaskResult.Succeeded, "success()") },
};
foreach (var variableSet in variableSets)
{
@@ -399,18 +405,134 @@ namespace GitHub.Runner.Common.Tests.Worker
}
}
private Mock<IStep> CreateStep(TaskResult result, string condition, Boolean continueOnError = false)
[Fact]
[Trait("Level", "L0")]
[Trait("Category", "Worker")]
public async Task StepEnvOverrideJobEnvContext()
{
using (TestHostContext hc = CreateTestContext())
{
// Arrange.
var env1 = new MappingToken(null, null, null);
env1.Add(new StringToken(null, null, null, "env1"), new StringToken(null, null, null, "100"));
env1.Add(new StringToken(null, null, null, "env2"), new BasicExpressionToken(null, null, null, "env.test"));
var step1 = CreateStep(hc, TaskResult.Succeeded, "success()", env: env1);
_ec.Object.Result = null;
_ec.Setup(x => x.JobSteps).Returns(new Queue<IStep>(new[] { step1.Object }));
// Act.
await _stepsRunner.RunAsync(jobContext: _ec.Object);
// Assert.
Assert.Equal(TaskResult.Succeeded, _ec.Object.Result ?? TaskResult.Succeeded);
#if OS_WINDOWS
Assert.Equal("100", _ec.Object.ExpressionValues["env"].AssertDictionary("env")["env1"].AssertString("100"));
Assert.Equal("github_actions", _ec.Object.ExpressionValues["env"].AssertDictionary("env")["env2"].AssertString("github_actions"));
#else
Assert.Equal("100", _ec.Object.ExpressionValues["env"].AssertCaseSensitiveDictionary("env")["env1"].AssertString("100"));
Assert.Equal("github_actions", _ec.Object.ExpressionValues["env"].AssertCaseSensitiveDictionary("env")["env2"].AssertString("github_actions"));
#endif
}
}
[Fact]
[Trait("Level", "L0")]
[Trait("Category", "Worker")]
public async Task PopulateEnvContextForEachStep()
{
using (TestHostContext hc = CreateTestContext())
{
// Arrange.
var env1 = new MappingToken(null, null, null);
env1.Add(new StringToken(null, null, null, "env1"), new StringToken(null, null, null, "100"));
env1.Add(new StringToken(null, null, null, "env2"), new BasicExpressionToken(null, null, null, "env.test"));
var step1 = CreateStep(hc, TaskResult.Succeeded, "success()", env: env1);
var env2 = new MappingToken(null, null, null);
env2.Add(new StringToken(null, null, null, "env1"), new StringToken(null, null, null, "1000"));
env2.Add(new StringToken(null, null, null, "env3"), new BasicExpressionToken(null, null, null, "env.test"));
var step2 = CreateStep(hc, TaskResult.Succeeded, "success()", env: env2);
_ec.Object.Result = null;
_ec.Setup(x => x.JobSteps).Returns(new Queue<IStep>(new[] { step1.Object, step2.Object }));
// Act.
await _stepsRunner.RunAsync(jobContext: _ec.Object);
// Assert.
Assert.Equal(TaskResult.Succeeded, _ec.Object.Result ?? TaskResult.Succeeded);
#if OS_WINDOWS
Assert.Equal("1000", _ec.Object.ExpressionValues["env"].AssertDictionary("env")["env1"].AssertString("1000"));
Assert.Equal("github_actions", _ec.Object.ExpressionValues["env"].AssertDictionary("env")["env3"].AssertString("github_actions"));
Assert.False(_ec.Object.ExpressionValues["env"].AssertDictionary("env").ContainsKey("env2"));
#else
Assert.Equal("1000", _ec.Object.ExpressionValues["env"].AssertCaseSensitiveDictionary("env")["env1"].AssertString("1000"));
Assert.Equal("github_actions", _ec.Object.ExpressionValues["env"].AssertCaseSensitiveDictionary("env")["env3"].AssertString("github_actions"));
Assert.False(_ec.Object.ExpressionValues["env"].AssertCaseSensitiveDictionary("env").ContainsKey("env2"));
#endif
}
}
[Fact]
[Trait("Level", "L0")]
[Trait("Category", "Worker")]
public async Task PopulateEnvContextAfterSetupStepsContext()
{
using (TestHostContext hc = CreateTestContext())
{
// Arrange.
var env1 = new MappingToken(null, null, null);
env1.Add(new StringToken(null, null, null, "env1"), new StringToken(null, null, null, "100"));
var step1 = CreateStep(hc, TaskResult.Succeeded, "success()", env: env1, name: "foo", setOutput: true);
var env2 = new MappingToken(null, null, null);
env2.Add(new StringToken(null, null, null, "env1"), new StringToken(null, null, null, "1000"));
env2.Add(new StringToken(null, null, null, "env2"), new BasicExpressionToken(null, null, null, "steps.foo.outputs.test"));
var step2 = CreateStep(hc, TaskResult.Succeeded, "success()", env: env2);
_ec.Object.Result = null;
_ec.Setup(x => x.JobSteps).Returns(new Queue<IStep>(new[] { step1.Object, step2.Object }));
// Act.
await _stepsRunner.RunAsync(jobContext: _ec.Object);
// Assert.
Assert.Equal(TaskResult.Succeeded, _ec.Object.Result ?? TaskResult.Succeeded);
#if OS_WINDOWS
Assert.Equal("1000", _ec.Object.ExpressionValues["env"].AssertDictionary("env")["env1"].AssertString("1000"));
Assert.Equal("something", _ec.Object.ExpressionValues["env"].AssertDictionary("env")["env2"].AssertString("something"));
#else
Assert.Equal("1000", _ec.Object.ExpressionValues["env"].AssertCaseSensitiveDictionary("env")["env1"].AssertString("1000"));
Assert.Equal("something", _ec.Object.ExpressionValues["env"].AssertCaseSensitiveDictionary("env")["env2"].AssertString("something"));
#endif
}
}
private Mock<IActionRunner> CreateStep(TestHostContext hc, TaskResult result, string condition, Boolean continueOnError = false, MappingToken env = null, string name = "Test", bool setOutput = false)
{
// Setup the step.
var step = new Mock<IStep>();
var step = new Mock<IActionRunner>();
step.Setup(x => x.Condition).Returns(condition);
step.Setup(x => x.ContinueOnError).Returns(new BooleanToken(null, null, null, continueOnError));
step.Setup(x => x.RunAsync()).Returns(Task.CompletedTask);
step.Setup(x => x.Action)
.Returns(new DistributedTask.Pipelines.ActionStep()
{
Name = name,
Id = Guid.NewGuid(),
Environment = env
});
// Setup the step execution context.
var stepContext = new Mock<IExecutionContext>();
stepContext.SetupAllProperties();
stepContext.Setup(x => x.WriteDebug).Returns(true);
stepContext.Setup(x => x.Variables).Returns(_variables);
stepContext.Setup(x => x.EnvironmentVariables).Returns(_env);
stepContext.Setup(x => x.ExpressionValues).Returns(_contexts);
stepContext.Setup(x => x.JobContext).Returns(_jobContext);
stepContext.Setup(x => x.StepsContext).Returns(_stepContext);
@@ -422,13 +544,24 @@ namespace GitHub.Runner.Common.Tests.Worker
stepContext.Object.Result = r;
}
});
var trace = hc.GetTrace();
stepContext.Setup(x => x.Write(It.IsAny<string>(), It.IsAny<string>())).Callback((string tag, string message) => { trace.Info($"[{tag}]{message}"); });
stepContext.Object.Result = result;
step.Setup(x => x.ExecutionContext).Returns(stepContext.Object);
if (setOutput)
{
step.Setup(x => x.RunAsync()).Callback(() => { _stepContext.SetOutput(null, name, "test", "something", out string reference); }).Returns(Task.CompletedTask);
}
else
{
step.Setup(x => x.RunAsync()).Returns(Task.CompletedTask);
}
return step;
}
private string FormatSteps(IEnumerable<Mock<IStep>> steps)
private string FormatSteps(IEnumerable<Mock<IActionRunner>> steps)
{
return String.Join(
" ; ",

View File

@@ -0,0 +1,26 @@
name: 'Hello World'
description: 'Greet the world and record the time'
author: 'Test Corporation'
inputs:
greeting: # id of input
description: 'The greeting we choose - will print ""{greeting}, World!"" on stdout'
required: true
default: 'Hello'
entryPoint: # id of input
description: 'optional docker entrypoint overwrite.'
required: false
outputs:
time: # id of output
description: 'The time we did the greeting'
icon: 'hello.svg' # vector art to display in the GitHub Marketplace
color: 'green' # optional, decorates the entry in the GitHub Marketplace
runs:
using: 'docker'
image: 'Dockerfile'
args:
- 'bzz'
entrypoint: 'main.sh'
env:
Token: foo
Url: bar
post-entrypoint: 'cleanup.sh'

View File

@@ -0,0 +1,27 @@
name: 'Hello World'
description: 'Greet the world and record the time'
author: 'Test Corporation'
inputs:
greeting: # id of input
description: 'The greeting we choose - will print ""{greeting}, World!"" on stdout'
required: true
default: 'Hello'
entryPoint: # id of input
description: 'optional docker entrypoint overwrite.'
required: false
outputs:
time: # id of output
description: 'The time we did the greeting'
icon: 'hello.svg' # vector art to display in the GitHub Marketplace
color: 'green' # optional, decorates the entry in the GitHub Marketplace
runs:
using: 'docker'
image: 'Dockerfile'
args:
- 'bzz'
entrypoint: 'main.sh'
env:
Token: foo
Url: bar
pre-entrypoint: 'init.sh'
pre-if: 'success()'

View File

@@ -0,0 +1,26 @@
name: 'Hello World'
description: 'Greet the world and record the time'
author: 'Test Corporation'
inputs:
greeting: # id of input
description: 'The greeting we choose - will print ""{greeting}, World!"" on stdout'
required: true
default: 'Hello'
entryPoint: # id of input
description: 'optional docker entrypoint overwrite.'
required: false
outputs:
time: # id of output
description: 'The time we did the greeting'
icon: 'hello.svg' # vector art to display in the GitHub Marketplace
color: 'green' # optional, decorates the entry in the GitHub Marketplace
runs:
using: 'docker'
image: 'Dockerfile'
args:
- 'bzz'
entrypoint: 'main.sh'
env:
Token: foo
Url: bar
pre-entrypoint: 'init.sh'

View File

@@ -0,0 +1,21 @@
name: 'Hello World'
description: 'Greet the world and record the time'
author: 'Test Corporation'
inputs:
greeting: # id of input
description: 'The greeting we choose - will print ""{greeting}, World!"" on stdout'
required: true
default: 'Hello'
deprecationMessage: 'This property has been deprecated'
entryPoint: # id of input
description: 'optional docker entrypoint overwrite.'
required: false
outputs:
time: # id of output
description: 'The time we did the greeting'
icon: 'hello.svg' # vector art to display in the GitHub Marketplace
color: 'green' # optional, decorates the entry in the GitHub Marketplace
runs:
using: 'node12'
main: 'main.js'
post: 'cleanup.js'

View File

@@ -0,0 +1,22 @@
name: 'Hello World'
description: 'Greet the world and record the time'
author: 'Test Corporation'
inputs:
greeting: # id of input
description: 'The greeting we choose - will print ""{greeting}, World!"" on stdout'
required: true
default: 'Hello'
deprecationMessage: 'This property has been deprecated'
entryPoint: # id of input
description: 'optional docker entrypoint overwrite.'
required: false
outputs:
time: # id of output
description: 'The time we did the greeting'
icon: 'hello.svg' # vector art to display in the GitHub Marketplace
color: 'green' # optional, decorates the entry in the GitHub Marketplace
runs:
using: 'node12'
main: 'main.js'
pre: 'init.js'
pre-if: 'cancelled()'

View File

@@ -0,0 +1,21 @@
name: 'Hello World'
description: 'Greet the world and record the time'
author: 'Test Corporation'
inputs:
greeting: # id of input
description: 'The greeting we choose - will print ""{greeting}, World!"" on stdout'
required: true
default: 'Hello'
deprecationMessage: 'This property has been deprecated'
entryPoint: # id of input
description: 'optional docker entrypoint overwrite.'
required: false
outputs:
time: # id of output
description: 'The time we did the greeting'
icon: 'hello.svg' # vector art to display in the GitHub Marketplace
color: 'green' # optional, decorates the entry in the GitHub Marketplace
runs:
using: 'node12'
main: 'main.js'
pre: 'init.js'

Some files were not shown because too many files have changed in this diff Show More