Compare commits

..

1 Commits

Author SHA1 Message Date
Tingluo Huang
b831e03e8c Release 2.302.0 runner. 2023-02-14 10:04:56 -05:00
17 changed files with 146 additions and 325 deletions

View File

@@ -27,7 +27,7 @@ jobs:
# Initializes the CodeQL tools for scanning. # Initializes the CodeQL tools for scanning.
- name: Initialize CodeQL - name: Initialize CodeQL
uses: github/codeql-action/init@v2 uses: github/codeql-action/init@v1
# Override language selection by uncommenting this and choosing your languages # Override language selection by uncommenting this and choosing your languages
# with: # with:
# languages: go, javascript, csharp, python, cpp, java # languages: go, javascript, csharp, python, cpp, java
@@ -38,4 +38,4 @@ jobs:
working-directory: src working-directory: src
- name: Perform CodeQL Analysis - name: Perform CodeQL Analysis
uses: github/codeql-action/analyze@v2 uses: github/codeql-action/analyze@v1

View File

@@ -18,6 +18,7 @@ jobs:
uses: github/super-linter@v4 uses: github/super-linter@v4
env: env:
DEFAULT_BRANCH: ${{ github.base_ref }} DEFAULT_BRANCH: ${{ github.base_ref }}
DISABLE_ERRORS: true
EDITORCONFIG_FILE_NAME: .editorconfig EDITORCONFIG_FILE_NAME: .editorconfig
LINTER_RULES_PATH: /src/ LINTER_RULES_PATH: /src/
VALIDATE_ALL_CODEBASE: false VALIDATE_ALL_CODEBASE: false

View File

@@ -1,38 +0,0 @@
# ADR 000: Update Proxy Behavior of Self-Hosted Runners
**Date**: 2023-02-21
**Status**: Pending
## Context
Today, the different user-accessible building blocks of GitHub Actions implement proxy behaviour with significant functional differences.
Users could realistically run all of the below examples and they would reasonably expect that their proxy settings will have the same networking effects across all of them.
A user running `actions/actions-runner-controller`, which starts instances of `actions/runner`, that the runs node.js actions made with `actions/toolkit`
- ARC and its controller and listener pods in k8s will follow `golang` defaults for proxy behaviour
- The `runner` overrides the default proxy behaviour of C# and implements it [explicitly](https://github.com/actions/runner/blob/main/src/Runner.Sdk/RunnerWebProxy.cs), however currently differently from `toolkit`
- `toolkit` overrides the default proxy behaviour of node.js and implements it [explicitly](https://github.com/actions/toolkit/blob/main/packages/http-client/src/proxy.ts), however currently differently from `runner`
## Example 1 - ARC
A user wants to create a scaleset in ARC. They give following settings when creating an ARC Scale Set:
- `https_proxy=https://someproxy.company.com`
- `no_proxy=8.8.8.8,192.168.1.1/32`
The ENV variables are propagated through all actors, but:
- *ARC operators and listener pods* will: follow the proxy but bypass it for `8.8.8.8` and the CIDR block `192.168.1.1/32`
- *The runner* will: use a proxy and ignore these `no_proxy` settings (no IP support in `runner` for `no_proxy`)
- *A node.js GitHub Action in a job executed by the runner* will: bypass `8.8.8.8` but use proxy for the CIDR block `192.168.1.1/32`
## Example 2 - Self-Hosted runner
Given the following settings when creating an ARC Scale Set:
- `https_proxy=someproxy.company.com`
- *The runner* will: silently ignore `https_proxy` value because it doesn't have a protocol (missing `https://`)
- *A node.js GitHub Action in a job executed by the runner* will: throw an exception because proxy could not be parsed (missing `https://`)
## Decisions
## Consequences

View File

@@ -158,11 +158,3 @@ cat (Runner/Worker)_TIMESTAMP.log # view your log file
We use the .NET Foundation and CoreCLR style guidelines [located here]( We use the .NET Foundation and CoreCLR style guidelines [located here](
https://github.com/dotnet/corefx/blob/master/Documentation/coding-guidelines/coding-style.md) https://github.com/dotnet/corefx/blob/master/Documentation/coding-guidelines/coding-style.md)
### Format C# Code
To format both staged and unstaged .cs files
```
cd ./src
./dev.(cmd|sh) format
```

View File

@@ -1,6 +1,7 @@
## Features ## Features
- Add support for ghe.com domain (#2420) - Add support for ghe.com domain (#2420)
- Add docker cli to the runner image. (#2425) - Add docker cli to the runner image. (#2425)
- Uploading step logs to Results service (#2422)
## Bugs ## Bugs
- Fix URL construction bug for RunService (#2396) - Fix URL construction bug for RunService (#2396)

View File

@@ -1 +1 @@
<Update to ./src/runnerversion when creating release> 2.302.0

View File

@@ -18,20 +18,6 @@ while [ -h "$SOURCE" ]; do # resolve $SOURCE until the file is no longer a symli
done done
DIR="$( cd -P "$( dirname "$SOURCE" )" && pwd )" DIR="$( cd -P "$( dirname "$SOURCE" )" && pwd )"
# Wait for docker to start
if [ ! -z "$RUNNER_WAIT_FOR_DOCKER_IN_SECONDS" ]; then
if [ "$RUNNER_WAIT_FOR_DOCKER_IN_SECONDS" -gt 0 ]; then
echo "Waiting for docker to be ready."
for i in $(seq "$RUNNER_WAIT_FOR_DOCKER_IN_SECONDS"); do
if docker ps > /dev/null 2>&1; then
echo "Docker is ready."
break
fi
"$DIR"/safe_sleep.sh 1
done
fi
fi
updateFile="update.finished" updateFile="update.finished"
"$DIR"/bin/Runner.Listener run $* "$DIR"/bin/Runner.Listener run $*

View File

@@ -32,7 +32,6 @@ namespace GitHub.Runner.Common
Task<TaskAttachment> CreateAttachmentAsync(Guid scopeIdentifier, string hubName, Guid planId, Guid timelineId, Guid timelineRecordId, String type, String name, Stream uploadStream, CancellationToken cancellationToken); Task<TaskAttachment> CreateAttachmentAsync(Guid scopeIdentifier, string hubName, Guid planId, Guid timelineId, Guid timelineRecordId, String type, String name, Stream uploadStream, CancellationToken cancellationToken);
Task CreateStepSummaryAsync(string planId, string jobId, Guid stepId, string file, CancellationToken cancellationToken); Task CreateStepSummaryAsync(string planId, string jobId, Guid stepId, string file, CancellationToken cancellationToken);
Task CreateResultsStepLogAsync(string planId, string jobId, Guid stepId, string file, bool finalize, bool firstBlock, long lineCount, CancellationToken cancellationToken); Task CreateResultsStepLogAsync(string planId, string jobId, Guid stepId, string file, bool finalize, bool firstBlock, long lineCount, CancellationToken cancellationToken);
Task CreateResultsJobLogAsync(string planId, string jobId, string file, bool finalize, bool firstBlock, long lineCount, CancellationToken cancellationToken);
Task<TaskLog> CreateLogAsync(Guid scopeIdentifier, string hubName, Guid planId, TaskLog log, CancellationToken cancellationToken); Task<TaskLog> CreateLogAsync(Guid scopeIdentifier, string hubName, Guid planId, TaskLog log, CancellationToken cancellationToken);
Task<Timeline> CreateTimelineAsync(Guid scopeIdentifier, string hubName, Guid planId, Guid timelineId, CancellationToken cancellationToken); Task<Timeline> CreateTimelineAsync(Guid scopeIdentifier, string hubName, Guid planId, Guid timelineId, CancellationToken cancellationToken);
Task<List<TimelineRecord>> UpdateTimelineRecordsAsync(Guid scopeIdentifier, string hubName, Guid planId, Guid timelineId, IEnumerable<TimelineRecord> records, CancellationToken cancellationToken); Task<List<TimelineRecord>> UpdateTimelineRecordsAsync(Guid scopeIdentifier, string hubName, Guid planId, Guid timelineId, IEnumerable<TimelineRecord> records, CancellationToken cancellationToken);
@@ -336,14 +335,6 @@ namespace GitHub.Runner.Common
throw new InvalidOperationException("Results client is not initialized."); throw new InvalidOperationException("Results client is not initialized.");
} }
public Task CreateResultsJobLogAsync(string planId, string jobId, string file, bool finalize, bool firstBlock, long lineCount, CancellationToken cancellationToken)
{
if (_resultsClient != null)
{
return _resultsClient.UploadResultsJobLogAsync(planId, jobId, file, finalize, firstBlock, lineCount, cancellationToken: cancellationToken);
}
throw new InvalidOperationException("Results client is not initialized.");
}
public Task<TaskLog> CreateLogAsync(Guid scopeIdentifier, string hubName, Guid planId, TaskLog log, CancellationToken cancellationToken) public Task<TaskLog> CreateLogAsync(Guid scopeIdentifier, string hubName, Guid planId, TaskLog log, CancellationToken cancellationToken)
{ {

View File

@@ -20,7 +20,7 @@ namespace GitHub.Runner.Common
void Start(Pipelines.AgentJobRequestMessage jobRequest); void Start(Pipelines.AgentJobRequestMessage jobRequest);
void QueueWebConsoleLine(Guid stepRecordId, string line, long? lineNumber = null); void QueueWebConsoleLine(Guid stepRecordId, string line, long? lineNumber = null);
void QueueFileUpload(Guid timelineId, Guid timelineRecordId, string type, string name, string path, bool deleteSource); void QueueFileUpload(Guid timelineId, Guid timelineRecordId, string type, string name, string path, bool deleteSource);
void QueueResultsUpload(Guid timelineRecordId, string name, string path, string type, bool deleteSource, bool finalize, bool firstBlock, long totalLines); void QueueResultsUpload(Guid timelineRecordId, string name, string path, string type, bool deleteSource, bool finalize, bool firstBlock, long totalLines = 0);
void QueueTimelineRecordUpdate(Guid timelineId, TimelineRecord timelineRecord); void QueueTimelineRecordUpdate(Guid timelineId, TimelineRecord timelineRecord);
} }
@@ -84,9 +84,6 @@ namespace GitHub.Runner.Common
private bool _webConsoleLineAggressiveDequeue = true; private bool _webConsoleLineAggressiveDequeue = true;
private bool _firstConsoleOutputs = true; private bool _firstConsoleOutputs = true;
private bool _resultsClientInitiated = false;
private delegate Task ResultsFileUploadHandler(ResultsUploadFileInfo file);
public override void Initialize(IHostContext hostContext) public override void Initialize(IHostContext hostContext)
{ {
base.Initialize(hostContext); base.Initialize(hostContext);
@@ -112,9 +109,9 @@ namespace GitHub.Runner.Common
{ {
Trace.Info("Initializing results client"); Trace.Info("Initializing results client");
_jobServer.InitializeResultsClient(new Uri(resultsReceiverEndpoint), accessToken); _jobServer.InitializeResultsClient(new Uri(resultsReceiverEndpoint), accessToken);
_resultsClientInitiated = true;
} }
if (_queueInProcess) if (_queueInProcess)
{ {
Trace.Info("No-opt, all queue process tasks are running."); Trace.Info("No-opt, all queue process tasks are running.");
@@ -233,23 +230,11 @@ namespace GitHub.Runner.Common
_fileUploadQueue.Enqueue(newFile); _fileUploadQueue.Enqueue(newFile);
} }
public void QueueResultsUpload(Guid timelineRecordId, string name, string path, string type, bool deleteSource, bool finalize, bool firstBlock, long totalLines) public void QueueResultsUpload(Guid recordId, string name, string path, string type, bool deleteSource, bool finalize, bool firstBlock, long totalLines)
{ {
if (!_resultsClientInitiated) if (recordId == _jobTimelineRecordId)
{ {
Trace.Verbose("Skipping results upload"); Trace.Verbose("Skipping job log {0} for record {1}", path, recordId);
try
{
if (deleteSource)
{
File.Delete(path);
}
}
catch (Exception ex)
{
Trace.Info("Catch exception during delete skipped results upload file.");
Trace.Error(ex);
}
return; return;
} }
@@ -261,14 +246,14 @@ namespace GitHub.Runner.Common
Type = type, Type = type,
PlanId = _planId.ToString(), PlanId = _planId.ToString(),
JobId = _jobTimelineRecordId.ToString(), JobId = _jobTimelineRecordId.ToString(),
RecordId = timelineRecordId, RecordId = recordId,
DeleteSource = deleteSource, DeleteSource = deleteSource,
Finalize = finalize, Finalize = finalize,
FirstBlock = firstBlock, FirstBlock = firstBlock,
TotalLines = totalLines, TotalLines = totalLines,
}; };
Trace.Verbose("Enqueue results file upload queue: file '{0}' attach to job {1} step {2}", newFile.Path, _jobTimelineRecordId, timelineRecordId); Trace.Verbose("Enqueue results file upload queue: file '{0}' attach to job {1} step {2}", newFile.Path, _jobTimelineRecordId, recordId);
_resultsFileUploadQueue.Enqueue(newFile); _resultsFileUploadQueue.Enqueue(newFile);
} }
@@ -503,11 +488,6 @@ namespace GitHub.Runner.Common
Trace.Info($"Got a step log file to send to results service."); Trace.Info($"Got a step log file to send to results service.");
await UploadResultsStepLogFile(file); await UploadResultsStepLogFile(file);
} }
else if (file.RecordId == _jobTimelineRecordId)
{
Trace.Info($"Got a job log file to send to results service.");
await UploadResultsJobLogFile(file);
}
} }
} }
catch (Exception ex) catch (Exception ex)
@@ -816,43 +796,42 @@ namespace GitHub.Runner.Common
private async Task UploadSummaryFile(ResultsUploadFileInfo file) private async Task UploadSummaryFile(ResultsUploadFileInfo file)
{ {
Trace.Info($"Starting to upload summary file to results service {file.Name}, {file.Path}"); bool uploadSucceed = false;
ResultsFileUploadHandler summaryHandler = async (file) => try
{ {
await _jobServer.CreateStepSummaryAsync(file.PlanId, file.JobId, file.RecordId, file.Path, CancellationToken.None); // Upload the step summary
}; Trace.Info($"Starting to upload summary file to results service {file.Name}, {file.Path}");
var cancellationTokenSource = new CancellationTokenSource();
await _jobServer.CreateStepSummaryAsync(file.PlanId, file.JobId, file.RecordId, file.Path, cancellationTokenSource.Token);
await UploadResultsFile(file, summaryHandler); uploadSucceed = true;
}
finally
{
if (uploadSucceed && file.DeleteSource)
{
try
{
File.Delete(file.Path);
}
catch (Exception ex)
{
Trace.Info("Catch exception during delete success results uploaded summary file.");
Trace.Error(ex);
}
}
}
} }
private async Task UploadResultsStepLogFile(ResultsUploadFileInfo file) private async Task UploadResultsStepLogFile(ResultsUploadFileInfo file)
{
Trace.Info($"Starting upload of step log file to results service {file.Name}, {file.Path}");
ResultsFileUploadHandler stepLogHandler = async (file) =>
{
await _jobServer.CreateResultsStepLogAsync(file.PlanId, file.JobId, file.RecordId, file.Path, file.Finalize, file.FirstBlock, file.TotalLines, CancellationToken.None);
};
await UploadResultsFile(file, stepLogHandler);
}
private async Task UploadResultsJobLogFile(ResultsUploadFileInfo file)
{
Trace.Info($"Starting upload of job log file to results service {file.Name}, {file.Path}");
ResultsFileUploadHandler jobLogHandler = async (file) =>
{
await _jobServer.CreateResultsJobLogAsync(file.PlanId, file.JobId, file.Path, file.Finalize, file.FirstBlock, file.TotalLines, CancellationToken.None);
};
await UploadResultsFile(file, jobLogHandler);
}
private async Task UploadResultsFile(ResultsUploadFileInfo file, ResultsFileUploadHandler uploadHandler)
{ {
bool uploadSucceed = false; bool uploadSucceed = false;
try try
{ {
await uploadHandler(file); Trace.Info($"Starting upload of step log file to results service {file.Name}, {file.Path}");
var cancellationTokenSource = new CancellationTokenSource();
await _jobServer.CreateResultsStepLogAsync(file.PlanId, file.JobId, file.RecordId, file.Path, file.Finalize, file.FirstBlock, file.TotalLines, cancellationTokenSource.Token);
uploadSucceed = true; uploadSucceed = true;
} }
finally finally

View File

@@ -21,12 +21,6 @@ namespace GitHub.Runner.Common
// 8 MB // 8 MB
public const int PageSize = 8 * 1024 * 1024; public const int PageSize = 8 * 1024 * 1024;
// For Results
public static string BlocksFolder = "blocks";
// 2 MB
public const int BlockSize = 2 * 1024 * 1024;
private Guid _timelineId; private Guid _timelineId;
private Guid _timelineRecordId; private Guid _timelineRecordId;
private FileStream _pageData; private FileStream _pageData;
@@ -38,6 +32,12 @@ namespace GitHub.Runner.Common
private string _pagesFolder; private string _pagesFolder;
private IJobServerQueue _jobServerQueue; private IJobServerQueue _jobServerQueue;
// For Results
public static string BlocksFolder = "blocks";
// 2 MB
public const int BlockSize = 2 * 1024 * 1024;
private string _resultsDataFileName; private string _resultsDataFileName;
private FileStream _resultsBlockData; private FileStream _resultsBlockData;
private StreamWriter _resultsBlockWriter; private StreamWriter _resultsBlockWriter;
@@ -99,8 +99,8 @@ namespace GitHub.Runner.Common
} }
} }
var bytes = System.Text.Encoding.UTF8.GetByteCount(line); var bytes = System.Text.Encoding.UTF8.GetByteCount(line);
_byteCount += bytes; _byteCount += bytes;
_blockByteCount += bytes; _blockByteCount += bytes;
if (_byteCount >= PageSize) if (_byteCount >= PageSize)
{ {

View File

@@ -1,4 +1,4 @@
using System; using System;
using System.Collections.Generic; using System.Collections.Generic;
using System.Globalization; using System.Globalization;
using System.IO; using System.IO;
@@ -81,7 +81,7 @@ namespace GitHub.Runner.Worker
// logging // logging
long Write(string tag, string message); long Write(string tag, string message);
void QueueAttachFile(string type, string name, string filePath); void QueueAttachFile(string type, string name, string filePath);
void QueueSummaryFile(string name, string filePath, Guid stepRecordId); void QueueSummaryFile(string name, string filePath, Guid stepRecordId);
// timeline record update methods // timeline record update methods
void Start(string currentOperation = null); void Start(string currentOperation = null);
@@ -871,7 +871,8 @@ namespace GitHub.Runner.Worker
{ {
throw new FileNotFoundException($"Can't upload (name:{name}) file: {filePath}. File does not exist."); throw new FileNotFoundException($"Can't upload (name:{name}) file: {filePath}. File does not exist.");
} }
_jobServerQueue.QueueResultsUpload(stepRecordId, name, filePath, ChecksAttachmentType.StepSummary, deleteSource: false, finalize: true, firstBlock: true, totalLines: 0);
_jobServerQueue.QueueResultsUpload(stepRecordId, name, filePath, ChecksAttachmentType.StepSummary, deleteSource: false, finalize: false, firstBlock: false);
} }
// Add OnMatcherChanged // Add OnMatcherChanged

View File

@@ -455,6 +455,7 @@ namespace GitHub.DistributedTask.Pipelines.ObjectTemplating
private readonly String[] s_expressionValueNames = new[] private readonly String[] s_expressionValueNames = new[]
{ {
PipelineTemplateConstants.GitHub, PipelineTemplateConstants.GitHub,
PipelineTemplateConstants.Needs,
PipelineTemplateConstants.Strategy, PipelineTemplateConstants.Strategy,
PipelineTemplateConstants.Matrix, PipelineTemplateConstants.Matrix,
PipelineTemplateConstants.Needs, PipelineTemplateConstants.Needs,

View File

@@ -1,4 +1,4 @@
using GitHub.Services.Common; using GitHub.Services.Common;
using GitHub.Services.WebApi; using GitHub.Services.WebApi;
using System; using System;
using System.Runtime.Serialization; using System.Runtime.Serialization;

View File

@@ -28,42 +28,6 @@ namespace GitHub.Services.Results.Contracts
public string BlobStorageType; public string BlobStorageType;
} }
[DataContract]
[JsonObject(NamingStrategyType = typeof(SnakeCaseNamingStrategy))]
public class StepSummaryMetadataCreate
{
[DataMember]
public string StepBackendId;
[DataMember]
public string WorkflowRunBackendId;
[DataMember]
public string WorkflowJobRunBackendId;
[DataMember]
public long Size;
[DataMember]
public string UploadedAt;
}
[DataContract]
[JsonObject(NamingStrategyType = typeof(SnakeCaseNamingStrategy))]
public class GetSignedJobLogsURLRequest
{
[DataMember]
public string WorkflowJobRunBackendId;
[DataMember]
public string WorkflowRunBackendId;
}
[DataContract]
[JsonObject(NamingStrategyType = typeof(SnakeCaseNamingStrategy))]
public class GetSignedJobLogsURLResponse
{
[DataMember]
public string LogsUrl;
[DataMember]
public string BlobStorageType;
}
[DataContract] [DataContract]
[JsonObject(NamingStrategyType = typeof(SnakeCaseNamingStrategy))] [JsonObject(NamingStrategyType = typeof(SnakeCaseNamingStrategy))]
public class GetSignedStepLogsURLRequest public class GetSignedStepLogsURLRequest
@@ -83,36 +47,46 @@ namespace GitHub.Services.Results.Contracts
[DataMember] [DataMember]
public string LogsUrl; public string LogsUrl;
[DataMember] [DataMember]
public string BlobStorageType;
[DataMember]
public long SoftSizeLimit; public long SoftSizeLimit;
[DataMember]
public string BlobStorageType;
} }
[DataContract] [DataContract]
[JsonObject(NamingStrategyType = typeof(SnakeCaseNamingStrategy))] [JsonObject(NamingStrategyType = typeof(SnakeCaseNamingStrategy))]
public class JobLogsMetadataCreate public class StepSummaryMetadataCreate
{ {
[DataMember]
public string StepBackendId;
[DataMember] [DataMember]
public string WorkflowRunBackendId; public string WorkflowRunBackendId;
[DataMember] [DataMember]
public string WorkflowJobRunBackendId; public string WorkflowJobRunBackendId;
[DataMember] [DataMember]
public string UploadedAt; public long Size;
[DataMember] [DataMember]
public long LineCount; public string UploadedAt;
}
[DataContract]
[JsonObject(NamingStrategyType = typeof(SnakeCaseNamingStrategy))]
public class CreateStepSummaryMetadataResponse
{
[DataMember]
public bool Ok;
} }
[DataContract] [DataContract]
[JsonObject(NamingStrategyType = typeof(SnakeCaseNamingStrategy))] [JsonObject(NamingStrategyType = typeof(SnakeCaseNamingStrategy))]
public class StepLogsMetadataCreate public class StepLogsMetadataCreate
{ {
[DataMember]
public string StepBackendId;
[DataMember] [DataMember]
public string WorkflowRunBackendId; public string WorkflowRunBackendId;
[DataMember] [DataMember]
public string WorkflowJobRunBackendId; public string WorkflowJobRunBackendId;
[DataMember] [DataMember]
public string StepBackendId;
[DataMember]
public string UploadedAt; public string UploadedAt;
[DataMember] [DataMember]
public long LineCount; public long LineCount;
@@ -120,7 +94,7 @@ namespace GitHub.Services.Results.Contracts
[DataContract] [DataContract]
[JsonObject(NamingStrategyType = typeof(SnakeCaseNamingStrategy))] [JsonObject(NamingStrategyType = typeof(SnakeCaseNamingStrategy))]
public class CreateMetadataResponse public class CreateStepLogsMetadataResponse
{ {
[DataMember] [DataMember]
public bool Ok; public bool Ok;

View File

@@ -24,85 +24,55 @@ namespace GitHub.Services.Results.Client
m_formatter = new JsonMediaTypeFormatter(); m_formatter = new JsonMediaTypeFormatter();
} }
// Get Sas URL calls public async Task<GetSignedStepSummaryURLResponse> GetStepSummaryUploadUrlAsync(string planId, string jobId, Guid stepId, CancellationToken cancellationToken)
private async Task<T> GetResultsSignedURLResponse<R, T>(Uri uri, CancellationToken cancellationToken, R request)
{ {
using (HttpRequestMessage requestMessage = new HttpRequestMessage(HttpMethod.Post, uri)) var request = new GetSignedStepSummaryURLRequest()
{
WorkflowJobRunBackendId= jobId,
WorkflowRunBackendId= planId,
StepBackendId= stepId.ToString()
};
var stepSummaryUploadRequest = new Uri(m_resultsServiceUrl, "twirp/results.services.receiver.Receiver/GetStepSummarySignedBlobURL");
using (HttpRequestMessage requestMessage = new HttpRequestMessage(HttpMethod.Post, stepSummaryUploadRequest))
{ {
requestMessage.Headers.Authorization = new AuthenticationHeaderValue("Bearer", m_token); requestMessage.Headers.Authorization = new AuthenticationHeaderValue("Bearer", m_token);
requestMessage.Headers.Accept.Add(MediaTypeWithQualityHeaderValue.Parse("application/json")); requestMessage.Headers.Accept.Add(MediaTypeWithQualityHeaderValue.Parse("application/json"));
using (HttpContent content = new ObjectContent<R>(request, m_formatter)) using (HttpContent content = new ObjectContent<GetSignedStepSummaryURLRequest>(request, m_formatter))
{ {
requestMessage.Content = content; requestMessage.Content = content;
using (var response = await SendAsync(requestMessage, HttpCompletionOption.ResponseContentRead, cancellationToken: cancellationToken)) using (var response = await SendAsync(requestMessage, HttpCompletionOption.ResponseContentRead, cancellationToken: cancellationToken))
{ {
return await ReadJsonContentAsync<T>(response, cancellationToken); return await ReadJsonContentAsync<GetSignedStepSummaryURLResponse>(response, cancellationToken);
} }
} }
} }
} }
private async Task<GetSignedStepSummaryURLResponse> GetStepSummaryUploadUrlAsync(string planId, string jobId, Guid stepId, CancellationToken cancellationToken) public async Task<GetSignedStepLogsURLResponse> GetStepLogUploadUrlAsync(string planId, string jobId, Guid stepId, CancellationToken cancellationToken)
{
var request = new GetSignedStepSummaryURLRequest()
{
WorkflowJobRunBackendId = jobId,
WorkflowRunBackendId = planId,
StepBackendId = stepId.ToString()
};
var getStepSummarySignedBlobURLEndpoint = new Uri(m_resultsServiceUrl, Constants.GetStepSummarySignedBlobURL);
return await GetResultsSignedURLResponse<GetSignedStepSummaryURLRequest, GetSignedStepSummaryURLResponse>(getStepSummarySignedBlobURLEndpoint, cancellationToken, request);
}
private async Task<GetSignedStepLogsURLResponse> GetStepLogUploadUrlAsync(string planId, string jobId, Guid stepId, CancellationToken cancellationToken)
{ {
var request = new GetSignedStepLogsURLRequest() var request = new GetSignedStepLogsURLRequest()
{ {
WorkflowJobRunBackendId = jobId, WorkflowJobRunBackendId= jobId,
WorkflowRunBackendId = planId, WorkflowRunBackendId= planId,
StepBackendId = stepId.ToString(), StepBackendId= stepId.ToString(),
}; };
var getStepLogsSignedBlobURLEndpoint = new Uri(m_resultsServiceUrl, Constants.GetStepLogsSignedBlobURL); var stepLogsUploadRequest = new Uri(m_resultsServiceUrl, "twirp/results.services.receiver.Receiver/GetStepLogsSignedBlobURL");
return await GetResultsSignedURLResponse<GetSignedStepLogsURLRequest, GetSignedStepLogsURLResponse>(getStepLogsSignedBlobURLEndpoint, cancellationToken, request); using (HttpRequestMessage requestMessage = new HttpRequestMessage(HttpMethod.Post, stepLogsUploadRequest))
}
private async Task<GetSignedJobLogsURLResponse> GetJobLogUploadUrlAsync(string planId, string jobId, CancellationToken cancellationToken)
{
var request = new GetSignedJobLogsURLRequest()
{
WorkflowJobRunBackendId = jobId,
WorkflowRunBackendId = planId,
};
var getJobLogsSignedBlobURLEndpoint = new Uri(m_resultsServiceUrl, Constants.GetJobLogsSignedBlobURL);
return await GetResultsSignedURLResponse<GetSignedJobLogsURLRequest, GetSignedJobLogsURLResponse>(getJobLogsSignedBlobURLEndpoint, cancellationToken, request);
}
// Create metadata calls
private async Task CreateMetadata<R>(Uri uri, CancellationToken cancellationToken, R request, string timestamp)
{
using (HttpRequestMessage requestMessage = new HttpRequestMessage(HttpMethod.Post, uri))
{ {
requestMessage.Headers.Authorization = new AuthenticationHeaderValue("Bearer", m_token); requestMessage.Headers.Authorization = new AuthenticationHeaderValue("Bearer", m_token);
requestMessage.Headers.Accept.Add(MediaTypeWithQualityHeaderValue.Parse("application/json")); requestMessage.Headers.Accept.Add(MediaTypeWithQualityHeaderValue.Parse("application/json"));
using (HttpContent content = new ObjectContent<R>(request, m_formatter)) using (HttpContent content = new ObjectContent<GetSignedStepLogsURLRequest>(request, m_formatter))
{ {
requestMessage.Content = content; requestMessage.Content = content;
using (var response = await SendAsync(requestMessage, HttpCompletionOption.ResponseContentRead, cancellationToken: cancellationToken)) using (var response = await SendAsync(requestMessage, HttpCompletionOption.ResponseContentRead, cancellationToken: cancellationToken))
{ {
var jsonResponse = await ReadJsonContentAsync<CreateMetadataResponse>(response, cancellationToken); return await ReadJsonContentAsync<GetSignedStepLogsURLResponse>(response, cancellationToken);
if (!jsonResponse.Ok)
{
throw new Exception($"Failed to mark {typeof(R).Name} upload as complete, status code: {response.StatusCode}, ok: {jsonResponse.Ok}, timestamp: {timestamp}");
}
} }
} }
} }
@@ -110,52 +80,73 @@ namespace GitHub.Services.Results.Client
private async Task StepSummaryUploadCompleteAsync(string planId, string jobId, Guid stepId, long size, CancellationToken cancellationToken) private async Task StepSummaryUploadCompleteAsync(string planId, string jobId, Guid stepId, long size, CancellationToken cancellationToken)
{ {
var timestamp = DateTime.UtcNow.ToString(Constants.TimestampFormat); var timestamp = DateTime.UtcNow.ToString("yyyy-MM-dd'T'HH:mm:ss.fffK");
var request = new StepSummaryMetadataCreate() var request = new StepSummaryMetadataCreate()
{ {
WorkflowJobRunBackendId = jobId, WorkflowJobRunBackendId= jobId,
WorkflowRunBackendId = planId, WorkflowRunBackendId= planId,
StepBackendId = stepId.ToString(), StepBackendId = stepId.ToString(),
Size = size, Size = size,
UploadedAt = timestamp UploadedAt = timestamp
}; };
var createStepSummaryMetadataEndpoint = new Uri(m_resultsServiceUrl, Constants.CreateStepSummaryMetadata); var stepSummaryUploadCompleteRequest = new Uri(m_resultsServiceUrl, "twirp/results.services.receiver.Receiver/CreateStepSummaryMetadata");
await CreateMetadata<StepSummaryMetadataCreate>(createStepSummaryMetadataEndpoint, cancellationToken, request, timestamp);
using (HttpRequestMessage requestMessage = new HttpRequestMessage(HttpMethod.Post, stepSummaryUploadCompleteRequest))
{
requestMessage.Headers.Authorization = new AuthenticationHeaderValue("Bearer", m_token);
requestMessage.Headers.Accept.Add(MediaTypeWithQualityHeaderValue.Parse("application/json"));
using (HttpContent content = new ObjectContent<StepSummaryMetadataCreate>(request, m_formatter))
{
requestMessage.Content = content;
using (var response = await SendAsync(requestMessage, HttpCompletionOption.ResponseContentRead, cancellationToken: cancellationToken))
{
var jsonResponse = await ReadJsonContentAsync<CreateStepSummaryMetadataResponse>(response, cancellationToken);
if (!jsonResponse.Ok)
{
throw new Exception($"Failed to mark step summary upload as complete, status code: {response.StatusCode}, ok: {jsonResponse.Ok}, size: {size}, timestamp: {timestamp}");
}
}
}
}
} }
private async Task StepLogUploadCompleteAsync(string planId, string jobId, Guid stepId, long lineCount, CancellationToken cancellationToken) private async Task StepLogUploadCompleteAsync(string planId, string jobId, Guid stepId, long lineCount, CancellationToken cancellationToken)
{ {
var timestamp = DateTime.UtcNow.ToString(Constants.TimestampFormat); var timestamp = DateTime.UtcNow.ToString("yyyy-MM-dd'T'HH:mm:ss.fffK");
var request = new StepLogsMetadataCreate() var request = new StepLogsMetadataCreate()
{ {
WorkflowJobRunBackendId = jobId, WorkflowJobRunBackendId= jobId,
WorkflowRunBackendId = planId, WorkflowRunBackendId= planId,
StepBackendId = stepId.ToString(), StepBackendId = stepId.ToString(),
UploadedAt = timestamp, UploadedAt = timestamp,
LineCount = lineCount, LineCount = lineCount,
}; };
var createStepLogsMetadataEndpoint = new Uri(m_resultsServiceUrl, Constants.CreateStepLogsMetadata); var stepLogsUploadCompleteRequest = new Uri(m_resultsServiceUrl, "twirp/results.services.receiver.Receiver/CreateStepLogsMetadata");
await CreateMetadata<StepLogsMetadataCreate>(createStepLogsMetadataEndpoint, cancellationToken, request, timestamp);
}
private async Task JobLogUploadCompleteAsync(string planId, string jobId, long lineCount, CancellationToken cancellationToken) using (HttpRequestMessage requestMessage = new HttpRequestMessage(HttpMethod.Post, stepLogsUploadCompleteRequest))
{
var timestamp = DateTime.UtcNow.ToString(Constants.TimestampFormat);
var request = new JobLogsMetadataCreate()
{ {
WorkflowJobRunBackendId = jobId, requestMessage.Headers.Authorization = new AuthenticationHeaderValue("Bearer", m_token);
WorkflowRunBackendId = planId, requestMessage.Headers.Accept.Add(MediaTypeWithQualityHeaderValue.Parse("application/json"));
UploadedAt = timestamp,
LineCount = lineCount,
};
var createJobLogsMetadataEndpoint = new Uri(m_resultsServiceUrl, Constants.CreateJobLogsMetadata); using (HttpContent content = new ObjectContent<StepLogsMetadataCreate>(request, m_formatter))
await CreateMetadata<JobLogsMetadataCreate>(createJobLogsMetadataEndpoint, cancellationToken, request, timestamp); {
requestMessage.Content = content;
using (var response = await SendAsync(requestMessage, HttpCompletionOption.ResponseContentRead, cancellationToken: cancellationToken))
{
var jsonResponse = await ReadJsonContentAsync<CreateStepSummaryMetadataResponse>(response, cancellationToken);
if (!jsonResponse.Ok)
{
throw new Exception($"Failed to mark step log upload as complete, status code: {response.StatusCode}, ok: {jsonResponse.Ok}, timestamp: {timestamp}");
}
}
}
}
} }
private async Task<HttpResponseMessage> UploadBlockFileAsync(string url, string blobStorageType, FileStream file, CancellationToken cancellationToken) private async Task<HttpResponseMessage> UploadFileAsync(string url, string blobStorageType, FileStream file, CancellationToken cancellationToken)
{ {
// Upload the file to the url // Upload the file to the url
var request = new HttpRequestMessage(HttpMethod.Put, url) var request = new HttpRequestMessage(HttpMethod.Put, url)
@@ -165,7 +156,7 @@ namespace GitHub.Services.Results.Client
if (blobStorageType == BlobStorageTypes.AzureBlobStorage) if (blobStorageType == BlobStorageTypes.AzureBlobStorage)
{ {
request.Content.Headers.Add(Constants.AzureBlobTypeHeader, Constants.AzureBlockBlob); request.Content.Headers.Add("x-ms-blob-type", "BlockBlob");
} }
using (var response = await SendAsync(request, HttpCompletionOption.ResponseHeadersRead, userState: null, cancellationToken)) using (var response = await SendAsync(request, HttpCompletionOption.ResponseHeadersRead, userState: null, cancellationToken))
@@ -186,7 +177,7 @@ namespace GitHub.Services.Results.Client
}; };
if (blobStorageType == BlobStorageTypes.AzureBlobStorage) if (blobStorageType == BlobStorageTypes.AzureBlobStorage)
{ {
request.Content.Headers.Add(Constants.AzureBlobTypeHeader, Constants.AzureAppendBlob); request.Content.Headers.Add("x-ms-blob-type", "AppendBlob");
request.Content.Headers.Add("Content-Length", "0"); request.Content.Headers.Add("Content-Length", "0");
} }
@@ -199,7 +190,7 @@ namespace GitHub.Services.Results.Client
return response; return response;
} }
} }
private async Task<HttpResponseMessage> UploadAppendFileAsync(string url, string blobStorageType, FileStream file, bool finalize, long fileSize, CancellationToken cancellationToken) private async Task<HttpResponseMessage> UploadAppendFileAsync(string url, string blobStorageType, FileStream file, bool finalize, long fileSize, CancellationToken cancellationToken)
{ {
var comp = finalize ? "&comp=appendblock&seal=true" : "&comp=appendblock"; var comp = finalize ? "&comp=appendblock&seal=true" : "&comp=appendblock";
@@ -212,7 +203,7 @@ namespace GitHub.Services.Results.Client
if (blobStorageType == BlobStorageTypes.AzureBlobStorage) if (blobStorageType == BlobStorageTypes.AzureBlobStorage)
{ {
request.Content.Headers.Add("Content-Length", fileSize.ToString()); request.Content.Headers.Add("Content-Length", fileSize.ToString());
request.Content.Headers.Add(Constants.AzureBlobSealedHeader, finalize.ToString()); request.Content.Headers.Add("x-ms-blob-sealed", finalize.ToString());
} }
using (var response = await SendAsync(request, HttpCompletionOption.ResponseHeadersRead, userState: null, cancellationToken)) using (var response = await SendAsync(request, HttpCompletionOption.ResponseHeadersRead, userState: null, cancellationToken))
@@ -245,7 +236,7 @@ namespace GitHub.Services.Results.Client
// Upload the file // Upload the file
using (var fileStream = new FileStream(file, FileMode.Open, FileAccess.Read, FileShare.Read, 4096, true)) using (var fileStream = new FileStream(file, FileMode.Open, FileAccess.Read, FileShare.Read, 4096, true))
{ {
var response = await UploadBlockFileAsync(uploadUrlResponse.SummaryUrl, uploadUrlResponse.BlobStorageType, fileStream, cancellationToken); var response = await UploadFileAsync(uploadUrlResponse.SummaryUrl, uploadUrlResponse.BlobStorageType, fileStream, cancellationToken);
} }
// Send step summary upload complete message // Send step summary upload complete message
@@ -262,6 +253,9 @@ namespace GitHub.Services.Results.Client
throw new Exception("Failed to get step log upload url"); throw new Exception("Failed to get step log upload url");
} }
// Do we want to throw an exception here or should we just be uploading/truncating the data
var fileSize = new FileInfo(file).Length;
// Create the Append blob // Create the Append blob
if (firstBlock) if (firstBlock)
{ {
@@ -269,7 +263,6 @@ namespace GitHub.Services.Results.Client
} }
// Upload content // Upload content
var fileSize = new FileInfo(file).Length;
using (var fileStream = new FileStream(file, FileMode.Open, FileAccess.Read, FileShare.Read, 4096, true)) using (var fileStream = new FileStream(file, FileMode.Open, FileAccess.Read, FileShare.Read, 4096, true))
{ {
var response = await UploadAppendFileAsync(uploadUrlResponse.LogsUrl, uploadUrlResponse.BlobStorageType, fileStream, finalize, fileSize, cancellationToken); var response = await UploadAppendFileAsync(uploadUrlResponse.LogsUrl, uploadUrlResponse.BlobStorageType, fileStream, finalize, fileSize, cancellationToken);
@@ -283,59 +276,8 @@ namespace GitHub.Services.Results.Client
} }
} }
// Handle file upload for job log
public async Task UploadResultsJobLogAsync(string planId, string jobId, string file, bool finalize, bool firstBlock, long lineCount, CancellationToken cancellationToken)
{
// Get the upload url
var uploadUrlResponse = await GetJobLogUploadUrlAsync(planId, jobId, cancellationToken);
if (uploadUrlResponse == null || uploadUrlResponse.LogsUrl == null)
{
throw new Exception("Failed to get job log upload url");
}
// Create the Append blob
if (firstBlock)
{
await CreateAppendFileAsync(uploadUrlResponse.LogsUrl, uploadUrlResponse.BlobStorageType, cancellationToken);
}
// Upload content
var fileSize = new FileInfo(file).Length;
using (var fileStream = new FileStream(file, FileMode.Open, FileAccess.Read, FileShare.Read, 4096, true))
{
var response = await UploadAppendFileAsync(uploadUrlResponse.LogsUrl, uploadUrlResponse.BlobStorageType, fileStream, finalize, fileSize, cancellationToken);
}
// Update metadata
if (finalize)
{
// Send step log upload complete message
await JobLogUploadCompleteAsync(planId, jobId, lineCount, cancellationToken);
}
}
private MediaTypeFormatter m_formatter; private MediaTypeFormatter m_formatter;
private Uri m_resultsServiceUrl; private Uri m_resultsServiceUrl;
private string m_token; private string m_token;
} }
// Constants specific to results
public static class Constants
{
public static readonly string TimestampFormat = "yyyy-MM-dd'T'HH:mm:ss.fffK";
public static readonly string ResultsReceiverTwirpEndpoint = "twirp/results.services.receiver.Receiver/";
public static readonly string GetStepSummarySignedBlobURL = ResultsReceiverTwirpEndpoint + "GetStepSummarySignedBlobURL";
public static readonly string CreateStepSummaryMetadata = ResultsReceiverTwirpEndpoint + "CreateStepSummaryMetadata";
public static readonly string GetStepLogsSignedBlobURL = ResultsReceiverTwirpEndpoint + "GetStepLogsSignedBlobURL";
public static readonly string CreateStepLogsMetadata = ResultsReceiverTwirpEndpoint + "CreateStepLogsMetadata";
public static readonly string GetJobLogsSignedBlobURL = ResultsReceiverTwirpEndpoint + "GetJobLogsSignedBlobURL";
public static readonly string CreateJobLogsMetadata = ResultsReceiverTwirpEndpoint + "CreateJobLogsMetadata";
public static readonly string AzureBlobSealedHeader = "x-ms-blob-sealed";
public static readonly string AzureBlobTypeHeader = "x-ms-blob-type";
public static readonly string AzureBlockBlob = "BlockBlob";
public static readonly string AzureAppendBlob = "AppendBlob";
}
} }

View File

@@ -203,13 +203,6 @@ function runtest ()
dotnet msbuild -t:test -p:PackageRuntime="${RUNTIME_ID}" -p:BUILDCONFIG="${BUILD_CONFIG}" -p:RunnerVersion="${RUNNER_VERSION}" ./dir.proj || failed "failed tests" dotnet msbuild -t:test -p:PackageRuntime="${RUNTIME_ID}" -p:BUILDCONFIG="${BUILD_CONFIG}" -p:RunnerVersion="${RUNNER_VERSION}" ./dir.proj || failed "failed tests"
} }
function format()
{
heading "Formatting..."
files="$(git status -s "*.cs" | awk '{print $2}' | tr '\n' ' ')"
dotnet format ${SCRIPT_DIR}/ActionsRunner.sln --exclude / --include $files || failed "failed formatting"
}
function package () function package ()
{ {
if [ ! -d "${LAYOUT_DIR}/bin" ]; then if [ ! -d "${LAYOUT_DIR}/bin" ]; then
@@ -367,9 +360,7 @@ case $DEV_CMD in
"l") layout;; "l") layout;;
"package") package;; "package") package;;
"p") package;; "p") package;;
"format") format;; *) echo "Invalid cmd. Use build(b), test(t), layout(l) or package(p)";;
"f") format;;
*) echo "Invalid cmd. Use build(b), test(t), layout(l), package(p), or format(f)";;
esac esac
popd popd

View File

@@ -1 +1 @@
2.302.1 2.302.0