Skip to content
This repository was archived by the owner on Nov 1, 2023. It is now read-only.

Bump memmap2 from 0.7.1 to 0.8.0 in /src/agent #3526

Closed
wants to merge 41 commits into from
Closed
Changes from all commits
Commits
Show all changes
41 commits
Select commit Hold shift + click to select a range
c69deed
Release 8.7.1 (hotfix) (#3459)
AdamL-Microsoft Aug 29, 2023
c8986aa
Revert "Release 8.7.1 (hotfix) (#3459)" (#3468)
AdamL-Microsoft Aug 30, 2023
7b40402
Redo 8.7.1 (#3469)
AdamL-Microsoft Aug 30, 2023
d999603
Support custom ado fields that mark work items as duplicate (#3467)
kananb Aug 30, 2023
b2435b1
Update readme with archive message (#3408)
mgreisen Aug 31, 2023
b913074
Bump tokio from 1.30.0 to 1.32.0 in /src/proxy-manager (#3425)
dependabot[bot] Aug 31, 2023
14ab36e
Bump tokio from 1.30.0 to 1.32.0 in /src/agent (#3424)
dependabot[bot] Aug 31, 2023
f141050
Remove unnecessary method argument (#3473)
kananb Sep 1, 2023
d4319d2
Bump elsa from 1.8.1 to 1.9.0 in /src/agent (#3411)
dependabot[bot] Sep 4, 2023
93b16ec
Bump tempfile from 3.7.1 to 3.8.0 in /src/agent (#3437)
dependabot[bot] Sep 5, 2023
7f7ab37
Bump tempfile from 3.7.1 to 3.8.0 in /src/proxy-manager (#3436)
dependabot[bot] Sep 5, 2023
b2e6a07
Updating requirements.txt to accept >= onefuzztypes. (#3477)
nharper285 Sep 5, 2023
aa9c9ea
Bump notify from 6.0.1 to 6.1.1 in /src/agent (#3435)
dependabot[bot] Sep 5, 2023
74475cc
Bump azure_* crates (#3478)
Porges Sep 5, 2023
64699ed
Release 8.8.0 (#3466)
AdamL-Microsoft Sep 6, 2023
a3fb480
Bump clap from 4.3.21 to 4.4.2 in /src/agent (#3484)
dependabot[bot] Sep 6, 2023
59c52d6
Bump gimli from 0.27.3 to 0.28.0 in /src/agent (#3414)
dependabot[bot] Sep 6, 2023
dd9e266
Bump clap from 4.3.21 to 4.4.2 in /src/proxy-manager (#3474)
dependabot[bot] Sep 6, 2023
6e2cb14
Bump winreg from 0.50.0 to 0.51.0 in /src/agent (#3434)
dependabot[bot] Sep 6, 2023
d2d57a8
Starting integration tests (#3438)
tevoinea Sep 7, 2023
830b479
Fix sed checks for CLI versioning (#3486)
nharper285 Sep 7, 2023
896329d
Bump bytes from 1.4.0 to 1.5.0 in /src/agent (#3488)
dependabot[bot] Sep 10, 2023
d34138d
Improve area/iteration path validation (#3489)
kananb Sep 11, 2023
d009476
Improve handling of unexpected breakpoints (#3493)
tevoinea Sep 13, 2023
18f2b4a
Update azure_* crates (#3503)
Porges Sep 13, 2023
9ede0de
Fuzz coverage recording (#3322)
tevoinea Sep 14, 2023
cde6a19
Reporting coverage on task start up (#3502)
nharper285 Sep 14, 2023
1fb1563
Remove feature flag from heartbeat metrics. (#3505)
nharper285 Sep 14, 2023
c7a9827
Update archive notice. (#3507)
mgreisen Sep 15, 2023
58da7b4
Add onefuzz service version to job created events (#3504)
kananb Sep 20, 2023
60766e6
Tevoinea/add version checking in local tasks (#3517)
tevoinea Sep 21, 2023
e3c4a40
Create directories if they don't exist in the template (#3522)
tevoinea Sep 21, 2023
d1ccb1e
Support for retention policies on containers (#3501)
Porges Sep 26, 2023
7efea43
Bump rayon from 1.7.0 to 1.8.0 in /src/agent (#3520)
dependabot[bot] Sep 26, 2023
f3b7e20
Bump insta from 1.31.0 to 1.32.0 in /src/agent (#3521)
dependabot[bot] Sep 26, 2023
d2ba170
Disable `repro` and `debug` VM CLI commands. (#3494)
nharper285 Sep 27, 2023
2c8ecc9
Make modules case insenstive on windows (#3527)
tevoinea Sep 28, 2023
e12b41e
Update windows interceptor list (#3528)
tevoinea Sep 28, 2023
552df45
Template creation command (#3531)
tevoinea Sep 28, 2023
e3b1e0e
Terminate process on timeout in windows for the coverage task (#3529)
chkeita Sep 29, 2023
676a644
Bump memmap2 from 0.7.1 to 0.8.0 in /src/agent
dependabot[bot] Sep 29, 2023
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
28 changes: 18 additions & 10 deletions .github/workflows/ci.yml
Original file line number Diff line number Diff line change
@@ -79,16 +79,24 @@ jobs:
key: ${{env.ACTIONS_CACHE_KEY_DATE}} # additional key for cache-busting
workspaces: src/agent
- name: Linux Prereqs
if: runner.os == 'Linux' && steps.cache-agent-artifacts.outputs.cache-hit != 'true'
if: runner.os == 'Linux'
run: |
sudo apt-get -y update
sudo apt-get -y install libssl-dev libunwind-dev build-essential pkg-config
sudo apt-get -y install libssl-dev libunwind-dev build-essential pkg-config clang
- name: Clone onefuzz-samples
run: git clone https://github.com/microsoft/onefuzz-samples
- name: Prepare for agent integration tests
shell: bash
working-directory: ./onefuzz-samples/examples/simple-libfuzzer
run: |
make
mkdir -p ../../../src/agent/onefuzz-task/tests/targets/simple
cp fuzz.exe ../../../src/agent/onefuzz-task/tests/targets/simple/fuzz.exe
cp *.pdb ../../../src/agent/onefuzz-task/tests/targets/simple/ 2>/dev/null || :
- name: Install Rust Prereqs
if: steps.rust-build-cache.outputs.cache-hit != 'true' && steps.cache-agent-artifacts.outputs.cache-hit != 'true'
shell: bash
run: src/ci/rust-prereqs.sh
- run: src/ci/agent.sh
if: steps.cache-agent-artifacts.outputs.cache-hit != 'true'
shell: bash
- name: Upload coverage to Codecov
uses: codecov/codecov-action@v3
@@ -115,7 +123,7 @@ jobs:
- uses: actions/checkout@v3
- uses: actions/setup-python@v4
with:
python-version: 3.7
python-version: "3.10"
- name: lint
shell: bash
run: src/ci/check-check-pr.sh
@@ -129,7 +137,7 @@ jobs:
shell: bash
- uses: actions/setup-python@v4
with:
python-version: 3.7
python-version: "3.10"
- uses: actions/download-artifact@v3
with:
name: artifact-onefuzztypes
@@ -182,7 +190,7 @@ jobs:
- uses: actions/checkout@v3
- uses: actions/setup-python@v4
with:
python-version: 3.8
python-version: "3.10"
- name: lint
shell: bash
run: |
@@ -200,7 +208,7 @@ jobs:
- uses: actions/checkout@v3
- uses: actions/setup-python@v4
with:
python-version: 3.8
python-version: "3.10"
- name: lint
shell: bash
run: |
@@ -216,7 +224,7 @@ jobs:
- run: src/ci/set-versions.sh
- uses: actions/setup-python@v4
with:
python-version: 3.7
python-version: "3.10"
- run: src/ci/onefuzztypes.sh
- uses: actions/upload-artifact@v3
with:
@@ -473,7 +481,7 @@ jobs:
path: artifacts
- uses: actions/setup-python@v4
with:
python-version: 3.7
python-version: "3.10"
- name: Lint
shell: bash
run: |
26 changes: 26 additions & 0 deletions CHANGELOG.md
Original file line number Diff line number Diff line change
@@ -7,6 +7,32 @@ All notable changes to this project will be documented in this file.
The format is based on [Keep a Changelog](https://keepachangelog.com/en/1.0.0/),
and this project adheres to [Semantic Versioning](https://semver.org/spec/v2.0.0.html).

## 8.8.0

### Added

* Agent: Added Mariner Linux support for agent VMs [#3306](https://github.com/microsoft/onefuzz/pull/3306)
* Service: Added support for custom ado fields that mark work items as duplicate [#3467](https://github.com/microsoft/onefuzz/pull/3467)
* Service: Permanently store OneFuzz job result data - # crashing input, # regression crashing input, etc. - in Azure storage [#3380](https://github.com/microsoft/onefuzz/pull/3380), [#3439](https://github.com/microsoft/onefuzz/pull/3439)
* Service: Added validation for Iteration/AreaPath on notifications when a job is submitted with a notification config and for `onefuzz debug notification test_template` [#3386](https://github.com/microsoft/onefuzz/pull/3386)

### Changed

* Agent: Updated libfuzzer-fuzz basic template to include required args and make it match cli [#3429](https://github.com/microsoft/onefuzz/pull/3429)
* Agent: Downgraded some debug logs from warn to debug [#3450](https://github.com/microsoft/onefuzz/pull/3450)
* CLI: Removed CLI commands from the local fuzzing tasks as they can now be described via yaml template [#3428](https://github.com/microsoft/onefuzz/pull/3428)
* Service: AutoScale table entries are now deleted on VMSS shutdown [#3455](https://github.com/microsoft/onefuzz/pull/3455)

### Fixed

* Agent: Fixed local path generation [#3432](https://github.com/microsoft/onefuzz/pull/3432), [#3460](https://github.com/microsoft/onefuzz/pull/3460)

## 8.7.1

### Fixed

* Service: Removed deprecated Azure retention policy setting that was causing scaleset deployment errors [#3452](https://github.com/microsoft/onefuzz/pull/3452)

## 8.7.0

### Added
2 changes: 1 addition & 1 deletion CURRENT_VERSION
Original file line number Diff line number Diff line change
@@ -1 +1 @@
8.7.0
8.8.0
21 changes: 21 additions & 0 deletions README.md
Original file line number Diff line number Diff line change
@@ -1,5 +1,26 @@
# <img src="docs/onefuzz_text.svg" height="120" alt="OneFuzz" />

# :exclamation: IMPORTANT NOTICE :exclamation:

**_August 31, 2023_**.

**_Since September 2020 when OneFuzz was first open sourced, we’ve been on a journey to create a best-in-class orchestrator for running fuzzers, driving security and quality into our products._**


**_Initially launched by a small group in MSR, OneFuzz has now become a significant internal platform within Microsoft. As such, we are regretfully archiving the project to focus our attention on becoming a more deeply integrated service within the company. Unfortunately, we aren’t a large enough team to live in both the open-source world and the internal Microsoft world with its own unique set of requirements._**

**_Our current plan is to archive the project in the next few months. That means we’ll still be making updates for a little while. Of course, even after it’s archived, you’ll still be able to fork it and make the changes you need. Once we’ve decided on a specific date for archiving, we’ll update this readme._**

**_Thanks for taking the journey with us._**

**_The OneFuzz team._**

---
**_Update: September 15 2023:_**
**_Our current target to archive the project is September 30th, 2023._**

---

[![Onefuzz build status](https://github.com/microsoft/onefuzz/workflows/Build/badge.svg?branch=main)](https://github.com/microsoft/onefuzz/actions/workflows/ci.yml?query=branch%3Amain)

## A self-hosted Fuzzing-As-A-Service platform
4 changes: 4 additions & 0 deletions contrib/onefuzz-job-azure-devops-pipeline/ado-work-items.json
Original file line number Diff line number Diff line change
@@ -13,6 +13,10 @@
"System.AreaPath": "OneFuzz-Ado-Integration",
"System.Title": "{{report.task_id}}"
},
"ado_duplicate_fields": {
"System.Reason": "My custom value that means a work item is a duplicate",
"Custom.Work.Item.Field": "My custom value that means a work item is a duplicate"
},
"on_duplicate": {
"increment": [],
"comment": "DUP {{report.input_sha256}} <br> Repro Command: <br> <pre> {{ repro_cmd }} </pre> ",
7 changes: 7 additions & 0 deletions docs/notifications/ado.md
Original file line number Diff line number Diff line change
@@ -51,6 +51,13 @@ clickable, make it a link.
"System.Title": "{{ report.crash_site }} - {{ report.executable }}",
"Microsoft.VSTS.TCM.ReproSteps": "This is my call stack: <ul> {{ for item in report.call_stack }} <li> {{ item }} </li> {{ end }} </ul>"
},
"ado_duplicate_fields": {
"System.Reason": "My custom value that means a work item is a duplicate",
"Custom.Work.Item.Field": "My custom value that means a work item is a duplicate"
// note: the fields and values below are checked by default and don't need to be specified
// "System.Reason": "Duplicate"
// "Microsoft.VSTS.Common.ResolvedReason": "Duplicate"
},
"comment": "This is my comment. {{ report.input_sha256 }} {{ input_url }} <br> <pre>{{ repro_cmd }}</pre>",
"unique_fields": ["System.Title", "System.AreaPath"],
"on_duplicate": {
1 change: 1 addition & 0 deletions src/ApiService/ApiService/FeatureFlags.cs
Original file line number Diff line number Diff line change
@@ -8,4 +8,5 @@ public static class FeatureFlagConstants {
public const string EnableBlobRetentionPolicy = "EnableBlobRetentionPolicy";
public const string EnableDryRunBlobRetention = "EnableDryRunBlobRetention";
public const string EnableWorkItemCreation = "EnableWorkItemCreation";
public const string EnableContainerRetentionPolicies = "EnableContainerRetentionPolicies";
}
2 changes: 1 addition & 1 deletion src/ApiService/ApiService/Functions/Jobs.cs
Original file line number Diff line number Diff line change
@@ -83,7 +83,7 @@ private async Task<HttpResponseData> Post(HttpRequestData req, FunctionContext c
"job");
}

await _context.Events.SendEvent(new EventJobCreated(job.JobId, job.Config, job.UserInfo));
await _context.Events.SendEvent(new EventJobCreated(job.JobId, job.Config, job.UserInfo, _context.ServiceConfiguration.OneFuzzVersion));
return await RequestHandling.Ok(req, JobResponse.ForJob(job, taskInfo: null));
}

44 changes: 39 additions & 5 deletions src/ApiService/ApiService/Functions/QueueFileChanges.cs
Original file line number Diff line number Diff line change
@@ -1,5 +1,6 @@
using System.Text.Json;
using System.Text.Json.Nodes;
using System.Threading.Tasks;
using Azure.Core;
using Microsoft.Azure.Functions.Worker;
using Microsoft.Extensions.Logging;
@@ -54,14 +55,16 @@ public async Async.Task Run(
return;
}

var storageAccount = new ResourceIdentifier(topicElement.GetString()!);

try {
// Setting isLastRetryAttempt to false will rethrow any exceptions
// With the intention that the azure functions runtime will handle requeing
// the message for us. The difference is for the poison queue, we're handling the
// requeuing ourselves because azure functions doesn't support retry policies
// for queue based functions.

var result = await FileAdded(fileChangeEvent, isLastRetryAttempt: false);
var result = await FileAdded(storageAccount, fileChangeEvent, isLastRetryAttempt: false);
if (!result.IsOk && result.ErrorV.Code == ErrorCode.ADO_WORKITEM_PROCESSING_DISABLED) {
await RequeueMessage(msg, TimeSpan.FromDays(1));
}
@@ -71,16 +74,47 @@ public async Async.Task Run(
}
}

private async Async.Task<OneFuzzResultVoid> FileAdded(JsonDocument fileChangeEvent, bool isLastRetryAttempt) {
private async Async.Task<OneFuzzResultVoid> FileAdded(ResourceIdentifier storageAccount, JsonDocument fileChangeEvent, bool isLastRetryAttempt) {
var data = fileChangeEvent.RootElement.GetProperty("data");
var url = data.GetProperty("url").GetString()!;
var parts = url.Split("/").Skip(3).ToList();

var container = parts[0];
var container = Container.Parse(parts[0]);
var path = string.Join('/', parts.Skip(1));

_log.LogInformation("file added : {Container} - {Path}", container, path);
return await _notificationOperations.NewFiles(Container.Parse(container), path, isLastRetryAttempt);
_log.LogInformation("file added : {Container} - {Path}", container.String, path);

var (_, result) = await (
ApplyRetentionPolicy(storageAccount, container, path),
_notificationOperations.NewFiles(container, path, isLastRetryAttempt));

return result;
}

private async Async.Task<bool> ApplyRetentionPolicy(ResourceIdentifier storageAccount, Container container, string path) {
if (await _context.FeatureManagerSnapshot.IsEnabledAsync(FeatureFlagConstants.EnableContainerRetentionPolicies)) {
// default retention period can be applied to the container
// if one exists, we will set the expiry date on the newly-created blob, if it doesn't already have one
var account = await _storage.GetBlobServiceClientForAccount(storageAccount);
var containerClient = account.GetBlobContainerClient(container.String);
var containerProps = await containerClient.GetPropertiesAsync();
var retentionPeriod = RetentionPolicyUtils.GetContainerRetentionPeriodFromMetadata(containerProps.Value.Metadata);
if (!retentionPeriod.IsOk) {
_log.LogError("invalid retention period: {Error}", retentionPeriod.ErrorV);
} else if (retentionPeriod.OkV is TimeSpan period) {
var blobClient = containerClient.GetBlobClient(path);
var tags = (await blobClient.GetTagsAsync()).Value.Tags;
var expiryDate = DateTime.UtcNow + period;
var tag = RetentionPolicyUtils.CreateExpiryDateTag(DateOnly.FromDateTime(expiryDate));
if (tags.TryAdd(tag.Key, tag.Value)) {
_ = await blobClient.SetTagsAsync(tags);
_log.LogInformation("applied container retention policy ({Policy}) to {Path}", period, path);
return true;
}
}
}

return false;
}

private async Async.Task RequeueMessage(string msg, TimeSpan? visibilityTimeout = null) {
5 changes: 2 additions & 3 deletions src/ApiService/ApiService/Functions/QueueNodeHeartbeat.cs
Original file line number Diff line number Diff line change
@@ -41,9 +41,8 @@ public async Async.Task Run([QueueTrigger("node-heartbeat", Connection = "AzureW
var nodeHeartbeatEvent = new EventNodeHeartbeat(node.MachineId, node.ScalesetId, node.PoolName, node.State);
// TODO: do we still send event if we fail do update the table ?
await events.SendEvent(nodeHeartbeatEvent);
if (await _context.FeatureManagerSnapshot.IsEnabledAsync(FeatureFlagConstants.EnableCustomMetricTelemetry)) {
metrics.SendMetric(1, nodeHeartbeatEvent);
}
metrics.SendMetric(1, nodeHeartbeatEvent);


}
}
5 changes: 2 additions & 3 deletions src/ApiService/ApiService/Functions/QueueTaskHeartbeat.cs
Original file line number Diff line number Diff line change
@@ -45,8 +45,7 @@ public async Async.Task Run([QueueTrigger("task-heartbeat", Connection = "AzureW

var taskHeartBeatEvent = new EventTaskHeartbeat(newTask.JobId, newTask.TaskId, job.Config.Project, job.Config.Name, newTask.State, newTask.Config);
await _events.SendEvent(taskHeartBeatEvent);
if (await _context.FeatureManagerSnapshot.IsEnabledAsync(FeatureFlagConstants.EnableCustomMetricTelemetry)) {
_metrics.SendMetric(1, taskHeartBeatEvent);
}
_metrics.SendMetric(1, taskHeartBeatEvent);

}
}
2 changes: 2 additions & 0 deletions src/ApiService/ApiService/OneFuzzTypes/Enums.cs
Original file line number Diff line number Diff line change
@@ -49,6 +49,8 @@ public enum ErrorCode {
ADO_VALIDATION_MISSING_PAT_SCOPES = 492,
ADO_WORKITEM_PROCESSING_DISABLED = 494,
ADO_VALIDATION_INVALID_PATH = 495,
ADO_VALIDATION_INVALID_PROJECT = 496,
INVALID_RETENTION_PERIOD = 497,
// NB: if you update this enum, also update enums.py
}

3 changes: 2 additions & 1 deletion src/ApiService/ApiService/OneFuzzTypes/Events.cs
Original file line number Diff line number Diff line change
@@ -124,7 +124,8 @@ TaskConfig Config
public record EventJobCreated(
Guid JobId,
JobConfig Config,
StoredUserInfo? UserInfo
StoredUserInfo? UserInfo,
string OneFuzzVersion
) : BaseEvent();


4 changes: 3 additions & 1 deletion src/ApiService/ApiService/OneFuzzTypes/Model.cs
Original file line number Diff line number Diff line change
@@ -689,6 +689,7 @@ public record AdoTemplate(
List<string> UniqueFields,
Dictionary<string, string> AdoFields,
ADODuplicateTemplate OnDuplicate,
Dictionary<string, string>? AdoDuplicateFields = null,
string? Comment = null
) : NotificationTemplate {
public async Task<OneFuzzResultVoid> Validate() {
@@ -704,8 +705,9 @@ public record RenderedAdoTemplate(
List<string> UniqueFields,
Dictionary<string, string> AdoFields,
ADODuplicateTemplate OnDuplicate,
Dictionary<string, string>? AdoDuplicateFields = null,
string? Comment = null
) : AdoTemplate(BaseUrl, AuthToken, Project, Type, UniqueFields, AdoFields, OnDuplicate, Comment);
) : AdoTemplate(BaseUrl, AuthToken, Project, Type, UniqueFields, AdoFields, OnDuplicate, AdoDuplicateFields, Comment);

public record TeamsTemplate(SecretData<string> Url) : NotificationTemplate {
public Task<OneFuzzResultVoid> Validate() {
Original file line number Diff line number Diff line change
@@ -22,13 +22,12 @@ public NotificationOperations(ILogger<NotificationOperations> log, IOnefuzzConte

}
public async Async.Task<OneFuzzResultVoid> NewFiles(Container container, string filename, bool isLastRetryAttempt) {
var result = OneFuzzResultVoid.Ok;

// We don't want to store file added events for the events container because that causes an infinite loop
if (container == WellKnownContainers.Events) {
return result;
return Result.Ok();
}

var result = OneFuzzResultVoid.Ok;
var notifications = GetNotifications(container);
var hasNotifications = await notifications.AnyAsync();
var reportOrRegression = await _context.Reports.GetReportOrRegression(container, filename, expectReports: hasNotifications);
24 changes: 0 additions & 24 deletions src/ApiService/ApiService/onefuzzlib/RententionPolicy.cs

This file was deleted.

43 changes: 43 additions & 0 deletions src/ApiService/ApiService/onefuzzlib/RetentionPolicy.cs
Original file line number Diff line number Diff line change
@@ -0,0 +1,43 @@
using System.Xml;

namespace Microsoft.OneFuzz.Service;


public interface IRetentionPolicy {
DateOnly GetExpiryDate();
}

public class RetentionPolicyUtils {
public const string EXPIRY_TAG = "Expiry";
public static KeyValuePair<string, string> CreateExpiryDateTag(DateOnly expiryDate) =>
new(EXPIRY_TAG, expiryDate.ToString());

public static DateOnly? GetExpiryDateTagFromTags(IDictionary<string, string>? blobTags) {
if (blobTags != null &&
blobTags.TryGetValue(EXPIRY_TAG, out var expiryTag) &&
!string.IsNullOrWhiteSpace(expiryTag) &&
DateOnly.TryParse(expiryTag, out var expiryDate)) {
return expiryDate;
}
return null;
}

public static string CreateExpiredBlobTagFilter() => $@"""{EXPIRY_TAG}"" <= '{DateOnly.FromDateTime(DateTime.UtcNow)}'";

// NB: this must match the value used on the CLI side
public const string CONTAINER_RETENTION_KEY = "onefuzz_retentionperiod";

public static OneFuzzResult<TimeSpan?> GetContainerRetentionPeriodFromMetadata(IDictionary<string, string>? containerMetadata) {
if (containerMetadata is not null &&
containerMetadata.TryGetValue(CONTAINER_RETENTION_KEY, out var retentionString) &&
!string.IsNullOrWhiteSpace(retentionString)) {
try {
return Result.Ok<TimeSpan?>(XmlConvert.ToTimeSpan(retentionString));
} catch (Exception ex) {
return Error.Create(ErrorCode.INVALID_RETENTION_PERIOD, ex.Message);
}
}

return Result.Ok<TimeSpan?>(null);
}
}
113 changes: 95 additions & 18 deletions src/ApiService/ApiService/onefuzzlib/notifications/Ado.cs
Original file line number Diff line number Diff line change
@@ -89,30 +89,97 @@ private static bool IsTransient(Exception e) {
return errorCodes.Any(errorStr.Contains);
}

private static async Async.Task<OneFuzzResultVoid> ValidatePath(string project, string path, TreeStructureGroup structureGroup, WorkItemTrackingHttpClient client) {
var pathType = (structureGroup == TreeStructureGroup.Areas) ? "Area" : "Iteration";
var pathParts = path.Split('\\');
if (!string.Equals(pathParts[0], project, StringComparison.OrdinalIgnoreCase)) {
public static OneFuzzResultVoid ValidateTreePath(IEnumerable<string> path, WorkItemClassificationNode? root) {
if (root is null) {
return OneFuzzResultVoid.Error(ErrorCode.ADO_VALIDATION_INVALID_PROJECT, new string[] {
$"Path \"{string.Join('\\', path)}\" is invalid. The specified ADO project doesn't exist.",
"Double check the 'project' field in your ADO config.",
});
}

string treeNodeTypeName;
switch (root.StructureType) {
case TreeNodeStructureType.Area:
treeNodeTypeName = "Area";
break;
case TreeNodeStructureType.Iteration:
treeNodeTypeName = "Iteration";
break;
default:
return OneFuzzResultVoid.Error(ErrorCode.ADO_VALIDATION_INVALID_PATH, new string[] {
$"Path root \"{root.Name}\" is an unsupported type. Expected Area or Iteration but got {root.StructureType}.",
});
}

// Validate path based on
// https://learn.microsoft.com/en-us/azure/devops/organizations/settings/about-areas-iterations?view=azure-devops#naming-restrictions
var maxNodeLength = 255;
var maxDepth = 13;
// Invalid characters from the link above plus the escape sequences (since they have backslashes and produce confusingly formatted errors if not caught here)
var invalidChars = new char[] { '/', ':', '*', '?', '"', '<', '>', '|', ';', '#', '$', '*', '{', '}', ',', '+', '=', '[', ']' };

// Ensure that none of the path parts are too long
var erroneous = path.FirstOrDefault(part => part.Length > maxNodeLength);
if (erroneous != null) {
return OneFuzzResultVoid.Error(ErrorCode.ADO_VALIDATION_INVALID_PATH, new string[] {
$"{treeNodeTypeName} Path \"{string.Join('\\', path)}\" is invalid. \"{erroneous}\" is too long. It must be less than {maxNodeLength} characters.",
"Learn more about naming restrictions here: https://learn.microsoft.com/en-us/azure/devops/organizations/settings/about-areas-iterations?view=azure-devops#naming-restrictions"
});
}

// Ensure that none of the path parts contain invalid characters
erroneous = path.FirstOrDefault(part => invalidChars.Any(part.Contains));
if (erroneous != null) {
return OneFuzzResultVoid.Error(ErrorCode.ADO_VALIDATION_INVALID_PATH, new string[] {
$"Path \"{path}\" is invalid. It must start with the project name, \"{project}\".",
$"Example: \"{project}\\{path}\".",
$"{treeNodeTypeName} Path \"{string.Join('\\', path)}\" is invalid. \"{erroneous}\" contains an invalid character ({string.Join(" ", invalidChars)}).",
"Make sure that the path is separated by backslashes (\\) and not forward slashes (/).",
"Learn more about naming restrictions here: https://learn.microsoft.com/en-us/azure/devops/organizations/settings/about-areas-iterations?view=azure-devops#naming-restrictions"
});
}

var current = await client.GetClassificationNodeAsync(project, structureGroup, depth: pathParts.Length - 1);
if (current == null) {
// Ensure no unicode control characters
erroneous = path.FirstOrDefault(part => part.Any(ch => char.IsControl(ch)));
if (erroneous != null) {
return OneFuzzResultVoid.Error(ErrorCode.ADO_VALIDATION_INVALID_PATH, new string[] {
$"{pathType} Path \"{path}\" is invalid. \"{project}\" is not a valid project.",
// More about control codes and their range here: https://en.wikipedia.org/wiki/Unicode_control_characters
$"{treeNodeTypeName} Path \"{string.Join('\\', path)}\" is invalid. \"{erroneous}\" contains a unicode control character (\\u0000 - \\u001F or \\u007F - \\u009F).",
"Make sure that you're path doesn't contain any escape characters (\\0 \\a \\b \\f \\n \\r \\t \\v).",
"Learn more about naming restrictions here: https://learn.microsoft.com/en-us/azure/devops/organizations/settings/about-areas-iterations?view=azure-devops#naming-restrictions"
});
}

foreach (var part in pathParts.Skip(1)) {
// Ensure that there aren't too many path parts
if (path.Count() > maxDepth) {
return OneFuzzResultVoid.Error(ErrorCode.ADO_VALIDATION_INVALID_PATH, new string[] {
$"{treeNodeTypeName} Path \"{string.Join('\\', path)}\" is invalid. It must be less than {maxDepth} levels deep.",
"Learn more about naming restrictions here: https://learn.microsoft.com/en-us/azure/devops/organizations/settings/about-areas-iterations?view=azure-devops#naming-restrictions"
});
}


// Path should always start with the project name ADO expects an absolute path
if (!string.Equals(path.First(), root.Name, StringComparison.OrdinalIgnoreCase)) {
return OneFuzzResultVoid.Error(ErrorCode.ADO_VALIDATION_INVALID_PATH, new string[] {
$"{treeNodeTypeName} Path \"{string.Join('\\', path)}\" is invalid. It must start with the project name, \"{root.Name}\".",
$"Example: \"{root.Name}\\{path}\".",
});
}

// Validate that each part of the path is a valid child of the previous part
var current = root;
foreach (var part in path.Skip(1)) {
var child = current.Children?.FirstOrDefault(x => string.Equals(x.Name, part, StringComparison.OrdinalIgnoreCase));
if (child == null) {
return OneFuzzResultVoid.Error(ErrorCode.ADO_VALIDATION_INVALID_PATH, new string[] {
$"{pathType} Path \"{path}\" is invalid. \"{part}\" is not a valid child of \"{current.Name}\".",
$"Valid children of \"{current.Name}\" are: [{string.Join(',', current.Children?.Select(x => $"\"{x.Name}\"") ?? new List<string>())}].",
});
if (current.Children is null || !current.Children.Any()) {
return OneFuzzResultVoid.Error(ErrorCode.ADO_VALIDATION_INVALID_PATH, new string[] {
$"{treeNodeTypeName} Path \"{string.Join('\\', path)}\" is invalid. \"{current.Name}\" has no children.",
});
} else {
return OneFuzzResultVoid.Error(ErrorCode.ADO_VALIDATION_INVALID_PATH, new string[] {
$"{treeNodeTypeName} Path \"{string.Join('\\', path)}\" is invalid. \"{part}\" is not a valid child of \"{current.Name}\".",
$"Valid children of \"{current.Name}\" are: [{string.Join(',', current.Children?.Select(x => $"\"{x.Name}\"") ?? new List<string>())}].",
});
}
}

current = child;
@@ -195,14 +262,19 @@ await policy.ExecuteAsync(async () => {

try {
// Validate AreaPath and IterationPath exist
// This also validates that the config.Project exists
if (config.AdoFields.TryGetValue("System.AreaPath", out var areaPathString)) {
var validateAreaPath = await ValidatePath(config.Project, areaPathString, TreeStructureGroup.Areas, witClient);
var path = areaPathString.Split('\\');
var root = await witClient.GetClassificationNodeAsync(config.Project, TreeStructureGroup.Areas, depth: path.Length - 1);
var validateAreaPath = ValidateTreePath(path, root);
if (!validateAreaPath.IsOk) {
return validateAreaPath;
}
}
if (config.AdoFields.TryGetValue("System.IterationPath", out var iterationPathString)) {
var validateIterationPath = await ValidatePath(config.Project, iterationPathString, TreeStructureGroup.Iterations, witClient);
var path = iterationPathString.Split('\\');
var root = await witClient.GetClassificationNodeAsync(config.Project, TreeStructureGroup.Iterations, depth: path.Length - 1);
var validateIterationPath = ValidateTreePath(path, root);
if (!validateIterationPath.IsOk) {
return validateIterationPath;
}
@@ -291,6 +363,7 @@ public static RenderedAdoTemplate RenderAdoTemplate(ILogger logTracer, Renderer
original.UniqueFields,
adoFields,
onDuplicate,
original.AdoDuplicateFields,
original.Comment != null ? Render(renderer, original.Comment, instanceUrl, logTracer) : null
);
}
@@ -535,7 +608,7 @@ public async Async.Task Process(IList<(string, string)> notificationInfo) {
_logTracer.AddTags(new List<(string, string)> { ("MatchingWorkItemIds", $"{workItem.Id}") });
_logTracer.LogInformation("Found matching work item");
}
if (IsADODuplicateWorkItem(workItem)) {
if (IsADODuplicateWorkItem(workItem, _config.AdoDuplicateFields)) {
continue;
}

@@ -575,13 +648,17 @@ public async Async.Task Process(IList<(string, string)> notificationInfo) {
}
}

private static bool IsADODuplicateWorkItem(WorkItem wi) {
private static bool IsADODuplicateWorkItem(WorkItem wi, Dictionary<string, string>? duplicateFields) {
// A work item could have System.State == Resolve && System.Reason == Duplicate
// OR it could have System.State == Closed && System.Reason == Duplicate
// I haven't found any other combinations where System.Reason could be duplicate but just to be safe
// we're explicitly _not_ checking the state of the work item to determine if it's duplicate
return wi.Fields.ContainsKey("System.Reason") && string.Equals(wi.Fields["System.Reason"].ToString(), "Duplicate", StringComparison.OrdinalIgnoreCase)
|| wi.Fields.ContainsKey("Microsoft.VSTS.Common.ResolvedReason") && string.Equals(wi.Fields["Microsoft.VSTS.Common.ResolvedReason"].ToString(), "Duplicate", StringComparison.OrdinalIgnoreCase)
|| duplicateFields?.Any(fieldPair => {
var (field, value) = fieldPair;
return wi.Fields.ContainsKey(field) && string.Equals(wi.Fields[field].ToString(), value, StringComparison.OrdinalIgnoreCase);
}) == true
// Alternatively, the work item can also specify a 'relation' to another work item.
// This is typically used to create parent/child relationships between work items but can also
// Be used to mark duplicates so we should check this as well.
Original file line number Diff line number Diff line change
@@ -111,6 +111,7 @@ public async Async.Task OptionalFieldsAreSupported() {
},
"{{ if org }} blah {{ end }}"
),
null,
"{{ if org }} blah {{ end }}"
);

@@ -137,6 +138,7 @@ public async Async.Task All_ADO_Fields_Are_Migrated() {
},
"{% if org %} comment {% endif %}"
),
null,
"{% if org %} comment {% endif %}"
);

2 changes: 2 additions & 0 deletions src/ApiService/Tests/OrmModelsTest.cs
Original file line number Diff line number Diff line change
@@ -232,6 +232,7 @@ from authToken in Arb.Generate<SecretData<string>>()
from str in Arb.Generate<NonEmptyString>()
from fields in Arb.Generate<List<string>>()
from adoFields in Arb.Generate<Dictionary<string, string>>()
from adoDuplicateFields in Arb.Generate<Dictionary<string, string>>()
from dupeTemplate in Arb.Generate<ADODuplicateTemplate>()
select new AdoTemplate(
baseUrl,
@@ -241,6 +242,7 @@ from dupeTemplate in Arb.Generate<ADODuplicateTemplate>()
fields,
adoFields,
dupeTemplate,
adoDuplicateFields,
str.Get));

public static Arbitrary<TeamsTemplate> ArbTeamsTemplate()
148 changes: 148 additions & 0 deletions src/ApiService/Tests/TreePathTests.cs
Original file line number Diff line number Diff line change
@@ -0,0 +1,148 @@
using System.Collections.Generic;
using System.Linq;
using Microsoft.OneFuzz.Service;
using Microsoft.TeamFoundation.WorkItemTracking.WebApi.Models;
using Xunit;

namespace Tests;

// This might be a good candidate for property based testing
// https://fscheck.github.io/FsCheck//QuickStart.html
public class TreePathTests {
private static IEnumerable<string> SplitPath(string path) {
return path.Split('\\');
}

private static WorkItemClassificationNode MockTreeNode(IEnumerable<string> path, TreeNodeStructureType structureType) {
var root = new WorkItemClassificationNode() {
Name = path.First(),
StructureType = structureType
};

var current = root;
foreach (var segment in path.Skip(1)) {
var child = new WorkItemClassificationNode {
Name = segment
};
current.Children = new[] { child };
current = child;
}

return root;
}


[Fact]
public void TestValidPath() {
var path = SplitPath(@"project\foo\bar\baz");
var root = MockTreeNode(path, TreeNodeStructureType.Area);

var result = Ado.ValidateTreePath(path, root);

Assert.True(result.IsOk);
}

[Fact]
public void TestNullTreeNode() { // A null tree node indicates an invalid ADO project was used in the query
var path = SplitPath(@"project\foo\bar\baz");

var result = Ado.ValidateTreePath(path, null);

Assert.False(result.IsOk);
Assert.Equal(ErrorCode.ADO_VALIDATION_INVALID_PROJECT, result.ErrorV!.Code);
Assert.Contains("ADO project doesn't exist", result.ErrorV!.Errors![0]);
}

[Fact]
public void TestPathPartTooLong() {
var path = SplitPath(@"project\foo\barbazquxquuxcorgegraultgarplywaldofredplughxyzzythudbarbazquxquuxcorgegraultgarplywaldofredplughxyzzythudbarbazquxquuxcorgegraultgarplywaldofredplughxyzzythudbarbazquxquuxcorgegraultgarplywaldofredplughxyzzythudbarbazquxquuxcorgegraultgarplywaldofredplughxyzzythud\baz");
var root = MockTreeNode(path, TreeNodeStructureType.Iteration);

var result = Ado.ValidateTreePath(path, root);

Assert.False(result.IsOk);
Assert.Equal(ErrorCode.ADO_VALIDATION_INVALID_PATH, result.ErrorV!.Code);
Assert.Contains("too long", result.ErrorV!.Errors![0]);
}

[Theory]
[InlineData("project/foo/bar/baz")]
[InlineData("project\\foo:\\bar\\baz")]
public void TestPathContainsInvalidChar(string invalidPath) {
var path = SplitPath(invalidPath);
var treePath = SplitPath(@"project\foo\bar\baz");
var root = MockTreeNode(treePath, TreeNodeStructureType.Area);

var result = Ado.ValidateTreePath(path, root);

Assert.False(result.IsOk);
Assert.Equal(ErrorCode.ADO_VALIDATION_INVALID_PATH, result.ErrorV!.Code);
Assert.Contains("invalid character", result.ErrorV!.Errors![0]);
}

[Theory]
[InlineData("project\\foo\\ba\u0005r\\baz")]
[InlineData("project\\\nfoo\\bar\\baz")]
public void TestPathContainsUnicodeControlChar(string invalidPath) {
var path = SplitPath(invalidPath);
var treePath = SplitPath(@"project\foo\bar\baz");
var root = MockTreeNode(treePath, TreeNodeStructureType.Area);

var result = Ado.ValidateTreePath(path, root);

Assert.False(result.IsOk);
Assert.Equal(ErrorCode.ADO_VALIDATION_INVALID_PATH, result.ErrorV!.Code);
Assert.Contains("unicode control character", result.ErrorV!.Errors![0]);
}

[Fact]
public void TestPathTooDeep() {
var path = SplitPath(@"project\foo\bar\baz\qux\quux\corge\grault\garply\waldo\fred\plugh\xyzzy\thud");
var root = MockTreeNode(path, TreeNodeStructureType.Area);

var result = Ado.ValidateTreePath(path, root);

Assert.False(result.IsOk);
Assert.Equal(ErrorCode.ADO_VALIDATION_INVALID_PATH, result.ErrorV!.Code);
Assert.Contains("levels deep", result.ErrorV!.Errors![0]);
}

[Fact]
public void TestPathWithoutProjectName() {
var path = SplitPath(@"foo\bar\baz");
var treePath = SplitPath(@"project\foo\bar\baz");
var root = MockTreeNode(treePath, TreeNodeStructureType.Iteration);

var result = Ado.ValidateTreePath(path, root);

Assert.False(result.IsOk);
Assert.Equal(ErrorCode.ADO_VALIDATION_INVALID_PATH, result.ErrorV!.Code);
Assert.Contains("start with the project name", result.ErrorV!.Errors![0]);
}

[Fact]
public void TestPathWithInvalidChild() {
var path = SplitPath(@"project\foo\baz");
var treePath = SplitPath(@"project\foo\bar");
var root = MockTreeNode(treePath, TreeNodeStructureType.Iteration);

var result = Ado.ValidateTreePath(path, root);

Assert.False(result.IsOk);
Assert.Equal(ErrorCode.ADO_VALIDATION_INVALID_PATH, result.ErrorV!.Code);
Assert.Contains("not a valid child", result.ErrorV!.Errors![0]);
}

[Fact]
public void TestPathWithExtraChild() {
var path = SplitPath(@"project\foo\bar\baz");
var treePath = SplitPath(@"project\foo\bar");
var root = MockTreeNode(treePath, TreeNodeStructureType.Iteration);

var result = Ado.ValidateTreePath(path, root);

Assert.False(result.IsOk);
Assert.Equal(ErrorCode.ADO_VALIDATION_INVALID_PATH, result.ErrorV!.Code);
Assert.Contains("has no children", result.ErrorV!.Errors![0]);
}
}
143 changes: 73 additions & 70 deletions src/agent/Cargo.lock

Large diffs are not rendered by default.

9 changes: 5 additions & 4 deletions src/agent/coverage/Cargo.toml
Original file line number Diff line number Diff line change
@@ -20,23 +20,24 @@ symbolic = { version = "12.3", features = [
"symcache",
] }
thiserror = "1.0"
process_control = "4.0"

[target.'cfg(target_os = "windows")'.dependencies]
debugger = { path = "../debugger" }

[target.'cfg(target_os = "linux")'.dependencies]
nix = "0.26"
pete = "0.10"
pete = "0.12"
# For procfs, opt out of the `chrono` freature; it pulls in an old version
# of `time`. We do not use the methods that the `chrono` feature enables.
procfs = { version = "0.15.1", default-features = false, features = ["flate2"] }

[dev-dependencies]
clap = { version = "4.3", features = ["derive"] }
clap = { version = "4.4", features = ["derive"] }
env_logger = "0.10.0"
pretty_assertions = "1.4.0"
insta = { version = "1.31.0", features = ["glob"] }
insta = { version = "1.32.0", features = ["glob"] }
coverage = { path = "../coverage" }
cc = "1.0"
tempfile = "3.7.0"
tempfile = "3.8.0"
dunce = "1.0"
4 changes: 4 additions & 0 deletions src/agent/coverage/fuzz/.gitignore
Original file line number Diff line number Diff line change
@@ -0,0 +1,4 @@
target
corpus
artifacts
coverage
1,480 changes: 1,480 additions & 0 deletions src/agent/coverage/fuzz/Cargo.lock

Large diffs are not rendered by default.

30 changes: 30 additions & 0 deletions src/agent/coverage/fuzz/Cargo.toml
Original file line number Diff line number Diff line change
@@ -0,0 +1,30 @@
[package]
name = "coverage-fuzz"
version = "0.0.0"
publish = false
edition = "2021"

[package.metadata]
cargo-fuzz = true

[dependencies]
libfuzzer-sys = "0.4"
tempfile = "3.7"
debuggable-module = { path = "../../debuggable-module" }


[dependencies.coverage]
path = ".."

# Prevent this from interfering with workspaces
[workspace]
members = ["."]

[profile.release]
debug = 1

[[bin]]
name = "fuzz_target_record_coverage"
path = "fuzz_targets/fuzz_target_record_coverage.rs"
test = false
doc = false
Original file line number Diff line number Diff line change
@@ -0,0 +1,51 @@
#![no_main]

use libfuzzer_sys::fuzz_target;
use std::env;
use std::fs;
use std::io::Write;
use std::process::Command;
use std::sync::Arc;
use std::time::Duration;

use tempfile::NamedTempFile;

use coverage::allowlist::AllowList;
use coverage::binary::BinaryCoverage;
use coverage::record::CoverageRecorder;

use debuggable_module::loader::Loader;

const INPUT_MARKER: &str = "@@";

fuzz_target!(|data: &[u8]| {
if data.len() == 0 {
return;
}

// Write mutated bytes to a file
let mut file = NamedTempFile::new_in(env::current_dir().unwrap()).unwrap();
file.write_all(data);
let path = String::from(file.path().to_str().unwrap());

// Make sure the file is executable
Command::new("chmod").args(["+wrx", &path]).spawn().unwrap().wait();
file.keep().unwrap();

let timeout = Duration::from_secs(5);

let allowlist = AllowList::default();

let _coverage = BinaryCoverage::default();
let loader = Arc::new(Loader::new());

let cmd = Command::new(&path);

let _recorded = CoverageRecorder::new(cmd)
.module_allowlist(allowlist.clone())
.loader(loader)
.timeout(timeout)
.record();

fs::remove_file(path);
});
7 changes: 6 additions & 1 deletion src/agent/coverage/src/allowlist.rs
Original file line number Diff line number Diff line change
@@ -142,7 +142,12 @@ fn glob_to_regex(expr: &str) -> Result<Regex> {
let expr = expr.replace(r"\*", ".*");

// Anchor to line start and end.
let expr = format!("^{expr}$");
// On Windows we should also ignore case.
let expr = if cfg!(windows) {
format!("(?i)^{expr}$")
} else {
format!("^{expr}$")
};

Ok(Regex::new(&expr)?)
}
18 changes: 18 additions & 0 deletions src/agent/coverage/src/allowlist/tests.rs
Original file line number Diff line number Diff line change
@@ -175,3 +175,21 @@ fn test_allowlist_escape() -> Result<()> {

Ok(())
}

#[cfg(target_os = "windows")]
#[test]
fn test_windows_allowlists_are_not_case_sensitive() -> Result<()> {
let allowlist = AllowList::parse("vccrt")?;
assert!(allowlist.is_allowed("VCCRT"));

Ok(())
}

#[cfg(not(target_os = "windows"))]
#[test]
fn test_linux_allowlists_are_case_sensitive() -> Result<()> {
let allowlist = AllowList::parse("vccrt")?;
assert!(!allowlist.is_allowed("VCCRT"));

Ok(())
}
81 changes: 60 additions & 21 deletions src/agent/coverage/src/record.rs
Original file line number Diff line number Diff line change
@@ -1,7 +1,7 @@
// Copyright (c) Microsoft Corporation.
// Licensed under the MIT License.

use std::process::{Command, ExitStatus, Stdio};
use std::process::{Command, Stdio};
use std::sync::Arc;
use std::time::Duration;

@@ -120,32 +120,58 @@ impl CoverageRecorder {

#[cfg(target_os = "windows")]
pub fn record(self) -> Result<Recorded> {
use anyhow::bail;
use debugger::Debugger;
use process_control::{ChildExt, Control};
use windows::WindowsRecorder;

let child = Debugger::create_child(self.cmd)?;

// Spawn a thread to wait for the target process to exit.
let taget_process = std::thread::spawn(move || {
let output = child
.controlled_with_output()
.time_limit(self.timeout)
.terminate_for_timeout()
.wait();
output
});

let loader = self.loader.clone();
let mut recorder =
WindowsRecorder::new(&loader, self.module_allowlist, self.cache.as_ref());

crate::timer::timed(self.timeout, move || {
let mut recorder =
WindowsRecorder::new(&loader, self.module_allowlist, self.cache.as_ref());
let (mut dbg, child) = Debugger::init(self.cmd, &mut recorder)?;
dbg.run(&mut recorder)?;

// If the debugger callbacks fail, this may return with a spurious clean exit.
let output = child.wait_with_output()?.into();

// Check if debugging was stopped due to a callback error.
//
// If so, the debugger terminated the target, and the recorded coverage and
// output are both invalid.
if let Some(err) = recorder.stop_error {
return Err(err);
// The debugger is initialized in the same thread that created the target process to be able to receive the debug events
let mut dbg = Debugger::init_debugger(&mut recorder)?;
dbg.run(&mut recorder)?;

// If the debugger callbacks fail, this may return with a spurious clean exit.
let output = match taget_process.join() {
Err(err) => {
bail!("failed to launch target thread: {:?}", err)
}
Ok(Err(err)) => {
bail!("failed to launch target process: {:?}", err)
}
Ok(Ok(None)) => {
bail!(crate::timer::TimerError::Timeout(self.timeout))
}
Ok(Ok(Some(output))) => output,
};

let coverage = recorder.coverage;
// Check if debugging was stopped due to a callback error.
//
// If so, the debugger terminated the target, and the recorded coverage and
// output are both invalid.
if let Some(err) = recorder.stop_error {
return Err(err);
}

Ok(Recorded { coverage, output })
})?
let coverage = recorder.coverage;
Ok(Recorded {
coverage,
output: output.into(),
})
}
}

@@ -157,19 +183,32 @@ pub struct Recorded {

#[derive(Clone, Debug, Default)]
pub struct Output {
pub status: Option<ExitStatus>,
pub status: Option<process_control::ExitStatus>,
pub stderr: String,
pub stdout: String,
}

impl From<process_control::Output> for Output {
fn from(output: process_control::Output) -> Self {
let status = Some(output.status);
let stdout = String::from_utf8_lossy(&output.stdout).into_owned();
let stderr = String::from_utf8_lossy(&output.stderr).into_owned();
Self {
status,
stdout,
stderr,
}
}
}

impl From<std::process::Output> for Output {
fn from(output: std::process::Output) -> Self {
let status = Some(output.status);
let stdout = String::from_utf8_lossy(&output.stdout).into_owned();
let stderr = String::from_utf8_lossy(&output.stderr).into_owned();

Self {
status,
status: status.map(Into::into),
stdout,
stderr,
}
11 changes: 8 additions & 3 deletions src/agent/coverage/src/record/linux/debugger.rs
Original file line number Diff line number Diff line change
@@ -4,6 +4,7 @@
use std::collections::BTreeMap;
use std::io::Read;
use std::process::{Child, Command};
use std::time::Duration;

use anyhow::{bail, format_err, Result};
use debuggable_module::path::FilePath;
@@ -75,7 +76,11 @@ impl<'eh> Debugger<'eh> {
// These calls should also be unnecessary no-ops, but we really want to avoid any dangling
// or zombie child processes.
let _ = child.kill();
let _ = child.wait();

// We don't need to call child.wait() because of the following series of events:
// 1. pete, our ptracing library, spawns the child process with ptrace flags
// 2. rust stdlib set SIG_IGN as the SIGCHLD handler: https://github.com/rust-lang/rust/issues/110317
// 3. linux kernel automatically reaps pids when the above 2 hold: https://github.com/torvalds/linux/blob/44149752e9987a9eac5ad78e6d3a20934b5e018d/kernel/signal.c#L2089-L2110

let output = Output {
status,
@@ -198,8 +203,8 @@ impl DebuggerContext {
pub fn new() -> Self {
let breakpoints = Breakpoints::default();
let images = None;
let tracer = Ptracer::new();

let mut tracer = Ptracer::new();
*tracer.poll_delay_mut() = Duration::from_millis(1);
Self {
breakpoints,
images,
34 changes: 19 additions & 15 deletions src/agent/coverage/src/record/windows.rs
Original file line number Diff line number Diff line change
@@ -4,7 +4,7 @@
use std::collections::BTreeMap;
use std::path::Path;

use anyhow::{anyhow, bail, Error, Result};
use anyhow::{anyhow, Error, Result};
use debuggable_module::debuginfo::{DebugInfo, Function};
use debuggable_module::load_module::LoadModule;
use debuggable_module::loader::Loader;
@@ -132,20 +132,24 @@ impl<'cache, 'data> WindowsRecorder<'cache, 'data> {
return Ok(());
}

let breakpoint = self.breakpoints.remove(id);

let Some(breakpoint) = breakpoint else {
let stack = dbg.get_current_stack()?;
bail!("stopped on dangling breakpoint, debuggee stack:\n{}", stack);
};

let coverage = self
.coverage
.modules
.get_mut(&breakpoint.module)
.ok_or_else(|| anyhow!("coverage not initialized for module: {}", breakpoint.module))?;

coverage.increment(breakpoint.offset);
match self.breakpoints.remove(id) {
Some(breakpoint) => {
let coverage = self
.coverage
.modules
.get_mut(&breakpoint.module)
.ok_or_else(|| {
anyhow!("coverage not initialized for module: {}", breakpoint.module)
})?;

coverage.increment(breakpoint.offset);
}
// ASAN can set breakpoints which we don't know about, meaning they're not in `self.breakpoints`
None => {
let stack = dbg.get_current_stack()?;
warn!("stopped on dangling breakpoint, debuggee stack:\n{}", stack);
}
}

Ok(())
}
26 changes: 26 additions & 0 deletions src/agent/coverage/src/source.rs
Original file line number Diff line number Diff line change
@@ -2,6 +2,7 @@
// Licensed under the MIT License.

use std::collections::{BTreeMap, BTreeSet};

use std::num::NonZeroU32;

use anyhow::{Context, Result};
@@ -11,6 +12,7 @@ use debuggable_module::load_module::LoadModule;
use debuggable_module::loader::Loader;
use debuggable_module::path::FilePath;
use debuggable_module::{Module, Offset};
use symbolic::symcache::transform::{SourceLocation, Transformer};

use crate::allowlist::AllowList;
use crate::binary::BinaryCoverage;
@@ -69,6 +71,30 @@ pub fn binary_to_source_coverage(
let mut symcache = vec![];
let mut converter = SymCacheConverter::new();

if cfg!(windows) {
use symbolic::symcache::transform::Function;
struct CaseInsensitive {}
impl Transformer for CaseInsensitive {
fn transform_function<'f>(&'f mut self, f: Function<'f>) -> Function<'f> {
f
}

fn transform_source_location<'f>(
&'f mut self,
mut sl: SourceLocation<'f>,
) -> SourceLocation<'f> {
sl.file.name = sl.file.name.to_ascii_lowercase().into();
sl.file.directory = sl.file.directory.map(|d| d.to_ascii_lowercase().into());
sl.file.comp_dir = sl.file.comp_dir.map(|d| d.to_ascii_lowercase().into());
sl
}
}

let case_insensitive_transformer = CaseInsensitive {};

converter.add_transformer(case_insensitive_transformer);
}

let exe = Object::parse(module.executable_data())?;
converter.process_object(&exe)?;

1 change: 1 addition & 0 deletions src/agent/coverage/src/timer.rs
Original file line number Diff line number Diff line change
@@ -7,6 +7,7 @@ use std::time::Duration;

use thiserror::Error;

#[allow(dead_code)]
pub fn timed<F, T>(timeout: Duration, function: F) -> Result<T, TimerError>
where
T: Send + 'static,
14 changes: 12 additions & 2 deletions src/agent/coverage/tests/snapshot.rs
Original file line number Diff line number Diff line change
@@ -43,7 +43,8 @@ fn windows_snapshot_tests() {
};

// filter to just the input test file:
let source_allowlist = AllowList::parse(&input_path.to_string_lossy()).unwrap();
let source_allowlist =
AllowList::parse(&input_path.to_string_lossy().to_ascii_lowercase()).unwrap();

let exe_cmd = std::process::Command::new(&exe_name);
let recorded = coverage::CoverageRecorder::new(exe_cmd)
@@ -57,9 +58,18 @@ fn windows_snapshot_tests() {
coverage::source::binary_to_source_coverage(&recorded.coverage, &source_allowlist)
.expect("binary_to_source_coverage");

println!("{:?}", source.files.keys());

// For Windows, the source coverage is tracked using case-insensitive paths.
// The conversion from case-sensitive to insensitive is done when converting from binary to source coverage.
// By naming our test file with a capital letter, we can ensure that the case-insensitive conversion is working.
source.files.keys().for_each(|k| {
assert_eq!(k.to_string().to_ascii_lowercase(), k.to_string());
});

let file_coverage = source
.files
.get(&FilePath::new(input_path.to_string_lossy()).unwrap())
.get(&FilePath::new(input_path.to_string_lossy().to_ascii_lowercase()).unwrap())
.expect("coverage for input");

let mut result = String::new();
Original file line number Diff line number Diff line change
@@ -1,7 +1,7 @@
---
source: coverage/tests/snapshot.rs
expression: result
input_file: coverage/tests/windows/inlinee.cpp
input_file: coverage/tests/windows/Inlinee.cpp
---
[ ] #include <iostream>
[ ]
6 changes: 3 additions & 3 deletions src/agent/debuggable-module/Cargo.toml
Original file line number Diff line number Diff line change
@@ -6,8 +6,8 @@ license = "MIT"

[dependencies]
anyhow = "1.0"
elsa = "1.8.1"
gimli = "0.27.2"
elsa = "1.9.0"
gimli = "0.28.0"
goblin = "0.6"
iced-x86 = "1.20"
log = "0.4.17"
@@ -21,4 +21,4 @@ symbolic = { version = "12.3", features = [
thiserror = "1.0"

[dev-dependencies]
clap = { version = "4.3", features = ["derive"] }
clap = { version = "4.4", features = ["derive"] }
2 changes: 1 addition & 1 deletion src/agent/debugger/Cargo.toml
Original file line number Diff line number Diff line change
@@ -11,7 +11,7 @@ fnv = "1.0"
goblin = "0.6"
iced-x86 = "1.20"
log = "0.4"
memmap2 = "0.7"
memmap2 = "0.8"
rand = "0.8"
serde = { version = "1.0", features = ["derive"] }
win-util = { path = "../win-util" }
27 changes: 17 additions & 10 deletions src/agent/debugger/src/debugger.rs
Original file line number Diff line number Diff line change
@@ -134,15 +134,7 @@ pub struct Debugger {
}

impl Debugger {
pub fn init(
mut command: Command,
callbacks: &mut impl DebugEventHandler,
) -> Result<(Self, Child)> {
let child = command
.creation_flags(DEBUG_ONLY_THIS_PROCESS.0)
.spawn()
.context("debugee failed to start")?;

pub fn init_debugger(callbacks: &mut impl DebugEventHandler) -> Result<Self> {
unsafe { DebugSetProcessKillOnExit(TRUE) }
.ok()
.context("Setting DebugSetProcessKillOnExit to TRUE")?;
@@ -186,12 +178,27 @@ impl Debugger {
return Err(last_os_error());
}

Ok((debugger, child))
Ok(debugger)
} else {
anyhow::bail!("Unexpected event: {}", de)
}
}

pub fn create_child(mut command: Command) -> Result<Child> {
let child = command
.creation_flags(DEBUG_ONLY_THIS_PROCESS.0)
.spawn()
.context("debugee failed to start")?;

Ok(child)
}

pub fn init(command: Command, callbacks: &mut impl DebugEventHandler) -> Result<(Self, Child)> {
let child = Self::create_child(command)?;
let debugger = Self::init_debugger(callbacks)?;
Ok((debugger, child))
}

pub fn target(&mut self) -> &mut Target {
&mut self.target
}
1 change: 0 additions & 1 deletion src/agent/debugger/src/module.rs
Original file line number Diff line number Diff line change
@@ -46,7 +46,6 @@ impl Module {
error!("Error getting path from file handle: {}", e);
"???".into()
});

let image_details = get_image_details(&path)?;

Ok(Module {
4 changes: 2 additions & 2 deletions src/agent/dynamic-library/Cargo.toml
Original file line number Diff line number Diff line change
@@ -6,14 +6,14 @@ license = "MIT"

[dependencies]
anyhow = "1.0"
clap = { version = "4.3.0", features = ["derive"] }
clap = { version = "4.4.2", features = ["derive"] }
lazy_static = "1.4"
regex = "1.9"
thiserror = "1.0"

[target.'cfg(windows)'.dependencies]
debugger = { path = "../debugger" }
winreg = "0.50"
winreg = "0.51"

[dependencies.windows]
version = "0.48"
2 changes: 1 addition & 1 deletion src/agent/input-tester/Cargo.toml
Original file line number Diff line number Diff line change
@@ -13,7 +13,7 @@ fnv = "1.0"
hex = "0.4"
log = "0.4"
num_cpus = "1.15"
rayon = "1.7"
rayon = "1.8"
sha2 = "0.10.2"
win-util = { path = "../win-util" }

8 changes: 4 additions & 4 deletions src/agent/onefuzz-agent/Cargo.toml
Original file line number Diff line number Diff line change
@@ -22,7 +22,7 @@ reqwest = { version = "0.11", features = [
serde = { version = "1.0", features = ["derive"] }
serde_json = "1.0"
storage-queue = { path = "../storage-queue" }
tokio = { version = "1.29", features = ["full"] }
tokio = { version = "1.32", features = ["full"] }
url = { version = "2.4", features = ["serde"] }
uuid = { version = "1.4", features = ["serde", "v4"] }
clap = { version = "4", features = ["derive", "cargo"] }
@@ -31,13 +31,13 @@ onefuzz-telemetry = { path = "../onefuzz-telemetry" }
backtrace = "0.3"
ipc-channel = { git = "https://github.com/servo/ipc-channel", rev = "7f432aa" }
dynamic-library = { path = "../dynamic-library" }
azure_core = { version = "0.13", default-features = false, features = [
azure_core = { version = "0.15", default-features = false, features = [
"enable_reqwest",
] }
azure_storage = { version = "0.13", default-features = false, features = [
azure_storage = { version = "0.15", default-features = false, features = [
"enable_reqwest",
] }
azure_storage_blobs = { version = "0.13", default-features = false, features = [
azure_storage_blobs = { version = "0.15", default-features = false, features = [
"enable_reqwest",
] }

19 changes: 14 additions & 5 deletions src/agent/onefuzz-task/Cargo.toml
Original file line number Diff line number Diff line change
@@ -6,6 +6,14 @@ edition = "2021"
publish = false
license = "MIT"

[lib]
path = "src/lib.rs"
name = "onefuzz_task_lib"

[[bin]]
path = "src/main.rs"
name = "onefuzz-task"

[features]
integration_test = []

@@ -46,9 +54,9 @@ strum = "0.25"
strum_macros = "0.25"
stacktrace-parser = { path = "../stacktrace-parser" }
storage-queue = { path = "../storage-queue" }
tempfile = "3.7.0"
tempfile = "3.8.0"
thiserror = "1.0"
tokio = { version = "1.29", features = ["full"] }
tokio = { version = "1.32", features = ["full"] }
tokio-util = { version = "0.7", features = ["full"] }
tokio-stream = "0.1"
tui = { package = "ratatui", version = "0.22.0", default-features = false, features = [
@@ -62,13 +70,13 @@ chrono = { version = "0.4", default-features = false, features = [
] }
ipc-channel = { git = "https://github.com/servo/ipc-channel", rev = "7f432aa" }

azure_core = { version = "0.13", default-features = false, features = [
azure_core = { version = "0.15", default-features = false, features = [
"enable_reqwest",
] }
azure_storage = { version = "0.13", default-features = false, features = [
azure_storage = { version = "0.15", default-features = false, features = [
"enable_reqwest",
] }
azure_storage_blobs = { version = "0.13", default-features = false, features = [
azure_storage_blobs = { version = "0.15", default-features = false, features = [
"enable_reqwest",
] }

@@ -77,3 +85,4 @@ schemars = { version = "0.8.12", features = ["uuid1"] }

[dev-dependencies]
pretty_assertions = "1.4"
tempfile = "3.8"
78 changes: 78 additions & 0 deletions src/agent/onefuzz-task/src/check_for_update.rs
Original file line number Diff line number Diff line change
@@ -0,0 +1,78 @@
use std::process::Stdio;

use anyhow::Result;
use serde_json::Value;

pub fn run(onefuzz_built_version: &str) -> Result<()> {
// Find onefuzz cli
let common_names = ["onefuzz", "onefuzz.exe", "onefuzz.cmd"];
let mut valid_commands: Vec<_> = common_names
.into_iter()
.map(|name| {
(
name,
std::process::Command::new(name)
.stderr(Stdio::null())
.stdout(Stdio::null())
.arg("-h")
.spawn(),
)
})
.filter_map(|(name, child)| child.ok().map(|c| (name, c)))
.collect();

if valid_commands.is_empty() {
bail!(
"Could not find any of the following common names for the onefuzz-cli: {:?}",
common_names
);
}

let (name, child) = valid_commands
.first_mut()
.expect("Expected valid_commands to not be empty");

info!("Found the onefuzz cli at: {}", name);

// We just used this to check if it exists, we'll invoke it again later
let _ = child.kill();

// Run onefuzz info get
let output = std::process::Command::new(&name)
.args(["info", "get"])
.output()?;

if !output.status.success() {
bail!(
"Failed to run command `{} info get`. stderr: {:?}, stdout: {:?}",
name,
String::from_utf8(output.stderr),
String::from_utf8(output.stdout)
)
}

let stdout = String::from_utf8(output.stdout)?;
let info: Value = serde_json::from_str(&stdout)?;

if let Some(onefuzz_service_version) = info["versions"]["onefuzz"]["version"].as_str() {
if onefuzz_service_version == onefuzz_built_version {
println!("You are up to date!");
} else {
println!(
"Version mismatch. onefuzz-task version: {} | onefuzz service version: {}",
onefuzz_built_version, onefuzz_service_version
);
println!(
"To update, please run the following command: {} tools get .",
name
);
println!("Then extract the onefuzz-task binary from the appropriate OS folder");
}
return Ok(());
}

bail!(
"Failed to get onefuzz service version from cli response: {}",
stdout
)
}
9 changes: 9 additions & 0 deletions src/agent/onefuzz-task/src/lib.rs
Original file line number Diff line number Diff line change
@@ -0,0 +1,9 @@
#[macro_use]
extern crate anyhow;
#[macro_use]
extern crate clap;
#[macro_use]
extern crate onefuzz_telemetry;

pub mod local;
pub mod tasks;
10 changes: 6 additions & 4 deletions src/agent/onefuzz-task/src/local/cmd.rs
Original file line number Diff line number Diff line change
@@ -1,26 +1,26 @@
// Copyright (c) Microsoft Corporation.
// Licensed under the MIT License.

use super::{create_template, template};
#[cfg(any(target_os = "linux", target_os = "windows"))]
use crate::local::coverage;
use crate::local::{common::add_common_config, libfuzzer_fuzz, tui::TerminalUi};
use anyhow::{Context, Result};

use clap::{Arg, ArgAction, Command};
use std::time::Duration;
use std::{path::PathBuf, str::FromStr};
use strum::IntoEnumIterator;
use strum_macros::{EnumIter, EnumString, IntoStaticStr};
use tokio::{select, time::timeout};

use super::template;

#[derive(Debug, PartialEq, Eq, EnumString, IntoStaticStr, EnumIter)]
#[strum(serialize_all = "kebab-case")]
enum Commands {
#[cfg(any(target_os = "linux", target_os = "windows"))]
Coverage,
LibfuzzerFuzz,
Template,
CreateTemplate,
}

const TIMEOUT: &str = "timeout";
@@ -43,7 +43,7 @@ pub async fn run(args: clap::ArgMatches) -> Result<()> {

let sub_args = sub_args.clone();

let terminal = if start_ui {
let terminal = if start_ui && command != Commands::CreateTemplate {
Some(TerminalUi::init()?)
} else {
env_logger::Builder::from_env(env_logger::Env::default().default_filter_or("info")).init();
@@ -62,6 +62,7 @@ pub async fn run(args: clap::ArgMatches) -> Result<()> {

template::launch(config, event_sender).await
}
Commands::CreateTemplate => create_template::run(),
}
});

@@ -116,6 +117,7 @@ pub fn args(name: &'static str) -> Command {
.args(vec![Arg::new("config")
.value_parser(value_parser!(std::path::PathBuf))
.required(true)]),
Commands::CreateTemplate => create_template::args(subcommand.into()),
};

cmd = if add_common {
15 changes: 14 additions & 1 deletion src/agent/onefuzz-task/src/local/coverage.rs
Original file line number Diff line number Diff line change
@@ -148,7 +148,20 @@ pub struct Coverage {
}

#[async_trait]
impl Template for Coverage {
impl Template<Coverage> for Coverage {
fn example_values() -> Coverage {
Coverage {
target_exe: PathBuf::from("path_to_your_exe"),
target_env: HashMap::new(),
target_options: vec![],
target_timeout: None,
module_allowlist: None,
source_allowlist: None,
input_queue: Some(PathBuf::from("path_to_your_inputs")),
readonly_inputs: vec![PathBuf::from("path_to_readonly_inputs")],
coverage: PathBuf::from("path_to_where_you_want_coverage_to_be_output"),
}
}
async fn run(&self, context: &RunContext) -> Result<()> {
let ri: Result<Vec<SyncedDir>> = self
.readonly_inputs
285 changes: 285 additions & 0 deletions src/agent/onefuzz-task/src/local/create_template.rs
Original file line number Diff line number Diff line change
@@ -0,0 +1,285 @@
use crate::local::template::CommonProperties;

use super::template::{TaskConfig, TaskConfigDiscriminants, TaskGroup};
use anyhow::Result;
use clap::Command;
use std::str::FromStr;
use std::{
io,
path::{Path, PathBuf},
};

use strum::VariantNames;

use crate::local::{
coverage::Coverage, generic_analysis::Analysis, generic_crash_report::CrashReport,
generic_generator::Generator, libfuzzer::LibFuzzer,
libfuzzer_crash_report::LibfuzzerCrashReport, libfuzzer_merge::LibfuzzerMerge,
libfuzzer_regression::LibfuzzerRegression, libfuzzer_test_input::LibfuzzerTestInput,
template::Template, test_input::TestInput,
};

use crossterm::{
event::{self, DisableMouseCapture, EnableMouseCapture, Event, KeyCode, KeyEventKind},
execute,
terminal::{disable_raw_mode, enable_raw_mode, EnterAlternateScreen, LeaveAlternateScreen},
};
use tui::{prelude::*, widgets::*};

pub fn args(name: &'static str) -> Command {
Command::new(name).about("interactively create a template")
}

pub fn run() -> Result<()> {
// setup terminal
enable_raw_mode()?;
let mut stdout = io::stdout();
execute!(stdout, EnterAlternateScreen, EnableMouseCapture)?;
let backend = CrosstermBackend::new(stdout);
let mut terminal = Terminal::new(backend)?;

// create app and run it
let app = App::new();
let res = run_app(&mut terminal, app);

// restore terminal
disable_raw_mode()?;
execute!(
terminal.backend_mut(),
LeaveAlternateScreen,
DisableMouseCapture
)?;
terminal.show_cursor()?;

match res {
Ok(None) => { /* user quit, do nothing */ }
Ok(Some(path)) => match path.canonicalize() {
Ok(canonical_path) => println!("Wrote the template to: {:?}", canonical_path),
_ => println!("Wrote the template to: {:?}", path),
},
Err(e) => println!("Failed to write template due to {}", e),
}

Ok(())
}

fn run_app<B: Backend>(terminal: &mut Terminal<B>, mut app: App) -> Result<Option<PathBuf>> {
loop {
terminal.draw(|f| ui(f, &mut app))?;
if let Event::Key(key) = event::read()? {
if key.kind == KeyEventKind::Press {
match key.code {
KeyCode::Char('q') => return Ok(None),
KeyCode::Char(' ') => app.items.toggle(),
KeyCode::Down => app.items.next(),
KeyCode::Up => app.items.previous(),
KeyCode::Enter => {
return match generate_template(app.items.items) {
Ok(p) => Ok(Some(p)),
Err(e) => Err(e),
}
}
_ => {}
}
}
}
}
}

fn generate_template(items: Vec<ListElement>) -> Result<PathBuf> {
let tasks: Vec<TaskConfig> = items
.iter()
.filter(|item| item.is_included)
.filter_map(|list_element| {
match TaskConfigDiscriminants::from_str(list_element.task_type) {
Err(e) => {
error!(
"Failed to match task config {:?} - {}",
list_element.task_type, e
);
None
}
Ok(t) => match t {
TaskConfigDiscriminants::LibFuzzer => {
Some(TaskConfig::LibFuzzer(LibFuzzer::example_values()))
}
TaskConfigDiscriminants::Analysis => {
Some(TaskConfig::Analysis(Analysis::example_values()))
}
TaskConfigDiscriminants::Coverage => {
Some(TaskConfig::Coverage(Coverage::example_values()))
}
TaskConfigDiscriminants::CrashReport => {
Some(TaskConfig::CrashReport(CrashReport::example_values()))
}
TaskConfigDiscriminants::Generator => {
Some(TaskConfig::Generator(Generator::example_values()))
}
TaskConfigDiscriminants::LibfuzzerCrashReport => Some(
TaskConfig::LibfuzzerCrashReport(LibfuzzerCrashReport::example_values()),
),
TaskConfigDiscriminants::LibfuzzerMerge => {
Some(TaskConfig::LibfuzzerMerge(LibfuzzerMerge::example_values()))
}
TaskConfigDiscriminants::LibfuzzerRegression => Some(
TaskConfig::LibfuzzerRegression(LibfuzzerRegression::example_values()),
),
TaskConfigDiscriminants::LibfuzzerTestInput => Some(
TaskConfig::LibfuzzerTestInput(LibfuzzerTestInput::example_values()),
),
TaskConfigDiscriminants::TestInput => {
Some(TaskConfig::TestInput(TestInput::example_values()))
}
TaskConfigDiscriminants::Radamsa => Some(TaskConfig::Radamsa),
},
}
})
.collect();

let definition = TaskGroup {
common: CommonProperties {
setup_dir: None,
extra_setup_dir: None,
extra_dir: None,
create_job_dir: false,
},
tasks,
};

let filename = "template";
let mut filepath = format!("./{}.yaml", filename);
let mut output_file = Path::new(&filepath);
let mut counter = 0;
while output_file.exists() {
filepath = format!("./{}-{}.yaml", filename, counter);
output_file = Path::new(&filepath);
counter += 1;
}

std::fs::write(output_file, serde_yaml::to_string(&definition)?)?;

Ok(output_file.into())
}

fn ui<B: Backend>(f: &mut Frame<B>, app: &mut App) {
let areas = Layout::default()
.direction(Direction::Vertical)
.constraints([Constraint::Percentage(100)])
.split(f.size());
// Iterate through all elements in the `items` app and append some debug text to it.
let items: Vec<ListItem> = app
.items
.items
.iter()
.map(|list_element| {
let title = if list_element.is_included {
format!("✅ {}", list_element.task_type)
} else {
list_element.task_type.to_string()
};
ListItem::new(title).style(Style::default().fg(Color::Black).bg(Color::White))
})
.collect();

// Create a List from all list items and highlight the currently selected one
let items = List::new(items)
.block(
Block::default()
.borders(Borders::ALL)
.title("Select which tasks you want to include in the template. Use ⬆/⬇ to navigate and <space> to select. Press <enter> when you're done."),
)
.highlight_style(
Style::default()
.bg(Color::LightGreen)
.add_modifier(Modifier::BOLD),
)
.highlight_symbol(">> ");

// We can now render the item list
f.render_stateful_widget(items, areas[0], &mut app.items.state);
}

struct ListElement<'a> {
pub task_type: &'a str,
pub is_included: bool,
}

pub trait Toggle {
fn toggle(&mut self) {}
}

impl<'a> Toggle for ListElement<'a> {
fn toggle(&mut self) {
self.is_included = !self.is_included
}
}

struct App<'a> {
items: StatefulList<ListElement<'a>>,
}

impl<'a> App<'a> {
fn new() -> App<'a> {
App {
items: StatefulList::with_items(
TaskConfig::VARIANTS
.iter()
.map(|name| ListElement {
task_type: name,
is_included: false,
})
.collect(),
),
}
}
}

struct StatefulList<ListElement> {
state: ListState,
items: Vec<ListElement>,
}

impl<T: Toggle> StatefulList<T> {
fn with_items(items: Vec<T>) -> StatefulList<T> {
StatefulList {
state: ListState::default(),
items,
}
}

fn next(&mut self) {
let i = match self.state.selected() {
Some(i) => {
if self.items.first().is_some() {
(i + 1) % self.items.len()
} else {
0
}
}
None => 0,
};
self.state.select(Some(i));
}

fn previous(&mut self) {
let i = match self.state.selected() {
Some(i) => {
if i == 0 {
self.items.len() - 1
} else {
i - 1
}
}
None => 0,
};
self.state.select(Some(i));
}

fn toggle(&mut self) {
if let Some(index) = self.state.selected() {
if let Some(element) = self.items.get_mut(index) {
element.toggle()
}
}
}
}
18 changes: 17 additions & 1 deletion src/agent/onefuzz-task/src/local/generic_analysis.rs
Original file line number Diff line number Diff line change
@@ -27,7 +27,23 @@ pub struct Analysis {
}

#[async_trait]
impl Template for Analysis {
impl Template<Analysis> for Analysis {
fn example_values() -> Analysis {
Analysis {
analyzer_exe: String::new(),
analyzer_options: vec![],
analyzer_env: HashMap::new(),
target_exe: PathBuf::from("path_to_your_exe"),
target_options: vec![],
input_queue: Some(PathBuf::from("path_to_your_inputs")),
crashes: Some(PathBuf::from("path_where_crashes_written")),
analysis: PathBuf::new(),
tools: None,
reports: Some(PathBuf::from("path_where_reports_written")),
unique_reports: Some(PathBuf::from("path_where_reports_written")),
no_repro: Some(PathBuf::from("path_where_no_repro_reports_written")),
}
}
async fn run(&self, context: &RunContext) -> Result<()> {
let input_q = if let Some(w) = &self.input_queue {
Some(context.monitor_dir(w).await?)
20 changes: 19 additions & 1 deletion src/agent/onefuzz-task/src/local/generic_crash_report.rs
Original file line number Diff line number Diff line change
@@ -39,7 +39,25 @@ pub struct CrashReport {
minimized_stack_depth: Option<usize>,
}
#[async_trait]
impl Template for CrashReport {
impl Template<CrashReport> for CrashReport {
fn example_values() -> CrashReport {
CrashReport {
target_exe: PathBuf::from("path_to_your_exe"),
target_options: vec![],
target_env: HashMap::new(),
input_queue: Some(PathBuf::from("path_to_your_inputs")),
crashes: Some(PathBuf::from("path_where_crashes_written")),
reports: Some(PathBuf::from("path_where_reports_written")),
unique_reports: Some(PathBuf::from("path_where_reports_written")),
no_repro: Some(PathBuf::from("path_where_no_repro_reports_written")),
target_timeout: None,
check_asan_log: true,
check_debugger: true,
check_retry_count: 5,
check_queue: false,
minimized_stack_depth: None,
}
}
async fn run(&self, context: &RunContext) -> Result<()> {
let input_q_fut: OptionFuture<_> = self
.input_queue
21 changes: 20 additions & 1 deletion src/agent/onefuzz-task/src/local/generic_generator.rs
Original file line number Diff line number Diff line change
@@ -35,7 +35,26 @@ pub struct Generator {
}

#[async_trait]
impl Template for Generator {
impl Template<Generator> for Generator {
fn example_values() -> Generator {
Generator {
generator_exe: String::new(),
generator_env: HashMap::new(),
generator_options: vec![],
readonly_inputs: vec![PathBuf::from("path_to_readonly_inputs")],
crashes: PathBuf::new(),
tools: None,
target_exe: PathBuf::from("path_to_your_exe"),
target_env: HashMap::new(),
target_options: vec![],
target_timeout: None,
check_asan_log: true,
check_debugger: true,
check_retry_count: 5,
rename_output: false,
ensemble_sync_delay: None,
}
}
async fn run(&self, context: &RunContext) -> Result<()> {
let generator_config = crate::tasks::fuzz::generator::Config {
generator_exe: self.generator_exe.clone(),
17 changes: 16 additions & 1 deletion src/agent/onefuzz-task/src/local/libfuzzer.rs
Original file line number Diff line number Diff line change
@@ -32,7 +32,22 @@ pub struct LibFuzzer {
}

#[async_trait]
impl Template for LibFuzzer {
impl Template<LibFuzzer> for LibFuzzer {
fn example_values() -> LibFuzzer {
LibFuzzer {
inputs: PathBuf::new(),
readonly_inputs: vec![PathBuf::from("path_to_readonly_inputs")],
crashes: PathBuf::new(),
crashdumps: None,
target_exe: PathBuf::from("path_to_your_exe"),
target_env: HashMap::new(),
target_options: vec![],
target_workers: None,
ensemble_sync_delay: None,
check_fuzzer_help: true,
expect_crash_on_failure: true,
}
}
async fn run(&self, context: &RunContext) -> Result<()> {
let ri: Result<Vec<SyncedDir>> = self
.readonly_inputs
19 changes: 18 additions & 1 deletion src/agent/onefuzz-task/src/local/libfuzzer_crash_report.rs
Original file line number Diff line number Diff line change
@@ -36,7 +36,24 @@ pub struct LibfuzzerCrashReport {
}

#[async_trait]
impl Template for LibfuzzerCrashReport {
impl Template<LibfuzzerCrashReport> for LibfuzzerCrashReport {
fn example_values() -> LibfuzzerCrashReport {
LibfuzzerCrashReport {
target_exe: PathBuf::from("path_to_your_exe"),
target_env: HashMap::new(),
target_options: vec![],
target_timeout: None,
input_queue: Some(PathBuf::from("path_to_your_inputs")),
crashes: Some(PathBuf::from("path_where_crashes_written")),
reports: Some(PathBuf::from("path_where_reports_written")),
unique_reports: Some(PathBuf::from("path_where_reports_written")),
no_repro: Some(PathBuf::from("path_where_no_repro_reports_written")),
check_fuzzer_help: true,
check_retry_count: 5,
minimized_stack_depth: None,
check_queue: true,
}
}
async fn run(&self, context: &RunContext) -> Result<()> {
let input_q_fut: OptionFuture<_> = self
.input_queue
14 changes: 13 additions & 1 deletion src/agent/onefuzz-task/src/local/libfuzzer_merge.rs
Original file line number Diff line number Diff line change
@@ -27,7 +27,19 @@ pub struct LibfuzzerMerge {
}

#[async_trait]
impl Template for LibfuzzerMerge {
impl Template<LibfuzzerMerge> for LibfuzzerMerge {
fn example_values() -> LibfuzzerMerge {
LibfuzzerMerge {
target_exe: PathBuf::from("path_to_your_exe"),
target_env: HashMap::new(),
target_options: vec![],
input_queue: Some(PathBuf::from("path_to_your_inputs")),
inputs: vec![],
unique_inputs: PathBuf::new(),
preserve_existing_outputs: true,
check_fuzzer_help: true,
}
}
async fn run(&self, context: &RunContext) -> Result<()> {
let input_q_fut: OptionFuture<_> = self
.input_queue
20 changes: 19 additions & 1 deletion src/agent/onefuzz-task/src/local/libfuzzer_regression.rs
Original file line number Diff line number Diff line change
@@ -40,7 +40,25 @@ pub struct LibfuzzerRegression {
}

#[async_trait]
impl Template for LibfuzzerRegression {
impl Template<LibfuzzerRegression> for LibfuzzerRegression {
fn example_values() -> LibfuzzerRegression {
LibfuzzerRegression {
target_exe: PathBuf::from("path_to_your_exe"),
target_options: vec![],
target_env: HashMap::new(),
target_timeout: None,
crashes: PathBuf::new(),
regression_reports: PathBuf::new(),
report_list: None,
unique_reports: Some(PathBuf::from("path_where_reports_written")),
reports: Some(PathBuf::from("path_where_reports_written")),
no_repro: Some(PathBuf::from("path_where_no_repro_reports_written")),
readonly_inputs: None,
check_fuzzer_help: true,
check_retry_count: 5,
minimized_stack_depth: None,
}
}
async fn run(&self, context: &RunContext) -> Result<()> {
let libfuzzer_regression = crate::tasks::regression::libfuzzer::Config {
target_exe: self.target_exe.clone(),
16 changes: 15 additions & 1 deletion src/agent/onefuzz-task/src/local/libfuzzer_test_input.rs
Original file line number Diff line number Diff line change
@@ -24,7 +24,21 @@ pub struct LibfuzzerTestInput {
}

#[async_trait]
impl Template for LibfuzzerTestInput {
impl Template<LibfuzzerTestInput> for LibfuzzerTestInput {
fn example_values() -> LibfuzzerTestInput {
LibfuzzerTestInput {
input: PathBuf::new(),
target_exe: PathBuf::from("path_to_your_exe"),
target_options: vec![],
target_env: HashMap::new(),
setup_dir: PathBuf::new(),
extra_setup_dir: None,
extra_output_dir: None,
target_timeout: None,
check_retry_count: 5,
minimized_stack_depth: None,
}
}
async fn run(&self, context: &RunContext) -> Result<()> {
let c = self.clone();
let t = tokio::spawn(async move {
1 change: 1 addition & 0 deletions src/agent/onefuzz-task/src/local/mod.rs
Original file line number Diff line number Diff line change
@@ -5,6 +5,7 @@ pub mod cmd;
pub mod common;
#[cfg(any(target_os = "linux", target_os = "windows"))]
pub mod coverage;
pub mod create_template;
pub mod generic_analysis;
pub mod generic_crash_report;
pub mod generic_generator;
23 changes: 13 additions & 10 deletions src/agent/onefuzz-task/src/local/template.rs
Original file line number Diff line number Diff line change
@@ -5,6 +5,7 @@ use path_absolutize::Absolutize;
use serde::Deserialize;
use std::path::{Path, PathBuf};
use storage_queue::QueueClient;
use strum_macros::{EnumDiscriminants, EnumString, EnumVariantNames};
use tokio::{sync::Mutex, task::JoinHandle};
use url::Url;
use uuid::Uuid;
@@ -27,24 +28,25 @@ use schemars::JsonSchema;
#[derive(Debug, Serialize, Deserialize, Clone, JsonSchema)]
pub struct TaskGroup {
#[serde(flatten)]
common: CommonProperties,
pub common: CommonProperties,
/// The list of tasks
tasks: Vec<TaskConfig>,
pub tasks: Vec<TaskConfig>,
}

#[derive(Debug, Deserialize, Serialize, Clone, JsonSchema)]

struct CommonProperties {
pub struct CommonProperties {
pub setup_dir: Option<PathBuf>,
pub extra_setup_dir: Option<PathBuf>,
pub extra_dir: Option<PathBuf>,
#[serde(default)]
pub create_job_dir: bool,
}

#[derive(Debug, Serialize, Deserialize, Clone, JsonSchema)]
#[derive(Debug, Serialize, Deserialize, Clone, JsonSchema, EnumVariantNames, EnumDiscriminants)]
#[strum_discriminants(derive(EnumString))]
#[serde(tag = "type")]
enum TaskConfig {
pub enum TaskConfig {
LibFuzzer(LibFuzzer),
Analysis(Analysis),
Coverage(Coverage),
@@ -61,7 +63,8 @@ enum TaskConfig {
}

#[async_trait]
pub trait Template {
pub trait Template<T> {
fn example_values() -> T;
async fn run(&self, context: &RunContext) -> Result<()>;
}

@@ -136,16 +139,16 @@ impl RunContext {
name: impl AsRef<str>,
path: impl AsRef<Path>,
) -> Result<SyncedDir> {
if !path.as_ref().exists() {
std::fs::create_dir_all(&path)?;
}

self.to_sync_dir(name, path)?
.monitor_count(&self.event_sender)
}

pub fn to_sync_dir(&self, name: impl AsRef<str>, path: impl AsRef<Path>) -> Result<SyncedDir> {
let path = path.as_ref();
if !path.exists() {
std::fs::create_dir_all(path)?;
}

let name = name.as_ref();
let current_dir = std::env::current_dir()?;
if self.create_job_dir {
19 changes: 18 additions & 1 deletion src/agent/onefuzz-task/src/local/test_input.rs
Original file line number Diff line number Diff line change
@@ -28,7 +28,24 @@ pub struct TestInput {
}

#[async_trait]
impl Template for TestInput {
impl Template<TestInput> for TestInput {
fn example_values() -> TestInput {
TestInput {
input: PathBuf::new(),
target_exe: PathBuf::from("path_to_your_exe"),
target_options: vec![],
target_env: HashMap::new(),
setup_dir: PathBuf::new(),
extra_setup_dir: None,
task_id: Uuid::new_v4(),
job_id: Uuid::new_v4(),
target_timeout: None,
check_retry_count: 5,
check_asan_log: true,
check_debugger: true,
minimized_stack_depth: None,
}
}
async fn run(&self, context: &RunContext) -> Result<()> {
let c = self.clone();
let t = tokio::spawn(async move {
14 changes: 12 additions & 2 deletions src/agent/onefuzz-task/src/main.rs
Original file line number Diff line number Diff line change
@@ -11,29 +11,38 @@ extern crate onefuzz;

use anyhow::Result;
use clap::{ArgMatches, Command};

use std::io::{stdout, Write};

mod check_for_update;
mod local;
mod managed;
mod tasks;

const LICENSE_CMD: &str = "licenses";
const LOCAL_CMD: &str = "local";
const MANAGED_CMD: &str = "managed";
const CHECK_FOR_UPDATE: &str = "check_for_update";

const ONEFUZZ_BUILT_VERSION: &str = env!("ONEFUZZ_VERSION");

fn main() -> Result<()> {
let built_version = format!(
"{} onefuzz:{} git:{}",
crate_version!(),
env!("ONEFUZZ_VERSION"),
ONEFUZZ_BUILT_VERSION,
env!("GIT_VERSION")
);

let app = Command::new("onefuzz-task")
.version(built_version)
.subcommand(managed::cmd::args(MANAGED_CMD))
.subcommand(local::cmd::args(LOCAL_CMD))
.subcommand(Command::new(LICENSE_CMD).about("display third-party licenses"));
.subcommand(Command::new(LICENSE_CMD).about("display third-party licenses"))
.subcommand(
Command::new(CHECK_FOR_UPDATE)
.about("compares the version of onefuzz-task with the onefuzz service"),
);

let matches = app.get_matches();

@@ -55,6 +64,7 @@ async fn run(args: ArgMatches) -> Result<()> {
Some((LICENSE_CMD, _)) => licenses(),
Some((LOCAL_CMD, sub)) => local::cmd::run(sub.to_owned()).await,
Some((MANAGED_CMD, sub)) => managed::cmd::run(sub).await,
Some((CHECK_FOR_UPDATE, _)) => check_for_update::run(ONEFUZZ_BUILT_VERSION),
_ => anyhow::bail!("No command provided. Run with 'help' to see available commands."),
}
}
9 changes: 5 additions & 4 deletions src/agent/onefuzz-task/src/tasks/coverage/generic.rs
Original file line number Diff line number Diff line change
@@ -141,6 +141,9 @@ impl CoverageTask {

context.heartbeat.alive();

info!("report initial coverage");
context.report_coverage_stats().await;

for dir in &self.config.readonly_inputs {
debug!("recording coverage for {}", dir.local_path.display());

@@ -161,7 +164,6 @@ impl CoverageTask {
}

if seen_inputs {
context.report_coverage_stats().await?;
context.save_and_sync_coverage().await?;
}

@@ -454,7 +456,7 @@ impl<'a> TaskContext<'a> {
Ok(count)
}

pub async fn report_coverage_stats(&self) -> Result<()> {
pub async fn report_coverage_stats(&self) {
use EventData::*;

let coverage = RwLock::read(&self.coverage).await;
@@ -471,7 +473,6 @@ impl<'a> TaskContext<'a> {
]),
)
.await;
Ok(())
}

pub async fn save_coverage(
@@ -565,7 +566,7 @@ impl<'a> Processor for TaskContext<'a> {
self.heartbeat.alive();

self.record_input(input).await?;
self.report_coverage_stats().await?;
self.report_coverage_stats().await;
self.save_and_sync_coverage().await?;

Ok(())
Original file line number Diff line number Diff line change
@@ -1,5 +1,6 @@
# Required to avoid recording errors.
! *\llvm-project\compiler-rt\*
! *\llvm\compiler-rt\*
! *\vctools\crt\*
! *\Windows Kits\10\Include\*\ucrt\*
! *\ExternalAPIs\Windows\10\sdk\*
2 changes: 1 addition & 1 deletion src/agent/onefuzz-task/src/tasks/fuzz/libfuzzer/common.rs
Original file line number Diff line number Diff line change
@@ -272,7 +272,7 @@ where
info!("config is: {:?}", self.config);

let fuzzer = L::from_config(&self.config).await?;
let mut running = fuzzer.fuzz(crash_dir.path(), local_inputs, &inputs).await?;
let mut running = fuzzer.fuzz(crash_dir.path(), local_inputs, &inputs)?;

info!("child is: {:?}", running);

208 changes: 208 additions & 0 deletions src/agent/onefuzz-task/tests/template_integration.rs
Original file line number Diff line number Diff line change
@@ -0,0 +1,208 @@
use std::{
collections::HashSet,
ffi::OsStr,
path::{Path, PathBuf},
};

use path_absolutize::Absolutize;
use tokio::fs;

use anyhow::Result;
use log::info;
use onefuzz_task_lib::local::template;
use std::time::Duration;
use tokio::time::timeout;

macro_rules! libfuzzer_tests {
($($name:ident: $value:expr,)*) => {
$(
#[tokio::test(flavor = "multi_thread")]
#[cfg_attr(not(feature = "integration_test"), ignore)]
async fn $name() {
let _ = env_logger::builder().is_test(true).try_init();
let (config, libfuzzer_target) = $value;
test_libfuzzer_basic_template(PathBuf::from(config), PathBuf::from(libfuzzer_target)).await;
}
)*
}
}

// This is the format for adding other templates/targets for this macro
// $TEST_NAME: ($RELATIVE_PATH_TO_TEMPLATE, $RELATIVE_PATH_TO_TARGET),
// Make sure that you place the target binary in CI
libfuzzer_tests! {
libfuzzer_basic: ("./tests/templates/libfuzzer_basic.yml", "./tests/targets/simple/fuzz.exe"),
}

async fn test_libfuzzer_basic_template(config: PathBuf, libfuzzer_target: PathBuf) {
assert_exists_and_is_file(&config).await;
assert_exists_and_is_file(&libfuzzer_target).await;

let test_layout = create_test_directory(&config, &libfuzzer_target)
.await
.expect("Failed to create test directory layout");

info!("Executed test from: {:?}", &test_layout.root);
info!("Running template for 1 minute...");
if let Ok(template_result) = timeout(
Duration::from_secs(60),
template::launch(&test_layout.config, None),
)
.await
{
// Something went wrong when running the template so lets print out the template to be helpful
info!("Printing config as it was used in the test:");
info!("{:?}", fs::read_to_string(&test_layout.config).await);
template_result.unwrap();
}

verify_test_layout_structure_did_not_change(&test_layout).await;
assert_directory_is_not_empty(&test_layout.inputs).await;
assert_directory_is_not_empty(&test_layout.crashes).await;
verify_coverage_dir(&test_layout.coverage).await;

let _ = fs::remove_dir_all(&test_layout.root).await;
}

async fn verify_test_layout_structure_did_not_change(test_layout: &TestLayout) {
assert_exists_and_is_dir(&test_layout.root).await;
assert_exists_and_is_file(&test_layout.config).await;
assert_exists_and_is_file(&test_layout.target_exe).await;
assert_exists_and_is_dir(&test_layout.crashdumps).await;
assert_exists_and_is_dir(&test_layout.coverage).await;
assert_exists_and_is_dir(&test_layout.crashes).await;
assert_exists_and_is_dir(&test_layout.inputs).await;
assert_exists_and_is_dir(&test_layout.regression_reports).await;
}

async fn verify_coverage_dir(coverage: &Path) {
warn_if_empty(coverage).await;
}

async fn assert_exists_and_is_dir(dir: &Path) {
assert!(dir.exists(), "Expected directory to exist. dir = {:?}", dir);
assert!(
dir.is_dir(),
"Expected path to be a directory. dir = {:?}",
dir
);
}

async fn warn_if_empty(dir: &Path) {
if dir_is_empty(dir).await {
println!("Expected directory to not be empty: {:?}", dir);
}
}

async fn assert_exists_and_is_file(file: &Path) {
assert!(file.exists(), "Expected file to exist. file = {:?}", file);
assert!(
file.is_file(),
"Expected path to be a file. file = {:?}",
file
);
}

async fn dir_is_empty(dir: &Path) -> bool {
fs::read_dir(dir)
.await
.unwrap_or_else(|_| panic!("Failed to list files in directory. dir = {:?}", dir))
.next_entry()
.await
.unwrap_or_else(|_| {
panic!(
"Failed to get next file in directory listing. dir = {:?}",
dir
)
})
.is_some()
}

async fn assert_directory_is_not_empty(dir: &Path) {
assert!(
dir_is_empty(dir).await,
"Expected directory to not be empty. dir = {:?}",
dir
);
}

async fn create_test_directory(config: &Path, target_exe: &Path) -> Result<TestLayout> {
let mut test_directory = PathBuf::from(".").join(uuid::Uuid::new_v4().to_string());
fs::create_dir_all(&test_directory).await?;
test_directory = test_directory.canonicalize()?;

let mut inputs_directory = PathBuf::from(&test_directory).join("inputs");
inputs_directory = inputs_directory.absolutize()?.into();

let mut crashes_directory = PathBuf::from(&test_directory).join("crashes");
crashes_directory = crashes_directory.absolutize()?.into();

let mut crashdumps_directory = PathBuf::from(&test_directory).join("crashdumps");
crashdumps_directory = crashdumps_directory.absolutize()?.into();

let mut coverage_directory = PathBuf::from(&test_directory).join("coverage");
coverage_directory = coverage_directory.absolutize()?.into();

let mut regression_reports_directory =
PathBuf::from(&test_directory).join("regression_reports");
regression_reports_directory = regression_reports_directory.absolutize()?.into();

let mut target_in_test = PathBuf::from(&test_directory).join("fuzz.exe");
fs::copy(target_exe, &target_in_test).await?;
target_in_test = target_in_test.canonicalize()?;

let mut interesting_extensions = HashSet::new();
interesting_extensions.insert(Some(OsStr::new("so")));
interesting_extensions.insert(Some(OsStr::new("pdb")));
let mut f = fs::read_dir(target_exe.parent().unwrap()).await?;
while let Ok(Some(f)) = f.next_entry().await {
if interesting_extensions.contains(&f.path().extension()) {
fs::copy(f.path(), PathBuf::from(&test_directory).join(f.file_name())).await?;
}
}

let mut config_data = fs::read_to_string(config).await?;

config_data = config_data
.replace("{TARGET_PATH}", target_in_test.to_str().unwrap())
.replace("{INPUTS_PATH}", inputs_directory.to_str().unwrap())
.replace("{CRASHES_PATH}", crashes_directory.to_str().unwrap())
.replace("{CRASHDUMPS_PATH}", crashdumps_directory.to_str().unwrap())
.replace("{COVERAGE_PATH}", coverage_directory.to_str().unwrap())
.replace(
"{REGRESSION_REPORTS_PATH}",
regression_reports_directory.to_str().unwrap(),
)
.replace("{TEST_DIRECTORY}", test_directory.to_str().unwrap());

let mut config_in_test =
PathBuf::from(&test_directory).join(config.file_name().unwrap_or_else(|| {
panic!("Failed to get file name for config. config = {:?}", config)
}));

fs::write(&config_in_test, &config_data).await?;
config_in_test = config_in_test.canonicalize()?;

Ok(TestLayout {
root: test_directory,
config: config_in_test,
target_exe: target_in_test,
inputs: inputs_directory,
crashes: crashes_directory,
crashdumps: crashdumps_directory,
coverage: coverage_directory,
regression_reports: regression_reports_directory,
})
}

#[derive(Debug)]
struct TestLayout {
root: PathBuf,
config: PathBuf,
target_exe: PathBuf,
inputs: PathBuf,
crashes: PathBuf,
crashdumps: PathBuf,
coverage: PathBuf,
regression_reports: PathBuf,
}
33 changes: 33 additions & 0 deletions src/agent/onefuzz-task/tests/templates/libfuzzer_basic.yml
Original file line number Diff line number Diff line change
@@ -0,0 +1,33 @@
# yaml-language-server: $schema=../../src/local/schema.json

required_args: &required_args
target_exe: '{TARGET_PATH}'
inputs: &inputs '{INPUTS_PATH}' # A folder containining your inputs
crashes: &crashes '{CRASHES_PATH}' # The folder where you want the crashing inputs to be output
crashdumps: '{CRASHDUMPS_PATH}' # The folder where you want the crash dumps to be output
coverage: '{COVERAGE_PATH}' # The folder where you want the code coverage to be output
regression_reports: '{REGRESSION_REPORTS_PATH}' # The folder where you want the regression reports to be output
target_env: {
'LD_LIBRARY_PATH': '{TEST_DIRECTORY}',
}
target_options: []
check_fuzzer_help: false

tasks:
- type: LibFuzzer
<<: *required_args
readonly_inputs: []

- type: LibfuzzerRegression
<<: *required_args

- type: "LibfuzzerCrashReport"
<<: *required_args
input_queue: *crashes

- type: "Coverage"
<<: *required_args
target_options:
- "{input}"
input_queue: *inputs
readonly_inputs: [*inputs]
2 changes: 1 addition & 1 deletion src/agent/onefuzz-telemetry/Cargo.toml
Original file line number Diff line number Diff line change
@@ -15,5 +15,5 @@ chrono = { version = "0.4", default-features = false, features = [
lazy_static = "1.4"
log = "0.4"
serde = { version = "1.0", features = ["derive"] }
tokio = { version = "1.29", features = ["full"] }
tokio = { version = "1.32", features = ["full"] }
uuid = { version = "1.4", features = ["serde", "v4"] }
14 changes: 7 additions & 7 deletions src/agent/onefuzz/Cargo.toml
Original file line number Diff line number Diff line change
@@ -10,15 +10,15 @@ license = "MIT"
anyhow = "1.0"
async-trait = "0.1"
base64 = "0.21"
bytes = "1.4"
bytes = "1.5"
dunce = "1.0"
dynamic-library = { path = "../dynamic-library" }
futures = "0.3"
futures-util = "0.3"
hex = "0.4"
lazy_static = "1.4"
log = "0.4"
notify = { version = "6.0.1", default-features = false }
notify = { version = "6.1.1", default-features = false }
regex = "1.9.1"
reqwest = { version = "0.11", features = [
"json",
@@ -31,7 +31,7 @@ serde = "1.0"
serde_json = "1.0"
rand = "0.8"
serde_derive = "1.0"
tokio = { version = "1.29", features = ["full"] }
tokio = { version = "1.32", features = ["full"] }
tokio-stream = { version = "0.1", features = ["fs", "time", "tokio-util"] }
tokio-util = { version = "0.7", features = ["full"] }
uuid = { version = "1.4", features = ["serde", "v4"] }
@@ -40,7 +40,7 @@ url-escape = "0.1.0"
storage-queue = { path = "../storage-queue" }
strum = "0.25"
strum_macros = "0.25"
tempfile = "3.7.0"
tempfile = "3.8.0"
process_control = "4.0"
reqwest-retry = { path = "../reqwest-retry" }
onefuzz-telemetry = { path = "../onefuzz-telemetry" }
@@ -49,7 +49,7 @@ stacktrace-parser = { path = "../stacktrace-parser" }
backoff = { version = "0.4", features = ["tokio"] }

[target.'cfg(target_family = "windows")'.dependencies]
winreg = "0.50"
winreg = "0.51"
input-tester = { path = "../input-tester" }
debugger = { path = "../debugger" }
windows = { version = "0.48", features = [
@@ -62,10 +62,10 @@ cpp_demangle = "0.4"
nix = "0.26"

[target.'cfg(target_os = "linux")'.dependencies]
pete = "0.10"
pete = "0.12"
rstack = "0.3"
proc-maps = { version = "0.3", default-features = false }

[dev-dependencies]
clap = { version = "4.3.0", features = ["derive"] }
clap = { version = "4.4.2", features = ["derive"] }
pretty_assertions = "1.4.0"
9 changes: 6 additions & 3 deletions src/agent/onefuzz/src/expand.rs
Original file line number Diff line number Diff line change
@@ -128,7 +128,8 @@ impl<'a> Expand<'a> {

fn input_file_sha256(&self) -> Result<ExpandedValue<'a>> {
let Some(val) = self.values.get(PlaceHolder::Input.get_string()) else {
bail!("no value found for {}, unable to evaluate {}",
bail!(
"no value found for {}, unable to evaluate {}",
PlaceHolder::Input.get_string(),
PlaceHolder::InputFileSha256.get_string(),
)
@@ -149,7 +150,8 @@ impl<'a> Expand<'a> {

fn extract_file_name_no_ext(&self) -> Result<ExpandedValue<'a>> {
let Some(val) = self.values.get(PlaceHolder::Input.get_string()) else {
bail!("no value found for {}, unable to evaluate {}",
bail!(
"no value found for {}, unable to evaluate {}",
PlaceHolder::Input.get_string(),
PlaceHolder::InputFileNameNoExt.get_string(),
)
@@ -173,7 +175,8 @@ impl<'a> Expand<'a> {

fn extract_file_name(&self) -> Result<ExpandedValue<'a>> {
let Some(val) = self.values.get(PlaceHolder::Input.get_string()) else {
bail!("no value found for {}, unable to evaluate {}",
bail!(
"no value found for {}, unable to evaluate {}",
PlaceHolder::Input.get_string(),
PlaceHolder::InputFileName.get_string(),
)
22 changes: 18 additions & 4 deletions src/agent/onefuzz/src/libfuzzer.rs
Original file line number Diff line number Diff line change
@@ -339,7 +339,7 @@ impl LibFuzzer {
Ok(missing)
}

pub async fn fuzz(
pub fn fuzz(
&self,
fault_dir: impl AsRef<Path>,
corpus_dir: impl AsRef<Path>,
@@ -352,8 +352,7 @@ impl LibFuzzer {
// specify that a new file `crash-<digest>` should be written to a
// _directory_ `<corpus_dir>`, we must ensure that the prefix includes a
// trailing path separator.
let artifact_prefix: OsString =
format!("-artifact_prefix={}/", fault_dir.as_ref().display()).into();
let artifact_prefix = artifact_prefix(fault_dir.as_ref());

let mut cmd = self.build_command(
Some(fault_dir.as_ref()),
@@ -363,10 +362,11 @@ impl LibFuzzer {
None,
)?;

debug!("Running command: {:?}", &cmd);

let child = cmd
.spawn()
.with_context(|| format_err!("libfuzzer failed to start: {}", self.exe.display()))?;

Ok(child)
}

@@ -441,6 +441,20 @@ impl LibFuzzer {
}
}

#[cfg(target_os = "windows")]
fn artifact_prefix(fault_dir: &Path) -> OsString {
if fault_dir.is_absolute() {
format!("-artifact_prefix={}\\", fault_dir.display()).into()
} else {
format!("-artifact_prefix={}/", fault_dir.display()).into()
}
}

#[cfg(not(target_os = "windows"))]
fn artifact_prefix(fault_dir: &Path) -> OsString {
format!("-artifact_prefix={}/", fault_dir.display()).into()
}

pub struct LibFuzzerLine {
_line: String,
iters: u64,
4 changes: 2 additions & 2 deletions src/agent/onefuzz/src/syncdir.rs
Original file line number Diff line number Diff line change
@@ -283,7 +283,7 @@ impl SyncedDir {
Event::new_coverage => {
jr_client
.send_direct(
JobResultData::CoverageData,
JobResultData::NewCoverage,
HashMap::from([("count".to_string(), 1.0)]),
)
.await;
@@ -351,7 +351,7 @@ impl SyncedDir {
Event::new_coverage => {
jr_client
.send_direct(
JobResultData::CoverageData,
JobResultData::NewCoverage,
HashMap::from([("count".to_string(), 1.0)]),
)
.await;
2 changes: 1 addition & 1 deletion src/agent/reqwest-retry/Cargo.toml
Original file line number Diff line number Diff line change
@@ -19,5 +19,5 @@ reqwest = { version = "0.11", features = [
thiserror = "1.0"

[dev-dependencies]
tokio = { version = "1.29", features = ["macros"] }
tokio = { version = "1.32", features = ["macros"] }
wiremock = "0.5"
2 changes: 1 addition & 1 deletion src/agent/stacktrace-parser/Cargo.toml
Original file line number Diff line number Diff line change
@@ -16,5 +16,5 @@ serde_json = "1.0"
libclusterfuzz = { path = "../libclusterfuzz" }

[dev-dependencies]
insta = { version = "1.31.0", features = ["glob", "json"] }
insta = { version = "1.32.0", features = ["glob", "json"] }
pretty_assertions = "1.4"
4 changes: 2 additions & 2 deletions src/agent/storage-queue/Cargo.toml
Original file line number Diff line number Diff line change
@@ -10,7 +10,7 @@ anyhow = "1.0"
async-trait = "0.1"
backoff = { version = "0.4", features = ["tokio"] }
base64 = "0.21"
bytes = { version = "1.4", features = ["serde"] }
bytes = { version = "1.5", features = ["serde"] }
derivative = "2.2"
flume = "0.10"
num_cpus = "1.15"
@@ -26,6 +26,6 @@ serde = { version = "1.0", features = ["derive"] }
serde_derive = "1.0"
serde_json = "1.0"
bincode = "1.3"
tokio = { version = "1.29", features = ["full"] }
tokio = { version = "1.32", features = ["full"] }
queue-file = "1.4"
uuid = { version = "1.4", features = ["serde", "v4"] }
4 changes: 2 additions & 2 deletions src/agent/win-util/Cargo.toml
Original file line number Diff line number Diff line change
@@ -12,7 +12,7 @@ log = "0.4"
os_pipe = "1.1"

[target.'cfg(windows)'.dependencies]
winreg = "0.50"
winreg = "0.51"

[dependencies.windows]
version = "0.48"
@@ -33,4 +33,4 @@ features = [
]

[dev-dependencies]
tempfile = "3.7.0"
tempfile = "3.8.0"
2 changes: 1 addition & 1 deletion src/ci/agent.sh
Original file line number Diff line number Diff line change
@@ -37,7 +37,7 @@ export RUST_BACKTRACE=full

# Run tests and collect coverage
# https://github.com/taiki-e/cargo-llvm-cov
cargo llvm-cov nextest --all-targets --features slow-tests --locked --workspace --lcov --output-path "$output_dir/lcov.info"
cargo llvm-cov nextest --all-targets --features slow-tests,integration_test --locked --workspace --lcov --output-path "$output_dir/lcov.info"

# TODO: re-enable integration tests.
# cargo test --release --manifest-path ./onefuzz-task/Cargo.toml --features integration_test -- --nocapture
6 changes: 5 additions & 1 deletion src/ci/set-versions.sh
Original file line number Diff line number Diff line change
@@ -10,8 +10,12 @@ GET_VERSION=${SCRIPT_DIR}/get-version.sh
VERSION=${1:-$(${GET_VERSION})}
cd ${SCRIPT_DIR}/../../

arrVer=(${VERSION//./ })
MAJOR=${arrVer[0]}
MINOR=${arrVer[1]}

SET_VERSIONS="src/pytypes/onefuzztypes/__version__.py src/cli/onefuzz/__version__.py"
SET_REQS="src/cli/requirements.txt"

sed -i "s/0.0.0/${VERSION}/" ${SET_VERSIONS}
sed -i "s/onefuzztypes==0.0.0/onefuzztypes==${VERSION}/" ${SET_REQS}
sed -i "s/onefuzztypes==0.0.0/onefuzztypes==${MAJOR}.${MINOR}.*/" ${SET_REQS}
21 changes: 12 additions & 9 deletions src/cli/examples/domato.py
Original file line number Diff line number Diff line change
@@ -67,7 +67,7 @@ def upload_to_fuzzer_container(of: Onefuzz, fuzzer_name: str, fuzzer_url: str) -


def upload_to_setup_container(of: Onefuzz, helper: JobHelper, setup_dir: str) -> None:
setup_sas = of.containers.get(helper.containers[ContainerType.setup]).sas_url
setup_sas = of.containers.get(helper.container_name(ContainerType.setup)).sas_url
if AZCOPY_PATH is None:
raise Exception("missing azcopy")
command = [AZCOPY_PATH, "sync", setup_dir, setup_sas]
@@ -143,13 +143,16 @@ def main() -> None:
helper.create_containers()
helper.setup_notifications(notification_config)
upload_to_setup_container(of, helper, args.setup_dir)
add_setup_script(of, helper.containers[ContainerType.setup])
add_setup_script(of, helper.container_name(ContainerType.setup))

containers = [
(ContainerType.setup, helper.containers[ContainerType.setup]),
(ContainerType.crashes, helper.containers[ContainerType.crashes]),
(ContainerType.reports, helper.containers[ContainerType.reports]),
(ContainerType.unique_reports, helper.containers[ContainerType.unique_reports]),
(ContainerType.setup, helper.container_name(ContainerType.setup)),
(ContainerType.crashes, helper.container_name(ContainerType.crashes)),
(ContainerType.reports, helper.container_name(ContainerType.reports)),
(
ContainerType.unique_reports,
helper.container_name(ContainerType.unique_reports),
),
]

of.logger.info("Creating generic_crash_report task")
@@ -164,11 +167,11 @@ def main() -> None:

containers = [
(ContainerType.tools, Container(FUZZER_NAME)),
(ContainerType.setup, helper.containers[ContainerType.setup]),
(ContainerType.crashes, helper.containers[ContainerType.crashes]),
(ContainerType.setup, helper.container_name(ContainerType.setup)),
(ContainerType.crashes, helper.container_name(ContainerType.crashes)),
(
ContainerType.readonly_inputs,
helper.containers[ContainerType.readonly_inputs],
helper.container_name(ContainerType.readonly_inputs),
),
]

19 changes: 11 additions & 8 deletions src/cli/examples/honggfuzz.py
Original file line number Diff line number Diff line change
@@ -88,13 +88,16 @@ def main() -> None:
if args.inputs:
helper.upload_inputs(args.inputs)

add_setup_script(of, helper.containers[ContainerType.setup])
add_setup_script(of, helper.container_name(ContainerType.setup))

containers = [
(ContainerType.setup, helper.containers[ContainerType.setup]),
(ContainerType.crashes, helper.containers[ContainerType.crashes]),
(ContainerType.reports, helper.containers[ContainerType.reports]),
(ContainerType.unique_reports, helper.containers[ContainerType.unique_reports]),
(ContainerType.setup, helper.container_name(ContainerType.setup)),
(ContainerType.crashes, helper.container_name(ContainerType.crashes)),
(ContainerType.reports, helper.container_name(ContainerType.reports)),
(
ContainerType.unique_reports,
helper.container_name(ContainerType.unique_reports),
),
]

of.logger.info("Creating generic_crash_report task")
@@ -109,11 +112,11 @@ def main() -> None:

containers = [
(ContainerType.tools, Container("honggfuzz")),
(ContainerType.setup, helper.containers[ContainerType.setup]),
(ContainerType.crashes, helper.containers[ContainerType.crashes]),
(ContainerType.setup, helper.container_name(ContainerType.setup)),
(ContainerType.crashes, helper.container_name(ContainerType.crashes)),
(
ContainerType.inputs,
helper.containers[ContainerType.inputs],
helper.container_name(ContainerType.inputs),
),
]

Original file line number Diff line number Diff line change
@@ -74,15 +74,15 @@ def main() -> None:
helper.create_containers()

of.containers.files.upload_file(
helper.containers[ContainerType.tools], f"{args.tools}/source-coverage.sh"
helper.container_name(ContainerType.tools), f"{args.tools}/source-coverage.sh"
)

containers = [
(ContainerType.setup, helper.containers[ContainerType.setup]),
(ContainerType.analysis, helper.containers[ContainerType.analysis]),
(ContainerType.tools, helper.containers[ContainerType.tools]),
(ContainerType.setup, helper.container_name(ContainerType.setup)),
(ContainerType.analysis, helper.container_name(ContainerType.analysis)),
(ContainerType.tools, helper.container_name(ContainerType.tools)),
# note, analysis is typically for crashes, but this is analyzing inputs
(ContainerType.crashes, helper.containers[ContainerType.inputs]),
(ContainerType.crashes, helper.container_name(ContainerType.inputs)),
]

of.logger.info("Creating generic_analysis task")
10 changes: 5 additions & 5 deletions src/cli/examples/llvm-source-coverage/source-coverage.py
Original file line number Diff line number Diff line change
@@ -61,15 +61,15 @@ def main() -> None:
helper.upload_inputs(args.inputs)

of.containers.files.upload_file(
helper.containers[ContainerType.tools], f"{args.tools}/source-coverage.sh"
helper.container_name(ContainerType.tools), f"{args.tools}/source-coverage.sh"
)

containers = [
(ContainerType.setup, helper.containers[ContainerType.setup]),
(ContainerType.analysis, helper.containers[ContainerType.analysis]),
(ContainerType.tools, helper.containers[ContainerType.tools]),
(ContainerType.setup, helper.container_name(ContainerType.setup)),
(ContainerType.analysis, helper.container_name(ContainerType.analysis)),
(ContainerType.tools, helper.container_name(ContainerType.tools)),
# note, analysis is typically for crashes, but this is analyzing inputs
(ContainerType.crashes, helper.containers[ContainerType.inputs]),
(ContainerType.crashes, helper.container_name(ContainerType.inputs)),
]

of.logger.info("Creating generic_analysis task")
Loading