Bug 1290523: support generic-worker task descriptions; r?grenade
MozReview-Commit-ID: CHIGSrB1MIu
--- a/taskcluster/docs/transforms.rst
+++ b/taskcluster/docs/transforms.rst
@@ -79,19 +79,26 @@ Task-Generation Transforms
Every kind needs to create tasks, and all of those tasks have some things in
common. They all run on one of a small set of worker implementations, each
with their own idiosyncracies. And they all report to TreeHerder in a similar
way.
The transforms in ``taskcluster/taskgraph/transforms/make_task.py`` implement
this common functionality. They expect a "task description", and produce a
task definition. The schema for a task description is defined at the top of
-``make_task.py``, with copious comments. The result is a dictionary with keys
-``label``, ``attributes``, ``task``, and ``dependencies``, with the latter
-having the same format as the input dependencies.
+``make_task.py``, with copious comments. The parts of the task description
+that are specific to a worker implementation are isolated in a ``worker``
+object which has an ``implementation`` property naming the worker
+implementation. Thus the transforms that produce a task description must be
+aware of the worker implementation to be used, but need not be aware of the
+details of its payload format.
+
+The result is a dictionary with keys ``label``, ``attributes``, ``task``, and
+``dependencies``, with the latter having the same format as the input
+dependencies.
These transforms assign names to treeherder groups using an internal list of
group names. Feel free to add additional groups to this list as necessary.
Test Transforms
---------------
The transforms configured for test kinds proceed as follows, based on
--- a/taskcluster/taskgraph/transforms/make_task.py
+++ b/taskcluster/taskgraph/transforms/make_task.py
@@ -122,16 +122,37 @@ task_description_schema = Schema({
'env': {basestring: taskref_or_string},
# the command to run
'command': [taskref_or_string],
# the maximum time to run, in seconds
'max-run-time': int,
}, {
+ 'implementation': 'generic-worker',
+
+ # command is a list of commands to run, sequentially
+ 'command': [basestring],
+
+ # artifacts to extract from the task image after completion; note that artifacts
+ # for the generic worker cannot have names
+ 'artifacts': [{
+ # type of artifact -- simple file, or recursive directory
+ 'type': Any('file', 'directory'),
+
+ # task image path from which to read artifact
+ 'path': basestring,
+ }],
+
+ # environment variables
+ 'env': {basestring: taskref_or_string},
+
+ # the maximum time to run, in seconds
+ 'max-run-time': int,
+ }, {
'implementation': 'buildbot-bridge',
# see https://github.com/mozilla/buildbot-bridge/blob/master/bbb/schemas/payload.yml
'buildername': basestring,
'sourcestamp': {
'branch': basestring,
Optional('revision'): basestring,
Optional('repository'): basestring,
@@ -236,16 +257,37 @@ def build_docker_worker_payload(config,
'maxRunTime': worker['max-run-time'],
}
if features:
payload['features'] = features
if capabilities:
payload['capabilities'] = capabilities
+@payload_builder('generic-worker')
+def build_generic_worker_payload(config, task, task_def):
+ worker = task['worker']
+
+ artifacts = []
+
+ for artifact in worker['artifacts']:
+ artifacts.append({
+ 'path': artifact['path'],
+ 'type': artifact['type'],
+ 'expires': task_def['expires'], # always expire with the task
+ })
+
+ task_def['payload'] = {
+ 'command': worker['command'],
+ 'artifacts': artifacts,
+ 'env': worker['env'],
+ 'maxRunTime': worker['max-run-time'],
+ }
+
+
transforms = TransformSequence()
@transforms.add
def validate(config, tasks):
for task in tasks:
yield validate_schema(
task_description_schema, task,