Bug 1286075: replace almost all legacy tasks with new kinds draft
authorDustin J. Mitchell <dustin@mozilla.com>
Wed, 31 Aug 2016 15:26:28 +0000
changeset 408090 0cdd662f6c3e3c81b82366c0233b4762c380b0bb
parent 408089 6818aeb6cbc0b98a453f4f8c779b8304adb5aa0d
child 530032 8ea9b36f15f5525d68defc4fa20f58729e8e0169
push id28132
push userdmitchell@mozilla.com
push dateWed, 31 Aug 2016 16:52:08 +0000
bugs1286075, 1286086
milestone51.0a1
Bug 1286075: replace almost all legacy tasks with new kinds ** SNEAK PREVIEW COMMIT ** There is a lot in this commit, and I am still hacking on it. I will attend to many of the "TODO" items in this commit, as well as lint, documentation, and catching up on tests, during the review period. Items marked "TODO(taskdiff)" would modify the resulting tasks in a signficant way, so they may either be pushed to a later bug, or modified in the legacy task definitions in an eaerlier commit. Time permitting, I will also break this commit out into a number of smaller commits. Major things to note: * There is now a hierarhcy of "descriptions": job descriptions are converted into task descriptions, which ultimately become task definitions. The job description uses a "run-using" dispatch approach to support the many and varied ways that tasks are implemented without cluttering the job descriptions too much. A later bug will modify the test kinds to use job descriptions too. * There is now a great degree of freedom in how to implement a kind: the simplest can probably be implemented using the "TransformTask" implementation and specifying raw job descriptions. The most complex may involve custom transforms, a custom task class, custom job implementations, and additional YAML files. I will include documentation outlining the options. * I've broken the non-test tasks implemented in the legacy kind down into a number of kinds based in part on task similarity and in part on who is responsible for them. * I've added more hacks around platform-name oddities. Those will be addressed in bug 1286086. * only-if-files-changed optimizations are supported * Task descriptions have a "run-on-projects" attribute specifying which projects (branches) a task should run on. This provides a natural way to run certain tasks only on project branches, integration, etc. without listing task labels in the target task methods. * This has been tested to produce virtually identical task-graphs as the original legacy kind, via some scripts in a previous commit. This testing spanned several branches, including try, although the try functionality may exhibit some subtle differences due to the way its syntax was so intimately related to the legacy task specifications. MozReview-Commit-ID: GSXvyIlO31x
taskcluster/ci/android-test/kind.yml
taskcluster/ci/artifact-build/kind.yml
taskcluster/ci/build/android-partner.yml
taskcluster/ci/build/android.yml
taskcluster/ci/build/kind.yml
taskcluster/ci/build/linux.yml
taskcluster/ci/build/macosx.yml
taskcluster/ci/build/mulet.yml
taskcluster/ci/build/windows.yml
taskcluster/ci/desktop-test/kind.yml
taskcluster/ci/hazard/kind.yml
taskcluster/ci/l10n/kind.yml
taskcluster/ci/legacy/tasks/branches/base_job_flags.yml
taskcluster/ci/legacy/tasks/branches/base_jobs.yml
taskcluster/ci/legacy/tasks/branches/try/job_flags.yml
taskcluster/ci/source-check/doc.yml
taskcluster/ci/source-check/eslint.yml
taskcluster/ci/source-check/kind.yml
taskcluster/ci/source-check/python-lint.yml
taskcluster/ci/source-check/python-tests.yml
taskcluster/ci/spidermonkey/kind.yml
taskcluster/ci/static-analysis/kind.yml
taskcluster/ci/toolchain/kind.yml
taskcluster/ci/upload-symbols/job-template.yml
taskcluster/ci/upload-symbols/kind.yml
taskcluster/ci/valgrind/kind.yml
taskcluster/docs/attributes.rst
taskcluster/taskgraph/decision.py
taskcluster/taskgraph/files_changed.py
taskcluster/taskgraph/jobs/__init__.py
taskcluster/taskgraph/jobs/base.py
taskcluster/taskgraph/jobs/common.py
taskcluster/taskgraph/jobs/hazard.py
taskcluster/taskgraph/jobs/mach.py
taskcluster/taskgraph/jobs/mozharness.py
taskcluster/taskgraph/jobs/mulet.py
taskcluster/taskgraph/jobs/run_task.py
taskcluster/taskgraph/jobs/spidermonkey.py
taskcluster/taskgraph/jobs/toolchain.py
taskcluster/taskgraph/target_tasks.py
taskcluster/taskgraph/task/legacy.py
taskcluster/taskgraph/task/post_build.py
taskcluster/taskgraph/task/test.py
taskcluster/taskgraph/task/transform.py
taskcluster/taskgraph/transforms/base.py
taskcluster/taskgraph/transforms/build.py
taskcluster/taskgraph/transforms/job.py
taskcluster/taskgraph/transforms/mulet_simulator.py
taskcluster/taskgraph/transforms/task.py
taskcluster/taskgraph/transforms/tests/make_task_description.py
taskcluster/taskgraph/transforms/upload_symbols.py
taskcluster/taskgraph/try_option_syntax.py
--- a/taskcluster/ci/android-test/kind.yml
+++ b/taskcluster/ci/android-test/kind.yml
@@ -1,12 +1,12 @@
 implementation: taskgraph.task.test:TestTask
 
 kind-dependencies:
-    - legacy
+    - build
 
 transforms:
    - taskgraph.transforms.tests.test_description:validate
    - taskgraph.transforms.tests.android_test:transforms
    - taskgraph.transforms.tests.all_kinds:transforms
    - taskgraph.transforms.tests.test_description:validate
    - taskgraph.transforms.tests.make_task_description:transforms
    - taskgraph.transforms.task:transforms
new file mode 100644
--- /dev/null
+++ b/taskcluster/ci/artifact-build/kind.yml
@@ -0,0 +1,38 @@
+# Artifact builds produce "artifacts" that can be used by Firefox devs to quickly test
+# non-compiled changes without waiting for a slow compile run.
+implementation: taskgraph.task.transform:TransformTask
+
+jobs:
+    linux64-artifact/opt:
+        description: "Linux64 Opt"
+        attributes: # TODO(taskdiff): remove
+            build_platform: linux64-artifact
+            build_type: opt
+        index:
+            product: firefox
+            job-name: linux64-artifact-opt
+        treeherder:
+            platform: linux64/opt
+            kind: build
+            symbol: AB
+            tier: 2
+        worker-type: aws-provisioner-v1/gecko-{level}-b-linux
+        worker:
+            implementation: docker-worker
+            docker-image: {in-tree: desktop-build}
+            max-run-time: 36000
+        run:
+            using: mozharness
+            actions: [get-secrets build generate-build-stats]
+            config:
+                - builds/releng_sub_linux_configs/64_artifact.py
+                - balrog/production.py
+            script: "mozharness/scripts/fx_desktop_build.py"
+            secrets: ['*']
+            tooltool-downloads: public
+            need-xvfb: true
+            keep-artifacts: false
+
+transforms:
+   - taskgraph.transforms.job:transforms
+   - taskgraph.transforms.task:transforms
new file mode 100644
--- /dev/null
+++ b/taskcluster/ci/build/android-partner.yml
@@ -0,0 +1,32 @@
+android-partner-sample1/opt:
+    description: "Android 4.0 API15+ Debug"
+    index:
+        product: mobile
+        job-name: android-api-15-partner-sample1-opt
+    treeherder:
+        platform: android-4-0-armv7-api15-partner1/opt
+        symbol: tc(B)
+        tier: 2
+    run-on-projects:
+        - try
+    worker-type: aws-provisioner-v1/android-api-15
+    worker:
+        implementation: docker-worker
+        max-run-time: 36000
+        env:
+            EXTRA_CHECKOUT_REPOSITORIES: "PARTNER\n"
+            PARTNER_BASE_REPOSITORY: "https://github.com/mozilla/fennec-distribution-sample"
+            PARTNER_DEST_DIR: "/home/worker/workspace/build/partner"
+            PARTNER_HEAD_REPOSITORY: "https://github.com/mozilla/fennec-distribution-sample"
+            PARTNER_HEAD_REV: "master"
+    run:
+        using: mozharness
+        actions: [get-secrets build multi-l10n update]
+        config:
+            - builds/releng_base_android_64_builds.py 
+            - disable_signing.py 
+            - platform_supports_post_upload_to_latest.py
+        script: "mozharness/scripts/fx_desktop_build.py"
+        custom-build-variant-cfg: api-15-partner-sample1
+        tooltool-downloads: internal
+
new file mode 100644
--- /dev/null
+++ b/taskcluster/ci/build/android.yml
@@ -0,0 +1,99 @@
+android-api-15/debug:
+    description: "Android 4.0 API15+ Debug"
+    index:
+        product: mobile
+        job-name:
+            buildbot: android-api-15-debug
+            gecko-v1: android-api-15.debug
+            gecko-v2: android-api-15-debug
+    treeherder:
+        platform: android-4-0-armv7-api15/debug
+        symbol: tc(B)
+    worker-type: aws-provisioner-v1/android-api-15
+    worker:
+        implementation: docker-worker
+        max-run-time: 7200
+    run:
+        using: mozharness
+        actions: [get-secrets build multi-l10n update]
+        config:
+            - builds/releng_base_android_64_builds.py 
+            - disable_signing.py 
+            - platform_supports_post_upload_to_latest.py
+        script: "mozharness/scripts/fx_desktop_build.py"
+        custom-build-variant-cfg: api-15-debug
+        tooltool-downloads: internal
+
+android-x86/opt:
+    description: "Android 4.2 x86 Opt"
+    index:
+        product: mobile
+        job-name: android-x86-opt
+    treeherder:
+        platform: android-4-2-x86/opt
+        symbol: tc(B)
+        tier: 1
+    worker-type: aws-provisioner-v1/android-api-15
+    worker:
+        implementation: docker-worker
+        max-run-time: 7200
+    run:
+        using: mozharness
+        actions: [get-secrets build multi-l10n update]
+        config:
+            - builds/releng_base_android_64_builds.py 
+            - disable_signing.py 
+            - platform_supports_post_upload_to_latest.py
+        script: "mozharness/scripts/fx_desktop_build.py"
+        custom-build-variant-cfg: x86
+        tooltool-downloads: internal
+
+android-api-15/opt:
+    description: "Android 4.0 API15+ Opt"
+    index:
+        product: mobile
+        job-name: android-api-15-opt
+    treeherder:
+        platform: android-4-0-armv7-api15/opt
+        symbol: tc(B)
+        tier: 2
+    worker-type: aws-provisioner-v1/android-api-15
+    worker:
+        implementation: docker-worker
+        max-run-time: 7200
+    run:
+        using: mozharness
+        actions: [get-secrets build multi-l10n update]
+        config:
+            - builds/releng_base_android_64_builds.py 
+            - disable_signing.py 
+            - platform_supports_post_upload_to_latest.py
+        script: "mozharness/scripts/fx_desktop_build.py"
+        custom-build-variant-cfg: api-15
+        tooltool-downloads: internal
+
+android-api-15-gradle/opt:
+    description: "Android 4.0 API15+ (Gradle) Opt"
+    index:
+        product: mobile
+        job-name: android-api-15-gradle-opt
+    treeherder:
+        platform: android-4-0-armv7-api15/opt
+        symbol: tc(Bg)
+        tier: 2
+    worker-type: aws-provisioner-v1/android-api-15
+    worker:
+        implementation: docker-worker
+        max-run-time: 7200
+    run:
+        using: mozharness
+        actions: [get-secrets build multi-l10n update]
+        config:
+            - builds/releng_base_android_64_builds.py 
+            - disable_signing.py 
+            - platform_supports_post_upload_to_latest.py
+        script: "mozharness/scripts/fx_desktop_build.py"
+        custom-build-variant-cfg: api-15-gradle
+        tooltool-downloads: internal
+
+
new file mode 100644
--- /dev/null
+++ b/taskcluster/ci/build/kind.yml
@@ -0,0 +1,18 @@
+# Builds are tasks that produce an installer or other output that can be run by
+# users or automated tests.  This is more restrictive than most definitions of
+# "build" in a Mozilla context: it does not include tasks that run build-like
+# actions for static analysis or to produce instrumented artifacts.
+implementation: taskgraph.task.transform:TransformTask
+
+transforms:
+   - taskgraph.transforms.build:transforms
+   - taskgraph.transforms.job:transforms
+   - taskgraph.transforms.task:transforms
+
+jobs-from:
+    - linux.yml
+    - windows.yml
+    - macosx.yml
+    - mulet.yml
+    - android.yml
+    - android-partner.yml
new file mode 100644
--- /dev/null
+++ b/taskcluster/ci/build/linux.yml
@@ -0,0 +1,176 @@
+linux64/opt:
+    description: "Linux64 Opt"
+    index:
+        product: firefox
+        job-name: linux64-opt
+    treeherder:
+        platform: linux64/opt
+        symbol: tc(B)
+        tier: 2
+    worker-type: aws-provisioner-v1/gecko-{level}-b-linux
+    worker:
+        implementation: docker-worker
+        max-run-time: 36000
+    run:
+        using: mozharness
+        actions: [get-secrets build check-test generate-build-stats update]
+        config:
+            - builds/releng_base_linux_64_builds.py
+            - balrog/production.py
+        script: "mozharness/scripts/fx_desktop_build.py"
+        secrets: ['*']
+        tooltool-downloads: public
+        need-xvfb: true
+
+linux64/pgo:
+    description: "Linux64 Opt PGO"
+    index:
+        product: firefox
+        job-name: linux64-pgo-opt
+    treeherder:
+        platform: linux64/pgo
+        symbol: tc(B)
+        tier: 2
+    worker-type: aws-provisioner-v1/gecko-{level}-b-linux
+    worker:
+        implementation: docker-worker
+        max-run-time: 36000
+    coalesce-name: linux64-pgo
+    run:
+        using: mozharness
+        actions: [get-secrets build check-test generate-build-stats update]
+        options: [enable-pgo]
+        config:
+            - builds/releng_base_linux_64_builds.py
+            - balrog/production.py
+        script: "mozharness/scripts/fx_desktop_build.py"
+        secrets: ['*']
+        tooltool-downloads: public
+        need-xvfb: true
+
+linux64/debug:
+    description: "Linux64 Debug"
+    index:
+        product: firefox
+        job-name:
+            buildbot: linux64-debug
+            gecko-v1: linux64.debug
+            gecko-v2: linux64-debug
+    treeherder:
+        platform: linux64/debug
+        symbol: tc(B)
+    worker-type: aws-provisioner-v1/gecko-{level}-b-linux
+    worker:
+        implementation: docker-worker
+        max-run-time: 36000
+    run:
+        using: mozharness
+        actions: [get-secrets build check-test generate-build-stats update]
+        config:
+            - builds/releng_base_linux_64_builds.py
+            - balrog/production.py
+        script: "mozharness/scripts/fx_desktop_build.py"
+        secrets: ['*']
+        custom-build-variant-cfg: debug
+        tooltool-downloads: public
+        need-xvfb: true
+
+linux/opt:
+    description: "Linux32 Opt"
+    index:
+        product: firefox
+        job-name: linux-opt
+    treeherder:
+        platform: linux32/opt
+        symbol: tc(B)
+        tier: 2
+    worker-type: aws-provisioner-v1/gecko-{level}-b-linux
+    worker:
+        implementation: docker-worker
+        max-run-time: 36000
+    coalesce-name: opt_linux32 
+    run:
+        using: mozharness
+        actions: [get-secrets build check-test generate-build-stats update]
+        config:
+            - builds/releng_base_linux_32_builds.py
+            - balrog/production.py
+        script: "mozharness/scripts/fx_desktop_build.py"
+        secrets: ['*']
+        tooltool-downloads: public
+        need-xvfb: true
+
+linux/debug:
+    description: "Linux32 Debug"
+    index:
+        product: firefox
+        job-name: linux-debug
+    treeherder:
+        platform: linux32/debug
+        symbol: tc(B)
+        tier: 2
+    worker-type: aws-provisioner-v1/gecko-{level}-b-linux
+    worker:
+        implementation: docker-worker
+        max-run-time: 36000
+    coalesce-name: dbg_linux32 
+    run:
+        using: mozharness
+        actions: [get-secrets build check-test generate-build-stats update]
+        config:
+            - builds/releng_base_linux_32_builds.py
+            - balrog/production.py
+        script: "mozharness/scripts/fx_desktop_build.py"
+        secrets: ['*']
+        custom-build-variant-cfg: debug
+        tooltool-downloads: public
+        need-xvfb: true
+
+linux64-asan/opt:
+    description: "Linux64 Opt ASAN"
+    index:
+        product: firefox
+        job-name: linux64-asan-opt
+    treeherder:
+        platform: linux64/asan
+        symbol: tc(Bo)
+    worker-type: aws-provisioner-v1/gecko-{level}-b-linux
+    worker:
+        implementation: docker-worker
+        max-run-time: 36000
+    run:
+        using: mozharness
+        actions: [get-secrets build check-test generate-build-stats update]
+        config:
+            - builds/releng_base_linux_64_builds.py
+            - balrog/production.py
+        script: "mozharness/scripts/fx_desktop_build.py"
+        secrets: ['*']
+        custom-build-variant-cfg: asan-tc
+        tooltool-downloads: public
+        need-xvfb: true
+
+linux64-asan/debug:
+    description: "Linux64 Debug ASAN"
+    index:
+        product: firefox
+        job-name: linux64-asan-debug
+    treeherder:
+        platform: linux64/asan
+        symbol: tc(Bd)
+    worker-type: aws-provisioner-v1/gecko-{level}-b-linux
+    worker:
+        implementation: docker-worker
+        max-run-time: 36000
+    run:
+        using: mozharness
+        actions: [get-secrets build check-test generate-build-stats update]
+        config:
+            - builds/releng_base_linux_64_builds.py
+            - balrog/production.py
+        script: "mozharness/scripts/fx_desktop_build.py"
+        secrets: ['*']
+        custom-build-variant-cfg: asan-tc-and-debug
+        tooltool-downloads: public
+        need-xvfb: true
+
new file mode 100644
--- /dev/null
+++ b/taskcluster/ci/build/macosx.yml
@@ -0,0 +1,46 @@
+macosx64/debug:
+    description: "MacOS X x64 Cross-compile"
+    index:
+        product: firefox
+        job-name: macosx64-debug
+    treeherder:
+        platform: osx-10-7/debug
+        symbol: tc(B)
+        tier: 2  # TODO: xform
+    worker-type: aws-provisioner-v1/dbg-macosx64
+    worker:
+        implementation: docker-worker
+        max-run-time: 36000
+    run:
+        using: mozharness
+        actions: [get-secrets build generate-build-stats update]
+        config:
+            - builds/releng_base_mac_64_cross_builds.py
+            - balrog/production.py
+        script: "mozharness/scripts/fx_desktop_build.py"
+        secrets: ['*']  # TODO: xform (default)
+        custom-build-variant-cfg: cross-debug
+        tooltool-downloads: internal
+
+macosx64/opt:
+    description: "MacOS X x64 Cross-compile"
+    index:
+        product: firefox
+        job-name: macosx64-opt
+    treeherder:
+        platform: osx-10-7/opt
+        symbol: tc(B)
+        tier: 2
+    worker-type: aws-provisioner-v1/opt-macosx64
+    worker:
+        implementation: docker-worker
+        max-run-time: 36000
+    run:
+        using: mozharness
+        actions: [get-secrets build generate-build-stats update]
+        config:
+            - builds/releng_base_mac_64_cross_builds.py
+            - balrog/production.py
+        script: "mozharness/scripts/fx_desktop_build.py"
+        secrets: ['*']
+        tooltool-downloads: internal
new file mode 100644
--- /dev/null
+++ b/taskcluster/ci/build/mulet.yml
@@ -0,0 +1,45 @@
+linux64-mulet/debug:
+    description: "Linux64 Mulet Debug"
+    index:
+        product: b2g
+        job-name:
+            buildbot: linux64-mulet
+            gecko-v1: mulet.dbg
+            gecko-v2: mulet-dbg
+    treeherder:
+        platform: mulet-linux64/debug # ?!?%
+        symbol: B
+        tier: 3
+    worker-type: aws-provisioner-v1/mulet-debug
+    worker:
+        implementation: docker-worker
+        docker-image: {in-tree: builder}
+        max-run-time: 3600
+    run:
+        using: mach-via-build-mulet-linux.sh
+        mozconfig: b2g/dev/config/mozconfigs/linux64/mulet_dbg
+        tooltool-manifest: b2g/dev/config/tooltool-manifests/linux64/releng.manifest
+
+linux64-mulet/opt:
+    description: "Linux64 Mulet Opt"
+    index:
+        product: b2g
+        job-name:
+            buildbot: linux64-mulet
+            gecko-v1: mulet.opt
+            gecko-v2: mulet-opt
+    treeherder:
+        platform: mulet-linux64/opt # ?!?%
+        symbol: B
+        tier: 3
+    worker-type: aws-provisioner-v1/mulet-opt
+    worker:
+        implementation: docker-worker
+        docker-image: {in-tree: builder}
+        max-run-time: 3600
+    run:
+        using: mach-via-build-mulet-linux.sh
+        mozconfig: b2g/dev/config/mozconfigs/linux64/mulet
+        tooltool-manifest: b2g/dev/config/tooltool-manifests/linux64/releng.manifest
+
+
new file mode 100644
--- /dev/null
+++ b/taskcluster/ci/build/windows.yml
@@ -0,0 +1,80 @@
+win32/debug:
+    description: "Win32 Debug"
+    index:
+        product: firefox
+        job-name:
+            gecko-v2: win32-debug
+    treeherder:
+        platform: windowsxp/debug
+        symbol: tc(B)
+        tier: 2
+    worker-type: aws-provisioner-v1/gecko-{level}-b-win2012
+    worker:
+        implementation: generic-worker
+        max-run-time: 7200
+    run:
+        using: mozharness
+        script: fx_desktop_build.py
+        config:
+            - builds/taskcluster_firefox_win32_debug.py
+
+win32/opt:
+    description: "Win32 Opt"
+    index:
+        product: firefox
+        job-name:
+            gecko-v2: win32-opt
+    treeherder:
+        platform: windowsxp/opt
+        symbol: tc(B)
+        tier: 2
+    worker-type: aws-provisioner-v1/gecko-{level}-b-win2012
+    worker:
+        implementation: generic-worker
+        max-run-time: 7200
+    run:
+        using: mozharness
+        script: fx_desktop_build.py
+        config:
+            - builds/taskcluster_firefox_win32_opt.py
+
+win64/debug:
+    description: "Win64 Debug"
+    index:
+        product: firefox
+        job-name:
+            gecko-v2: win64-debug
+    treeherder:
+        platform: windows8-64/debug
+        symbol: tc(B)
+        tier: 2
+    worker-type: aws-provisioner-v1/gecko-{level}-b-win2012
+    worker:
+        implementation: generic-worker
+        max-run-time: 7200
+    run:
+        using: mozharness
+        script: fx_desktop_build.py
+        config:
+            - builds/taskcluster_firefox_win64_debug.py
+
+win64/opt:
+    description: "Win64 Opt"
+    index:
+        product: firefox
+        job-name:
+            gecko-v2: win64-opt
+    treeherder:
+        platform: windows8-64/opt
+        symbol: tc(B)
+        tier: 2
+    worker-type: aws-provisioner-v1/gecko-{level}-b-win2012
+    worker:
+        implementation: generic-worker
+        max-run-time: 7200
+    run:
+        using: mozharness
+        script: fx_desktop_build.py
+        config:
+            - builds/taskcluster_firefox_win64_opt.py
+
--- a/taskcluster/ci/desktop-test/kind.yml
+++ b/taskcluster/ci/desktop-test/kind.yml
@@ -1,12 +1,12 @@
 implementation: taskgraph.task.test:TestTask
 
 kind-dependencies:
-    - legacy
+    - build
 
 transforms:
    - taskgraph.transforms.tests.test_description:validate
    - taskgraph.transforms.tests.desktop_test:transforms
    - taskgraph.transforms.tests.all_kinds:transforms
    - taskgraph.transforms.tests.test_description:validate
    - taskgraph.transforms.tests.make_task_description:transforms
    - taskgraph.transforms.task:transforms
new file mode 100644
--- /dev/null
+++ b/taskcluster/ci/hazard/kind.yml
@@ -0,0 +1,97 @@
+implementation: taskgraph.task.transform:TransformTask
+
+transforms:
+   - taskgraph.transforms.job:transforms
+   - taskgraph.transforms.task:transforms
+
+job-defaults:
+    treeherder:
+        kind: build
+        tier: 1
+    worker-type: aws-provisioner-v1/gecko-{level}-b-linux
+    worker:
+        implementation: docker-worker
+        max-run-time: 36000
+        docker-image: {in-tree: desktop-build}
+    run:
+        uses-secrets: true  # TODO(taskdiff): don't use secrets for these jobs
+
+jobs:
+    linux64-shell-haz/debug:
+        description: "JS Shell Hazard Analysis Linux"
+        attributes:
+            build_platform: linux64-shell-haz
+            build_type: debug
+        index:
+            product: firefox
+            job-name:
+                gecko-v1: shell-haz.debug
+                gecko-v2: shell-haz-debug
+        treeherder:
+            platform: linux64/debug
+            symbol: SM-tc(H)
+        run:
+            using: hazard
+            tooltool-manifest: "browser/config/tooltool-manifests/linux64/hazard.manifest"
+            command: >
+                tc-vcs checkout workspace/gecko "$GECKO_BASE_REPOSITORY" "$GECKO_HEAD_REPOSITORY" "$GECKO_HEAD_REV" "$GECKO_HEAD_REF"
+                && cd ./workspace/gecko/taskcluster/scripts/builder
+                && ./build-haz-linux.sh --project shell $HOME/workspace
+        run-on-projects:
+            - integration
+            - release
+        optimizations:
+            only-if-files-changed:
+                - js/public/**
+                - js/src/**
+
+    linux64-haz/debug:
+        description: "Browser Hazard Analysis Linux"
+        attributes:
+            build_platform: linux64-haz
+            build_type: debug
+        index:
+            product: firefox
+            job-name:
+                gecko-v1: browser-haz.debug
+                gecko-v2: browser-haz-debug
+        treeherder:
+            platform: linux64/debug
+            symbol: tc(H)
+        run:
+            using: hazard
+            tooltool-manifest: "browser/config/tooltool-manifests/linux64/hazard.manifest"
+            mozconfig: "browser/config/mozconfigs/linux64/hazards"
+            command: >
+                tc-vcs checkout workspace/gecko "$GECKO_BASE_REPOSITORY" "$GECKO_HEAD_REPOSITORY" "$GECKO_HEAD_REV" "$GECKO_HEAD_REF"
+                && cd ./workspace/gecko/taskcluster/scripts/builder
+                && ./build-haz-linux.sh --project browser $HOME/workspace
+
+    linux64-mulet-haz/debug:
+        description: "Mulet Hazard Analysis Linux"
+        attributes:
+            build_platform: linux64-mulet-haz
+            build_type: debug
+        index:
+            product: b2g
+            job-name:
+                buildbot: linux64-haz-mulet
+                gecko-v1: mulet-haz.debug
+                gecko-v2: mulet-haz-debug
+        treeherder:
+            platform: mulet-linux64/opt
+            symbol: tc(H)
+            tier: 3
+        run-on-projects:
+            - try
+        worker-type: aws-provisioner-v1/mulet-debug  # TODO (taskdiff): change to default
+        worker:
+            docker-image: {in-tree: builder}
+        run:
+            using: hazard
+            tooltool-manifest: "gecko/b2g/dev/config/tooltool-manifests/linux64/hazard.manifest"
+            mozconfig: "b2g/dev/config/mozconfigs/linux64/mulet-hazards"
+            uses-secrets: false  # TODO(taskdiff): don't use secrets for these jobs
+            command: >
+                checkout-gecko workspace && cd ./workspace/gecko/taskcluster/scripts/builder
+                && buildbot_step 'Build' ./build-mulet-haz-linux.sh $HOME/workspace
new file mode 100644
--- /dev/null
+++ b/taskcluster/ci/l10n/kind.yml
@@ -0,0 +1,82 @@
+implementation: taskgraph.task.transform:TransformTask
+
+transforms:
+   - taskgraph.transforms.job:transforms
+   - taskgraph.transforms.task:transforms
+
+job-defaults:
+    index:
+        product: firefox
+    treeherder:
+        kind: build
+        tier: 2
+    worker-type: aws-provisioner-v1/gecko-{level}-b-linux
+    worker:
+        implementation: docker-worker
+        docker-image: {in-tree: desktop-test}
+        max-run-time: 36000
+        env:
+            # TODO: this is really a different "using" since it's using build-l10n.sh instead of build-linux.sh
+            JOB_SCRIPT: "taskcluster/scripts/builder/build-l10n.sh"
+    optimizations:
+        only-if-files-changed:
+            - browser/locales/all-locales
+            - python/compare-locales/**
+            - testing/mozharness/configs/single_locale/**
+            - testing/mozharness/mozharness/mozilla/l10n/locales.py
+            - testing/mozharness/scripts/desktop_l10n.py
+            - toolkit/locales/**
+            - toolkit/mozapps/installer/**
+
+jobs:
+    linux-l10n/opt:
+        description: "Localization"
+        attributes:
+            build_platform: linux-l10n
+            build_type: opt
+        index:
+            job-name:
+                gecko-v2: linux32-l10n-opt
+        treeherder:
+            platform: linux32/opt
+            symbol: tc(L10n)
+        run:
+            using: mozharness
+            script: mozharness/scripts/desktop_l10n.py
+            actions: [clone-locales list-locales setup repack summary]
+            config:
+                - single_locale/tc_linux32.py
+            options:
+                - environment-config=single_locale/production.py
+                - branch-config=single_locale/try.py # TODO: {{project}}.py
+                - platform-config=single_locale/linux32.py  # TODO: {{platform}}.py
+                - total-chunks=1
+                - this-chunk=1
+            tooltool-downloads: public
+            need-xvfb: true
+
+    linux64-l10n/opt:
+        description: "Localization"
+        attributes:
+            build_platform: linux64-l10n
+            build_type: opt
+        index:
+            job-name:
+                gecko-v2: linux64-l10n-opt
+        treeherder:
+            platform: linux64/opt
+            symbol: tc(L10n)
+        run:
+            using: mozharness
+            script: mozharness/scripts/desktop_l10n.py
+            actions: [clone-locales list-locales setup repack summary]
+            config:
+                - single_locale/tc_linux64.py
+            options:
+                - environment-config=single_locale/production.py
+                - branch-config=single_locale/try.py
+                - platform-config=single_locale/linux64.py
+                - total-chunks=1
+                - this-chunk=1
+            tooltool-downloads: public
+            need-xvfb: true
--- a/taskcluster/ci/legacy/tasks/branches/base_job_flags.yml
+++ b/taskcluster/ci/legacy/tasks/branches/base_job_flags.yml
@@ -1,41 +1,17 @@
 ---
 # List of all possible flags for each category of tests used in the case where
 # "all" is specified.
 flags:
   builds:
-    - linux32_gecko  # b2g desktop linux 32 bit
-    - linux64_gecko  # b2g desktop linux 64 bit
-    - linux64-mulet  # Firefox desktop - b2g gecko linux 64 bit
-    - linux64-haz    # Firefox desktop browser, rooting hazard analysis
-    - linux64-shell-haz  # JS shell, rooting hazard analysis
-    - linux64-mulet-haz  # Firefox desktop - b2g gecko linux 64 bit, rooting hazard analysis
-    - macosx64_gecko # b2g desktop osx 64 bit
-    - win32_gecko    # b2g desktop win 32 bit
     - nexus-5l-eng
     - aries-eng
-    - android-api-15
-    - android-api-15-gradle
     - android-api-15-frontend
     - android-partner-sample1
-    - android-x86
-    - linux
-    - linux-l10n    # Desktop l10n
-    - linux64
-    - linux64-l10n  # Desktop l10n
-    - linux64-st-an
-    - linux64-artifact
-    - linux64-asan
-    - linux64-pgo
-    - linux64-valgrind
-    - macosx64
-    - macosx64-st-an
-    - win32
-    - win64
 
   tests:
     - cppunit
     - crashtest
     - crashtest-e10s
     - external-media-tests
     - firefox-ui-functional-local
     - firefox-ui-functional-local-e10s
--- a/taskcluster/ci/legacy/tasks/branches/base_jobs.yml
+++ b/taskcluster/ci/legacy/tasks/branches/base_jobs.yml
@@ -1,30 +1,16 @@
 ---
 # For complete sample of all build and test jobs,
 # see <gecko>/testing/taskcluster/tasks/branches/base_job_flags.yml
 
 $inherits:
   from: tasks/branches/base_job_flags.yml
 
 builds:
-  android-api-15:
-    platforms:
-      - Android
-    types:
-      opt:
-        task: tasks/builds/android_api_15.yml
-      debug:
-        task: tasks/builds/android_api_15_debug.yml
-  android-x86:
-    platforms:
-      - Android
-    types:
-      opt:
-        task: tasks/builds/android_x86.yml
   android-api-15-gradle:
     platforms:
       - Android
     types:
       opt:
         task: tasks/builds/android_api_15_gradle.yml
   linux64-mulet:
     platforms:
@@ -267,70 +253,16 @@ builds:
     types:
       opt:
         task: tasks/builds/opt_win64.yml
       debug:
         task: tasks/builds/dbg_win64.yml
 
 # Miscellaneous tasks.
 tasks:
-  eslint-gecko:
-    task: tasks/tests/eslint-gecko.yml
-    root: true
-    when:
-      file_patterns:
-        # Files that are likely audited.
-        - '**/*.js'
-        - '**/*.jsm'
-        - '**/*.jsx'
-        - '**/*.html'
-        - '**/*.xml'
-        # Run when eslint policies change.
-        - '**/.eslintignore'
-        - '**/*eslintrc*'
-        # The plugin implementing custom checks.
-        - 'tools/lint/eslint/eslint-plugin-mozilla/**'
-        # Other misc lint related files.
-        - 'tools/lint/**'
-        - 'testing/docker/lint/**'
-  flake8-gecko:
-    task: tasks/tests/mozlint-flake8.yml
-    root: true
-    when:
-      file_patterns:
-        - '**/*.py'
-        - '**/.flake8'
-        - 'python/mozlint/**'
-        - 'tools/lint/**'
-        - 'testing/docker/lint/**'
-  wptlint-gecko:
-    task: tasks/tests/mozlint-wpt.yml
-    root: true
-    when:
-      file_patterns:
-        - 'testing/web-platform/tests/**'
-        - 'python/mozlint/**'
-        - 'tools/lint/**'
-        - 'testing/docker/lint/**'
-  sphinx:
-    task: tasks/tests/sphinx.yml
-    root: true
-    when:
-      file_patterns:
-        - '**/*.py'
-        - '**/*.rst'
-        - 'tools/docs/**'
-  taskgraph-tests:
-    task: tasks/tests/taskgraph-tests.yml
-    root: true
-    when:
-      file_patterns:
-        - 'taskcluster/**/*.py'
-        - 'config/mozunit.py'
-        - 'python/mach/**/*.py'
   android-api-15-gradle-dependencies:
     task: tasks/builds/android_api_15_gradle_dependencies.yml
     root: true
     when:
       file_patterns:
         - 'mobile/android/config/**'
         - 'testing/docker/android-gradle-build/**'
         - 'testing/mozharness/configs/builds/releng_sub_android_configs/*gradle_dependencies.py'
@@ -357,36 +289,16 @@ tasks:
       file_patterns:
         - 'mobile/android/**/*.java'
         - 'mobile/android/**/*.jpeg'
         - 'mobile/android/**/*.jpg'
         - 'mobile/android/**/*.png'
         - 'mobile/android/**/*.svg'
         - 'mobile/android/**/*.xml' # Manifest & android resources
         - 'mobile/android/**/build.gradle'
-  mozharness:
-    task: tasks/tests/mozharness-gecko.yml
-    root: true
-    when:
-      file_patterns:
-        - 'testing/mozharness/**'
   marionette-harness:
     task: tasks/tests/harness_marionette.yml
     root: true
     when:
         file_patterns:
           - 'testing/marionette/harness/**'
           - 'testing/mozharness/scripts/marionette_harness_tests.py'
           - 'testing/config/marionette_harness_test_requirements.txt'
-  linux64-gcc:
-    task: tasks/builds/linux64_gcc.yml
-    root: true
-    when:
-        file_patterns:
-          - 'build/unix/build-gcc/**'
-          - 'taskcluster/scripts/misc/build-gcc-linux.sh'
-  linux64-clang:
-    task: tasks/builds/linux64_clang.yml
-    root: true
-    when:
-        file_patterns:
-          - 'build/build-clang/**'
-          - 'taskcluster/scripts/misc/build-clang-linux.sh'
--- a/taskcluster/ci/legacy/tasks/branches/try/job_flags.yml
+++ b/taskcluster/ci/legacy/tasks/branches/try/job_flags.yml
@@ -6,22 +6,16 @@
   from: tasks/branches/base_jobs.yml
 
 # Flags specific to this branch
 flags:
   post-build:
     - upload-symbols
 
 builds:
-  android-partner-sample1:
-    platforms:
-      - Android
-    types:
-      opt:
-        task: tasks/builds/android_api_15_partner_sample1.yml
   linux:
     platforms:
       - Linux
     types:
       opt:
         task: tasks/builds/opt_linux32.yml
       debug:
         task: tasks/builds/dbg_linux32.yml
new file mode 100644
--- /dev/null
+++ b/taskcluster/ci/source-check/doc.yml
@@ -0,0 +1,35 @@
+sphinx:
+    description: Generate the Sphinx documentation
+    attributes:
+        build_platform: sphinx
+        build_type: opt
+    treeherder:
+        symbol: tc(Doc)
+        kind: test
+        tier: 1
+        platform: lint/opt
+    worker-type: aws-provisioner-v1/b2gtest
+    worker:
+        implementation: docker-worker
+        docker-image: {in-tree: "lint"}
+        max-run-time: 1800
+        artifacts:
+            - type: file
+              name: public/docs.tar.gz
+              path: /home/worker/checkouts/gecko/docs.tar.gz
+    run:
+        using: run-task
+        command: >
+            cd /home/worker/checkouts/gecko &&
+            ./mach doc --outdir docs-out --no-open &&
+            rm -rf docs-out/html/Mozilla_Source_Tree_Docs/_venv &&
+            mv docs-out/html/Mozilla_Source_Tree_Docs docs &&
+            tar -czf docs.tar.gz docs
+    run-on-projects:
+        - integration
+        - release
+    optimizations:
+        only-if-files-changed:
+            - '**/*.py'
+            - '**/*.rst'
+            - 'tools/docs/**'
new file mode 100644
--- /dev/null
+++ b/taskcluster/ci/source-check/eslint.yml
@@ -0,0 +1,44 @@
+eslint-gecko:
+    description: mozharness unit tests
+    attributes:
+        build_platform: eslint-gecko
+        build_type: opt
+    treeherder:
+        symbol: ES
+        kind: test
+        tier: 1
+        platform: lint/opt
+    worker-type: aws-provisioner-v1/b2gtest
+    worker:
+        implementation: docker-worker
+        docker-image: {in-tree: "lint"}
+        max-run-time: 1800
+    run:
+        using: run-task
+        command: >
+            cd /home/worker/checkouts/gecko/tools/lint/eslint &&
+            /build/tooltool.py fetch -m manifest.tt &&
+            tar xvfz eslint.tar.gz &&
+            rm eslint.tar.gz &&
+            ln -s ../eslint-plugin-mozilla node_modules &&
+            cd ../../.. &&
+            tools/lint/eslint/node_modules/.bin/eslint --quiet --plugin html --ext [.js,.jsm,.jsx,.xml,.html,.xhtml] -f tools/lint/eslint-formatter .
+    run-on-projects:
+        - integration
+        - release
+    optimizations:
+        only-if-files-changed:
+            # Files that are likely audited.
+            - '**/*.js'
+            - '**/*.jsm'
+            - '**/*.jsx'
+            - '**/*.html'
+            - '**/*.xml'
+            # Run when eslint policies change.
+            - '**/.eslintignore'
+            - '**/*eslintrc*'
+            # The plugin implementing custom checks.
+            - 'tools/lint/eslint/eslint-plugin-mozilla/**'
+            # Other misc lint related files.
+            - 'tools/lint/**'
+            - 'testing/docker/lint/**'
new file mode 100644
--- /dev/null
+++ b/taskcluster/ci/source-check/kind.yml
@@ -0,0 +1,14 @@
+# Source-checks are tasks that look at the Gecko source directly to check
+# correctness.  This can include linting, Python unit tests, source-code
+# analysis, or measurement work.
+implementation: taskgraph.task.transform:TransformTask
+
+jobs-from:
+    - python-tests.yml
+    - python-lint.yml
+    - eslint.yml
+    - doc.yml
+
+transforms:
+   - taskgraph.transforms.job:transforms
+   - taskgraph.transforms.task:transforms
new file mode 100644
--- /dev/null
+++ b/taskcluster/ci/source-check/python-lint.yml
@@ -0,0 +1,61 @@
+flake8-gecko:
+    description: flake8 run over the gecko codebase
+    attributes:
+        build_platform: flake8-gecko
+        build_type: opt
+    treeherder:
+        symbol: f8
+        kind: test
+        tier: 2
+        platform: lint/opt
+    worker-type: aws-provisioner-v1/b2gtest
+    worker:
+        implementation: docker-worker
+        docker-image: {in-tree: "lint"}
+        max-run-time: 1800
+    run:
+        using: run-task
+        command: >
+            cd /home/worker/checkouts/gecko &&
+            ./mach lint -l flake8 -f treeherder
+    run-on-projects:
+        - integration
+        - release
+    optimizations:
+        only-if-files-changed:
+            - '**/*.py'
+            - '**/.flake8'
+            - 'python/mozlint/**'
+            - 'tools/lint/**'
+            - 'testing/docker/lint/**'
+
+wptlint-gecko:
+    description: web-platform-tests linter
+    # TODO (taskdiff): remove these from anything but builds
+    attributes:
+        build_platform: wptlint-gecko
+        build_type: opt
+    treeherder:
+        symbol: W
+        kind: test
+        tier: 2
+        platform: lint/opt
+    worker-type: aws-provisioner-v1/b2gtest
+    worker:
+        implementation: docker-worker
+        docker-image: {in-tree: "lint"}
+        max-run-time: 1800
+    run:
+        using: run-task
+        command: >
+            cd /home/worker/checkouts/gecko &&
+            ./mach lint -l wpt -f treeherder
+    run-on-projects:
+        - integration
+        - release
+    optimizations:
+        only-if-files-changed:
+        - 'testing/web-platform/tests/**'
+        - 'python/mozlint/**'
+        - 'tools/lint/**'
+        - 'testing/docker/lint/**'
new file mode 100644
--- /dev/null
+++ b/taskcluster/ci/source-check/python-tests.yml
@@ -0,0 +1,55 @@
+taskgraph-tests:
+    description: taskcluster/taskgraph unit tests
+    attributes:
+        build_platform: taskgraph-tests
+        build_type: opt
+    treeherder:
+        symbol: tg
+        kind: test
+        tier: 2
+        platform: linux64/opt
+    worker-type: aws-provisioner-v1/b2gtest
+    worker:
+        implementation: docker-worker
+        docker-image: {in-tree: "lint"}
+        max-run-time: 1800
+    run:
+        using: mach
+        mach: taskgraph python-tests
+    run-on-projects:
+        - integration
+        - release
+    optimizations:
+        only-if-files-changed:
+            - 'taskcluster/**/*.py'
+            - 'config/mozunit.py'
+            - 'python/mach/**/*.py'
+
+mozharness:
+    description: mozharness integration tests
+    attributes:
+        build_platform: mozharness
+        build_type: opt
+    treeherder:
+        symbol: MH
+        kind: test
+        tier: 2
+        platform: lint/opt
+    worker-type: aws-provisioner-v1/b2gtest
+    worker:
+        implementation: docker-worker
+        docker-image: {in-tree: "lint"}
+        max-run-time: 1800
+    run:
+        using: run-task
+        cache-dotcache: true
+        command: >
+            cd /home/worker/checkouts/gecko/testing/mozharness &&
+            /usr/bin/pip2 install tox &&
+            /home/worker/.local/bin/tox -e py27-hg3.7
+    run-on-projects:
+        - integration
+        - release
+    optimizations:
+        only-if-files-changed:
+            - 'testing/mozharness/**'
new file mode 100644
--- /dev/null
+++ b/taskcluster/ci/spidermonkey/kind.yml
@@ -0,0 +1,291 @@
+implementation: taskgraph.task.transform:TransformTask
+
+transforms:
+   - taskgraph.transforms.job:transforms
+   - taskgraph.transforms.task:transforms
+
+job-defaults:
+    treeherder:
+        platform: linux64/opt
+        kind: build
+        tier: 1
+    index:
+        product: firefox
+    worker-type: aws-provisioner-v1/gecko-{level}-b-linux
+    worker:
+        implementation: docker-worker
+        max-run-time: 36000
+        docker-image: {in-tree: desktop-build}
+
+jobs:
+    sm-package/opt:
+        description: "Spidermonkey Plain"
+        # TODO (taskdiff): remove these attributes
+        attributes:
+            build_platform: sm-package
+            build_type: opt
+        index:
+            job-name:
+                buildbot: sm-plain
+                gecko-v1: sm-package.opt
+                gecko-v2: sm-package-opt
+        treeherder:
+            symbol: SM-tc(pkg)
+        run:
+            using: spidermonkey-package
+            spidermonkey-variant: plain
+        run-on-projects:
+            - integration
+            - release
+        optimizations:
+            only-if-files-changed:
+                - build/**
+                - configure.py
+                - dom/bindings/**
+                - intl/icu/**
+                - js/moz.configure
+                - js/public/**
+                - js/src/**
+                - layout/tools/reftest/reftest/**
+                - media/webrtc/trunk/tools/gyp/**
+                - memory/**
+                - mfbt/**
+                - modules/fdlibm/**
+                - modules/zlib/src/**
+                - mozglue/**
+                - moz.configure
+                - nsprpub/**
+                - python/**
+                - taskcluster/moz.build
+                - testing/mozbase/**
+                - test.mozbuild
+                - toolkit/mozapps/installer/package-name.mk
+                - toolkit/mozapps/installer/upload-files.mk
+
+    sm-plain/debug:
+        description: "Spidermonkey Plain"
+        attributes:
+            build_platform: sm-plain
+            build_type: debug
+        index:
+            job-name:
+                buildbot: sm-plain
+                gecko-v1: sm-plaindebug.debug
+                gecko-v2: sm-plaindebug-debug
+        treeherder:
+            platform: linux64/debug
+            symbol: SM-tc(p)
+        run:
+            using: spidermonkey
+            spidermonkey-variant: plaindebug
+        run-on-projects:
+            - integration
+            - release
+        optimizations:
+            only-if-files-changed:
+                - js/public/**
+                - js/src/**
+
+    sm-plain/opt:
+        description: "Spidermonkey Plain"
+        attributes:
+            build_platform: sm-plain
+            build_type: opt
+        index:
+            job-name: sm-plain-opt
+        treeherder:
+            symbol: SM-tc(p)
+        run:
+            using: spidermonkey
+            spidermonkey-variant: plain
+        run-on-projects:
+            - integration
+            - release
+        optimizations:
+            only-if-files-changed:
+                - js/public/**
+                - js/src/**
+
+    sm-arm-sim/debug:
+        description: "Spidermonkey ARM sim"
+        attributes:
+            build_platform: sm-arm-sim
+            build_type: debug
+        index:
+            job-name:
+                buildbot: sm-plain
+                gecko-v1: sm-arm-sim.debug
+                gecko-v2: sm-arm-sim-debug
+        treeherder:
+            symbol: SM-tc(arm)
+        run:
+            using: spidermonkey
+            spidermonkey-variant: arm-sim
+        run-on-projects:
+            - integration
+            - release
+        optimizations:
+            only-if-files-changed:
+                - js/public/**
+                - js/src/**
+
+    sm-arm64-sim/debug:
+        description: "Spidermonkey ARM64 sim"
+        attributes:
+            build_platform: sm-arm64-sim
+            build_type: debug
+        index:
+            job-name:
+                buildbot: sm-plain
+                gecko-v1: sm-arm64-sim.debug
+                gecko-v2: sm-arm64-sim-debug
+        treeherder:
+            symbol: SM-tc(arm64)
+        run:
+            using: spidermonkey
+            spidermonkey-variant: arm64-sim
+        run-on-projects:
+            - integration
+            - release
+        optimizations:
+            only-if-files-changed:
+                - js/public/**
+                - js/src/**
+
+    sm-asan/opt:
+        description: "Spidermonkey Address Sanitizer"
+        attributes:
+            build_platform: sm-asan
+            build_type: opt
+        index:
+            job-name:
+                buildbot: sm-plain
+                gecko-v1: sm-asan.opt
+                gecko-v2: sm-asan-opt
+        treeherder:
+            symbol: SM-tc(asan)
+        run:
+            using: spidermonkey
+            spidermonkey-variant: asan
+            tooltool-manifest: browser/config/tooltool-manifests/linux64/asan.manifest
+        run-on-projects:
+            - integration
+            - release
+        optimizations:
+            only-if-files-changed:
+                - js/public/**
+                - js/src/**
+
+    sm-compacting/debug:
+        description: "Spidermonkey Compacting"
+        attributes:
+            build_platform: sm-compacting
+            build_type: debug
+        index:
+            job-name:
+                buildbot: sm-plain
+                gecko-v1: sm-compacting.debug
+                gecko-v2: sm-compacting-debug
+        treeherder:
+            symbol: SM-tc(cgc)
+        run:
+            using: spidermonkey
+            spidermonkey-variant: compacting
+        run-on-projects:
+            - integration
+            - release
+        optimizations:
+            only-if-files-changed:
+                - js/public/**
+                - js/src/**
+
+    sm-msan/opt:
+        description: "Spidermonkey Memory Sanitizer"
+        attributes:
+            build_platform: sm-msan
+            build_type: opt
+        index:
+            job-name:
+                buildbot: sm-plain
+                gecko-v1: sm-msan.opt
+                gecko-v2: sm-msan-opt
+        treeherder:
+            symbol: SM-tc(msan)
+            tier: 1
+        run:
+            using: spidermonkey
+            spidermonkey-variant: msan
+            tooltool-manifest: browser/config/tooltool-manifests/linux64/msan.manifest
+        run-on-projects:
+            - integration
+            - release
+        optimizations:
+            only-if-files-changed:
+                - js/public/**
+                - js/src/**
+
+    sm-tsan/opt:
+        description: "Spidermonkey Thread Sanitizer"
+        attributes:
+            build_platform: sm-tsan
+            build_type: opt
+        index:
+            job-name:
+                buildbot: sm-plain
+                gecko-v1: sm-tsan.opt
+                gecko-v2: sm-tsan-opt
+        treeherder:
+            symbol: SM-tc(tsan)
+            tier: 3
+        run-on-projects: []
+        run:
+            using: spidermonkey
+            spidermonkey-variant: tsan
+            tooltool-manifest: browser/config/tooltool-manifests/linux64/tsan.manifest
+
+    sm-rootanalysis/debug:
+        description: "Spidermonkey Root Analysis"
+        attributes:
+            build_platform: sm-rootanalysis
+            build_type: debug
+        index:
+            job-name:
+                buildbot: sm-plain
+                gecko-v1: sm-rootanalysis.debug
+                gecko-v2: sm-rootanalysis-debug
+        treeherder:
+            symbol: SM-tc(r)
+        run:
+            using: spidermonkey
+            spidermonkey-variant: rootanalysis
+        run-on-projects:
+            - integration
+            - release
+        optimizations:
+            only-if-files-changed:
+                - js/public/**
+                - js/src/**
+
+    sm-nonunified/debug:
+        description: "Spidermonkey Non-Unified Debug"
+        attributes:
+            build_platform: sm-nonunified
+            build_type: debug
+        index:
+            job-name:
+                buildbot: sm-plain
+                gecko-v1: sm-nonunified.debug
+                gecko-v2: sm-nonunified-debug
+        treeherder:
+            platform: linux64/debug
+            symbol: SM-tc(nu)
+        run:
+            using: spidermonkey
+            spidermonkey-variant: nonunified
+        run-on-projects:
+            - integration
+            - release
+        optimizations:
+            only-if-files-changed:
+                - js/public/**
+                - js/src/**
new file mode 100644
--- /dev/null
+++ b/taskcluster/ci/static-analysis/kind.yml
@@ -0,0 +1,62 @@
+implementation: taskgraph.task.transform:TransformTask
+
+transforms:
+   - taskgraph.transforms.job:transforms
+   - taskgraph.transforms.task:transforms
+
+job-defaults:
+    index:
+        product: firefox
+    treeherder:
+        symbol: S
+        kind: build
+        tier: 1
+    worker:
+        implementation: docker-worker
+        docker-image: {in-tree: desktop-build}
+        max-run-time: 36000
+
+jobs:
+    macosx64-st-an/opt:
+        description: "MacOS X x64 Cross-compile Static Analysis"
+        attributes:
+            build_platform: macosx64-st-an
+            build_type: opt
+        index:
+            job-name: macosx64-st-an-opt
+        treeherder:
+            platform: osx-10-7/opt
+        worker-type: aws-provisioner-v1/opt-macosx64
+        run:
+            using: mozharness
+            actions: [get-secrets build generate-build-stats update]
+            config:
+                - builds/releng_base_mac_64_cross_builds.py
+                - balrog/production.py
+            custom-build-variant-cfg: cross-opt
+            script: "mozharness/scripts/fx_desktop_build.py"
+            secrets: ['*']
+            tooltool-downloads: internal
+            keep-artifacts: false
+
+    linux64-st-an/opt:
+        description: "Linux64 Opt Static Analysis"
+        attributes:
+            build_platform: linux64-st-an
+            build_type: opt
+        index:
+            job-name: linux64-st-an-opt
+        treeherder:
+            platform: linux64/opt
+        worker-type: aws-provisioner-v1/gecko-{level}-b-linux
+        run:
+            using: mozharness
+            actions: [get-secrets build generate-build-stats]
+            config:
+                - builds/releng_sub_linux_configs/64_stat_and_opt.py
+                - balrog/production.py
+            script: "mozharness/scripts/fx_desktop_build.py"
+            secrets: ['*']
+            tooltool-downloads: public
+            need-xvfb: true
+            keep-artifacts: false
new file mode 100644
--- /dev/null
+++ b/taskcluster/ci/toolchain/kind.yml
@@ -0,0 +1,51 @@
+# Source-checks are tasks that look at the Gecko source directly to check
+# correctness.  This can include linting, Python unit tests, source-code
+# analysis, or measurement work.
+implementation: taskgraph.task.transform:TransformTask
+
+job-defaults:
+    description: toolchain build
+    treeherder:
+        kind: build
+        tier: 2
+        platform: linux64/opt
+    worker-type: aws-provisioner-v1/gecko-{level}-b-linux
+    worker:
+        implementation: docker-worker
+        docker-image: {in-tree: desktop-build}
+        max-run-time: 36000
+
+jobs:
+    clang:
+        # TODO(taskdiff): don't require these for toolchain builds
+        attributes:
+            build_platform: linux64-clang
+            build_type: opt
+        treeherder:
+            symbol: Cc(Clang)
+        run:
+            using: toolchain-script
+            script: build-clang-linux.sh
+        optimizations:
+            only-if-files-changed:
+                - 'build/build-clang/**'
+                - 'taskcluster/scripts/misc/build-clang-linux.sh'
+
+    gcc:
+        # TODO(taskdiff): don't require these for toolchain builds
+        attributes:
+            build_platform: linux64-gcc
+            build_type: opt
+        treeherder:
+            symbol: Cc(GCC)
+        run:
+            using: toolchain-script
+            script: build-gcc-linux.sh
+        optimizations:
+            only-if-files-changed:
+                - 'build/unix/build-gcc/**'
+                - 'taskcluster/scripts/misc/build-gcc-linux.sh'
+
+transforms:
+   - taskgraph.transforms.job:transforms
+   - taskgraph.transforms.task:transforms
new file mode 100644
--- /dev/null
+++ b/taskcluster/ci/upload-symbols/job-template.yml
@@ -0,0 +1,21 @@
+label: # see transforms
+description: Upload Symbols
+attributes: # TODO: necessary?
+    kind: upload-symbols
+dependencies: # see transforms
+expires-after: 7 days
+deadline-after: 24 hours
+run-on-projects:
+    - try
+worker-type: aws-provisioner-v1/symbol-upload
+worker:
+    implementation: docker-worker
+    max-run-time: 600
+    command: ["/bin/bash", "bin/upload.sh"]
+    docker-image: taskclusterprivate/upload_symbols:0.0.3
+    env:
+        GECKO_HEAD_REPOSITORY: # see transforms
+        GECKO_HEAD_REV: # see transforms
+        ARTIFACT_TASKID: {"task-reference": "<build>"}
+scopes:
+    - docker-worker:image:taskclusterprivate/upload_symbols:0.0.3
new file mode 100644
--- /dev/null
+++ b/taskcluster/ci/upload-symbols/kind.yml
@@ -0,0 +1,17 @@
+# Upload-symbols jobs are post-build tasks that upload the symbols files
+# generated by build tasks to Socorro for later use in crash analysis.
+implementation: taskgraph.task.post_build:PostBuildTask
+
+kind-dependencies:
+    - build
+
+job-template: job-template.yml
+
+only-for-build-platforms:
+    - linux64/opt
+    - linux64/debug
+    - android-api-15/opt
+
+transforms:
+   - taskgraph.transforms.upload_symbols:transforms
+   - taskgraph.transforms.task:transforms
new file mode 100644
--- /dev/null
+++ b/taskcluster/ci/valgrind/kind.yml
@@ -0,0 +1,39 @@
+# Valgrind tasks produce builds instrumented by valgrind.
+implementation: taskgraph.task.transform:TransformTask
+
+transforms:
+   - taskgraph.transforms.job:transforms
+   - taskgraph.transforms.task:transforms
+
+jobs:
+    linux64-valgrind/opt:
+        description: "Linux64 Valgrind Opt"
+        attributes: # TODO(taskdiff): remove
+            build_platform: linux64-valgrind
+            build_type: opt
+        index:
+            product: firefox
+            job-name: linux64-valgrind-opt
+        treeherder:
+            platform: linux64/opt
+            symbol: tc(V)
+            kind: build
+            tier: 1
+        worker-type: aws-provisioner-v1/gecko-{level}-b-linux
+        worker:
+            implementation: docker-worker
+            docker-image: {in-tree: desktop-build}
+            max-run-time: 72000
+        run:
+            using: mozharness
+            actions: [get-secrets build valgrind-test generate-build-stats]
+            custom-build-variant-cfg: valgrind
+            config:
+                - builds/releng_base_linux_64_builds.py
+                - balrog/production.py
+            script: "mozharness/scripts/fx_desktop_build.py"
+            secrets: ['*']
+            tooltool-downloads: public
+            need-xvfb: true
+
+
--- a/taskcluster/docs/attributes.rst
+++ b/taskcluster/docs/attributes.rst
@@ -14,16 +14,28 @@ The attributes, and acceptable values, a
 names and values are the short, lower-case form, with underscores.
 
 kind
 ====
 
 A task's ``kind`` attribute gives the name of the kind that generated it, e.g.,
 ``build`` or ``legacy``.
 
+
+run_on_projects
+===============
+
+The projects where this task should be in the target task set.  This is how
+requireements like "only run this on inbound" get implemented.  These are
+either project names or the aliases
+
+ * `integration` -- integration branches
+ * `release` -- release branches including mozilla-central
+ * `all` -- everywhere (the default)
+
 task_duplicates
 ===============
 
 This is used to indicate that we want multiple copies of the task created.
 This feature is used to track down intermittent job failures.
 
 If this value is set to N, the task-creation machinery will create a total of N
 copies of the task.  Only the first copy will be included in the taskgraph
--- a/taskcluster/taskgraph/decision.py
+++ b/taskcluster/taskgraph/decision.py
@@ -27,29 +27,30 @@ logger = logging.getLogger(__name__)
 ARTIFACTS_DIR = 'artifacts'
 GECKO = os.path.realpath(os.path.join(__file__, '..', '..', '..'))
 
 # For each project, this gives a set of parameters specific to the project.
 # See `taskcluster/docs/parameters.rst` for information on parameters.
 PER_PROJECT_PARAMETERS = {
     'try': {
         'target_tasks_method': 'try_option_syntax',
-        # for try, if a task was specified as a target, it should
-        # not be optimized away
-        'optimize_target_tasks': False,
+        # Always perform optimization.  This makes it difficult to use try
+        # pushes to run a task that would otherwise be optimized, but is a
+        # compromise to avoid essentially disabling optimization in try.
+        'optimize_target_tasks': True,
     },
 
     'ash': {
-        'target_tasks_method': 'ash_tasks',
+        'target_tasks_method': 'ash_task',
         'optimize_target_tasks': True,
     },
 
     # the default parameters are used for projects that do not match above.
     'default': {
-        'target_tasks_method': 'all_builds_and_tests',
+        'target_tasks_method': 'default',
         'optimize_target_tasks': True,
     }
 }
 
 
 def taskgraph_decision(options):
     """
     Run the decision task.  This function implements `mach taskgraph decision`,
new file mode 100644
--- /dev/null
+++ b/taskcluster/taskgraph/files_changed.py
@@ -0,0 +1,65 @@
+# This Source Code Form is subject to the terms of the Mozilla Public
+# License, v. 2.0. If a copy of the MPL was not distributed with this
+# file, You can obtain one at http://mozilla.org/MPL/2.0/.
+
+"""
+Support for optimizing tasks based on the set of files that have changed.
+"""
+
+from __future__ import absolute_import, print_function, unicode_literals
+
+import logging
+import requests
+from redo import retry
+from mozpack.path import match as mozpackmatch
+
+logger = logging.getLogger(__name__)
+
+
+# TODO: tests!
+
+
+def get_changed_files(repository, revision, _cache={}):
+    """
+    Get the set of files changed in the push headed by the given revision.
+    Responses are cached, so multiple calls with the same arguments are OK.
+    """
+    if (repository, revision) not in _cache:
+        url = '%s/json-automationrelevance/%s' % (repository.rstrip('/'), revision)
+        logger.debug("Querying version control for metadata: %s", url)
+        response = retry(requests.get, attempts=2, sleeptime=10,
+                         args=(url,), kwargs={'timeout': 5})
+        contents = response.json()
+
+        logger.debug('{} commits influencing task scheduling:'
+                     .format(len(contents['changesets'])))
+        changed_files = set()
+        for c in contents['changesets']:
+            logger.debug(" {cset} {desc}".format(
+                cset=c['node'][0:12],
+                desc=c['desc'].splitlines()[0].encode('ascii', 'ignore')))
+            changed_files |= set(c['files'])
+
+        _cache[repository, revision] = changed_files
+    return _cache[repository, revision]
+
+
+def check(params, file_patterns):
+    """
+    Determine whether any of the files changed in the indicated push to
+    https://hg.mozilla.org match any of the given file patterns.
+    """
+    repository = params.get('head_repository')
+    revision = params.get('head_rev')
+    if not repository or not revision:
+        logger.warning("Missing `head_repository` or `head_rev` parameters; assuming all files have changed")
+        return True
+
+    changed_files = get_changed_files(repository, revision)
+
+    for pattern in file_patterns:
+        for path in changed_files:
+            if mozpackmatch(path, pattern):
+                return True
+
+    return False
new file mode 100644
new file mode 100644
--- /dev/null
+++ b/taskcluster/taskgraph/jobs/base.py
@@ -0,0 +1,70 @@
+# TODO: rename to registry? move to transforms.job?
+
+# This Source Code Form is subject to the terms of the Mozilla Public
+# License, v. 2.0. If a copy of the MPL was not distributed with this
+# file, You can obtain one at http://mozilla.org/MPL/2.0/.
+"""
+Registry for job implementations referenced from job descriptions'
+`job.run.using`.  See taskgraph.transforms.job.
+"""
+
+from __future__ import absolute_import, print_function, unicode_literals
+
+import os
+
+from taskgraph.transforms.base import validate_schema
+
+registry = {}
+
+
+def run_job_using(worker_implementation, run_using, schema=None):
+    """Register the decorated function to set up a task description this task
+    using for the given kind of run (`run_using`) on the given worker
+    implementation (`worker_implementation`).  If given, the job's run field
+    will be verified to match the `schema`.
+
+    The decorated function should have the signature `using_foo(config, job,
+    taskdesc) and should modify the task description in-place.  The skeleton of
+    the task description is already set up, but without a payload."""
+    def wrap(func):
+        for_run_using = registry.setdefault(run_using, {})
+        if worker_implementation in for_run_using:
+            raise Exception("run_job_using({!r}, {!r}) already exists: {!r}".format(
+                run_using, worker_implementation, for_run_using[run_using]))
+        for_run_using[worker_implementation] = (func, schema)
+        return func
+    return wrap
+
+
+def configure_taskdesc_for_run(config, job, taskdesc):
+    """Run the appropriate functino for this job against the given task description.
+
+    This will raise an appropriate error if no function exists, or if the job's
+    run is not valid according to the schema."""
+    run_using = job['run']['using']
+    if run_using not in registry:
+        raise Exception("no functions for run.using {!r}".format(run_using))
+
+    worker_implementation = job['worker']['implementation']
+    if worker_implementation not in registry[run_using]:
+        raise Exception("no functions for run.using {!r} on {!r}".format(
+            run_using, worker_implementation))
+
+    func, schema = registry[run_using][worker_implementation]
+    if schema:
+        job['run'] = validate_schema(
+                schema, job['run'],
+                "In job.run using {!r} for job {!r}:".format(
+                    job['run']['using'], job['label']))
+
+    func(config, job, taskdesc)
+
+
+def import_all():
+    # TODO: enumerate these from some regex over the docs, to make sure everything
+    # is documented?
+    """Import all modules that are siblings of this one, triggering the decorator
+    above in the process."""
+    for f in os.listdir(os.path.dirname(__file__)):
+        if f.endswith('.py') and f != 'base.py':
+            __import__('taskgraph.jobs.' + f[:-3])
new file mode 100644
--- /dev/null
+++ b/taskcluster/taskgraph/jobs/common.py
@@ -0,0 +1,21 @@
+# This Source Code Form is subject to the terms of the Mozilla Public
+# License, v. 2.0. If a copy of the MPL was not distributed with this
+# file, You can obtain one at http://mozilla.org/MPL/2.0/.
+"""
+Common support for various job types.
+"""
+
+from __future__ import absolute_import, print_function, unicode_literals
+
+
+def add_workspace_cache(config, taskdesc, worker):
+    if config.params['project'] != 'try':
+        worker['caches'].append({
+            'type': 'persistent',
+            'name': 'level-{}-{}-build-{}-{}-workspace'.format(
+                config.params['level'], config.params['project'],
+                taskdesc['attributes']['build_platform'],
+                taskdesc['attributes']['build_type'],
+                ),
+            'mount-point': "/home/worker/workspace",
+        })
new file mode 100644
--- /dev/null
+++ b/taskcluster/taskgraph/jobs/hazard.py
@@ -0,0 +1,93 @@
+# This Source Code Form is subject to the terms of the Mozilla Public
+# License, v. 2.0. If a copy of the MPL was not distributed with this
+# file, You can obtain one at http://mozilla.org/MPL/2.0/.
+"""
+Support for running hazard jobs via dedicated scripts
+"""
+
+from __future__ import absolute_import, print_function, unicode_literals
+
+import time
+from voluptuous import Schema, Required, Optional
+
+from taskgraph.jobs.base import run_job_using
+from taskgraph.jobs.common import add_workspace_cache
+
+haz_run_schema = Schema({
+    Required('using'): 'hazard',
+
+    # The command to run within the task image (passed through to the worker)
+    Required('command'): basestring,
+
+    # The tooltool manifest to use; default in the script is used if omitted
+    Optional('tooltool-manifest'): basestring,
+
+    # The mozconfig to use; default in the script is used if omitted
+    Optional('mozconfig'): basestring,
+
+    # If true, this job needs secret access (TODO (taskdiff): remove)
+    Optional('uses-secrets'): bool,
+})
+
+
+@run_job_using("docker-worker", "hazard", schema=haz_run_schema)
+def docker_worker_hazard(config, job, taskdesc):
+    run = job['run']
+
+    worker = taskdesc['worker']
+    worker['artifacts'] = []
+    worker['caches'] = []
+
+    worker['artifacts'].append({
+        'name': 'public/build',
+        'path': '/home/worker/artifacts/',
+        'type': 'directory',
+    })
+
+    worker['caches'].append({
+        'type': 'persistent',
+        'name': 'level-{}-{}-tc-vcs'.format(
+            config.params['level'], config.params['project']),
+        'mount-point': "/home/worker/.tc-vcs",
+    })
+
+    add_workspace_cache(config, taskdesc, worker)
+
+    # include secrets access
+    # TODO (taskdiff): this is unused? remove it
+    if run.get('uses-secrets'):
+        worker['taskcluster-proxy'] = True
+        taskdesc['scopes'].append(
+            'secrets:get:project/releng/gecko/build/level-{}/*'.format(config.params['level']))
+
+    env = worker['env']
+    env.update({
+        'GECKO_BASE_REPOSITORY': config.params['base_repository'],
+        'GECKO_HEAD_REF': config.params['head_rev'],
+        'GECKO_HEAD_REPOSITORY': config.params['head_repository'],
+        'GECKO_HEAD_REV': config.params['head_rev'],
+        'MOZ_BUILD_DATE': time.strftime("%Y%m%d%H%M%S", time.gmtime(config.params['pushdate'])),
+        'MOZ_SCM_LEVEL': config.params['level'],
+    })
+
+    # script parameters
+    if run.get('tooltool-manifest'):
+        env['TOOLTOOL_MANIFEST'] = run['tooltool-manifest']
+    if run.get('mozconfig'):
+        env['MOZCONFIG'] = run['mozconfig']
+
+    # tooltool downloads
+    worker['caches'].append({
+        'type': 'persistent',
+        'name': 'tooltool-cache',
+        'mount-point': '/home/worker/tooltool-cache',
+    })
+    worker['relengapi-proxy'] = True
+    taskdesc['scopes'].extend([
+        'docker-worker:relengapi-proxy:tooltool.download.public',
+    ])
+    env['TOOLTOOL_CACHE'] = '/home/worker/tooltool-cache'
+    env['TOOLTOOL_REPO'] = 'https://github.com/mozilla/build-tooltool'
+    env['TOOLTOOL_REV'] = 'master'
+
+    worker['command'] = ["/bin/bash", "-c", run['command']]
new file mode 100644
--- /dev/null
+++ b/taskcluster/taskgraph/jobs/mach.py
@@ -0,0 +1,29 @@
+# This Source Code Form is subject to the terms of the Mozilla Public
+# License, v. 2.0. If a copy of the MPL was not distributed with this
+# file, You can obtain one at http://mozilla.org/MPL/2.0/.
+"""
+Support for running mach tasks (via run-task)
+"""
+
+from __future__ import absolute_import, print_function, unicode_literals
+
+from taskgraph.jobs.base import run_job_using
+from taskgraph.jobs.run_task import docker_worker_run_task
+from voluptuous import Schema, Required
+
+mach_schema = Schema({
+    Required('using'): 'mach',
+
+    # The mach command (omitting `./mach`) to run
+    Required('mach'): basestring,
+})
+
+
+@run_job_using("docker-worker", "mach", schema=mach_schema)
+def docker_worker_mach(config, job, taskdesc):
+    run = job['run']
+
+    # defer to the run_task implementation
+    run['command'] = 'cd /home/worker/checkouts/gecko && ./mach ' + run['mach']
+    del run['mach']
+    docker_worker_run_task(config, job, taskdesc)
new file mode 100644
--- /dev/null
+++ b/taskcluster/taskgraph/jobs/mozharness.py
@@ -0,0 +1,205 @@
+# This Source Code Form is subject to the terms of the Mozilla Public
+# License, v. 2.0. If a copy of the MPL was not distributed with this
+# file, You can obtain one at http://mozilla.org/MPL/2.0/.
+"""
+
+Support for running jobs via mozharness.  Ideally, most stuff gets run this
+way, and certainly anything using mozharness should use this approach.
+
+"""
+
+from __future__ import absolute_import, print_function, unicode_literals
+
+import time
+from voluptuous import Schema, Required, Optional, Any
+
+from taskgraph.jobs.base import run_job_using
+from taskgraph.jobs.common import add_workspace_cache
+
+SECRET_SCOPE = 'secrets:get:project/releng/gecko/{}/level-{}/{}'
+COALESCE_KEY = 'builds.{project}.{name}'
+
+mozharness_run_schema = Schema({
+    Required('using'): 'mozharness',
+
+    # the mozharness script used to run this task
+    Required('script'): basestring,
+
+    # the config files required for the task
+    Required('config'): [basestring],
+
+    # any additional actions to pass to the mozharness command
+    Optional('actions'): [basestring],
+
+    # any additional options (without leading --) to be passed to mozharness
+    Optional('options'): [basestring],
+
+    # --custom-build-variant-cfg value
+    Optional('custom-build-variant-cfg'): basestring,
+
+    # feature flags -- not available on all worker implementations!
+
+    # If not false, tooltool downloads will be enabled via relengAPIProxy
+    # for either just public files, or all files.
+    Required('tooltool-downloads', default=False): Any(
+        False,
+        'public',
+        'internal',
+    ),
+
+    # The set of secret names to which the task has access; these are prefixed
+    # with `project/releng/gecko/{treeherder.kind}/level-{level}/`.   Setting
+    # this will enable any worker features required and set the task's scopes
+    # appropriately.  Often this is just ['*'].
+    Optional('secrets'): [basestring],
+
+    # If true, taskcluster proxy will be enabled; note that it may also be enabled
+    # automatically e.g., for secrets support
+    Required('taskcluster-proxy', default=False): bool,
+
+    # If true, the build scripts will start Xvfb
+    Required('need-xvfb', default=False): bool,
+
+    # If false, indicate that builds should skip producing artifacts
+    Required('keep-artifacts', default=True): bool,
+})
+
+# can probably use the same transform for docker-engine, too:
+# @run_job_using("docker-engine", "mozharness-via-build.sh")
+
+
+@run_job_using("docker-worker", "mozharness", schema=mozharness_run_schema)
+def mozharness_on_docker_worker_setup(config, job, taskdesc):
+    run = job['run']
+
+    worker = taskdesc['worker']
+    worker['implementation'] = job['worker']['implementation']
+
+    # mozharness-via-build.sh assumes desktop-build (which contains build.sh)
+    taskdesc['dependencies']['docker-image'] = 'build-docker-image-desktop-build'
+
+    worker['relengapi-proxy'] = False  # but maybe enabled for tooltool below
+    worker['taskcluster-proxy'] = run.get('taskcluster-proxy')
+
+    worker['artifacts'] = [{
+        'name': 'public/build',
+        'path': '/home/worker/artifacts/',
+        'type': 'directory',
+    }]
+
+    worker['caches'] = [{
+        'type': 'persistent',
+        'name': 'level-{}-{}-tc-vcs'.format(
+            config.params['level'], config.params['project']),
+        'mount-point': "/home/worker/.tc-vcs",
+    }]
+
+    add_workspace_cache(config, taskdesc, worker)
+
+    env = worker.setdefault('env', {})
+    env.update({
+        'GECKO_BASE_REPOSITORY': config.params['base_repository'],
+        'GECKO_HEAD_REF': config.params['head_rev'],
+        'GECKO_HEAD_REPOSITORY': config.params['head_repository'],
+        'GECKO_HEAD_REV': config.params['head_rev'],
+        'MOZHARNESS_CONFIG': ' '.join(run['config']),
+        'MOZHARNESS_SCRIPT': run['script'],
+        'MH_BRANCH': config.params['project'],
+        'MH_BUILD_POOL': 'taskcluster',
+        'MOZ_BUILD_DATE': time.strftime("%Y%m%d%H%M%S", time.gmtime(config.params['pushdate'])),
+        'MOZ_SCM_LEVEL': config.params['level'],
+    })
+
+    if 'actions' in run:
+        env['MOZHARNESS_ACTIONS'] = ' '.join(run['actions'])
+
+    # TODO: turn these into `extra-options` like for tests
+    if 'options' in run:
+        env['MOZHARNESS_OPTIONS'] = ' '.join(run['options'])
+
+    if 'custom-build-variant-cfg' in run:
+        env['MH_CUSTOM_BUILD_VARIANT_CFG'] = run['custom-build-variant-cfg']
+
+    # if we're not keeping artifacts, set some env variables to empty values
+    # that will cause the build process to skip copying the results to the
+    # artifacts directory.  This will have no effect for operations that are
+    # not builds.
+    if not run['keep-artifacts']:
+        env['DIST_TARGET_UPLOADS'] = ''
+        env['DIST_UPLOADS'] = ''
+
+    # Xvfb
+    if run['need-xvfb']:
+        env['NEED_XVFB'] = 'true'
+
+    # coalesce / superseding
+    if 'coalesce-name' in job and int(config.params['level']) > 1:
+        key = COALESCE_KEY.format(
+            project=config.params['project'],
+            name=job['coalesce-name'])
+        worker["superseder-url"] = "https://coalesce.mozilla-releng.net/v1/list/" + key
+
+    # tooltool downloads
+    if run['tooltool-downloads']:
+        worker['relengapi-proxy'] = True
+        worker['caches'].append({
+            'type': 'persistent',
+            'name': 'tooltool-cache',
+            'mount-point': '/home/worker/tooltool-cache',
+        })
+        taskdesc['scopes'].extend([
+            'docker-worker:relengapi-proxy:tooltool.download.public',
+        ])
+        if run['tooltool-downloads'] == 'internal':
+            taskdesc['scopes'].append(
+                'docker-worker:relengapi-proxy:tooltool.download.internal')
+        env['TOOLTOOL_CACHE'] = '/home/worker/tooltool-cache'
+        env['TOOLTOOL_REPO'] = 'https://github.com/mozilla/build-tooltool'
+        env['TOOLTOOL_REV'] = 'master'
+
+    if run.get('secrets'):
+        worker['taskcluster-proxy'] = True
+        for sec in run['secrets']:
+            taskdesc['scopes'].append(SECRET_SCOPE.format(
+                job['treeherder']['kind'], config.params['level'], sec))
+
+    worker['command'] = ["/bin/bash", "bin/build.sh"]
+
+
+@run_job_using("generic-worker", "mozharness", schema=mozharness_run_schema)
+def mozharness_on_generic_worker_setup(config, job, taskdesc):
+    run = job['run']
+
+    # TODO: fail if invalid run options are included
+
+    worker = taskdesc['worker']
+
+    worker['artifacts'] = [{
+        'path': r'public\build',
+        'type': 'directory',
+    }]
+
+    env = worker['env']
+    env.update({
+        'GECKO_BASE_REPOSITORY': config.params['base_repository'],
+        'GECKO_HEAD_REF': config.params['head_rev'],
+        'GECKO_HEAD_REPOSITORY': config.params['head_repository'],
+        'GECKO_HEAD_REV': config.params['head_rev'],
+        'MOZ_BUILD_DATE': time.strftime("%Y%m%d%H%M%S", time.gmtime(config.params['pushdate'])),
+        'MOZ_SCM_LEVEL': config.params['level'],
+        'TOOLTOOL_REPO': 'https://github.com/mozilla/build-tooltool',
+        'TOOLTOOL_REV': 'master',
+    })
+
+    mh_command = [r"c:\mozilla-build\python\python.exe"]
+    mh_command.append(r".\build\src\testing\mozharness\scripts\fx_desktop_build.py")
+    for cfg in run['config']:
+        mh_command.append(r"--config " + cfg.replace('/', '\\'))
+    mh_command.append(r"--branch " + config.params['project'])
+    mh_command.append(r"--skip-buildbot-actions --work-dir %cd:Z:=z:%\build")
+    worker['command'] = [
+        r"mkdir .\build\src",
+        r"hg share c:\builds\hg-shared\mozilla-central .\build\src",
+        r"hg pull -u -R .\build\src --rev %GECKO_HEAD_REV% %GECKO_HEAD_REPOSITORY%",
+        " ".join(mh_command),
+    ]
new file mode 100644
--- /dev/null
+++ b/taskcluster/taskgraph/jobs/mulet.py
@@ -0,0 +1,150 @@
+# This Source Code Form is subject to the terms of the Mozilla Public
+# License, v. 2.0. If a copy of the MPL was not distributed with this
+# file, You can obtain one at http://mozilla.org/MPL/2.0/.
+"""
+Support for running mulet tasks via build-mulet-linux.sh
+"""
+
+from __future__ import absolute_import, print_function, unicode_literals
+
+import time
+from voluptuous import Schema, Required
+
+from taskgraph.jobs.base import run_job_using
+from taskgraph.jobs.common import add_workspace_cache
+
+COALESCE_KEY = 'builds.{project}.{name}'
+
+build_mulet_linux_schema = Schema({
+    Required('using'): 'mach-via-build-mulet-linux.sh',
+
+    # The pathname of the mozconfig to use
+    Required('mozconfig'): basestring,
+
+    # The tooltool manifest to use
+    Required('tooltool-manifest'): basestring,
+})
+
+
+@run_job_using("docker-worker", "mach-via-build-mulet-linux.sh", schema=build_mulet_linux_schema)
+def docker_worker_make_via_build_mulet_linux_sh(config, job, taskdesc):
+    run = job['run']
+    worker = taskdesc.get('worker')
+
+    # assumes desktop-build (which contains the gecko checkout command)
+    taskdesc['dependencies']['docker-image'] = 'build-docker-image-desktop-build'
+
+    worker['taskcluster-proxy'] = False
+
+    worker['artifacts'] = [{
+        'name': 'public/build',
+        'path': '/home/worker/artifacts/',
+        'type': 'directory',
+    }]
+
+    # TODO: factor out common code here
+
+    worker['caches'] = [{
+        'type': 'persistent',
+        'name': 'level-{}-{}-tc-vcs'.format(
+            config.params['level'], config.params['project']),
+        'mount-point': "/home/worker/.tc-vcs",
+    }]
+
+    add_workspace_cache(config, taskdesc, worker)
+
+    env = worker.setdefault('env', {})
+    env.update({
+        'GECKO_BASE_REPOSITORY': config.params['base_repository'],
+        'GECKO_HEAD_REF': config.params['head_rev'],
+        'GECKO_HEAD_REPOSITORY': config.params['head_repository'],
+        'GECKO_HEAD_REV': config.params['head_rev'],
+        'MOZ_BUILD_DATE': time.strftime("%Y%m%d%H%M%S", time.gmtime(config.params['pushdate'])),
+        'MOZ_SCM_LEVEL': config.params['level'],
+    })
+
+    env['MOZCONFIG'] = run['mozconfig']
+    env['TOOLTOOL_MANIFEST'] = run['tooltool-manifest']
+
+    # coalesce / superseding
+    if 'coalesce-name' in job and int(config.params['level']) > 1:
+        key = COALESCE_KEY.format(
+            project=config.params['project'],
+            name=job['coalesce-name'])
+        worker["superseder-url"] = "https://coalesce.mozilla-releng.net/v1/list/" + key
+
+    # tooltool downloads
+    worker['relengapi-proxy'] = True
+    worker['caches'].append({
+        'type': 'persistent',
+        'name': 'tooltool-cache',
+        # N.B. different from build.sh
+        # TODO: grepping suggests this isn't used..
+        'mount-point': '/home/worker/tools/tooltool-cache',
+    })
+    taskdesc['scopes'].extend([
+        'docker-worker:relengapi-proxy:tooltool.download.public',
+    ])
+    env['TOOLTOOL_REPO'] = 'https://github.com/mozilla/build-tooltool'
+    env['TOOLTOOL_REV'] = 'master'
+
+    worker['command'] = [
+        "/bin/bash",
+        "-c",
+        "checkout-gecko workspace"
+        " && cd ./workspace/gecko/taskcluster/scripts/builder"
+        " && buildbot_step 'Build' ./build-mulet-linux.sh $HOME/workspace",
+    ]
+
+mulet_simulator_schema = Schema({
+    Required('using'): 'mulet-simulator',
+
+    # The shell command to run with `bash -exc`.  This will have parameters
+    # substituted for {..} and will be enclosed in a {task-reference: ..} block
+    # so it can refer to the parent task as <build>
+    Required('shell-command'): basestring,
+})
+
+
+@run_job_using("docker-worker", "mulet-simulator", schema=mulet_simulator_schema)
+def docker_worker_mulet_simulator(config, job, taskdesc):
+    run = job['run']
+    worker = taskdesc.get('worker')
+
+    # assumes desktop-build (which contains the gecko checkout command)
+    taskdesc['dependencies']['docker-image'] = 'build-docker-image-desktop-build'
+
+    worker['taskcluster-proxy'] = False
+
+    worker['artifacts'] = [{
+        'name': 'public/build',
+        'path': '/home/worker/artifacts/',
+        'type': 'directory',
+    }]
+
+    # TODO: factor out common code here
+
+    env = worker.setdefault('env', {})
+    env.update({
+        'GECKO_BASE_REPOSITORY': config.params['base_repository'],
+        'GECKO_HEAD_REF': config.params['head_rev'],
+        'GECKO_HEAD_REPOSITORY': config.params['head_repository'],
+        'GECKO_HEAD_REV': config.params['head_rev'],
+    })
+
+    taskdesc.setdefault('routes', []).extend([
+        'index.gecko.v1.{project}.latest.simulator.opt'.format(**config.params),
+    ])
+
+    # TODO(taksdiff): has the scope for this cache, but not the cache
+    taskdesc.setdefault('scopes', []).extend([
+        'docker-worker:cache:level-{level}-{project}-tc-vcs'.format(**config.params),
+    ])
+
+    shell_command = run['shell-command'].format(**config.params)
+
+    worker['command'] = [
+        "/bin/bash",
+        "-exc",
+        {'task-reference': shell_command},
+    ]
new file mode 100644
--- /dev/null
+++ b/taskcluster/taskgraph/jobs/run_task.py
@@ -0,0 +1,66 @@
+# This Source Code Form is subject to the terms of the Mozilla Public
+# License, v. 2.0. If a copy of the MPL was not distributed with this
+# file, You can obtain one at http://mozilla.org/MPL/2.0/.
+"""
+Support for running jobs that are invoked via the `run-task` script.
+"""
+
+from __future__ import absolute_import, print_function, unicode_literals
+
+import copy
+
+from taskgraph.jobs.base import run_job_using
+from voluptuous import Schema, Required, Any
+
+run_task_schema = Schema({
+    Required('using'): 'run-task',
+
+    # if true, add a cache at ~worker/.cache, which is where things like pip
+    # tend to hide their caches.  This cache never added for level-1 jobs.
+    Required('cache-dotcache', default=False): bool,
+
+    # The command arguments to pass to the `run-task` script, after the
+    # checkout arguments.  If a list, it will be passed directly; otherwise
+    # it will be included in a single argument to `bash -cx`.
+    Required('command'): Any([basestring], basestring),
+})
+
+
+@run_job_using("docker-worker", "run-task", schema=run_task_schema)
+def docker_worker_run_task(config, job, taskdesc):
+    run = job['run']
+
+    worker = taskdesc['worker'] = copy.deepcopy(job['worker'])
+
+    worker['caches'] = [{
+        'type': 'persistent',
+        'name': 'level-{}-hg-shared'.format(config.params['level']),
+        'mount-point': "/home/worker/hg-shared",
+    }, {
+        'type': 'persistent',
+        'name': 'level-{}-checkouts'.format(config.params['level']),
+        'mount-point': "/home/worker/checkouts",
+    }]
+
+    if run.get('cache-dotcache') and int(config.params['level']) > 1:
+        worker['caches'].append({
+            'type': 'persistent',
+            'name': 'level-{level}-{project}-dotcache'.format(**config.params),
+            'mount-point': '/home/worker/.cache',
+        })
+
+    env = worker['env'] = {}
+    env.update({
+        'GECKO_BASE_REPOSITORY': config.params['base_repository'],
+        'GECKO_HEAD_REPOSITORY': config.params['head_repository'],
+        'GECKO_HEAD_REV': config.params['head_rev'],
+    })
+
+    run_command = run['command']
+    if isinstance(run_command, basestring):
+        run_command = ['bash', '-cx', run_command]
+    worker['command'] = [
+        "/home/worker/bin/run-task",
+        "--vcs-checkout=/home/worker/checkouts/gecko",
+        "--",
+    ] + run_command
new file mode 100644
--- /dev/null
+++ b/taskcluster/taskgraph/jobs/spidermonkey.py
@@ -0,0 +1,99 @@
+# This Source Code Form is subject to the terms of the Mozilla Public
+# License, v. 2.0. If a copy of the MPL was not distributed with this
+# file, You can obtain one at http://mozilla.org/MPL/2.0/.
+"""
+Support for running spidermonkey jobs via dedicated scripts
+"""
+
+from __future__ import absolute_import, print_function, unicode_literals
+
+import time
+from voluptuous import Schema, Required, Optional, Any
+
+from taskgraph.jobs.base import run_job_using
+
+sm_run_schema = Schema({
+    Required('using'): Any('spidermonkey', 'spidermonkey-package'),
+
+    # The SPIDERMONKEY_VARIANT
+    Required('spidermonkey-variant'): basestring,
+
+    # The tooltool manifest to use; default from sm-tooltool-config.sh  is used
+    # if omitted
+    Optional('tooltool-manifest'): basestring,
+})
+
+
+@run_job_using("docker-worker", "spidermonkey")
+@run_job_using("docker-worker", "spidermonkey-package")
+def docker_worker_spidermonkey(config, job, taskdesc, schema=sm_run_schema):
+    run = job['run']
+
+    worker = taskdesc['worker']
+    worker['artifacts'] = []
+    worker['caches'] = []
+
+    worker['artifacts'].append({
+        'name': 'public/build',
+        'path': '/home/worker/artifacts/',
+        'type': 'directory',
+    })
+
+    worker['caches'].append({
+        'type': 'persistent',
+        'name': 'level-{}-{}-tc-vcs'.format(
+            config.params['level'], config.params['project']),
+        'mount-point': "/home/worker/.tc-vcs",
+    })
+    
+    if int(config.params['level']) > 1:
+        worker['caches'].append({
+            'type': 'persistent',
+            'name': 'level-{}-{}-build-spidermonkey-workspace'.format(
+                config.params['level'], config.params['project']),
+            'mount-point': "/home/worker/workspace",
+        })
+
+    # include secrets access
+    # TODO (taskdiff): this is unused? remove it
+    worker['taskcluster-proxy'] = True
+    taskdesc['scopes'].append(
+        'secrets:get:project/releng/gecko/build/level-{}/*'.format(config.params['level']))
+
+    env = worker['env']
+    env.update({
+        'GECKO_BASE_REPOSITORY': config.params['base_repository'],
+        'GECKO_HEAD_REF': config.params['head_rev'],
+        'GECKO_HEAD_REPOSITORY': config.params['head_repository'],
+        'GECKO_HEAD_REV': config.params['head_rev'],
+        'MOZHARNESS_DISABLE': 'true',
+        'TOOLS_DISABLE': 'true',
+        'SPIDERMONKEY_VARIANT': run['spidermonkey-variant'],
+        'MOZ_BUILD_DATE': time.strftime("%Y%m%d%H%M%S", time.gmtime(config.params['pushdate'])),
+        'MOZ_SCM_LEVEL': config.params['level'],
+    })
+
+    # tooltool downloads; note that this script downloads using the API
+    # endpoiint directly, rather than via relengapi-proxy
+    worker['caches'].append({
+        'type': 'persistent',
+        'name': 'tooltool-cache',
+        'mount-point': '/home/worker/tooltool-cache',
+    })
+    env['TOOLTOOL_CACHE'] = '/home/worker/tooltool-cache'
+    env['TOOLTOOL_REPO'] = 'https://github.com/mozilla/build-tooltool'
+    env['TOOLTOOL_REV'] = 'master'
+    if run.get('tooltool-manifest'):
+        env['TOOLTOOL_MANIFEST'] = run['tooltool-manifest']
+
+    script = "build-sm.sh"
+    if run['using'] == 'spidermonkey-package':
+        script = "build-sm-package.sh"
+
+    worker['command'] = [
+        "/bin/bash",
+        "-c",
+        "cd /home/worker/ "
+        "&& ./bin/checkout-sources.sh "
+        "&& ./workspace/build/src/taskcluster/scripts/builder/" + script
+    ]
new file mode 100644
--- /dev/null
+++ b/taskcluster/taskgraph/jobs/toolchain.py
@@ -0,0 +1,70 @@
+# This Source Code Form is subject to the terms of the Mozilla Public
+# License, v. 2.0. If a copy of the MPL was not distributed with this
+# file, You can obtain one at http://mozilla.org/MPL/2.0/.
+"""
+Support for running toolchain-building jobs via dedicated scripts
+"""
+
+from __future__ import absolute_import, print_function, unicode_literals
+
+import time
+from voluptuous import Schema, Required
+
+from taskgraph.jobs.base import run_job_using
+
+toolchain_run_schema = Schema({
+    Required('using'): 'toolchain-script',
+
+    # the script (in taskcluster/scripts/misc) to run
+    Required('script'): basestring,
+})
+
+
+@run_job_using("docker-worker", "toolchain-script", schema=toolchain_run_schema)
+def docker_worker_toolchain(config, job, taskdesc):
+    run = job['run']
+
+    worker = taskdesc['worker']
+    worker['artifacts'] = []
+    worker['caches'] = []
+
+    worker['artifacts'].append({
+        'name': 'public',
+        'path': '/home/worker/workspace/artifacts/',
+        'type': 'directory',
+    })
+    worker['caches'].append({
+        'type': 'persistent',
+        'name': 'level-{}-{}-tc-vcs'.format(
+            config.params['level'], config.params['project']),
+        'mount-point': "/home/worker/.tc-vcs",
+    })
+
+    env = worker['env']
+    env.update({
+        'GECKO_BASE_REPOSITORY': config.params['base_repository'],
+        'GECKO_HEAD_REF': config.params['head_rev'],
+        'GECKO_HEAD_REPOSITORY': config.params['head_repository'],
+        'GECKO_HEAD_REV': config.params['head_rev'],
+        'MOZ_BUILD_DATE': time.strftime("%Y%m%d%H%M%S", time.gmtime(config.params['pushdate'])),
+        'MOZ_SCM_LEVEL': config.params['level'],
+        'TOOLS_DISABLE': 'true',
+    })
+
+    # tooltool downloads; note that this downloads using the API endpoint directly,
+    # rather than via relengapi-proxy
+    worker['caches'].append({
+        'type': 'persistent',
+        'name': 'tooltool-cache',
+        'mount-point': '/home/worker/tooltool-cache',
+    })
+    env['TOOLTOOL_CACHE'] = '/home/worker/tooltool-cache'
+    env['TOOLTOOL_REPO'] = 'https://github.com/mozilla/build-tooltool'
+    env['TOOLTOOL_REV'] = 'master'
+
+    command = ' && '.join([
+        "cd /home/worker/",
+        "./bin/checkout-sources.sh",
+        "./workspace/build/src/taskcluster/scripts/misc/" + run['script'],
+    ])
+    worker['command'] = ["/bin/bash", "-c", command]
--- a/taskcluster/taskgraph/target_tasks.py
+++ b/taskcluster/taskgraph/target_tasks.py
@@ -1,22 +1,27 @@
 # -*- coding: utf-8 -*-
 
 # This Source Code Form is subject to the terms of the Mozilla Public
 # License, v. 2.0. If a copy of the MPL was not distributed with this
 # file, You can obtain one at http://mozilla.org/MPL/2.0/.
 
 from __future__ import absolute_import, print_function, unicode_literals
 from taskgraph import try_option_syntax
-from taskgraph.util.attributes import attrmatch
+
+INTEGRATION_PROJECTS = set([
+    'mozilla-inbound',
+    'autoland',
+])
 
-BUILD_AND_TEST_KINDS = set([
-    'legacy',  # builds
-    'desktop-test',
-    'android-test',
+RELEASE_PROJECTS = set([
+    'mozilla-central',
+    'mozilla-aurora',
+    'mozilla-beta',
+    'mozilla-release',
 ])
 
 _target_task_methods = {}
 
 
 def _target_task(name):
     def wrap(func):
         _target_task_methods[name] = func
@@ -50,37 +55,47 @@ def target_tasks_try_option_syntax(full_
         for l in target_tasks_labels:
             task = full_task_graph[l]
             if 'unittest_suite' in task.attributes:
                 task.attributes['task_duplicates'] = options.trigger_tests
 
     return target_tasks_labels
 
 
-@_target_task('all_builds_and_tests')
-def target_tasks_all_builds_and_tests(full_task_graph, parameters):
-    """Trivially target all build and test tasks.  This is used for
-    branches where we want to build "everyting", but "everything"
-    does not include uninteresting things like docker images"""
+@_target_task('all_builds_and_tests')  # TODO: remove (old name)
+@_target_task('default')
+def target_tasks_default(full_task_graph, parameters):
+    """Target the tasks which have indicated they should be run on this project
+    via the `run_on_projects` attributes."""
     def filter(task):
-        return t.attributes.get('kind') in BUILD_AND_TEST_KINDS
+        run_on_projects = set(t.attributes.get('run_on_projects', []))
+        if 'all' in run_on_projects:
+            return True
+        project = parameters['project']
+        if 'integration' in run_on_projects:
+            if project in INTEGRATION_PROJECTS:
+                return True
+        if 'release' in run_on_projects:
+            if project in RELEASE_PROJECTS:
+                return True
+        return project in run_on_projects
     return [l for l, t in full_task_graph.tasks.iteritems() if filter(t)]
 
-
 @_target_task('ash_tasks')
-def target_tasks_ash_tasks(full_task_graph, parameters):
-    """Special case for builds on ash."""
+def target_tasks_ash(full_task_graph, parameters):
+    """Target tasks that only run on the ash branch."""
     def filter(task):
-        # NOTE: on the ash branch, update taskcluster/ci/desktop-test/tests.yml to
-        # run the M-dt-e10s tasks
-        attrs = t.attributes
-        if attrs.get('kind') not in BUILD_AND_TEST_KINDS:
+        platform = task.attributes.get('build_platform')
+        # only select platforms
+        if platform not in ('linux64', 'linux64-asan', 'linux64-pgo'):
             return False
-        if not attrmatch(attrs, build_platform=set([
-            'linux64',
-            'linux64-asan',
-            'linux64-pgo',
-        ])):
+        # and none of htis linux64-asan/debug stuff
+        if platform == 'linux64-asan' and task.attributes['build_type'] == 'debug':
             return False
-        if not attrmatch(attrs, e10s=True):
+        # no non-et10s tests
+        if task.attributes.get('unittest_suite') or task.attributes.get('talos_siute'):
+            if not task.attributes.get('e10s'):
+                return False
+        # don't upload symbols
+        if task.attributes['kind'] == 'upload-symbols':
             return False
         return True
     return [l for l, t in full_task_graph.tasks.iteritems() if filter(t)]
--- a/taskcluster/taskgraph/task/legacy.py
+++ b/taskcluster/taskgraph/task/legacy.py
@@ -517,17 +517,21 @@ class LegacyTask(base.Task):
                                             build_parameters['pushlog_id'])
             if not build.get('is_job'):
                 decorate_task_json_routes(build_task['task'],
                                           json_routes,
                                           build_parameters)
 
             # Ensure each build graph is valid after construction.
             validate_build_task(build_task)
-            attributes = build_task['attributes'] = {'kind': 'legacy', 'legacy_kind': 'build'}
+            attributes = build_task['attributes'] = {
+                'kind': 'legacy',
+                'legacy_kind': 'build',
+                'run_on_projects': ['all'],
+            }
             if 'build_name' in build:
                 attributes['build_platform'] = build['build_name']
             if 'build_type' in task_extra:
                 attributes['build_type'] = {'dbg': 'debug'}.get(task_extra['build_type'],
                                                                 task_extra['build_type'])
             if build.get('is_job'):
                 attributes['job'] = build['build_name']
                 attributes['legacy_kind'] = 'job'
new file mode 100644
--- /dev/null
+++ b/taskcluster/taskgraph/task/post_build.py
@@ -0,0 +1,62 @@
+# This Source Code Form is subject to the terms of the Mozilla Public License,
+# v. 2.0. If a copy of the MPL was not distributed with this file, You can
+# obtain one at http://mozilla.org/MPL/2.0/.
+
+from __future__ import absolute_import, print_function, unicode_literals
+
+import copy
+import logging
+import os
+import yaml
+
+from . import transform
+
+logger = logging.getLogger(__name__)
+
+
+class PostBuildTask(transform.TransformTask):
+    """
+    A task implementing a post-build jobs.  These depend on jobs and perform
+    various followup tasks after a build has completed.
+
+    The `only-for-build-platforms` kind configuration, if specified, will limit
+    the build platforms for which a post-build task will be created.
+
+    The `job-template' kind configuration points to a yaml file which will
+    be used to create the input to the transforms.  It will have added to it
+    keys `build-label`, the label for the build task, and `build-platform`, its
+    platform.
+    """
+
+    @classmethod
+    def get_inputs(cls, kind, path, config, params, loaded_tasks):
+        if config.get('kind-dependencies', []) != ["build"]:
+            raise Exception("PostBuildTask kinds must depend on builds")
+
+        only_platforms = config.get('only-for-build-platforms')
+        prototype = load_yaml(path, config.get('job-template'))
+
+        for task in loaded_tasks:
+            if task.kind != 'build':
+                continue
+
+            build_platform = task.attributes.get('build_platform')
+            build_type = task.attributes.get('build_type')
+            if not build_platform or not build_type:
+                continue
+            platform = "{}/{}".format(build_platform, build_type)
+            if only_platforms and platform not in only_platforms:
+                continue
+
+            post_task = copy.deepcopy(prototype)
+            post_task['build-label'] = task.label
+            post_task['build-platform'] = platform
+            yield post_task
+
+
+# TODO: move this somewhere useful; it appears in other kind impls too
+def load_yaml(path, name):
+    """Convenience method to load a YAML file in the kind directory"""
+    filename = os.path.join(path, name)
+    with open(filename, "rb") as f:
+        return yaml.load(f)
--- a/taskcluster/taskgraph/task/test.py
+++ b/taskcluster/taskgraph/task/test.py
@@ -58,19 +58,16 @@ class TestTask(transform.TransformTask):
     def get_builds_by_platform(cls, dep_kind, loaded_tasks):
         """Find the build tasks on which tests will depend, keyed by
         platform/type.  Returns a dictionary mapping build platform to task
         label."""
         builds_by_platform = {}
         for task in loaded_tasks:
             if task.kind != dep_kind:
                 continue
-            # remove this check when builds are no longer legacy
-            if task.attributes['legacy_kind'] != 'build':
-                continue
 
             build_platform = task.attributes.get('build_platform')
             build_type = task.attributes.get('build_type')
             if not build_platform or not build_type:
                 continue
             platform = "{}/{}".format(build_platform, build_type)
             if platform in builds_by_platform:
                 raise Exception("multiple build jobs for " + platform)
@@ -111,13 +108,14 @@ class TestTask(transform.TransformTask):
                     "Test set '{}' for test platform {} is not defined".format(
                         test_set, test_platform))
             test_names = test_sets_cfg[test_set]
             rv[test_platform] = cfg.copy()
             rv[test_platform]['test-names'] = test_names
         return rv
 
 
+# TODO: move this somewhere useful; it appears in other kind impls too
 def load_yaml(path, name):
     """Convenience method to load a YAML file in the kind directory"""
     filename = os.path.join(path, name)
     with open(filename, "rb") as f:
         return yaml.load(f)
--- a/taskcluster/taskgraph/task/transform.py
+++ b/taskcluster/taskgraph/task/transform.py
@@ -2,51 +2,59 @@
 # License, v. 2.0. If a copy of the MPL was not distributed with this
 # file, You can obtain one at http://mozilla.org/MPL/2.0/.
 
 from __future__ import absolute_import, print_function, unicode_literals
 
 import logging
 import os
 import yaml
+import itertools
 
 from . import base
+from .. import files_changed
 from ..util.python_path import find_object
+from ..util.templates import merge
 from ..transforms.base import TransformSequence, TransformConfig
 
 logger = logging.getLogger(__name__)
 
 
 class TransformTask(base.Task):
     """
     Tasks of this class are generated by applying transformations to a sequence
     of input entities.  By default, it gets those inputs from YAML data in the
-    kind directory, but subclasses may override `get_inputs` to produce them
-    in some other way.
+    kind directory, but subclasses may override `get_inputs` to produce them in
+    some other way.
     """
 
     @classmethod
     def get_inputs(cls, kind, path, config, params, loaded_tasks):
         """
         Get the input elements that will be transformed into tasks.  The
         elements themselves are free-form, and become the input to the first
         transform.
 
         By default, this reads jobs from the `jobs` key, or from yaml files
-        named by `jobs-from`, but can be overridden in subclasses.  The
-        entities are read from mappings, and the keys to those mappings are
-        added in the `name` key of each entity.
+        named by `jobs-from`.  The entities are read from mappings, and the
+        keys to those mappings are added in the `name` key of each entity.
+
+        This method can be overridden in subclasses that need to perform more
+        complex calculations to generate the list of inputs.
         """
         def jobs():
-            for name, job in config.get('jobs', {}).iteritems():
+            defaults = config.get('job-defaults')
+            jobs = config.get('jobs', {}).iteritems()
+            jobs_from = itertools.chain.from_iterable(
+                load_yaml(path, filename).iteritems()
+                for filename in config.get('jobs-from', {}))
+            for name, job in itertools.chain(jobs, jobs_from):
+                if defaults:
+                    job = merge(defaults, job)
                 yield name, job
-            for filename in config.get('jobs-from', {}):
-                jobs = load_yaml(path, filename)
-                for name, job in jobs.iteritems():
-                    yield name, job
 
         for name, job in jobs():
             job['name'] = name
             logger.debug("Generating tasks for {} {}".format(kind, name))
             yield job
 
     @classmethod
     def load_tasks(cls, kind, path, config, params, loaded_tasks):
@@ -54,28 +62,36 @@ class TransformTask(base.Task):
 
         transforms = TransformSequence()
         for xform_path in config['transforms']:
             transform = find_object(xform_path)
             transforms.add(transform)
 
         # perform the transformations
         trans_config = TransformConfig(kind, path, config, params)
-        tasks = [cls(kind, t) for t in transforms(trans_config, inputs)]
+        tasks = [cls(kind, t, params) for t in transforms(trans_config, inputs)]
         return tasks
 
-    def __init__(self, kind, task):
+    def __init__(self, kind, task, params):
         self.dependencies = task['dependencies']
+        self.optimizations = task['optimizations']
+        # TODO: this will probably make JSON/un-JSON not work
+        self.params = params
         super(TransformTask, self).__init__(kind, task['label'],
                                             task['attributes'], task['task'])
 
     def get_dependencies(self, taskgraph):
         return [(label, name) for name, label in self.dependencies.items()]
 
     def optimize(self):
+        if 'only-if-files-changed' in self.optimizations:
+            changed = files_changed.check(
+                    self.params, self.optimizations['only-if-files-changed'])
+            if not changed:
+                return True, None
         return False, None
 
 
 def load_yaml(path, name):
     """Convenience method to load a YAML file in the kind directory"""
     filename = os.path.join(path, name)
     with open(filename, "rb") as f:
         return yaml.load(f)
--- a/taskcluster/taskgraph/transforms/base.py
+++ b/taskcluster/taskgraph/transforms/base.py
@@ -1,15 +1,16 @@
 # This Source Code Form is subject to the terms of the Mozilla Public
 # License, v. 2.0. If a copy of the MPL was not distributed with this
 # file, You can obtain one at http://mozilla.org/MPL/2.0/.
 
 from __future__ import absolute_import, print_function, unicode_literals
 
 import re
+import pprint
 import voluptuous
 
 
 class TransformConfig(object):
     """A container for configuration affecting transforms.  The `config`
     argument to transforms is an instance of this class, possibly with
     additional kind-specific attributes beyond those set here."""
     def __init__(self, kind, path, config, params):
@@ -64,17 +65,17 @@ def validate_schema(schema, obj, msg_pre
     beginning with msg_prefix.
     """
     try:
         return schema(obj)
     except voluptuous.MultipleInvalid as exc:
         msg = [msg_prefix]
         for error in exc.errors:
             msg.append(str(error))
-        raise Exception('\n'.join(msg))
+        raise Exception('\n'.join(msg) + '\n' + pprint.pformat(obj))
 
 
 def get_keyed_by(item, field, item_name, subfield=None):
     """
     For values which can either accept a literal value, or be keyed by some
     other attribute of the item, perform that lookup.  For example, this supports
 
         chunks:
new file mode 100644
--- /dev/null
+++ b/taskcluster/taskgraph/transforms/build.py
@@ -0,0 +1,46 @@
+# This Source Code Form is subject to the terms of the Mozilla Public
+# License, v. 2.0. If a copy of the MPL was not distributed with this
+# file, You can obtain one at http://mozilla.org/MPL/2.0/.
+"""
+
+Apply some defaults and minor modifications to the jobs defined in the build
+kind.
+
+"""
+
+from __future__ import absolute_import, print_function, unicode_literals
+
+from taskgraph.transforms.base import TransformSequence
+
+transforms = TransformSequence()
+
+
+@transforms.add
+def set_defaults(config, jobs):
+    """Set defaults, including those that differ per worker implementation"""
+    for job in jobs:
+        job['treeherder'].setdefault('kind', 'build')
+        job['treeherder'].setdefault('tier', 1)
+        if job['worker']['implementation'] in ('docker-worker', 'docker-engine'):
+            job['worker'].setdefault('docker-image', {'in-tree': 'desktop-build'})
+        yield job
+
+
+@transforms.add
+def set_build_attributes(config, jobs):
+    """Set the build_platform and build_type attributes based on the job name"""
+    for job in jobs:
+        build_platform, build_type = job['name'].split('/')
+
+        # pgo builds are represented as a different platform, type opt
+        if build_type == 'pgo':
+            build_platform = build_platform + '-pgo'
+            build_type = 'opt'
+
+        attributes = job.setdefault('attributes', {})
+        attributes.update({
+            'build_platform': build_platform,
+            'build_type': build_type,
+        })
+
+        yield job
new file mode 100644
--- /dev/null
+++ b/taskcluster/taskgraph/transforms/job.py
@@ -0,0 +1,130 @@
+# This Source Code Form is subject to the terms of the Mozilla Public
+# License, v. 2.0. If a copy of the MPL was not distributed with this
+# file, You can obtain one at http://mozilla.org/MPL/2.0/.
+"""
+Convert a job description into a task description.
+
+Jobs descriptions are similar to task descriptions, but they specify how to run
+the job at a higher level, using a "run" field that can be interpreted by
+plugins in taskcluster/taskgraph/jobs.
+
+"""
+
+# TODO: more docstring
+
+from __future__ import absolute_import, print_function, unicode_literals
+
+import copy
+import logging
+
+from taskgraph.transforms.base import validate_schema, TransformSequence
+from taskgraph.transforms.task import task_description_schema
+from taskgraph.jobs.base import configure_taskdesc_for_run, import_all
+from voluptuous import (
+    Optional,
+    Required,
+    Schema,
+    Extra,
+)
+
+SECRET_SCOPE = 'secrets:get:project/releng/gecko/build/level-{}/{}'
+COALESCE_KEY = 'builds.{project}.{name}'
+
+logger = logging.getLogger(__name__)
+
+# Voluptuous uses marker objects as dictionary *keys*, but they are not
+# comparable, so we cast all of the keys back to regular strings
+task_description_schema = {str(k): v for k, v in task_description_schema.schema.iteritems()}
+
+# Schema for a build description
+job_description_schema = Schema({
+    # The name of the job and the job's label.  At least one must be specified,
+    # and the label will be generated from the name if necessary, by prepending
+    # the kind.
+    Optional('name'): basestring,
+    Optional('label'): basestring,
+
+    # the following fields are passed directly through to the task description,
+    # possibly modified by the run implementation.  See
+    # taskcluster/taskgraph/transforms/task.py for the schema details.
+    Required('description'): task_description_schema['description'],
+    Optional('attributes'): task_description_schema['attributes'],
+    Optional('dependencies'): task_description_schema['dependencies'],
+    Optional('expires-after'): task_description_schema['expires-after'],
+    Optional('routes'): task_description_schema['routes'],
+    Optional('scopes'): task_description_schema['scopes'],
+    Optional('extra'): task_description_schema['extra'],
+    Optional('treeherder'): task_description_schema['treeherder'],
+    Optional('index'): task_description_schema['index'],
+    Optional('run-on-projects'): task_description_schema['run-on-projects'],
+    Optional('worker-type'): task_description_schema['worker-type'],
+    Required('worker'): task_description_schema['worker'],
+    Optional('optimizations'): task_description_schema['optimizations'],
+
+    # If the job can be coalesced, this is the name used in the coalesce key
+    # the project, etc. will be added automatically.  Note that try (level 1)
+    # jobs are never coalesced
+    Optional('coalesce-name'): basestring,
+
+    # A description of how to run this job.
+    'run': {
+        # The key to a job implementation in taskcluster/taskgraph/jobs
+        'using': basestring,
+
+        # Any remaining content is verified against that job implementation's
+        # own schema.
+        Extra: object,
+    },
+})
+
+transforms = TransformSequence()
+
+
+@transforms.add
+def validate(config, jobs):
+    for job in jobs:
+        yield validate_schema(job_description_schema, job,
+                              "In job {!r}:".format(job['name']))
+
+
+@transforms.add
+def make_task_description(config, jobs):
+    """Given a build description, create a task description"""
+
+    for job in jobs:
+        if 'label' not in job:
+            assert 'name' in job, "job has neither a name nor a label"
+            job['label'] = '{}-{}'.format(config.kind, job['name'])
+        if job['name']:
+            del job['name']
+
+        taskdesc = copy.deepcopy(job)
+
+        # fill in some empty defaults to make run implementations easier
+        taskdesc.setdefault('attributes', {})
+        taskdesc.setdefault('dependencies', {})
+        taskdesc.setdefault('routes', [])
+        taskdesc.setdefault('scopes', [])
+        taskdesc.setdefault('extra', {})
+
+        # give the function for job.run.using on this worker implementation a
+        # chance to set up the task description.
+        configure_taskdesc_for_run(config, job, taskdesc)
+        del taskdesc['run']
+
+        # coalesce route
+        # TODO: move to make-task.py (moved already?!)
+        if 'coalesce-name' in job:
+            if int(config.params['level']) > 1:
+                key = COALESCE_KEY.format(
+                    project=config.params['project'],
+                    name=job['coalesce-name'])
+                taskdesc['routes'].append('coalesce.v1.' + key)
+            del taskdesc['coalesce-name']
+
+        # yield only the task description, discarding the job description
+        yield taskdesc
+
+# import all of the job types
+# TODO: this is ugly.. another pattern?
+import_all()
new file mode 100644
--- /dev/null
+++ b/taskcluster/taskgraph/transforms/mulet_simulator.py
@@ -0,0 +1,32 @@
+# This Source Code Form is subject to the terms of the Mozilla Public
+# License, v. 2.0. If a copy of the MPL was not distributed with this
+# file, You can obtain one at http://mozilla.org/MPL/2.0/.
+"""
+Transform the mulet-simulator job template,
+  taskcluster/ci/mulet-simulator/job-template.yml
+into an actual job description.
+"""
+
+from taskgraph.transforms.base import TransformSequence
+
+
+transforms = TransformSequence()
+
+
+@transforms.add
+def fill_template(config, tasks):
+    for task in tasks:
+        # Fill out the dynamic fields in the job description
+        task['name'] = task['build-label']
+        task['dependencies'] = {'build': task['build-label']}
+
+        build_platform, build_type = task['build-platform'].split('/')
+        task['attributes']['build_platform'] = build_platform
+        task['attributes']['build_type'] = build_type
+
+        # clear out the stuff that's not part of a job description
+        del task['build-label']
+        del task['build-platform']
+
+        yield task
+
--- a/taskcluster/taskgraph/transforms/task.py
+++ b/taskcluster/taskgraph/transforms/task.py
@@ -74,28 +74,62 @@ task_description_schema = Schema({
         # treeherder.machine.platform and treeherder.collection or
         # treeherder.labels
         'platform': basestring,
 
         # treeherder environments (defaults to both staging and production)
         Required('environments', default=['production', 'staging']): ['production', 'staging'],
     },
 
-    # the provisioner-id/worker-type for the task
+    # information for indexing this build so its artifacts can be discovered;
+    # if omitted, the build will not be indexed.
+    Optional('index'): {
+        # the name of the product this build produces
+        'product': Any('firefox', 'mobile', 'b2g'),
+
+        # the names to use for this job in the TaskCluster index
+        'job-name': Any(
+            # Assuming the job is named "normally", this is the v2 job name,
+            # and the v1 and buildbot routes will be determined appropriately.
+            basestring,
+
+            # otherwise, give separate names for each of the legacy index
+            # routes; if a name is omitted, no corresponding route will be
+            # created.
+            {
+                # the name as it appears in buildbot routes
+                Optional('buildbot'): basestring,
+                Optional('gecko-v1'): basestring,
+                Required('gecko-v2'): basestring,
+            }
+        ),
+    },
+
+    # The `run_on_projects` attribute, defaulting to "all".
+    Optional('run-on-projects'): [basestring],
+
+    # the provisioner-id/worker-type for the task.  The following parameters will
+    # be substituted in this string:
+    #  {level} -- the scm level of this push
     'worker-type': basestring,
 
     # information specific to the worker implementation that will run this task
     'worker': Any({
         Required('implementation'): Any('docker-worker', 'docker-engine'),
 
-        # the docker image (in docker's `host/repo/image:tag` format) in which
-        # to run the task; if omitted, this will be a reference to the image
-        # generated by the 'docker-image' dependency, which must be defined in
-        # 'dependencies'
-        Optional('docker-image'): basestring,
+        # For tasks that will run in docker-worker or docker-engine, this is the
+        # name of the docker image or in-tree docker image to run the task in.  If
+        # in-tree, then a dependency will be created automatically.  This is
+        # generally `desktop-test`, or an image that acts an awful lot like it.
+        Required('docker-image'): Any(
+            # a raw Docker image path (repo/image:tag)
+            basestring,
+            # an in-tree generated docker image (from `testing/docker/<name>`)
+            {'in-tree': basestring}
+        ),
 
         # worker features that should be enabled
         Required('relengapi-proxy', default=False): bool,
         Required('taskcluster-proxy', default=False): bool,
         Required('allow-ptrace', default=False): bool,
         Required('loopback-video', default=False): bool,
         Required('loopback-audio', default=False): bool,
         Optional('superseder-url'): basestring,
@@ -168,37 +202,71 @@ task_description_schema = Schema({
             Optional('project'): basestring,
         },
         'properties': {
             'product': basestring,
             Extra: basestring,  # additional properties are allowed
         },
     }),
 
+    # The optimizations section contains descriptions of the circumstances
+    # under which this task can be "optimized", that is, left out of the
+    # task graph because it is unnecessary.
+    Optional('optimizations'): {
+        # This task only needs to be run if a file matching one of the given
+        # patterns has changed in the push.  The patterns use the mozpack
+        # match function (python/mozbuild/mozpack/path.py).
+        Optional('only-if-files-changed'): [basestring],
+    },
 })
 
 GROUP_NAMES = {
+    '?': 'No group',
     'tc': 'Executed by TaskCluster',
     'tc-e10s': 'Executed by TaskCluster with e10s',
     'tc-Fxfn-l': 'Firefox functional tests (local) executed by TaskCluster',
     'tc-Fxfn-l-e10s': 'Firefox functional tests (local) executed by TaskCluster with e10s',
     'tc-Fxfn-r': 'Firefox functional tests (remote) executed by TaskCluster',
     'tc-Fxfn-r-e10s': 'Firefox functional tests (remote) executed by TaskCluster with e10s',
     'tc-M': 'Mochitests executed by TaskCluster',
     'tc-M-e10s': 'Mochitests executed by TaskCluster with e10s',
     'tc-R': 'Reftests executed by TaskCluster',
     'tc-R-e10s': 'Reftests executed by TaskCluster with e10s',
     'tc-VP': 'VideoPuppeteer tests executed by TaskCluster',
     'tc-W': 'Web platform tests executed by TaskCluster',
     'tc-W-e10s': 'Web platform tests executed by TaskCluster with e10s',
     'tc-X': 'Xpcshell tests executed by TaskCluster',
     'tc-X-e10s': 'Xpcshell tests executed by TaskCluster with e10s',
+    'tc-Sim': 'Mulet simulator runs',
+    'Cc': 'Toolchain builds',
+    'SM-tc': 'Spidermonkey builds',
 }
 UNKNOWN_GROUP_NAME = "Treeherder group {} has no name; add it to " + __file__
 
+BUILDBOT_ROUTE_TEMPLATES = [
+    "index.buildbot.branches.{project}.{job-name-buildbot}",
+    "index.buildbot.revisions.{head_rev}.{project}.{job-name-buildbot}",
+]
+
+V1_ROUTE_TEMPLATES = [
+    "index.gecko.v1.{project}.latest.linux.{job-name-gecko-v1}",
+    "index.gecko.v1.{project}.revision.linux.{head_rev}.{job-name-gecko-v1}",
+]
+
+V2_ROUTE_TEMPLATES = [
+    "index.gecko.v2.{project}.latest.{product}.{job-name-gecko-v2}",
+    "index.gecko.v2.{project}.pushdate.{pushdate_long}.{product}.{job-name-gecko-v2}",
+    "index.gecko.v2.{project}.revision.{head_rev}.{product}.{job-name-gecko-v2}",
+]
+
+# the roots of the treeherder routes, keyed by treeherder environment
+TREEHERDER_ROUTE_ROOTS = {
+    'production': 'tc-treeherder',
+    'staging': 'tc-treeherder-stage',
+}
 
 # define a collection of payload builders, depending on the worker
 # implementation
 payload_builders = {}
 
 
 def payload_builder(name):
     def wrap(func):
@@ -206,78 +274,82 @@ def payload_builder(name):
         return func
     return wrap
 
 
 @payload_builder('docker-worker')
 def build_docker_worker_payload(config, task, task_def):
     worker = task['worker']
 
-    if 'docker-image' in worker:
-        # a literal image name
-        image = {
-            'type': 'docker-image',
-            'name': worker['docker-image'],
-        }
-    else:
-        assert 'docker-image' in task[
-            'dependencies'], 'no docker-worker dependency'
+    image = worker['docker-image']
+    if isinstance(image, dict):
+        docker_image_task = 'build-docker-image-' + image['in-tree']
+        task.setdefault('dependencies', {})['docker-image'] = docker_image_task
         image = {
             "path": "public/image.tar",
             "taskId": {"task-reference": "<docker-image>"},
             "type": "task-image",
         }
 
     features = {}
 
     if worker.get('relengapi-proxy'):
         features['relengAPIProxy'] = True
 
+    if worker.get('taskcluster-proxy'):
+        features['taskclusterProxy'] = True
+
     if worker.get('allow-ptrace'):
         features['allowPtrace'] = True
         task_def['scopes'].append('docker-worker:feature:allowPtrace')
 
     capabilities = {}
 
     for lo in 'audio', 'video':
         if worker.get('loopback-' + lo):
             capitalized = 'loopback' + lo.capitalize()
             devices = capabilities.setdefault('devices', {})
             devices[capitalized] = True
             task_def['scopes'].append(
                 'docker-worker:capability:device:' + capitalized)
 
-    caches = {}
-
-    for cache in worker['caches']:
-        caches[cache['name']] = cache['mount-point']
-        task_def['scopes'].append('docker-worker:cache:' + cache['name'])
-
-    artifacts = {}
-
-    for artifact in worker['artifacts']:
-        artifacts[artifact['name']] = {
-            'path': artifact['path'],
-            'type': artifact['type'],
-            'expires': task_def['expires'],  # always expire with the task
-        }
-
     task_def['payload'] = payload = {
         'command': worker['command'],
-        'cache': caches,
-        'artifacts': artifacts,
         'image': image,
         'env': worker['env'],
-        'maxRunTime': worker['max-run-time'],
     }
+
+    if 'max-run-time' in worker:
+        payload['maxRunTime'] = worker['max-run-time']
+
+    if 'artifacts' in worker:
+        artifacts = {}
+        for artifact in worker['artifacts']:
+            artifacts[artifact['name']] = {
+                'path': artifact['path'],
+                'type': artifact['type'],
+                'expires': task_def['expires'],  # always expire with the task
+            }
+        payload['artifacts'] = artifacts
+
+    if 'caches' in worker:
+        caches = {}
+        for cache in worker['caches']:
+            caches[cache['name']] = cache['mount-point']
+            task_def['scopes'].append('docker-worker:cache:' + cache['name'])
+        payload['cache'] = caches
+
     if features:
         payload['features'] = features
     if capabilities:
         payload['capabilities'] = capabilities
 
+    if worker.get('superseder-url'):
+        payload['supersederUrl'] = worker['superseder-url']
+
 
 @payload_builder('generic-worker')
 def build_generic_worker_payload(config, task, task_def):
     worker = task['worker']
 
     artifacts = []
 
     for artifact in worker['artifacts']:
@@ -302,19 +374,66 @@ transforms = TransformSequence()
 def validate(config, tasks):
     for task in tasks:
         yield validate_schema(
             task_description_schema, task,
             "In task {!r}:".format(task.get('label', '?no-label?')))
 
 
 @transforms.add
+def add_index_routes(config, tasks):
+    for task in tasks:
+        index = task.get('index')
+        if index:
+            job_name = index['job-name']
+            # unpack the v2 name to v1 and buildbot names
+            if isinstance(job_name, basestring):
+                base_name, type_name = job_name.rsplit('-', 1)
+                job_name = {
+                    'buildbot': base_name,
+                    'gecko-v1': '{}.{}'.format(base_name, type_name),
+                    'gecko-v2': '{}-{}'.format(base_name, type_name),
+                }
+            vars = config.params.copy()
+            for n in job_name:
+                vars['job-name-' + n] = job_name[n]
+            vars['pushdate_long'] = time.strftime(
+                "%Y.%m.%d.%Y%m%d%H%M%S",
+                time.gmtime(config.params['pushdate']))
+            vars['product'] = index['product']
+
+            if 'buildbot' in job_name:
+                for tpl in BUILDBOT_ROUTE_TEMPLATES:
+                    task['routes'].append(tpl.format(**vars))
+            if 'gecko-v1' in job_name:
+                for tpl in V1_ROUTE_TEMPLATES:
+                    task['routes'].append(tpl.format(**vars))
+            if 'gecko-v2' in job_name:
+                for tpl in V2_ROUTE_TEMPLATES:
+                    task['routes'].append(tpl.format(**vars))
+
+            # rank is zero for non-tier-1 tasks and based on pushid for others;
+            # this sorts tier-{2,3} builds below tier-1 in the index
+            try:
+                tier = task['treeherder']['tier']
+            except KeyError:
+                tier = 3  # default
+            task['extra']['index'] = {
+                'rank': 0 if tier > 1 else int(config.params['pushdate'])
+            }
+            del task['index']
+        yield task
+
+
+@transforms.add
 def build_task(config, tasks):
     for task in tasks:
-        provisioner_id, worker_type = task['worker-type'].split('/', 1)
+        worker_type = task['worker-type'].format(level=str(config.params['level']))
+        provisioner_id, worker_type = worker_type.split('/', 1)
+
         routes = task.get('routes', [])
         scopes = task.get('scopes', [])
 
         # set up extra
         extra = task.get('extra', {})
         task_th = task.get('treeherder')
         if task_th:
             extra['treeherderEnv'] = task_th['environments']
@@ -330,23 +449,29 @@ def build_task(config, tasks):
             if groupSymbol not in GROUP_NAMES:
                 raise Exception(UNKNOWN_GROUP_NAME.format(groupSymbol))
             treeherder['groupName'] = GROUP_NAMES[groupSymbol]
             treeherder['symbol'] = symbol
             treeherder['jobKind'] = task_th['kind']
             treeherder['tier'] = task_th['tier']
 
             routes.extend([
-                '{}.v2.{}.{}.{}'.format(root,
+                '{}.v2.{}.{}.{}'.format(TREEHERDER_ROUTE_ROOTS[env],
                                         config.params['project'],
                                         config.params['head_rev'],
                                         config.params['pushlog_id'])
-                for root in 'tc-treeherder', 'tc-treeherder-stage'
+                for env in task_th['environments']
             ])
 
+        if 'expires-after' not in task:
+            task['expires-after'] = '14 days' if config.params['project'] == 'try' else '1 year'
+
+        if 'deadline-after' not in task:
+            task['deadline-after'] = '1 day'
+
         task_def = {
             'provisionerId': provisioner_id,
             'workerType': worker_type,
             'routes': routes,
             'created': {'relative-datestamp': '0 seconds'},
             'deadline': {'relative-datestamp': task['deadline-after']},
             'expires': {'relative-datestamp': task['expires-after']},
             'scopes': scopes,
@@ -362,14 +487,19 @@ def build_task(config, tasks):
             'extra': extra,
             'tags': {'createdForUser': config.params['owner']},
         }
 
         # add the payload and adjust anything else as required (e.g., scopes)
         payload_builders[task['worker']['implementation']](
             config, task, task_def)
 
+        attributes = task.get('attributes', {})
+        attributes['run_on_projects'] = task.get('run-on-projects', ['all'])
+
+
         yield {
             'label': task['label'],
             'task': task_def,
-            'dependencies': task['dependencies'],
-            'attributes': task['attributes'],
+            'dependencies': task.get('dependencies', {}),
+            'attributes': attributes,
+            'optimizations': task.get('optimizations', {}),
         }
--- a/taskcluster/taskgraph/transforms/tests/make_task_description.py
+++ b/taskcluster/taskgraph/transforms/tests/make_task_description.py
@@ -128,24 +128,17 @@ def docker_worker_setup(config, test, ta
         'default': 'aws-provisioner-v1/desktop-test-large',
         'large': 'aws-provisioner-v1/desktop-test-large',
         'xlarge': 'aws-provisioner-v1/desktop-test-xlarge',
         'legacy': 'aws-provisioner-v1/desktop-test',
     }[test['instance-size']]
 
     worker = taskdesc['worker'] = {}
     worker['implementation'] = test['worker-implementation']
-
-    docker_image = test.get('docker-image')
-    assert docker_image, "no docker image defined for a docker-worker/docker-engine task"
-    if isinstance(docker_image, dict):
-        taskdesc['dependencies']['docker-image'] = 'build-docker-image-' + docker_image['in-tree']
-    else:
-        # just a raw docker-image string
-        worker['docker-image'] = test['docker-image']
+    worker['docker-image'] = test['docker-image']
 
     worker['allow-ptrace'] = True  # required for all tests, for crashreporter
     worker['relengapi-proxy'] = False  # but maybe enabled for tooltool below
     worker['loopback-video'] = test['loopback-video']
     worker['loopback-audio'] = test['loopback-audio']
     worker['max-run-time'] = test['max-run-time']
 
     worker['artifacts'] = [{
new file mode 100644
--- /dev/null
+++ b/taskcluster/taskgraph/transforms/upload_symbols.py
@@ -0,0 +1,33 @@
+# This Source Code Form is subject to the terms of the Mozilla Public
+# License, v. 2.0. If a copy of the MPL was not distributed with this
+# file, You can obtain one at http://mozilla.org/MPL/2.0/.
+"""
+Transform the upload-symbols task description template,
+  taskcluster/ci/upload-symbols/job-template.yml
+into an actual task description.
+"""
+
+from taskgraph.transforms.base import TransformSequence
+
+
+transforms = TransformSequence()
+
+
+@transforms.add
+def fill_template(config, tasks):
+    for task in tasks:
+        # Fill out the dynamic fields in the task description
+        task['label'] = task['build-label'] + '-upload-symbols'
+        task['dependencies'] = {'build': task['build-label']}
+        task['worker']['env']['GECKO_HEAD_REPOSITORY'] = config.params['head_repository']
+        task['worker']['env']['GECKO_HEAD_REV'] = config.params['head_rev']
+
+        build_platform, build_type = task['build-platform'].split('/')
+        task['attributes']['build_platform'] = build_platform
+        task['attributes']['build_type'] = build_type
+
+        # clear out the stuff that's not part of a task description
+        del task['build-label']
+        del task['build-platform']
+
+        yield task
--- a/taskcluster/taskgraph/try_option_syntax.py
+++ b/taskcluster/taskgraph/try_option_syntax.py
@@ -1,31 +1,43 @@
 # This Source Code Form is subject to the terms of the Mozilla Public
 # License, v. 2.0. If a copy of the MPL was not distributed with this
 # file, You can obtain one at http://mozilla.org/MPL/2.0/.
 
 from __future__ import absolute_import, print_function, unicode_literals
 
 import argparse
 import copy
+import itertools
 import logging
 import re
 import shlex
 
 logger = logging.getLogger(__name__)
 
 TRY_DELIMITER = 'try:'
 
 # The build type aliases are very cryptic and only used in try flags these are
 # mappings from the single char alias to a longer more recognizable form.
 BUILD_TYPE_ALIASES = {
     'o': 'opt',
     'd': 'debug'
 }
 
+# consider anything in this whitelist of kinds be governed by -b/-p
+BUILD_KINDS = set([
+    'build',
+    'spidermonkey',
+    'hazard',
+    'upload-symbols',
+    'artifact-build',
+    'valgrind',
+    'source-check',
+    'static-analysis',
+])
 
 # mapping from shortcut name (usable with -u) to a boolean function identifying
 # matching test names
 def alias_prefix(prefix):
     return lambda name: name.startswith(prefix)
 
 
 def alias_contains(infix):
@@ -119,33 +131,34 @@ UNITTEST_PLATFORM_PRETTY_NAMES = {
     # 'Windows 8':  [..TODO..],
     # 'Windows XP': [..TODO..],
     # 'win32': [..TODO..],
     # 'win64': [..TODO..],
 }
 
 # We have a few platforms for which we want to do some "extra" builds, or at
 # least build-ish things.  Sort of.  Anyway, these other things are implemented
-# as different "platforms".
+# as different "platforms".  These do *not* automatically ride along with "-p
+# all"
 RIDEALONG_BUILDS = {
     'linux': [
         'linux-l10n',
     ],
     'linux64': [
         'linux64-l10n',
         'sm-plain',
         'sm-nonunified',
         'sm-arm-sim',
         'sm-arm64-sim',
         'sm-compacting',
         'sm-rootanalysis',
         'sm-package',
         'sm-tsan',
+        'sm-msan',
         'sm-asan',
-        'sm-msan',
     ],
 }
 
 TEST_CHUNK_SUFFIX = re.compile('(.*)-([0-9]+)$')
 
 
 class TryOptionSyntax(object):
 
@@ -500,16 +513,27 @@ class TryOptionSyntax(object):
                 return True
             elif attr('legacy_kind') == 'unittest':
                 return match_test(self.unittests, 'unittest_try_name')
             elif attr('legacy_kind') == 'talos':
                 return match_test(self.talos, 'talos_try_name')
             return False
         elif attr('kind') in ('desktop-test', 'android-test'):
             return match_test(self.unittests, 'unittest_try_name')
+        elif attr('kind') in BUILD_KINDS:
+            if attr('build_type') not in self.build_types:
+                return False
+            elif self.platforms is None:
+                # for "-p all", look fro try in the 'run_on_projects' attribute
+                # TODO: common code for this
+                return set(['try', 'all']) & set(attr('run_on_projects', []))
+            else:
+                if attr('build_platform') not in self.platforms:
+                    return False
+            return True
         else:
             return False
 
     def __str__(self):
         def none_for_all(list):
             if list is None:
                 return '<all>'
             return ', '.join(str(e) for e in list)