Today we will talk about pipelines.
Well, to be specific, how we write pipelines nowadays.
A workflow file is what happens when bash scripts and yaml files decide to bear a child.
Those files lies in your github project inside a folder called .github/workflows and are able to perform shell commands.
If you have a certain set of shell commands (bash, but you can set any supported shell, see here.) to test, build and publish your application then you can transform it into a workflow. You can also condition the execution of this workflow to a certain event, let's say on every push, performing proper code integration.
You script can be declared as one or more jobs, containing one or more steps, able to checking in previous results to decide what to do next.
Jobs run in parallel by default.
Simplest action possible is an echo "hello world", like in bash:
name: 00 - Hello world!
on:
workflow_dispatch:
jobs:
hello-world:
runs-on: ubuntu-latest
steps:
- run: echo "hello world!"
We can spot the three key elements that must figure in every GitHub action:
You can get somewhat creative when authoring pipelines. You can reuse scripts, chain them, even condition execution based on previous steps.
This is when you want a job to do everything another workflow does:
name: 04 - Use another workflow
on:
workflow_dispatch:
jobs:
use-sequence-workflow:
uses: ./.github/workflows/03-sequence-jobs.yml
In example above, the job in the workflow calls all jobs from previous one.
It is possible to pass some context to those jobs.
Use it to chain jobs. If a job needs another, they play in sequence. It's very similar behavior as put all steps into a single job:
name: 03 - Sequence jobs
on:
workflow_dispatch:
workflow_call:
jobs:
job-one:
runs-on: ubuntu-latest
steps:
- run: echo "hello first!"
job-two:
runs-on: ubuntu-latest
needs: [job-one]
steps:
- run: echo "hello second!"
You can either provide a single job as needed or a list of jobs.
Jobs can interact with each other. To do so, they can get inputs from events and emit outputs.
Events that can get inputs are the workflow_dispatch, which is when you call it by hand and the workflow_call, which is when one job calls another, as we seen in the previous examples.
This is an example of a workflow consuming inputs:
name: 05 - Job Inputs
on:
workflow_dispatch:
inputs:
anything:
type: string
required: false
default: 'is possible'
jobs:
using-input-from-dispatch:
runs-on: ubuntu-latest
steps:
- run: echo 'If you believe, anything ${{inputs.anything}}!'
And this is a job producing outputs:
name: 06 - Job outputs
on:
workflow_dispatch:
jobs:
simple-output:
runs-on: ubuntu-latest
outputs:
my-output: 'general kenobi!'
steps:
- run: echo "hello there!"
use-simple-output:
runs-on: ubuntu-latest
needs: simple-output
steps:
- run: echo "${{needs.simple-output.outputs.my-output}}"
output-from-step:
runs-on: ubuntu-latest
outputs:
from-step-2: ${{ steps.step2.outputs.custom_result }}
steps:
- id: step2
run: echo "custom_result=$(date +'%Y')" >> $GITHUB_OUTPUT
use-output-from-step:
runs-on: ubuntu-latest
needs: [output-from-step]
steps:
- run: echo "i got a dynamic value from ${{needs.output-from-step.outputs.from-step-2}}"
You can map workflow outputs at event level ina similar way the outputs at step level are mapped, so other workflows using this job can access produced outputs with no trouble.
Variables are a way to tweak workflow behavior by setting values external to the event context:
name: 07 - Environment variables
on:
workflow_dispatch:
env:
A: 'a reasonable value'
jobs:
use-one-variable:
runs-on: ubuntu-latest
steps:
- run: echo "${{vars.X}}" # gh variable set X or .vars file for act
use-another-variable:
runs-on: ubuntu-latest
steps:
- run: echo "${{env.A}}"
use-another-one:
runs-on: ubuntu-latest
env:
B: 'my other value'
steps:
- run: echo "${{env.B}}"
and-another:
runs-on: ubuntu-latest
steps:
- run: echo "${{env.C}}"
env:
C: 'other'
- run: echo "I have access to X and A, ${{vars.X}} ${{env.A}}"
- run: echo "I don't have access to C or B, ${{env.B}} ${{env.C}}"
about-redefines:
runs-on: ubuntu-latest
env:
A: 'redefined!'
steps:
- run: echo "I am using a redefined A, ${{env.A}}"
It is possible creating variables on project configurations either using web interface or the GitHub CLI.
Another cool tool to play with actions is act-cli. It's almost 100% compatible with the real thing, but it's good enough to avoid perform several commits when creating a pipeline. Your coworkers mailboxes will be glad!
It is possible to define environment variables at workflow level, job level and step level.
Secrets are like vars and env but for secrets. The neat touch is they don't echo on workflow execution logs:
name: 08 - Using secrets
on:
workflow_dispatch:
jobs:
ill-tell-u-a-secret:
runs-on: ubuntu-latest
steps:
- run: echo "my secret is ${{secrets.MY_SECRET}}"
# gh secret set MY_SECRET
# or create a .env file if using act
- run: |
echo "secret doesn't echo, but go ahead and use them!"
echo $(( 10 + ${{secrets.X}}))
These samples are quite simple by design, but combine them and you can create neat pipelines. Run your tests, package libraries and app images, publish, see the marketplace for more interesting examples and ways to build pipelines.
You can find sample actions in this repo