TIL: Pipelines on BitBucket

Al Hinds
3 min readAug 1, 2017
bitbucket.com

I’ve been using Atlassian’s git repository software ‘BitBucket’ for quite some time but have only recently started experimenting with their pipelines feature, and it’s pretty damn impressive.

tl;dr Pipelines are not just novelty, they’re free*, and allow for some pretty cool functionality to do with your software’s build status. You should give ’em a try.

Long version

If you’re a git power user, you might have already experimented with git-hooks. For those that don’t know, git-hooks allow you to insert scripts into the base git command workflow. In practice, this could mean having a complicated test suite run before a push is executed, or something as simple as forcing a commit comment to be above a certain length and of a certain style.

These are nice features to shape your workflow, but you definitely ‘feel’ them as you develop. If you’re simply doing a quick commit and bugfix, which you’ve already tested, you probably don’t want to go through that process again. What’s more it becomes even more jarring if you’re doing multiple updates a day, or doing changes that are don’t even affect your core software.

This is where pipelines come in.

Pipelines

What’s a pipeline you say?

They’re essentially just a little mini remote server that you use to run your current build. This is *super* useful as it takes away any local machine time, test toiling and runs it in the background while you get on with your work. If something does go wrong, you’ll get a notification. If it doesn’t. No worries.

At their heart they’re designed for continuous integration and they’re neatly packed in with bitbucket’s existing ecosystem.

Pipelines aren’t for everyone, and make far more sense with more complex software but considering they’re free to use (1000mins of server time is gifted to you per month as a base user) and super easy to get started with. I’d strongly encourage developers who haven’t used them to at least experiment with the feature. You’ve got nothing to lose.

Getting started

As simple as committing a file called bitbucket-pipelines.yml into the base directory of your repository which includes syntax similar to this:

# This is a sample build configuration for all languages.
# Check our guides at https://confluence.atlassian.com/x/5Q4SMw for more examples.
# Only use spaces to indent your .yml configuration.
# — — -
# You can specify a custom docker image from Docker Hub as your build environment.
# image: docker-image:tag
pipelines:
default:
— step:
script:
— echo “Everything is awesome!”

The syntax is pretty easy to get your head around and will work for all sorts of build system. I’ve been using it with gcc+6.1 to play with my uni c projects for no other reason than I can.

But you can run all sorts of builds with no major tweaking. Below is a sample configuration script (it just runs make and makes sure it compiles successfully, without errors) I have run as part of my first ‘play’ pipeline.

#!/bin/sh -e# nothing special going on here.DIR="$HOME/usr/local/bin";mkdir -p $DIR;PATH="$PATH:$DIR";ln -s `which gcc` "$DIR/dcc";cd labs;for f in `ls` do

cd $f;
echo "$f now compiling"
make clean
make
cd ..
done

I could add a whole bunch of other scripts to this pipeline, to increase the coverage or inject more tests, but at least for now I’m guaranteed that my pushes will at least compile and run cleanly.

I’ve used this at work (where real devs have put together much more complex pipes) and it’s super impressive when you’ve got multiple teams working together, but even on a small project it’s been fun and thoughtless to have in the background running. It doesn’t affect my workflow, but does let me know when things go awry.

Anyway. I’m going to look at extending it further and using it with some of my other existing projects.

--

--

Al Hinds

Write about all sorts of stuff. Dev. Sport. Life.