Larry Price

And The Endless Cup Of Coffee

Clean Package Building With Pbuilder

| Comments

Whether I’m adding dependencies, updating package names, or creating new package spins, I always have issues testing my debian packages. Something will work locally, only to fail on jenkins under a clean environment. Fortunately, there’s a nifty tool called pbuilder that exists to help out in these situations. pbuilder uses a chroot to set up a clean environment to build packages, and can even be used to build packages for systems with architectures different from your own.

Note: All code samples were originally written from a machine running Ubuntu 16.10 64-bit. Your mileage may vary.

Clean builds for current distro

Given a typical debian-packaged project with a debian directory (control, rules, .install), you can use debuild to build a package from your local environment:

1
2
3
4
5
$ cd my-project
$ debuild
...
$ ls ../*.deb
my-project.deb

This works pretty well for sanity checks, but sometimes knowing your sane just isn’t quite enough. My development environment is filled with libraries and files installed in all kinds of weird ways and in all kinds of strange places, so there’s a good chance packages built successfully on my machine may not work on everyone’s machine. To solve this, I can install pbuilder and set up my first chroot:

1
2
3
4
$ # install pbuilder and its dependencies
$ sudo apt-get install pbuilder debootstrap devscripts
$ # create a chroot for your current distro with build-essential pre-installed
$ sudo pbuilder create --debootstrapopts --variant=buildd

Since I use debuild pretty frequently, I also rely on pdebuild which performs debuild inside of the clean chroot environment, temporarily installing the needed dependencies listed in the control file.

1
2
3
4
$ cd my-project
$ pdebuild
$ ls /var/cache/pbuilder/result/*.deb
my-project.deb

Alternatively, I could create the .dsc file and then use pbuilder to create the package from there:

1
2
3
4
5
6
7
8
$ # generate a dsc file however you like
$ cd my-project
$ bzr-builddeb -- -us -uc
$ cd ..
$ # use pbuilder to create package
$ sudo pbuilder build my-project.dsc
$ ls /var/cache/pbuilder/result/*.deb
my-project.deb

Clean cross builds

Let’s say that you need to build for an older distribution of Ubuntu on a weird architecture. For this example, let’s say vivid with armhf. We can use pbuilder-dist to verify and build our packages for other distros and architectures:

1
2
3
4
5
6
7
$ # create the chroot, once again with build-essential pre-installed
$ pbuilder-dist vivid armhf create --debootstrapopts --variant=buildd
$ # the above command could take a while, but once it's finished
$ # we can attempt to build our package using a .dsc file
$ pbuilder-dist vivid armhf build my-project-dsc
$ ls ~/pbuilder/vivid-armhf_result/*.deb
my-project.deb

Custom, persistent chroot changes

In some cases, you may need to enable other archives or install custom software in your chroot. In the case of our vivid-armhf chroot, let’s add the stable-overlay ppa which updates the outdated vivid with some more modern versions of packages.

1
2
3
4
5
6
7
8
9
$ # login to our vivid-armhf chroot, and save state when we're finished
$ # if --save-after-login is omitted, a throwaway chroot will be used
$ pbuilder vivid armhf login --save-after-login
(chroot) $ # install the package container add-apt-repository for convenience
(chroot) $ apt install software-properties-common
(chroot) $ add-apt-repository ppa:ci-train-ppa-service/stable-phone-overlay
(chroot) $ exit
$ # update packages in the chroot
$ pbuilder-dist vivid armhf update

pbuilder and chroots are powerful tools in the world of packaging and beyond. There are scripting utilities, as well as pre- and post-build hooks which can customize your builds. There are ways to speed up clean builds using local caches or other “cheats”. You could use the throwaway terminal abilities to create and destroy tiny worlds as you please. All of this is very similar to the utility which comes from using docker and lxc, though the underlying “container” is quite a bit different. Using pbuilder seems to have a much lower threshold for setup, so I prefer it over docker for clean build environments, but I believe docker/lxc to be the better tool for managing the creation of consistent virtual environments.

Further reading:

Pbuilder HowTo on the Ubuntu wiki Pbuilder tricks from the debian wiki

Getting Started With Python Mocking and Patching

| Comments

I currently write a lot of python and C++. Although I religiously unit test my C++ code, I’m a bit ashamed to say that I haven’t had much experience with python unit testing until recently. You know how it is - python is one of those interpreted languages, you mostly use it to do quick hacks, it doesn’t need tests. Until you’ve written your entire D-Bus service using python, and every time you make a code change a literal python appears on the screen to crash your computer. So I’ve started writing a bunch of tests and found (as expected) a tangled mess of dependencies and system calls.

In many C-like languages, you can fix most of your dependency problems with The Big Three: mocks, fakes, and stubs. A fake is an actual implementation of an interface used for non-production environments, a stub is an implementation of an interface returning a pre-conceived result, and a mock is a wrapper around an interface allowing a programmer to accurately map what actions were performed on the object. In C-like languages, you use dependency injection to give our classes fakes, mocks, or stubs instead of real objects during testing.

The good news is that we can also use dependency injection in python! However, I found that relying solely on dependency injection would pile on more dependencies than I wanted and was not going to work to cover all my system calls. But python is a dynamic language. In python, you can literally change the definition of a class inside of another class. We call this operation patch and you can use it extensively in testing to do some pretty cool stuff.

Code Under Test

Let’s define some code to test. For all of these examples, I’ll be using python3.5.2 with the unittest and unittest.mock libs on Ubuntu 16.10. You can the final versions of these code samples on github.

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
from random import randint

class WorkerStrikeException(Exception):
    pass

class Worker(object):
    """
    A Worker will work a full 40 hour week and then go on strike. Each time
    a Worker works, they work a random amount of time between 1 and 40.
    """
    def __init__(self):
        self.hours_worked = 0

    def work(self):
        timesheet = randint(1, 40)
        self.hours_worked += timesheet
        if self.hours_worked > 40:
            raise WorkerStrikeException("This worker is picketing")

        return timesheet

class Boss(object):
    """
    A Boss makes profit using workers. Bosses squeeze 1000 monies out of a
    Worker for each hour worked. Workers on strike are instantly replaced.
    """
    def __init__(self, worker):
        self.worker = worker
        self.profit = 0

    def make_profit(self):
        try:
            self.profit += self.worker.work()*1000
        except WorkerStrikeException as e:
            print("%s" % e)
            self.worker = Worker()
            self.profit += self.worker.work()*1000
        finally:
            return self.profit

These are two simple classes (and a custom Exception) that we’ll use to demonstrate unit testing in python. The first class, Worker, will work a maximum of 40 hours per week before picketing it’s corporation. Each time work is called, the Worker will work a random number of hours. The Boss class takes in a Worker object, which it uses as it performs make_profit. The profit is determined by the number of hours worked multiplied by 1000. When the worker starts picketing, the Boss will hire a new Worker to take their place. So it goes.

Mocking the Worker Class

Our goal is to fully test the Boss class. We’ve left ourselves a dependency to inject in the __init__ method, so we could start there. We’ll mock the Worker and pass it into the Boss initializer. We’ll then set up the Worker.work method to always return a known number so we can test the functionality of make_profit.

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
import unittest.mock
from unittest import TestCase

from corp import work  # your impl file

class BossTest(TestCase):
    def test_profit_adds_up(self):
        worker = unittest.mock.create_autospec(work.Worker)
        worker.work.return_value = 8
        boss = work.Boss(worker)
        self.assertEqual(boss.make_profit(), 8000)
        self.assertEqual(boss.make_profit(), 16000)
        worker.work.return_value = 10
        self.assertEqual(boss.make_profit(), 26000)

        worker.work.assert_has_calls([
            unittest.mock.call(),
            unittest.mock.call(),
            unittest.mock.call()
        ])

if __name__ == '__main__':
    unittest.main()

To run this test, use the command python3 -m testtools.run test, where test is the name of your test file without the .py.

One curiosity here is unittest.mock.create_autospec. Python will also let you directly create a Mock, which will absorb all attribute calls regardless of whether they are defined, and MagicMock, which is like Mock except it also mocks magic methods. create_autospec will create a mock with all of the defined attributes of the given class (in our case work.Worker), and raise an Exception when the attribute is not defined on the specced class. This is really handy, and eliminates the possibility of tests “accidentally passing” because they are calling default attributes defined by the generic Mock or MagicMock initializers.

We set the return value of the work function with return_value, and we can change it on a whim if we so desire. We then use assertEqual to verify the numbers are crunching as expected. One further thing I’ve shown here is assert_has_calls, a mock assertion to verify that work was called 3 times on our mock method.

You may also note that we subclassed TestCase to enable running this class as part of our unit testing framework with the special __main__ method definition at the bottom of the file.

Patching the Worker Class

Although our first test demonstrates how to make_profit with a happy worker, we also need to verify how the Boss handles workers on strike. Unforunately, the Boss class creates his own Worker internally after learning they can’t trust the Worker we gave them in the initializer. We want to create consistent tests, so we can’t rely on the random numbers generated by randint in Worker.work. This means we can’t just depend on dependency injection to make these tests pass!

At this point we have two options: we can patch the Worker class or we can patch the randint function. Why not both! As luck would have it, there are a few ways to use patch, and we can explore a couple of these ways in our two example tests.

We’ll patch the randint function using a method decorator. Our intent is to make randint return a static number every time, and then verify that profits keep booming even as we push workers past their limit.

1
2
3
4
5
6
7
8
9
10
11
12
@unittest.mock.patch('corp.work.randint', return_value=20)
def test_profit_adds_up_despite_turnover(self, randint):
    boss = work.Boss(work.Worker())
    self.assertEqual(boss.make_profit(), 20000)
    self.assertEqual(boss.make_profit(), 40000)
    self.assertEqual(boss.make_profit(), 60000)
    self.assertEqual(boss.make_profit(), 80000)

    randint.assert_has_calls([
        unittest.mock.call(1, 40), unittest.mock.call(1, 40),
        unittest.mock.call(1, 40), unittest.mock.call(1, 40)
    ])

When calling patch, you must describe the namespace relative to the module you’re importing. In our case, we’re using randint in the corp.work module, so we use corp.work.randint. We define the return_value of randint to simply be 20. A fine number of hours per day to work an employee, according to the Boss. patch will inject a parameter into the test representing an automatically created mock that will be used in the patch, and we use that to assert that our calls were all made the way we expected.

Since we know the inner workings of the Worker class, we know that this test exercised our code by surpassing a 40-hour work week for our poor Worker and causing the WorkerStrikeException to be raised. In doing so, we’re depending on the Worker/Boss implementation to stay in-sync, which is a dangerous assumption. Let’s explore patching the Worker class instead.

To spice things up, we’ll use the ContextManager syntax when we patch the Worker class. We’ll create one mock Worker outside of the context to use for dependency injection, and we’ll use this mock to raise the WorkerStrikeException as a side effect of work being called too many times. Then we’ll patch the Worker class for newly created instances to return a known timesheet.

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
def test_profit_adds_up_despite_strikes(self):
    worker = unittest.mock.create_autospec(work.Worker)
    worker.work.return_value = 12
    boss = work.Boss(worker)

    with unittest.mock.patch('corp.work.Worker') as MockWorker:
        scrub = MockWorker.return_value
        scrub.work.return_value = 4

        self.assertEqual(boss.make_profit(), 12000)
        self.assertEqual(boss.make_profit(), 24000)

        worker.work.side_effect = work.WorkerStrikeException('Faking a strike!')
        self.assertEqual(boss.make_profit(), 28000)
        self.assertEqual(boss.make_profit(), 32000)

        worker.work.assert_has_calls([
            unittest.mock.call(), unittest.mock.call(), unittest.mock.call()
        ])
        scrub.work.assert_has_calls([
            unittest.mock.call(), unittest.mock.call()
        ])

After the first Worker throws a WorkerStrikeException, the second Worker (scrub) comes in to replace them. In patching the Worker, we are able to more accurately describe the behavior of Boss regardless of the implementation details behind Worker.

A Non-Political Conclusion

I’m not saying this is the best way to go about unit testing in python, but it is an option that should help you get started unit testing legacy code. There are certainly those who see this level of micromanaging mocks and objects as tedious, but there is be benefit to defining the way a class acts under exact circumstances. This was a contrived example, and your code may be a little bit harder to wrap with tests.

Now you can go get Hooked on Pythonics!

Maintaining X Applications in Unity 8

| Comments

The release of Ubuntu 16.10 Yakkety Yak in the coming months will bring about the public release of Unity 8 as a pre-installed desktop session (though not as the default session). It’s been a long time coming, and there’s a lot of new features which will break older applications. Canonical has unveiled snappy as the preferred packaging system for Unity 8, but what about all those old deb packages?

There have been a few other good posts about X applications on Unity 8 including this one on dogfooding, this one on Ubuntu Touch, and this one on how it works under the covers. This blog post is explicitly about Unity 8 on desktop using the Libertine CLI, though can be applied to most devices running Ubuntu Touch.

Disclaimer: I work for Canonical on one of the teams making all of this fancy stuff work.

A (Very) Brief Explanation

The toolchain we’ll be relying on is called libertine, and it’s essentially a wrapper around unprivileged LXC and chroot-based containers. We prefer to use LXC containers on newer OSes, but we must continue supporting chroot containers on many devices due to kernel limitations.

What You’ll Need

For desktop Unity 8, you’ll need the packages for libertine, libertine-tools, and lxc to get started. This will install a CLI and GUI for maintaining Libertine containers and applications.

If you’re running Wily or newer, you can just run the following in your terminal:

1
$ sudo apt install libertine

Otherwise, you’ll need to add the stable overlay PPA first:

1
2
3
$ sudo add-apt-repository ppa:ci-train-ppa-service/stable-phone-overlay
$ sudo apt-get update
$ sudo apt-get install libertine

The GUI

At this point, if you’re on desktop you can open up the GUI which will guide you through creating a new container and installing applications. Search the Dash (or Apps scope) for libertine and, given that we haven’t pushed a buggy version recently, you’ll be presented with a Qt application for maintaining containers. I highly recommend using the GUI, because then you are guaranteed not to be following out-of-date console commands.

…But maybe you prefer the terminal. Or maybe you’re secretly SSH’d into the target machine or Ubuntu Touch device and need to use the terminal. If so…

The CLI

The CLI we’ll be using is libertine-container-manager. It has a manpage, a --help option, and autocomplete to help you out in a jam.

The first thing you’ll want to do is create a container. There are a lot of options, but to create an optimal container for your current machine you only need to specify the id and name parameters:

1
$ libertine-container-manager create --id desktopapps --name "Desktop Applications"

A couple of things to note here: Your id must be unique and conform to the simple click name regex - this is what will identify your container on a system level. The name should be human-readable so you can easily identify what might be inside your container. If you don’t specify a name, your id will be used. The CLI will likely ask you for a password to use in the container in case you ever need it. You can leave this blank if you’re not concerned with that kind of thing.

At this point, a bunch of things should be happening in your terminal. This will pull a container image for your current distro and install all the requirements to get started maintaining and running X apps. This could take anywhere from a few minutes to the next hour depending on your network and disk speeds. Once you’re done, you can use the list subcommand to list all installed containers (note you probably just have one at this point). If you ever want to delete your container, you can run libertine-container-manager destroy -i desktopapps.

Once that’s finished, we can start installing apps. To find apps available, you can use the search-cache subcommand:

1
$ libertine-container-manager search-cache --id desktopapps --search-string "office"

This will return a few strings from the apt-cache of the container with id “desktopapps” that match “office”. Now, if you want to install “libreoffice”:

1
$ libertine-container-manager install-package --id desktopapps --package libreoffice

This will install the full libreoffice suite. Nice! Similarly, you can use the remove-package subcommand to remove applications. Don’t remember what apps you’ve installed? Use the list-apps command:

1
$ libertine-container-manager list-apps --id desktopapps

Maybe you’re an avid Steam for Linux gamer and want to try to get some games working. Since Steam still only comes in a 32-bit binary, you’ll need to enable the multiarch repos, and then you can just install Steam like any other app:

1
2
3
$ libertine-container-manager configure --id desktopapps --multiarch enable
...
$ libertine-container-manager install-package --id desktopapps --package steam

Steam will ask you to agree to their user agreement from the command line, which you should be able to do easily. If you need to use the readline frontend for dpkg, you can append --readline to the install-package command to enable it.

There are many other commands to explore to maintain your container, but for now I’ll let you check the manpage or open the GUI to explore further.

Running Apps

Now that you’ve installed some apps, you probably want to run them. You can install the Libertine Scope, which will allow you to peruse your installed apps in a Unity 8 session. You can either install it from the App Store on a device (search for “Desktop Apps Scope”) or through apt on desktop with:

1
$ sudo apt install libertine-scope

In a Unity 8 session, you can now find the scope and click on apps to run them. Note that there are many apps which still don’t work, such as those requiring a terminal or sudo; consider these a work in progress.

The Future

I’ve been toiling away the past few weeks getting a scope ready which can be used explicitly to install/remove X apps in Unity 8, like the current Ubuntu Software Center (or app store on Touch devices). This scope should be available anywhere the libertine scope is available, meaning that it will alleviate a lot of the pain associated with installing/removing apps for a large chunk of users. Using the Libertine GUI or Libertine CLI will still allow for much more customization, but those tools are largely designed with power users in mind.

Are you able to get libertine working on your system? Can you launch X applications to your heart’s content? Let me know in the comments!

Going Remote: 4 Months in, Aka Remote Life 4-eva

| Comments

This is the third in a series of blog posts detailing my experience acclimating to a fully remote work experience. You may also enjoy my original posts detailing my first week and my first month.

Has it really been 4 months since I started riding the raging river of remote work? After my first month, I felt pretty good about my daily schedule and work/life balance. Since the last post, I’ve met many of my co-workers IRL, learned to find my own work, and figured out how to shake things up when I get in a rut.

But First, Did I Handle My Action Items?

During my first month, I struggled to figure out what to work on after finishing a task. At this point, it’s extremely rare that I don’t have a dozen things in the queue like a pile of papers in an In Box about to topple over. I am happily busy all the time, either pulling things off the top of the stack, plucking things from the middle, or coming up with some new feature that I need to add.

I feel a lot more comfortable on chat. Our IRC channels are a bit daunting at first, especially when there’s a lot of action going on. I’ve learned some good ways to interject or reach out to the people that I need to talk to.

Oh, and I still haven’t gotten a new floormat. Yes, this one’s still broken. For some reason, it just doesn’t annoy me as much as it used to. It’s almost endearing: like a three-legged puppy that I roll my chair across and stand on top of for 8 hours a day.

Meeting IRL

Why would you guys actually meet IRL? - Elliot, Mr Robot

I know a bunch of my coworkers by IRC nicks, and I see a few of their faces in a Google Hangout for our daily standup. This has been sufficient for me, but we did something truly magical in June. We met IRL.

The team I’m on (~10 people) and our sibling team (~10 people) met in Montréal, Québec, Canada, for about a week, and it was unlike any meeting-of-the-minds I’ve ever been to. The engineers were largely from the US and Europe. We all stayed in the hotel downtown and used a small conference room to hang out and work all day, every day. We woke up and ate breakfast together, met in the conference room at 9, had snacks and coffee, ate lunch together, stopped working precisely at 6, and met back in the lobby a few minutes later to go out on the town until 11 or midnight. It’s a truly intense social experience, especially for a group of people who spend most of their days only interacting with other humans through IRC, especially for a group of people who only meet twice a year or so.

This coming together allowed us to hash out a lot of our plans for the coming months, but I believe the true victory in this type of event is the camaraderie it creates. It’s nice to be able to put faces to nicks, think about the inflection in a person’s voice coming out in the way they type, and know exactly who I need to ping on IRC to accomplish certain tasks. It’s fun to hang out with people from all over the world, and it’s fun to go drinking with your coworkers, a thing I had temporarily forgotten.

I’ll note that we also happened to be in Montréal during the 23rd annual Mondial de la Biere, a huge international beer festival that lasted several days. I’ll also note that it was fun to try to speak little bits of French in Montréal, and I’m really looking forward to wherever the next sprint abroad may take us (most likely Europe in October).

Finding Work

With a decentralized company, the issue of finding things to work on for new employees can be tough. Do you give them something small and easy and possibly belittling? Do you give them something massive that will leave them scratching their heads for weeks and completely out of touch with the rest of the company? How can you find a good middle ground here?

I’d say I was “eased in” with smaller tasks into my current project. After the first few tasks were completed, it was often unclear to me where I should go next. There was always a list of bugs - was I the right person to work on them? Was there any other impending feature or unlisted bug that I should start looking into instead? These are hard questions for someone who hasn’t been around for very long.

Over time, I gained more and more responsibilities. I needed a bug fixed pronto in another project, so I did it myself and submitted an MP. Oh, you understand this codebase? You can be a maintainer now! Oh, I think I remember from your resume that you have some golang experience. We need this fixed ASAP, using SD cards is completely broken without it!

It’s all a slippery slope to having a bottomless bucket of super-interesting things to choose from to do work, perform code reviews, and answer community questions. Yet somehow it’s all happened in a way that is not overwhelming, but freeing. When I get stuck on a problem or need a short break from my current project, there’s plenty of other work to go around. Which leads me to…

Shaking Things Up

Routine can be helpful, but it can also be terrible. For the parts of May and June that I wasn’t traveling, I was largely waking up at the same time, making the same lunch, listening to the same music every day, petting the same cats, and picking up the same kinds of tasks from the backlog. None of this was bad per se, but I found myself getting a tad sluggish. It would be harder to work on menial tasks, and easier to come up with elaborate solutions to simple problems. So what did I do?

I started varying my sleep schedule. Some days I get out of bed at 7, other days I get out of bed at 8:45. Because I work from home, I don’t have to worry about fighting traffic or skipping breakfast.

I started varying my lunches. I was making some fun rissotto recipes, but since it’s summer I’ve been mixing up a bunch of different vegetables into a menagerie of salads: sauteed green tomatoes, onions, and zucchini; cukes and onions; pineapple, succotash, and eggs.

As I’ve gained more responsibilities for various projects, I’ve been able to branch into different kinds of tasks. After finishing up a big rework of one codebase, I can start jumping into bugs somewhere else. I can dig deep somewhere, and pivot when I’m ready to go back. There’s always something interesting to work on, and I can see the way different tasks help us towards our end goal. Not to mention I can always do bug hunts.

Remote Life 4-eva

It’s not just working remotely that makes this possible - it’s the people, the culture, and the fun and interesting products we’re creating. Maybe I’ll start blogging more about those things in the future. I don’t have much in this area that I’m looking to improve on, so this could be the last post in the series. Maybe I’ll do one at the 1-year mark to note any new tricks I’ve learned. Until then, keep looking our for other non-diary-entry blog posts.

Looking for advice on working remotely? Not sure if you’d like it? Do you have a strong disagreement with me as a person or my lifestyle choices? Hit me up and we can chat!

Writing Go/QML Convergent Ubuntu Apps

| Comments

Thinking about writing an Ubuntu application that will work in Unity 8? You’ll be writing a “convergent” app, which is an application that can respond to touch or mouse, and will adapt appropriately to a phone, tablet, or desktop screen. It will even be able to update its display on-the-fly if, say, you plug your phone into a monitor or bluetooth keyboard.

There are lots of ways to write a convergent application with the Ubuntu SDK. You can build apps in pure QML, C++ with QML, QtQuick, pure HTML5/Javascript, or even Golang with QML. We’ll be focusing on go apps for the rest of this post.

Go is a young language, and the recent release of go 1.6 has introduced some nasty changes particularly involving cgo. It has also introduced vendoring by default, which is a welcome change.

I’m using go 1.6.2. For me, the current project template provided by the Ubuntu SDK feels quite broken. I can’t get things to build locally, and furthermore I see no options for pushing an ARM build to my device.

Fortunately, I found this ubuntu-go-qml-template which is a template to enable running/building a go/qml application locally while also supporting building/installing onto an arm device. The kicker? This tool was designed to work for go 1.3.3. Sigh! Since I’m unwilling to compromise and use old technology, I forked the project and updated it to fit my more modern needs.

Because our go template will depend on QML, we depend on the go-qml project to create the necessary C bindings to allow us to use QML. However, with the update to go 1.6, the current version (revision 2ee7e5f) of go-qml will give a runtime error from cgo with the wonderfully helpful panic: runtime error: cgo argument has Go pointer to Go pointer. Another thorn in our side. Fortunately, this issue is in previously tread-upon ground and there is a fork of go-qml with enough of the cgo issues fixed to run our application with no problems. In the ubuntu-go-qml-template I forked above, I’ve gone ahead and vendored this fork of go-qml. It all works because of the default vendoring available in go 1.6.

With that background out of the way, let’s run through getting a project started:

1
2
3
4
5
6
7
8
9
10
11
$ sudo apt-get install golang g++ qtdeclarative5-dev qtbase5-private-dev \
                       qtdeclarative5-private-dev libqt5opengl5-dev \
                                           qtdeclarative5-qtquick2-plugin
# installing dependencies...
$ git clone https://github.com/larryprice/ubuntu-go-qml-template.git your-project-name
# cloning repo...
$ cd your-project-name
$ chroot-scripts/setup-chroot.sh
# building a chroot for ubuntu-sdk-15.04
# distro=vivid, arch=arm
# with go 1.6.2 with armhf compilation support

You may need to have other dependencies including click or phablet-tools. The above commands installed dependencies, cloned the repo, and built and/or updated a chroot for building our source for arm.

Next you’ll want to setup the project for your own needs. The original template creator included a nifty setup script to get this done:

1
2
$ ruby setup.rb -v -n your-project-name -a "Your Name" -e "[email protected]" \
                   -d "your-developer-namespace"

This will do some fancy gsub and file renaming to make the template your own.

If you check src/, you’ll find a main.go file ready for your use. You’ll also find a main.qml file in share/qml/. You can vendor all of your dependencies in vendor/, where you’ll already find the qml package.

As far as getting your application to work, there are more scripts available:

1
2
3
4
$ ./build.sh
# this will build your project locally
$ ./run.sh
# this will build your project locally and then run it on the desktop

The best part about this template is the ability to build for arm and load your applications onto a device:

1
2
3
4
5
$ ./build-in-chroot.sh
# builds the package using the vivid+armhf chroot we set up previously
$ ./install-on-device.sh
# builds for vivid+armhf and installs the click directly on
# the first USB-connected Ubuntu Touch device

Now you can run and install go/qml applications on desktop or on devices. Time to go build something cool!

Disclaimer: In the future, this template will likely need to be updated for new go versions or new default versions of the Ubuntu SDK. Don’t be afraid to make a comment below or submit a PR.