Planet Ubuntu California

October 06, 2015

Elizabeth Krumbach

Ending my 6 year tenure on the Ubuntu Community Council

On September 16th, Michael Hall sent out a call for nominations for the Ubuntu Community Council. I will not be seeking re-election this time around.

My journey with Ubuntu has been a long one. I can actually pinpoint the day it began, because it was also the day I created my account: March 12th, 2005. That day I installed Ubuntu on one of my old laptops to play with this crazy new Debian derivative and was delighted to learn that the PCMCIA card I had for WiFi actually worked out of the box. No kidding. In 2006 I submitted my first package to Debian and following earlier involvement with Debian Women, I sent my first message to the Ubuntu-Women mailing list offering to help with consolidating team resources. In 2007 a LoCo in my area (Pennsylvania) started up, and my message was the third one in the archives!

As the years went by, Ubuntu empowered me to help people and build my career.

In 2007 I worked with the Pennsylvania LoCo to provide 10 Ubuntu computers to girls in Philadelphia without access to computers (details). In 2010 I joined the board of Partimus, a non-profit which uses Ubuntu (and the flavors) to provide schools and other education-focused programs in the San Francisco Bay Area with donated computers (work continues, details on the Partimus blog). In 2012 I took a short sabbatical from work and joined other volunteers from Computer Reach to deploy computers in Ghana (details). Today I maintain a series of articles for the Xubuntu team called Xubuntu at… where we profile organizations using Ubuntu, many of which do so in a way that serves their local community. Most people also know me as the curator for the Ubuntu Weekly Newsletter, a project I started contributing to in 2010.

Throughout this time, I have worked as a Linux Systems Administrator, a role that’s allowed me to build up my expertise around Linux and continue to spend volunteer time on the projects I love. I’ve also have been fortunate to have employers who not only allow me to continue my work on open source, but actively encourage and celebrate it. In 2014 I had the honor of working with Matthew Helmke and others on the 8th edition of The Official Ubuntu Book. Today I’m working on my second open source book for the same publisher.

I share all of this to demonstrate that I have made a serious investment in Ubuntu. Ubuntu has long been deeply intertwined in both my personal and professional goals.

Unfortunately this year has been a difficult one for me. As I find success growing in my day job (working as a systems administrator on the OpenStack project infrastructure for HP), I’ve been witness to numerous struggles within the Ubuntu community and those struggles have really hit home for me. Many discussions on community mailing lists have felt increasingly strained and I don’t feel like my responses have been effective or helpful. They’ve also come home to me in the form of a pile of emails harshly accusing me of not doing enough for the community and in breaches of trust during important conversations that have caused me serious personal pain.

I’ve also struggled to come to terms with Canonical’s position on Intellectual Property (Jono Bacon’s post here echos my feelings and struggle). I am not a lawyer and considering both sides I still don’t know where I stand. People on both sides have accused me of not caring or understanding the issue because I sympathize with everyone involved and have taken their concerns and motivations to heart.

It’s also very difficult to be a volunteer, community advocate in a project that’s controlled by a company. Not only that, but we continually have to teach some of employees how to properly engage with an open source community. I have met many exceptional Canonical employees, I work with them regularly and I had a blast at UbuCon Latin America this year with several others. In nearly every interaction with Canonical and every discussion with Mark about community issues, we’ve eventually had positive results and found a successful path forward. But I’m exhausted by it. It sometimes feels like a game of Whac-A-Mole where we are continually being confronted with the same problems, but with different people, and it’s our job to explain to the Marketing/Development/Design/Web/whatever team at Canonical that they’ve made a mistake with regard to the community and help them move forward effectively.

We had some really great conversations when a few members of the Community Council and the Community Team at Canonical at the Community Leadership Summit back in July (I wrote about it here). But I was already feeling tired then and I had trouble feeling hopeful. I realized during a recent call with an incredibly helpful and engaged Canonical employee that I’d actually given up. He was making assurances to us about improvements that could be made and really listening to our concerns, I could tell that he honestly cared. I should have been happy, hopeful and encouraged, but inside I was full of sarcasm, bitterness and snark. This is very out of character for me. I don’t want to be that person. I can no longer effectively be an advocate for the community while feeling this way.

It’s time for me to step down and step back. I will continue to be involved with Xubuntu, the Ubuntu News Team and Ubuntu California, but I need to spend time away from leadership and community building roles before I actually burn out.

I strongly encourage people who care about Ubuntu and the community to apply for a position on the Ubuntu Community Council. We need people who care. I need people who care. While it’s sometimes not the easiest council to be on, it’s been rewarding in so many ways. Mark seriously listens to feedback from the Community Council, and I’m incredibly thankful for his leadership and guidance over the years. Deep down I do continue to have hope and encouragement and I still love Ubuntu. Some day I hope to come back.

I also love you all. Please come talk to me at any time (IRC: pleia2, email: If you’re interested in a role on the Ubuntu Community Council, I’m happy to chat about duties, expectations and goals. But know that I don’t need gripe buddies, sympathy is fine, but anger and negativity are what brought me here and I can’t handle more. I also don’t have the energy to fix anything else right now. Bring discussions about how to fix things to the ubuntu-community-team mailing list and see my Community Leadership post from July mentioned earlier to learn more about about some of the issues the community and the Community Council are working on.

by pleia2 at October 06, 2015 04:36 PM

October 05, 2015


Bjarne on C++11

header image

I saw this keynote quite a while ago, and I still refer to it sometimes, even though its almost 3 years old now. Its a good whirlwind tour of the advances in C++11.

by Kevin at October 05, 2015 01:20 PM

October 04, 2015

Akkana Peck

Aligning images to make an animation (or an image stack)

For the animations I made from the lunar eclipse last week, the hard part was aligning all the images so the moon (or, in the case of the moonrise image, the hillside) was in the same position in every time.

This is a problem that comes up a lot with astrophotography, where multiple images are stacked for a variety of reasons: to increase contrast, to increase detail, or to take an average of a series of images, as well as animations like I was making this time. And of course animations can be fun in any context, not just astrophotography.

In the tutorial that follows, clicking on the images will show a full sized screenshot with more detail.

Load all the images as layers in a single GIMP image

The first thing I did was load up all the images as layers in a single image: File->Open as Layers..., then navigate to where the images are and use shift-click to select all the filenames I wanted.

[Upper layer 50% opaque to align two layers]

Work on two layers at once

By clicking on the "eyeball" icon in the Layers dialog, I could adjust which layers were visible. For each pair of layers, I made the top layer about 50% opaque by dragging the opacity slider (it's not important that it be exactly at 50%, as long as you can see both images).

Then use the Move tool to drag the top image on top of the bottom image.

But it's hard to tell when they're exactly aligned

"Drag the top image on top of the bottom image": easy to say, hard to do. When the images are dim and red like that, and half of the image is nearly invisible, it's very hard to tell when they're exactly aligned.


Use a Contrast display filter

What helped was a Contrast filter. View->Display Filters... and in the dialog that pops up, click on Contrast, and click on the right arrow to move it to Active Filters.

The Contrast filter changes the colors so that dim red moon is fully visible, and it's much easier to tell when the layers are approximately on top of each other.


Use Difference mode for the final fine-tuning

Even with the Contrast filter, though, it's hard to see when the images are exactly on top of each other. When you have them within a few pixels, get rid of the contrast filter (you can keep the dialog up but disable the filter by un-checking its checkbox in Active Filters). Then, in the Layers dialog, slide the top layer's Opacity back to 100%, go to the Mode selector and set the layer's mode to Difference.

In Difference mode, you only see differences between the two layers. So if your alignment is off by a few pixels, it'll be much easier to see. Even in a case like an eclipse where the moon's appearance is changing from frame to frame as the earth's shadow moves across it, you can still get the best alignment by making the Difference between the two layers as small as you can.

Use the Move tool and the keyboard: left, right, up and down arrows move your layer by one pixel at a time. Pick a direction, hit the arrow key a couple of times and see how the difference changes. If it got bigger, use the opposite arrow key to go back the other way.

When you get to where there's almost no difference between the two layers, you're done. Change Mode back to Normal, make sure Opacity is at 100%, then move on to the next layer in the stack.

It's still a lot of work. I'd love to find a program that looks for circular or partially-circular shapes in successive images and does the alignment automatically. Someone on GIMP suggested I might be able to write something using OpenCV, which has circle-finding primitives (I've written briefly before about SimpleCV, a wrapper that makes OpenCV easy to use from Python). But doing the alignment by hand in GIMP, while somewhat tedious, didn't take as long as I expected once I got the hang of using the Contrast display filter along with Opacity and Difference mode.

Creating the animation

Once you have your layers, how do you turn them into an animation?

The obvious solution, which I originally intended to use, is to save as GIF and check the "animated" box. I tried that -- and discovered that the color errors you get when converting an image to indexed make a beautiful red lunar eclipse look absolutely awful.

So I threw together a Javascript script to animate images by loading a series of JPEGs. That meant that I needed to export all the layers from my GIMP image to separate JPG files.

GIMP doesn't have a built-in way to export all of an image's layers to separate new images. But that's an easy plug-in to write, and a web search found lots of plug-ins already written to do that job.

The one I ended up using was Lie Ryan's Python script in How to save different layers of a design in separate files; though a couple of others looked promising (I didn't try them), such as gimp-plugin-export-layers and save_all_layers.scm.

You can see the final animation here: Lunar eclipse of September 27, 2015: Animations.

October 04, 2015 03:44 PM

October 01, 2015

Nathan Haines

Beginning Ubuntu for Windows and Mac Users

Where do I begin? That’s the challenge ahead of anyone who tries something new. And the first step of any new experience. Sometimes this can be exciting, like when you sit down to try food at a new restaurant. Other times the question is paralyzing. Taking the first step is difficult when the path is unclear or unmarked.

Ubuntu is the world’s third most popular operating system. It powers twenty million desktop computers, and untold servers. But for even more people who grew up using Windows or OS X, their operating system is the computer. Ubuntu’s Linux and Unix heritage are no longer its greatest strength, but its biggest drawback. But it doesn’t have to be.

For new Ubuntu users, the first challenge to surmount is familiarity. Ubuntu thinks and behaves in different ways from the computing experience they’ve gained over the years. And those years of experience are an enemy at first. But using a new operating system is much like visiting a foreign country. Everything’s different, but after a chance to acclimate, it’s not that different. The trick is finding your way around until you know what’s the same. The differences aren’t that vast and soon everything is manageable.

book cover

My new book, Beginning Ubuntu for Windows and Mac Users was written to help speed that process along. Ubuntu is the perfect operating system for every day business, casual, and entertainment use. The book explains key concepts and helps users adapt to their new operating system. It’s a reference guide to the best software in Ubuntu that can get tasks done. And it teaches how to use Ubuntu so that any computer user can get started and learn from there.

Beginning Ubuntu for Windows and Mac Users expects readers to want to use Ubuntu graphically, and prefers this over command line shortcuts. When the command lie is introduced in Chapter 5, it’s from the perspective of a window into an older period of computing history, and after a short overview, it walks the user through specific tasks that demonstrate exactly why one would use the command line over the graphical tools. Simple information lookup, text-based browsing, and even games gives the command line a practical purpose and makes the chapter a handy reference.

The book finishes up with power user advice that shows simple yet powerful ways to make an Ubuntu system even more powerful, from enabling multiple workspaces to installing VirtualBox and working with virtual machines.

If you’ve been wanting to try Ubuntu but don’t know where to begin, this book is for you. It explains the origins of Ubuntu and walks you through the install process step by step. It talks about dual-booting and installing graphics drivers. It even helps you find the right “translation” as you learn the Ubuntu desktop. Looking for the Start Menu or Spotlight? The Dash icon provides the same functionality.

If you’re already an Ubuntu user, you may benefit from the clear instructions and format of the book. But you can also buy the book for friends. It’s a friendly, gentle introduction to Ubuntu that any Windows or Mac user will enjoy, and the perfect gift for anyone who could benefit from using Ubuntu.

Beginning Ubuntu for Windows and Mac Users is available today from Amazon, Barnes & Noble, and other fine booksellers around the world. Best of all, the companion ebook is only $5 through Apress when you buy the print version (even if you didn't buy it from the publisher), and the ebook is available DRM-free in PDF, EPUB, and MOBI (Kindle) formats. Not only is that an incredible bargain that offers all 150+ screenshots in full color, but the DRM-free files respect you and your investment.

Whether you’ve already taken the first steps into experiencing Ubuntu for yourself, or you’ve hesitated because you don’t know where to begin, this book is for you. We’ll walk through the first steps together, and your existing Windows and Mac experience will help you take the next steps as you explore the endless possibilities offered by Ubuntu.

October 01, 2015 08:51 PM

Akkana Peck

Lunar eclipse animations

[Eclipsed moon rising] The lunar eclipse on Sunday was gorgeous. The moon rose already in eclipse, and was high in the sky by the time totality turned the moon a nice satisfying deep red.

I took my usual slipshod approach to astrophotography. I had my 90mm f/5.6 Maksutov lens set up on the patio with the camera attached, and I made a shot whenever it seemed like things had changed significantly, adjusting the exposure if the review image looked like it might be under- or overexposed, occasionally attempting to refocus. The rest of the time I spent socializing with friends, trading views through other telescopes and binoculars, and enjoying an apple tart a la mode.

So the images I ended up with aren't all they could be -- not as sharply focused as I'd like (I never have figured out a good way of focusing the Rebel on astronomy images) and rather grainy.

Still, I took enough images to be able to put together a couple of animations: one of the lovely moonrise over the mountains, and one of the sequence of the eclipse through totality.

Since the 90mm Mak was on a fixed tripod, the moon drifted through the field and I had to adjust it periodically as it drifted out. So the main trick to making animations was aligning all the moon images. I haven't found an automated way of doing that, alas, but I did come up with some useful GIMP techniques, which I'm in the process of writing up as a tutorial.

Once I got the images all aligned as layers in a GIMP image, I saved them as an animated GIF -- and immediately discovered that the color error you get when converting to an indexed GIF image loses all the beauty of those red colors. Ick!

So instead, I wrote a little Javascript animation function that loads images one by one at fixed intervals. That worked a lot better than the GIF animation, plus it lets me add a Start/Stop button.

You can view the animations (or the source for the javascript animation function) here: Lunar eclipse animations

October 01, 2015 06:55 PM

September 30, 2015

Jono Bacon

Free Beer, Prizes, and Bad Voltage in Fulda Tonight!

Tonight, Wed 30th September 2015 at 7pm there are five important reasons why you should be in Fulda in Germany:

  1. A live Bad Voltage show that will feature technology discussion, competitions, and plenty of fun.
  2. Free beer.
  3. The chance to win an awesome Samsung Galaxy Tab S2.
  4. Free entry (including the beer!).
  5. A chance to meet some awesome people.

It is going to be a blast and we hope you can make it out here tonight.

Just remember, you might leave with one of these:

Doors open tonight at 7pm, show starts at 7.30pm at:

Hall 8
University of Applied Science Fulda,
Leipziger Str. 123, 36037
Fulda, Germany

We hope to see you there!

by jono at September 30, 2015 08:16 AM

September 27, 2015

Akkana Peck

Make a series of contrasting colors with Python

[PyTopo with contrasting color track logs] Every now and then I need to create a series of contrasting colors. For instance, in my mapping app PyTopo, when displaying several track logs at once, I want them to be different colors so it's easy to tell which track is which.

Of course, I could make a list of five or ten different colors and cycle through the list. But I hate doing work that a computer could do for me.

Choosing random RGB (red, green and blue) values for the colors, though, doesn't work so well. Sometimes you end up getting two similar colors together. Other times, you get colors that just don't work well, because they're so light they look white, or so dark they look black, or so unsaturated they look like shades of grey.

What does work well is converting to the HSV color space: hue, saturation and value. Hue is a measure of the color -- that it's red, or blue, or yellow green, or orangeish, or a reddish purple. Saturation measures how intense the color is: is it a bright, vivid red or a washed-out red? Value tells you how light or dark it is: is it so pale it's almost white, so dark it's almost black, or somewhere in between? (A similar model, called HSL, substitutes Lightness for Value, but is similar enough in concept.)

[GIMP color chooser] If you're not familiar with HSV, you can get a good feel for it by playing with GIMP's color chooser (which pops up when you click the black Foreground or white Background color swatch in GIMP's toolbox). The vertical rainbow bar selects Hue. Once you have a hue, dragging up or down in the square changes Saturation; dragging right or left changes Value. You can also change one at a time by dragging the H, S or V sliders at the upper right of the dialog.

Why does this matter? Because once you've chosen a saturation and value, or at least ensured that saturation is fairly high and value is somewhere in the middle of its range, you can cycle through hues and be assured that you'll get colors that are fairly different each time. If you had a red last time, this time it'll be a green, or yellow, or blue, depending on how much you change the hue.

How does this work programmatically?

PyTopo uses Python-GTK, so I need a function that takes a gtk.gdk.Color and chooses a new, contrasting Color. Fortunately, gtk.gdk.Color already has hue, saturation and value built in. Color.hue is a floating-point number between 0 and 1, so I just have to choose how much to jump. Like this:

def contrasting_color(color):
    '''Returns a gtk.gdk.Color of similar saturation and value
       to the color passed in, but a contrasting hue.
       gtk.gdk.Color objects have a hue between 0 and 1.
    if not color:
        return self.first_track_color;

    # How much to jump in hue:
    jump = .37

    return gtk.gdk.color_from_hsv(color.hue + jump,

What if you're not using Python-GTK?

No problem. The first time I used this technique, I was generating Javascript code for a company's analytics web page. Python's colorsys module works fine for converting red, green, blue triples to HSV (or a variety of other colorspaces) which you can then use in whatever graphics package you prefer.

September 27, 2015 07:27 PM

September 23, 2015

Nathan Haines

Writing and Publishing a Book with Free Software

I’ve been a technology enthusiast since I was very little. I’ve always been fascinated by electronics and computers, and from the time I got my first computer when I was 10, I’ve loved computers for their own sake. That’s served me very well as a computer technician, but it can lead to narrow-sightedness, too. The one thing that doing computer support at my college campus drove home is that for most computer users, the computer is simply a tool.

Over the last year, I’ve been thinking a lot about Ubuntu in terms of getting specific tasks done. Not only because I was writing a book that would help Windows and Mac users get started with Ubuntu quickly, but also because Ubuntu development and documentation work best when they address clear user stories.

Ubuntu is exciting for many reasons. What stands out for me is how Ubuntu excels at providing the tools needed for so many different roles. For any hobbyist or professional, Ubuntu can be the foundation of a workflow that creates amazing results.

Ubuntu integrates seamlessly into my routine as an author, from planning, to writing, to revision and editing, to layout and design, all the way to the final step of publishing. Ubuntu gives me the tools I need whether my book is traditionally or self-published.

In this presentation, I talk about the process of writing and publishing a book, and although the presentation focuses on the steps involved in publishing, it also illustrates where the Free Software available in Ubuntu can be utilized along the way.

book cover

For a more comprehensive look at how Ubuntu can work for you as you come from Windows or OS X, take a look at my book, Beginning Ubuntu for Windows and Mac Users, available today on Amazon or from your local book retailer.

September 23, 2015 11:39 PM

Elizabeth Krumbach

Simcoe’s September 2015 Checkup

A few weeks ago I wrote about Simcoe’s lab work from July and some other medical issues that cropped up. I’m happy to report that the scabbing around her eyes has cleared up and we were able to get the ultrasound done last Thursday.

The bad news is that her kidneys are very small and deformed. Her vet seemed surprised that they were working at all. Fortunately she doesn’t seem to have anything else going on, no sign of infections from the tests they ran (UTIs are common at this stage). Her calcium levels have also remained low thanks to a weekly pill we’ve been giving her.

Her CRE levels do continue to creep up into a worrying range, which the vet warned could also lead to more vomiting:


But her BUN levels have dropped slightly since last time:


Her also weight continues to be lower than where it was trending for the past couple years:


All of this means it’s time to escalate her care beyond the subcutaneous fluids and calcium lowering pills. We have a few options, but the first step is making an appointment with the hospital veterinarian who has provided wise counsel in the past.

Simcoe melts

Otherwise, Simcoe has been joining us in melting during our typical late onset of summer here in San Francisco. Heat aside, her energy levels, appetite and general behavior has been normal. It’s pretty clear she’s not at all happy about our travel schedules though, I think we’ll all be relieved when I conclude my travel for the year in November.

by pleia2 at September 23, 2015 12:25 AM

September 21, 2015

Akkana Peck

The meaning of "fetid"; Albireo; and musings on variations in sensory perception

[Fetid marigold, which actually smells wonderfully minty] The street for a substantial radius around my mailbox has a wonderful, strong minty smell. The smell is coming from a clump of modest little yellow flowers.

They're apparently Dyssodia papposa, whose common name is "fetid marigold". It's in the sunflower family, Asteraceae, not related to Lamiaceae, the mints.

"Fetid", of course, means "Having an offensive smell; stinking". When I google for fetid marigold, I find quotes like "This plant is so abundant, and exhales an odor so unpleasant as to sicken the traveler over the western prairies of Illinois, in autumn." And nobody says it smells like mint -- at least, googling for the plant and "mint" or "minty" gets nothing.

But Dave and I both find the smell very minty and pleasant, and so do most of the other local people I queried. What's going on?

[Fetid goosefoot] Another local plant which turns strikingly red in autumn has an even worse name: fetid goosefoot. On a recent hike, several of us made a point of smelling it. Sure enough: everybody except one found it minty and pleasant. But one person on the hike said "Eeeeew!"

It's amazing how people's sensory perception can vary. Everybody knows how people's taste varies: some people perceive broccoli and cabbage as bitter while others love the taste. Some people can't taste lobster and crab at all and find Parmesan cheese unpleasant.

And then there's color vision. Every amateur astronomer who's worked public star parties knows about Albireo. Also known as beta Cygni, Albireo is a double star, the head of the constellation of the swan or the foot of the Northern Cross. In a telescope, it's a double star, and a special type of double: what's known as a "color double", two stars which are very different colors from each other.

Most non-astronomers probably don't think of stars having colors. Mostly, color isn't obvious when you're looking at things at night: you're using your rods, the cells in your retina that are sensitive to dim light, not your cones, which provide color vision but need a fair amount of light to work right.

But when you have two things right next to each other that are different colors, the contrast becomes more obvious. Sort of.

[Albireo, from Jefffisher10 on Wikimedia Commons] Point a telescope at Albireo at a public star party and ask the next ten people what two colors they see. You'll get at least six, more likely eight, different answers. I've heard blue and red, blue and gold, red and gold, red and white, pink and blue ... and white and white (some people can't see the colors at all).

Officially, the bright component is actually a close binary, too close to resolve as separate stars. The components are Aa (magnitude 3.18, spectral type K2II) and Ac (magnitude 5.82, spectral type B8). (There doesn't seem to be an Albireo Ab.) Officially that makes Albireo A's combined color yellow or amber. The dimmer component, Albireo B, is magnitude 5.09 and spectral type B8Ve: officially it's blue.

But that doesn't make the rest of the observers wrong. Color vision is a funny thing, and it's a lot more individual than most people think. Especially in dim light, at the limits of perception. I'm sure I'll continue to ask that question when I show Albireo in my telescope, fascinated with the range of answers.

In case you're wondering, I see Albireo's components as salmon-pink and pale blue. I enjoy broccoli and lobster but find bell peppers bitter. And I love the minty smell of plants that a few people, apparently, find "fetid".

September 21, 2015 10:09 PM

September 18, 2015

Elizabeth Krumbach

The Migration of OpenStack Translations to Zanata

The OpenStack infrastructure team that I’m part of provides tooling for OpenStack developers, translators, documentation writers and more. One of the commitments the OpenStack Infrastructure team has to the project, as outlined in our scope, is:

All of the software that we run is open source, and its configuration is public.

Like the rest of the project, we’ve committed ourselves to being Open. As a result, the infrastructure has become a mature open source project itself that we hope to see replicated by other projects.

With this in mind, the decision by Transifex to cease development on their open source platform meant that we needed to find a different solution that would meet the needs of our community and still be open source.

We were aware of the popular Pootle software, so we started there with evaluations. At the OpenStack Summit in Atlanta the i18n team first met up with Carlos Munoz and were given a demo of Zanata. As our need for a new solution increased in urgency, we worked with Pootle developers (thank you Dwayne Bailey!) and Zanata developers to find what was right for our community. Setting up development servers for testing for both and hosting demos through 2014. At the summit in Paris I had a great meeting with Andreas Jaeger of the OpenStack i18n team (and so much more!) and Carlos about Zanata.

Me, Carlos and Andreas in Paris

That summit was where we firmed up our plans to move forward with Zanata and wrote up spec so we could get to work.

Ying Chun Guo (Daisy) and I began by working closely with the Zanata team to identify requirements and file bugs that the team then made a priority. I worked closely with Stephanie Miller on our Puppet module for Zanata using Wildfly (an open source JBoss Application Server) and then later Steve Kowalik who worked on migrating our scripts from Transifex to Zanata. It was no small task, as we explored the behavior of the Zanata client that our scripts needed to use and worked to replicate what we had been doing previously.

As we worked on the scripts and rest of the infrastructure to support the team, this summer was spent by the translators with the formal trial of our final version of Zanata in preparation for the Liberty translations work. Final issues were worked out through this trial and the ever-responsive team from Zanata was able to work with us to fix a few more issues. I was thoroughly thankful for my infrastructure colleague Clark Boylan’s work keeping infrastructure things chugging along as I had some end of summer travel come up.


On September 10th Daisy announced that we had gone into production for Liberty translations in her email Liberty translation, go! In the past week the rest of us have worked to support all the moving parts that make our translations system work in the infrastructure side of production, with Wednesday being the day we switched to Zanata proposing all changes to Gerrit. Huge thanks to Alex Eng, Sean Flanigan and everyone else on the Zanata team who helped Steve, Andreas and me during the key parts of this switch.

I’m just now finishing up work on the documentation to call our project complete and Andreas has done a great job updating the documentation on the wiki.

Huge thanks to everyone who participated in this project, I’m really proud of the work we got done and so far the i18n team seems satisfied with the change. At the summit in Tokyo I will be leading the Translation tool support: What do we need to improve? session on Tuesday at 4:40pm where we’ll talk about the move to Zanata and other improvements that can be made to translations tooling. If you can’t attend the summit, please provide feedback on the openstack-i18n mailing list so it can be collected and considered for the session.

by pleia2 at September 18, 2015 09:43 PM

September 17, 2015

Elizabeth Krumbach

The OpenStack Ops mid-cycle, PLUG and Ubuntu & Debian gatherings

In the tail end of August I made my way down to Palo Alto for a day to attend the OpenStack Operators Mid-cycle event. I missed the first day because I wasn’t feeling well post-travel, but the second day gave me plenty of time to attend a few discussions and sync up with colleagues. My reason for going was largely to support the OpenStack Infrastructure work on running our own instance of OpenStack, the infra-cloud.

The event had about 200 people, and sessions were structured so they would have a moderator but were actually discussions to share knowledge between operators. It was also valuable to see several OpenStack project leads there trying to gain insight into how people are using their projects and to make themselves available for feedback. The day began with a large session covering the popularity and usage of configuration management databases (CMDBs) in order to track resources, notes here: PAO-ops-cmdb. Then there was a session covering OpenStack deployment tips, which included a nice chunk about preferred networking models (the room was about split when it came to OVS vs. LinuxBridge), notes from this session: PAO-ops-deployment-tips.

After lunch I attended a tools and monitoring session, and learned that they have a working group and an IRC meeting every other week. The session was meant to build upon a previous session from the summit, but the amount of overlap between that session and this seemed to be quite low and it ended up being a general session about sharing common tools. Notes from the session here: PAO-ops-tools-mon.

In all, an enjoyable event and I was impressed with how well-organized it all felt as an event with such a loose agenda going in. Participants seemed really engaged, not just expecting presentations, and it was great to see them all collaborating so openly.

My next event took me across the country, but only incidentally. Our recent trip back east happened to coincide with a PLUG meeting in downtown Philadelphia. The meetings are a great way for me to visit a bunch of my Philadelphia friends at once and I always have a good time. The presentation itself was by Debian Maintainer Guo Yixuan on “Debian: The community and the package management system” where he outlined the fundamentals regarding Debian community structure and organization and then did several demos of working with .deb packages, including unpacking, patching and rebuilding. After the meeting we adjourned to a local pizzeria where I got my ceremonial buffalo chicken cheese steak (fun fact: you can actually find a solid Philly cheese steak in San Francisco, but not one with chicken!).

Guo Yixuan prepares for his presentation, as Eric Lucas and CJ Fearnley host Q&A with attendees

Back home in San Francisco I hosted a couple events back to back last week. First up was the Ubuntu California Ubuntu Hour at a Starbucks downtown. One of the attendees was able to fill us in on his plans to ask his employer for space for a Wily Werewolf (15.10) release party in October. Unfortunately I’ll be out of town for this release, so I can’t really participate, but I’ll do what I can to support them from afar. After the Ubuntu Hour we all walked down the street to Henry’s Hunan in SOMA for a Bay Area Debian Dinner. There, talk continued about our work, upgrades and various bits of tech about Debian and not. We wrapped up the meeting with a GPG keysigning, which we hadn’t done in quite some time. I was also reminded post-meeting to upload my latest UID to a key server.

Next week rounds up my local-ish event schedule for the month by attending the CloudNOW Top Women in Cloud Awards in Menlo Park where my colleague Allison Randal is giving a keynote. Looking forward to it!

by pleia2 at September 17, 2015 10:42 PM

End of Summer Trip Back East

MJ and I spent the first week of September in New Jersey and Pennsylvania. During our trip we visited with an ailing close relative and spent additional time with other family. I’m really thankful for the support of friends, family and colleagues, even if I’ve been cagey about details. It made the best of what was a difficult trip and what continues to be a tough month.

It was also hot. A heat wave hit the northeast for the entire time we were there, each day the temperatures soaring into the 90s. Fortunately we spent our days ducking from the air-conditioned car to various air-conditioned buildings. Disappointingly there was also no rain, which is one of the things I miss the most, particularly now as California is suffering from such a severe drought.

We made time for a couple enjoyable evenings with friends. Our friend Danita met us downtown at The Continental in Philadelphia before we spent some time chatting and walking around Penn’s Landing. Later in the week we had dinner with our friend Tim at another of our favorites, CinCin in Chestnut Hill. On the New Jersey side we were able to have lunch with our friends Mike and Jess and their young one, David, at a typical local pizzeria near where we were staying. These are pretty common stops on our trips back east, but when you can only make it into town a couple times a year, you want to visit your favorites! Plus, any random pizzeria in New Jersey is often better and cheaper than what you find here in California. Sorry California, you just don’t do pizza right.

Much like our trip in the spring we also had a lot of work to do with storage to sort, consolidate and determine what we’ll be bringing out west. It’s a tedious and exhausting process, but we made good progress, all things considered. And there were moments where it was fun, like when we found MJ’s NES and all his games, then got to play our real world version of Tetris as we documented and packed it up into a plastic tote. We also got to assemble one of those two-wheeled hand trucks that we had delivered to the hotel (you should have seen their faces!). No one died in the process of building the hand truck. We also made a trip to the local scrap metal yard to get rid of an ancient, ridiculously heavy trash compactor that’s been taking up space in storage for years. We got a whopping $5.75 for it. Truthfully, I’m just glad we didn’t need to pay someone to haul it away. We also managed to get rid of some 1990s era x86 machines (sans harddrives) by bringing them to Best Buy for recycling, a service that I learned they offer nationwide for various computers and electronics.

Our trip also landed during the week of Force Friday, the official kickoff of the Star Wars Episode 7 merchandise blitz. Coming home late one evening anyway, we made it out to Toys”R”Us at midnight on Friday the 4th to check out the latest goodies. I picked up three Chewbacca toys, including the Chewbacca Furby, Furbacca. Upon returning to our hotel MJ managed to place an order for a BB-8 by Sphero for me, which I’m having a lot of fun with (and so have the cats!).

The midnight line at Toys”R”Us on Force Friday

And I also worked. One of my big projects at work this past year had deadlines coming up quickly and so I did what I could to squeeze in time to send emails and sync up with my team mates as needed to make sure everything was prepared for the launch into production that happened upon my return. I’m happy to report that it all worked out.

We flew home on Sunday, just before Labor Day. Unfortunately, we seemed to have brought the heat along with us, with San Francisco plunging into a heat wave upon our return!

Some more photos from the trip:

by pleia2 at September 17, 2015 04:01 AM

September 16, 2015

Akkana Peck

Hacking / Customizing a Kobo Touch ebook reader: Part II, Python

I wrote last week about tweaking a Kobo e-reader's sqlite database by hand.

But who wants to remember all the table names and type out those queries? I sure don't. So I wrote a Python wrapper that makes it much easier to interact with the Kobo databases.

Happily, Python already has a module called sqlite3. So all I had to do was come up with an API that included the calls I typically wanted -- list all the books, list all the shelves, figure out which books are on which shelves, and so forth.

The result was, which includes a main function that can list books, shelves, or shelf contents.

You can initialize kobo_utils like this:

import kobo_utils

koboDB = KoboDB("/path/where/your/kobo/is/mounted")
connect() throws an exception if it can't find the .sqlite file.

Then you can list books thusly:

or list shelf names:
or use print_shelf which books are on which shelves:
shelves = koboDB.get_dlist("Shelf", selectors=[ "Name" ])
for shelf in shelves:
    print shelf["Name"]

What I really wanted, though, was a way to organize my library, taking the tags in each of my epub books and assigning them to an appropriate shelf on the Kobo, creating new shelves as needed. Using plus the Python epub library I'd already written, that ended up being quite straightforward: shelves_by_tag.

September 16, 2015 02:38 AM

September 15, 2015

Jono Bacon

Bad Voltage Live in Germany: 30th Sep 2015

Some of you may know that I do a podcast called Bad Voltage with some friends; Stuart Langridge, Bryan Lunduke, and Jeremy Garcia.

The show covers Open Source, technology, politics, and more, and features interviews, reviews, and plenty of loose, fun, and at times argumentative discussion.

On Wed 30th Sep 2015, the Bad Voltage team will be doing a live show as part of the OpenNMS Users Conference. The show will be packed with discussion, surprises, contests, and give-aways.

The show takes place at the University Of Applied Sciences in Fulda, Germany. The address:

University of Applied Science Fulda, Leipziger Str. 123, 36037 Fulda, Germany Tel: +49 661 96400

For travel details of how to get there see this page.

Everyone is welcome to join and you don’t have to be joining the OpenNMS Users Conference to see the live Bad Voltage show. There will be a bunch of Ubuntu folks, SuSE folks, Linux folks, and more joining us. Also, after the show we plan on keeping the party going – it is going to be a huge amount of fun.

To watch the show, we have a small registration fee of €5. You can register here. While this is a nominal fee, we will also have some free beer and giveaways, so you will get your five euros worth.

So, be sure to come on join us. You can watch a fun show and meet some great people.

REGISTER FOR THE SHOW NOW; space is limited, so register ASAP.

by jono at September 15, 2015 07:00 PM

Nathan Haines

Last call for Free Culture Showcase submissions!

In just a few hours, the Ubuntu Free Culture Showcase submission period will wrap up, and we'll begin the judging process.  You have until 23:59 UTC tonight, the 15th, to submit to the Flickr group, the Vimeo group, or the SoundCloud group and have a chance to see your Creative Commons-licensed media included in the Ubuntu 15.10 release which will be enjoyed worldwide!

So if you've been waiting until the last second, it's arrived!  View the wiki page at the link above for more information about the rules for submission and links to the submission pools.

September 15, 2015 09:44 AM

September 12, 2015


More Usable Code By Avoiding Two Step Objects

header image

Two step initialization is harmful to the objects that you write because it obfuscates the dependencies of the object, and makes the object harder to use.

Harder to use

Consider a header and some usage code:

struct Monkey
    void set_banana(std::shared_ptr<Banana> const& banana);
    void munch_banana();
    std::shared_ptr<Banana> const& banana;
int main(int argc, char** argv)
    Monkey jim;

Now jim.munch_banana(); could be a valid line to call, but the reader of the interface isn’t really assured that it is if the writer wrote the object with two step initialization. If the implementation is:

Monkey::Monkey() :
void Monkey::set_banana(std::shared_ptr<Banana> const& b)
    banana = b;
void Monkey::munch_banana()

Then calling jim.munch_banana(); would segfault! A more careful coder might have written:

void Monkey::munch_banana()
    if (banana)

This still is a problem though, as calling munch_banana() is silently doing nothing; and the caller can’t know that. If you tried to fix by writing:

void Monkey::munch_banana()
    if (banana)
        throw std::logic_error("monkey doesn't have a banana");

We’re at least to the point where we haven’t segfaulted and we’ve notified the caller that something has gone wrong…. But we’re still at the point where we’ve thrown an exception that the user has to recover from.

Obfuscated Dependencies

With the two-step object, you need more lines of code to initialize it, and you leave the object “vulnerable”.

auto monkey = std::make_unique<Monkey>();

If you notice, between lines 1 and 2, monkey isn’t really a constructed object. It’s in an indeterminate state! If monkey has to be passed around to an object that has a Banana to share, thats a recipe for a problem. Other objects don’t have a good way to know if this is a Monkey object, or if its a meta-Monkey object that can’t be used yet.

Can we do better?

Yes! By thinking about our object’s dependencies, we can avoid the situation altogether.
The truth is; Monkey really does depend on Banana..
If the class expresses this in its constructor, ala:

struct Monkey
    Monkey(std::shared_ptr<Banana> const& banana);
    void set_banana(std::shared_ptr<Banana> const& banana);
    void munch_banana();
    std::shared_ptr<Banana> banana;

We make it clear when constructing that the Monkey needs a Banana. The coder interested in calling Monkey::munch_banana() is guaranteed that it’ll work. The code implementing Monkey::munch_banana() becomes the original, and simple:

void Monkey::munch_banana()

Furthermore, if we update the banana later via Monkey::set_banana(), we’re still in the clear. The only way the coder’s going to run into problems is if they explicitly set a nullptr as the argument, which is a pretty easy error to avoid, as you have to actively do something silly, instead of doing something reasonable, and getting a silly error.

Getting the dependencies of the object right sorts out a lot of interface problems and makes the object easier to use.

by Kevin at September 12, 2015 07:31 PM

September 11, 2015

Elizabeth Krumbach

“The Year Without Pants” and OpenStack work

As I’ve talked about before, the team I work on at HP is a collection of folks scattered all over the world, working from home and hacking on OpenStack together. We’re joined by hundreds of other people from dozens of companies doing the same, or similar.

This year our team at HP kicked off an internal book club, each month or two we’d read the same book that focused on some kind of knowledge that we felt would be helpful or valuable to the team. So far on our schedule:

  • Daring Greatly: How the Courage to Be Vulnerable Transforms the Way We Live, Love, Parent, and Lead by Brené Brown
  • The Year Without Pants: and the Future of Work by Scott Berkun
  • Crucial Conversations: Tools for Talking When Stakes Are High by Joseph Grenny, Kerry Patterson, and Ron McMillan

This month’s book was The Year Without Pants. I had previously read Scott Berkun’s Confessions of a Public Speaker which is my favorite book on public speaking, I recommend it to everyone. This, and given that our team is in some ways very similar to how the teams Automattic (makers of WordPress) work, I was very interested in reading this other book of his.

Stepping back for a high level view of how we do work, it’s probably easiest to begin with how we differ from Automattic as a team, rather than how we’re similar. There are certainly several notable things:

  • They have a contract to hire model, partially to weed out folks who can’t handle the work model. We and most companies who work on OpenStack instead either hire experienced open source people directly for an upstream OpenStack job or ease people into the position, making accommodations and changes if work from home and geographic distribution of the team isn’t working out for them (it happens).
  • All of the discussions about my OpenStack work are fully public, I don’t really have “inside the company”-only discussions directly related to my day to day project work.
  • I work with individuals from large, major companies all over the world for our project work on a day to day basis, not just one company and a broader community.

These differences mattered when reading the book, especially when it comes to the public-facing nature of our work. We don’t just entertain feedback and collaboration about our day to day discussions and work from people in our group or company, but from anyone who cares enough to take the time to find us on our public mailing list, IRC channel or meeting. As a member of the Infrastructure team I don’t believe we’ve suffered from this openness. Some people certainly have opinions about what our team “should” be working on, but we have pretty good filters for these things and I like to think that as a team we’re open to accepting efforts from anyone who comes to us with a good idea and people-power to help implement it.

The things we had in common were what interested me most so I could compare our experiences. In spite of working on open source software for many years, this is the first time I’ve been paid full time to do it and worked with such large companies. It’s been fascinating to see how the OpenStack community has evolved and how HP has met the challenges. Hiring the right people is certainly one of those challenges. Just like in the book, we’ve found that we need to find people who are technically talented and who also have good online communication skills and can let their personality show through in text. OpenStack is very IRC-focused, particularly the team I’m on. Additionally, it’s very important that we steer clear of people whose behavior may be toxic to the team and community, regardless of their technical skills. This is good advice in any company, but it becomes increasingly important on a self-motivated, remote team where it’s more difficult to casually notice or check in with people about how they’re doing. Someone feeling downtrodden or discouraged because of the behavior of a co-worker can be much harder to notice from afar and often difficult and awkward to talk about.

I think what struck me most about both the experience in the book and what I’ve seen in OpenStack is the need for in-person interactions. I love working from home, and in my head it’s something I believe I can just do forever because our team works well online. But if I’m completely honest about my experience over the past 3 years, I feel inspired, energized and empowered by our in-person time together as a team, even if it’s only 2-3 times a year. It also helps our team feel like a team, particularly as we’re growing in staff and scope, and our projects are becoming more segregated day to day (I’m working on Zanata, Jim is working on Zuulv3, Colleen is working on infra-cloud, etc). Reflecting upon my experience with the Ubuntu community these past couple years, I’ve seen first hand the damage done to a community and project when the in-person meetings cease (I went into this topic some following the Community Leadership Summit in July).

Now, the every-six-months developer and user summits (based on what Ubuntu used to do) have been a part of OpenStack all along. It’s been clear from the beginning that project leaders understood the value of getting people together in person twice a year to kick off the next release cycle. But as the OpenStack community has evolved, most teams have gotten in the habit of also having team-specific sprints each cycle, where team members come together face to face to work on specific projects between the summits. These sprints grew organically and without top-down direction from anyone. They satisfied a social need to retain team cohesion and the desire for high bandwidth collaboration. In the book this seemed very similar to the annual company meetings being supplemented by team sprints.

I think I’m going to call this “The year of realizing that in person interaction is vital to the health of a project and team.” Even if my introvert self doesn’t like it and still believes deep down I should just live far away in a cabin in the woods with my cats and computers.

It’s pretty obvious given my happiness with working from home and the teams I’m working on that I fully bought in to the premise of this book from the beginning, so it didn’t need to convince me of anything. And there was a lot more to this book, particularly for people who are seeking to manage a geographically distributed, remote team. I highly recommend it to anyone doing remote work, managing remote teams or looking for a different perspective than “tech workers need to be in the office to be productive.” Thanks, Scott!

by pleia2 at September 11, 2015 07:40 PM

Akkana Peck

The blooms of summer, and weeds that aren't weeds

[Wildflowers on the Quemazon trail] One of the adjustments we've had to make in moving to New Mexico is getting used to the backward (compared to California) weather. Like, rain in summer!

Not only is rain much more pleasant in summer, as a dramatic thundershower that cools you off on a hot day instead of a constant cold drizzle in winter (yes, I know that by now Calfornians need a lot more of that cold drizzle! But it's still not very pleasant being out in it). Summer rain has another unexpected effect: flowers all summer, a constantly changing series of them.

Right now the purple asters are just starting up, while skyrocket gilia and the last of the red penstemons add a note of scarlet to a huge array of yellow flowers of all shapes and sizes. Here's the vista that greeted us on a hike last weekend on the Quemazon trail.

Down in the piñon-juniper where we live, things aren't usually quite so colorful; we lack many red blooms, though we have just as many purple asters as they do up on the hill, plus lots of pale trumpets (a lovely pale violet gilia) and Cowpen daisy, a type of yellow sunflower.

But the real surprise is a plant with a modest name: snakeweed. It has other names, but they're no better: matchbrush, broomweed. It grows everywhere, and most of the year it just looks like a clump of bunchgrass.

[Snakeweed in bloom] Then come September, especially in a rainy year like this one, and all that snakeweed suddenly bursts into a glorious carpet of gold.

We have plenty of other weeds -- learning how to identify Russian thistle (tumbleweed), kochia and amaranth when they're young, so we can pull them up before they go to seed and spread farther, has launched me on a project of an Invasive Plants page for the nature center (we should be ready to make that public soon).

But snakeweed, despite the name, is a welcome guest in our yard, and it lifts my spirits to walk through it on a September evening.

By the way, if anyone in Los Alamos reads this blog, Dave and I are giving our first planetarium show at the nature center tomorrow (that's Friday) afternoon. Unlike most PEEC planetarium shows, it's free! Which is probably just as well since it's our debut. If you want to come see us, the info is here: Night Sky Fiesta Planetarium Show.

September 11, 2015 03:24 AM

September 04, 2015

Akkana Peck

Hacking / Customizing a Kobo Touch ebook reader: Part I, sqlite

I've been enjoying reading my new Kobo Touch quite a lot. The screen is crisp, clear and quite a bit whiter than my old Nook; the form factor is great, it's reasonably responsive (though there are a few places on the screen where I have to tap harder than other places to get it to turn the page), and I'm happy with the choice of fonts.

But as I mentioned in my previous Kobo article, there were a few tweaks I wanted to make; and I was very happy with how easy it was to tweak, compared to the Nook. Here's how.

Mount the Kobo

When you plug the Kobo in to USB, it automatically shows up as a USB-Storage device once you tap "Connect" on the Kobo -- or as two storage devices, if you have an SD card inserted.

Like the Nook, the Kobo's storage devices show up without partitions. For instance, on Linux, they might be /dev/sdb and /dev/sdc, rather than /dev/sdb1 and /dev/sdc1. That means they also don't present UUIDs until after they're already mounted, so it's hard to make an entry for them in /etc/fstab if you're the sort of dinosaur (like I am) who prefers that to automounters.

Instead, you can use the entry in /dev/disk/by-id. So fstab entries, if you're inclined to make them, might look like:

/dev/disk/by-id/usb-Kobo_eReader-3.16.0_N905K138254971:0 /kobo   vfat user,noauto,exec,fmask=133,shortname=lower 0 0
/dev/disk/by-id/usb-Kobo_eReader-3.16.0_N905K138254971:1 /kobosd vfat user,noauto,exec,fmask=133,shortname=lower 0 0

One other complication, for me, was that the Kobo is one of a few devices that don't work through my USB2 powered hub. Initially I thought the Kobo wasn't working, until I tried a cable plugged directly into my computer. I have no idea what controls which devices work through the hub and which ones don't. (The Kobo also doesn't give any indication when it's plugged in to a wall charger, nor does

The sqlite database

Once the Kobo is mouted, ls -a will show a directory named .kobo. That's where all the good stuff is: in particular, KoboReader.sqlite, the device's database, and Kobo/Kobo eReader.conf, a human-readable configuration file.

Browse through Kobo/Kobo eReader.conf for your own amusement, but the remainder of this article will be about KoboReader.sqlite.

I hadn't used sqlite before, and I'm certainly no SQL expert. But a little web searching and experimentation taught me what I needed to know.

First, make a local copy of KoboReader.sqlite, so you don't risk overwriting something important during your experimentation. The Kobo is apparently good at regenerating data it needs, but you might lose information on books you're reading.

To explore the database manually, run: sqlite3 KoboReader.sqlite

Some useful queries

Here are some useful sqlite commands, which you can generalize to whatever you want to search for on your own Kobo. Every query (not .tables) must end with a semicolon.

Show all tables in the database:

The most important ones, at least to me, are content (all your books), Shelf (a list of your shelves/collections), and ShelfContent (the table that assigns books to shelves).

Show all column names in a table:

PRAGMA table_info(content);
There are a lot of columns in content, so try PRAGMA table_info(content); to see a much simpler table.

Show the names of all your shelves/collections:


Show everything in a table:


Show all books assigned to shelves, and which shelves they're on:

SELECT ShelfName,ContentId FROM ShelfContent;
ContentId can be a URL to a sideloaded book, like file:///mnt/sd/TheWitchesOfKarres.epub, or a UUID like de98dbf6-e798-4de2-91fc-4be2723d952f for books from the Kobo store.

Show all books you have installed:

SELECT Title,Attribution,ContentID FROM content WHERE BookTitle is null ORDER BY Title;
One peculiarity of Kobo's database: each book has lots of entries, apparently one for each chapter. The entries for chapters have the chapter name as Title, and the book title as BookTitle. The entry for the book as a whole has BookTitle empty, and the book title as Title. For example, I have file:///mnt/sd/earnest.epub sideloaded:
sqlite> SELECT Title,BookTitle from content WHERE ContentID LIKE "%hamlet%";
ACT I.|Hamlet
Scene II. Elsinore. A room of state in the Castle.|Hamlet
Scene III. A room in Polonius's house.|Hamlet
Scene IV. The platform.|Hamlet
Scene V. A more remote part of the Castle.|Hamlet
Act II.|Hamlet
  [ ... and so on ... ]
ACT V.|Hamlet
Scene II. A hall in the Castle.|Hamlet
Each of these entries has Title set to the name of the chapter (an act in the play) and BookTitle set to Hamlet, except for the final entry, which has Title set to Hamlet and BookTitle set to nothing. That's why you need that query WHERE BookTitle is null if you just want a list of your books.

Show all books by an author:

SELECT Title,Attribution,ContentID FROM content WHERE BookTitle is null
AND Attribution LIKE "%twain%" ORDER BY Title;
Attribution is where the author's name goes. LIKE %% searches are case insensitive.

Of course, it's a lot handier to have a program that knows these queries so you don't have to type them in every time (especially since the sqlite3 app has no history or proper command-line editing). But this has gotten long enough, so I'll write about that separately.

September 04, 2015 01:11 AM

August 30, 2015

Jono Bacon

Go and back the Mycroft Kickstarter campaign

Disclaimer: I am not a member of the Mycroft team, but I think this is neat and an important example of open innovation that needs support.

Mycroft is an Open Source, Open Hardware, Open APIs product that you talk to and it provides information and services. It is a wonderful example of open innovation at work.

They are running a kickstarter campaign that is pretty close to the goal, but it needs further backers to nail it.

I recorded a short video about why I think this is important. You can watch it here.

I encourage you to go and back the campaign. This kind of open innovation across technology, software, hardware, and APIs is how we make the world a better and more hackable place.

by jono at August 30, 2015 09:42 PM

Elizabeth Krumbach

Simcoe’s July 2015 Checkup and Beyond

Simcoe, our Siamese, was diagnosed with Chronic Renal Failure (CRF) in December of 2011. Since then, we’ve kept her going with quarterly vet visits and subcutaneous fluid injections every other day to keep her properly hydrated. Her previous checkup was in mid March, so working around our travel schedules, we brought her in on July 2nd for her latest checkup.

Unfortunately the levels of Blood urea nitrogen (BUN) and Creatinine (CRE) levels continue to increase past healthy levels.

This visit showed a drop in weight as well.

On the bright side, after being high for some time, the weekly Alendronate tablets that were prescribed in May have been effective in getting her Calcium levels down. Our hope is that this trend will continue and prolong the life her her kidneys.

However, the ever-increasing BUN and CRE levels, combined with the weight loss, are a concern. She’s due for another urine analysis and ultrasound to get a closer view into what’s going on internally.

We had this all scheduled for the end of July when something came up. She sometimes gets sniffly, so it’s not uncommon to see crusted “eye goo” build up around her eyes. One day at the end of July I noticed it had gotten quite bad and grabbed her to wash it off. It’s when I got close to her eyes that I noticed it wasn’t “eye goo” that had crusted, she had sores around her eyes that had scabbed over! With no appointments at her regular vet on the horizon, we whisked her off to the emergency vet to see what was going on.

After several hours of waiting, the vet was able to look at the scabbing under the microscope and do a quick culture to confirm a bacterial infection. They also had a dermatologist have a quick look and decided to give her an antibiotics shot to try and clear it up. The next week we swapped out her ultrasound appointment for a visit with her vet to do a follow up. The sores had begun to heal by then and we were just given a topical gel to help it continue to heal. By early August she was looking much better and I left for my trip to Peru, with MJ following a few days later.

A few scabs around her eyes

When we came home in mid August Simcoe still looked alright, but within a few days we noticed the sores coming back. We were able to make an appointment for Saturday, August 22nd with her regular vet to see if we could get to the bottom of it. The result was another topical gel and a twice-a-day dose of the antibiotic Clavamox. The topical gel seemed effective, but the Clavamox seemed to make her vomit. On Monday, with the guidance of her vet, we stopped administering the Clavamox. On Wednesday I noticed that she hadn’t really been eating, sigh! Another call to the vet and I went over to pick up an appetite stimulant. She finally ate, but there was more vomiting. Thankfully our every-other-day fluid injections ensured that she didn’t become dehydrated through all of this. We brought her in for the final follow up just a couple days ago, on Friday. Her sores around her eyes are once again looking better and she seemed to be eating normally when I left for our latest trip on Friday evening.

Not happy (at the vet!) but sores are clearing up, again

I do feel bad leaving on another trip as she’s going through this, but she’s with a trusted pet sitter and I’m really hoping this is finally clearing up. I have a full month at home after this trip so if not we will have time at home to treat her. The strangest thing about all of this is that we have no idea how this happened. She’s an indoor cat, we live in a high rise condo building, and Caligula shows no symptoms, in spite of their proximity and their snuggle and groom-each-other habits. How did she get exposed to something? Why is Caligula fine?

“I am cute, don’t leave!”

Whatever the reason for all of this, here’s to Simcoe feeling better! Once she is, we’ll finally pick up getting the ultrasound and anything else done.

by pleia2 at August 30, 2015 03:20 PM

August 27, 2015

Jono Bacon

Ubuntu, Canonical, and IP

Recently there has been a flurry of concerns relating to the IP policy at Canonical. I have not wanted to throw my hat into the ring, but I figured I would share a few simple thoughts.

Firstly, the caveat. I am not a lawyer. Far from it. So, take all of this with a pinch of salt.

The core issue here seems to be whether the act of compiling binaries provides copyright over those binaries. Some believe it does, some believe it doesn’t. My opinion: I just don’t know.

The issue here though is with intent.

In Canonical’s defense, and specifically Mark Shuttleworth’s defense, they set out with a promise at the inception of the Ubuntu project that Ubuntu will always be free. The promise was that there would not be a hampered community edition and full-flavor enterprise edition. There will be one Ubuntu, available freely to all.

Canonical, and Mark Shuttleworth as a primary investor, have stuck to their word. They have not gone down the road of the community and enterprise editions, of per-seat licensing, or some other compromise in software freedom. Canonical has entered multiple markets where having separate enterprise and community editions could have made life easier from a business perspective, but they haven’t. I think we sometimes forget this.

Now, from a revenue side, this has caused challenges. Canonical has invested a lot of money in engineering/design/marketing and some companies have used Ubuntu without contributing even nominally to it’s development. Thus, Canonical has at times struggled to find the right balance between a free product for the Open Source community and revenue. We have seen efforts such as training services, Ubuntu One etc, some of which have failed, some have succeeded.

Again though, Canonical has made their own life more complex with this commitment to freedom. When I was at Canonical I saw Mark very specifically reject notions of compromising on these ethics.

Now, I get the notional concept of this IP issue from Canonical’s perspective. Canonical invests in staff and infrastructure to build binaries that are part of a free platform and that other free platforms can use. If someone else takes those binaries and builds a commercial product from them, I can understand Canonical being a bit miffed about that and asking the company to pay it forward and cover some of the costs.

But here is the rub. While I understand this, it goes against the grain of the Free Software movement and the culture of Open Source collaboration.

Putting the legal question of copyrightable binaries aside for one second, the current Canonical IP policy is just culturally awkward. I think most of us expect that Free Software code will result in Free Software binaries and to make claim that those binaries are limited or restricted in some way seems unusual and the antithesis of the wider movement. It feels frankly like an attempt to find a loophole in a collaborative culture where the connective tissue is freedom.

Thus, I see this whole thing from both angles. Firstly, Canonical is trying to find the right balance of revenue and software freedom, but I also sympathize with the critics that this IP approach feels like a pretty weak way to accomplish that balance.

So, I ask my humble readers this question: if Canonical reverts this IP policy and binaries are free to all, what do you feel is the best way for Canonical to derive revenue from their products and services while also committing to software freedom? Thoughts and ideas welcome!

by jono at August 27, 2015 11:59 PM

Elizabeth Krumbach

Travels in Peru: Machu Picchu

Our trip to Peru first took us to the cities ofLima and Cusco. We had a wonderful time in both, seeing the local sites and dining at some of their best restaurants. But if I’m honest, we left the most anticipated part of our journey for last, visiting Machu Picchu.

Before I talk about our trip to Machu Picchu, there are a few things worthy of note:

  1. I love history and ruins
  2. I’ve been fascinated by Peru since I was a kid
  3. Going to Machu Picchu has been a dream since I learned it existed

So, even being the world traveler that I am (I’d already been to Asia and Europe this year before going to South America), this was an exceptional trip for me. Growing up our land lord was from Peru, as a friend of his daughters I regularly got to see their home, which was full of Peruvian knickknacks and artifacts. As I dove into history during high school I learned about ancient ruins all over the world, from Egypt to Mexico and of course Machu Picchu in Peru. The mysterious city perched upon a mountaintop always held a special fascination to me. When the opportunity to go to Peru for a conference came up earlier this year, I agreed immediately and began planning. I had originally was going to go alone, but MJ decided to join me once I found a tour I wanted to book with. I’m so glad he did. Getting to share this experience with him meant the world to me.

Our trip from Cusco began very early on Friday morning in order to catch the 6:40AM train to Aguas Calientes, the village below Machu Picchu. Our tickets were for Peru Rail’s Vistadome train, and I was really looking forward to the ride. On the disappointing side, the Cusco half of the trip had foggy windows and the glare on the windows generally made it difficult to take pictures. But as we lowered in elevation my altitude headache went away and so did the condensation from the windows. The glare was still an issue, but as I settled in I just enjoyed the sights and didn’t end up taking many photos. It was probably the most enjoyable train journey I’ve ever been on. At 3 hours it was long enough to feel settled in and relaxed watching the countryside, rivers and mountains go by, but not too long that I got bored. I brought along my Nook but didn’t end up reading at all.

Of course I did take some pictures, here:

Once at Aguas Calientes our overnight bags (big suitcases were left at the hotel in Cusco, as is common) were collected and taken to the hotel. We followed the tour guide who met us with several others to take a bus up to Machu Picchu!

Our guide gave us a three hour tour of the site. At a medium pace, he took us to some of the key structures and took time for photo opportunities all around. Of particular interest to him was the Temple of the Sun (“J” shaped building, center of the photo below), which we saw from above and then explored around and below.

The hike up for these amazing views wasn’t very hard, but I was thankful for the stops along the way as he talked about the exploration and scientific discovery of the site in the early 20th century.

And then there were the llamas. Llamas were brought to Machu Picchu in modern times, some say to trim the grass and other say for tourists. It seems to be a mix of the two, and there is still a full staff of groundskeepers to keep tidy what the llamas don’t manage. I managed to get this nice people-free photo of a llama nursing.

There seem to be all kinds of jokes about “selfies with llamas” and I was totally in for that. Though I didn’t get next to a llama like some of my fellow selfie-takers, but I did get my lovely distance selfie with llamas.

Walking through what’s left of Machu Picchu is quite the experience. The tall stone walls, stepped terraces that make up the whole thing. Lots of climbing and walking at various elevations throughout the mountaintop. Even going through the ruins in Mexico didn’t quite prepare me for what it’s like to be on top of a mountain like this. Amazing place.

We really lucked out with the weather, much of the day was clear and sunny, and quite warm (in the 70s). It made for good walking weather as well as fantastic photos. When the afternoon showers did come in, it was just in time for our tour to end and for us to have lunch just outside the gates. When lunch was complete the sun came out again and we were able to go back in to explore a bit more and take more pictures!

I feel like I should write more about Machu Picchu, being such an epic event for me, but it was more of a visual experience much better shared via photos. I uploaded over 200 more photos from our walk through Machu Picchu here:

My photos were taken with a nice compact digital camera, but MJ brought along his DSLR camera. I’m really looking forward to seeing what he ended up with.

The park closes at 5PM, so close to that time we caught one of the buses back down to Aguas Calientes. I did a little shopping (went to Machu Picchu, got the t-shirt). We were able to check into our hotel, the Casa Andina Classic, which ended up being my favorite hotel of the trip, it was a shame we were only there for one night! Hot, high pressure shower, comfortable bed, and a lovely view of the river that runs along the village:

I was actually so tired from all our early mornings and late evenings the rest of the trip that after taking a shower at the hotel that evening I collapsed onto the bed and instead of reading, zombied out to some documentaries on the History channel, after figuring out the magical incantation on the remote to switch to English. So much for being selective about the TV I watch! We also decided to take advantage of the dinner that was included with our booking and had a really low key, but enjoyable and satisfying meal there at the hotel.

The next morning we took things slow and did some walking around the village before lunch. Aguas Calientes is very small, it’s quite possible that we saw almost all of it. I took the opportunity to also buy some post cards to send to my mother and sisters, plus find stamps for them. Finding stamps is always an interesting adventure. Our hotel couldn’t post them for me (or sell me stamps) and being a Saturday we struck out at the actual post office, but found a corner tourist goodie shop that sold them and a mailbox nearby to so I could send them off.

For lunch we made our way past all the restaurants who were trying to get us in their doors by telling us about their deals and pushing menus our way until we found what we were looking for, a strange little place called Indio Feliz. I found it first in the tour book I’d been lugging around, typical tourist that I am, and followed up with some online recommendations. The decor is straight up Caribbean pirate themed (what?) and with a French owner, they specialize in Franco-Peruvian cuisine. We did the fixed menu where you pick an appetizer, entree and dessert, though it was probably too much for lunch! They also had the best beer menu I had yet seen in Peru, finally far from the altitude headache in Cusco I had a Duvel and MJ went with a Chimay Red. Food-wise I began with an amazing avocado and papaya in lemon sauce. Entree was an exceptional skewer of beef with an orange sauce, and my meal concluded with coffee and apple pie that came with both custard and ice cream. While there we got to chat with some fellow diners from the US, they had just concluded the 4 day Inca Trail hike and regaled us with stories of rain and exhaustion as we swapped small talk about the work we do.

More photos from Aguas Calientes here:

After our leisurely lunch, it was off to the train station. We were back on the wonderful Vistadome train, and on the way back to Cusco there was some culturally-tuned entertainment as well as a “fashion show” featuring local clothing they were selling, mostly of alpaca wool. It was a fun touch, as the ride back was longer (going up the mountains) and being wintertime the last hour or so of the ride was in the dark.

We had our final night in Cusco, and Sunday was all travel. A quick flight from Cusco to Lima, where we had 7 hours before our next flight and took the opportunity to have one last meal in Lima. Unfortunately the timing of our stay meant that most restaurants were in their “closed between lunch and dinner” time, so we ended up at Larcomar, a shopping complex built into an oceanside cliff in Miraflores. We ate at Tanta, where we had a satisfying lunch with a wonderful ocean view!

Our late lunch concluded our trip, from there we went back to Lima airport and began our journey back home via Miami. I was truly sad to see the trip come to an end. Often times I am eager to get home after such an adventurey vacation (particularly when it’s attached to a conference!), but I will miss Peru. The sights, the foods, the llamas and alpacas! It’s a beautiful country that I hope to visit again.

by pleia2 at August 27, 2015 02:50 AM

August 26, 2015

Akkana Peck

Switching to a Kobo e-reader

For several years I've kept a rooted Nook Touch for reading ebooks. But recently it's become tough to use. Newer epub books no longer work work on any version of FBReader still available for the Nook's ancient Android 2.1, and the Nook's built-in reader has some fatal flaws: most notably that there's no way to browse books by subject tag, and it's painfully slow to navigate a library of 250 books when have to start from the As and you need to get to T paging slowly forward 6 books at a time.

The Kobo Touch

But with my Nook unusable, I borrowed Dave's Kobo Touch to see how it compared. I like the hardware: same screen size as the Nook, but a little brighter and sharper, with a smaller bezel around it, and a spring-loaded power button in a place where it won't get pressed accidentally when it's packed in a suitcase -- the Nook was always coming on while in its case, and I didn't find out until I pulled it out to read before bed and discovered the battery was too low.

The Kobo worked quite nicely as a reader, though it had a few of the same problems as the Nook. They both insist on justifying both left and right margins (Kobo has a preference for that, but it doesn't work in any book I tried). More important is the lack of subject tags. The Kobo has a "Shelves" option, called "Collections" in some versions, but adding books to shelves manually is tedious if you have a lot of books. (But see below.)

It also shared another Nook problem: it shows overall progress in the book, but not how far you are from the next chapter break. There's a choice to show either book progress or chapter progress, but not both; and chapter progress only works for books in Kobo's special "kepub" format (I'll write separately about that). I miss FBReader's progress bar that shows both book and chapter progress, and I can't fathom why that's not considered a necessary feature for any e-reader.

But mostly, Kobo's reader was better than the Nook's. Bookmarks weren't perfect, but they basically worked, and I didn't even have to spent half an hour reading the manual to use them (like I did with the Nook). The font selection was great, and the library navigation had one great advantage over the Nook: a slider so you could go from A to T quickly.

I liked the Kobo a lot, and promptly ordered one of my own.

It's not all perfect

There were a few disadvantages. Although the Kobo had a lot more granularity in its line spacing and margin settings, the smallest settings were still a lot less tight than I wanted. The Nook only offered a few settings but the smallest setting was pretty good.

Also, the Kobo can only see books at the top level of its microSD card. No subdirectories, which means that I can't use a program like rsync to keep the Kobo in sync with my ebooks directory on my computer. Not that big a deal, just a minor annoyance.

More important was the subject tagging, which is really needed in a big library. It was pretty clear Shelves/Collections were what I needed; but how could I get all my books into shelves without laboriously adding them all one by one on a slow e-ink screen?

It turns out Kobo's architecture makes it pretty easy to fix these problems.

Customizing Kobo

While the rooted Nook community has been stagnant for years -- it was a cute proof of concept that, in the end, no one cared about enough to try to maintain it -- Kobo readers are a lot easier to hack, and there's a thriving Kobo community on MobileReads which has been trading tips and patches over the years -- apparently with Kobo's blessing.

The biggest key to Kobo's customizability is that you can mount it as a USB storage device, and one of the files that exposes is the device's database (an sqlite file). That means that well supported programs like Calibre can update shelves/collections on a Kobo, access its book list, and other nifty tricks; and if you want more, you can write your own scripts, or even access the database by hand.

I'll write separately about some Python scripts I've written to display the database and add books to shelves, and I'll just say here that the process was remarkably straightforward and much easier than I usually expect when learning to access a new device.

There's lots of other customizing you can do. There are ways of installing alternative readers on the Kobo, or installing Python so you can write your own reader. I expected to want that, but so far the built-in reader seems good enough.

You can also patch the OS. Kobo updates are distributed as tarballs of binaries, and there's a very well designed, documented and supported (by users, not by Kobo) patching script distributed on MobileReads for each new Kobo release. I applied a few patches and was impressed by how easy it was. And now I have tight line spacing and margins, a slightly changed page number display at the bottom of the screen (still only chapter or book, not both), and a search that defaults to my local book collection rather than the Kobo store.

Stores and DRM

Oh, about the Kobo store. I haven't tried it yet, so I can't report on that. From what I read, it's pretty good as e-bookstores go, and a lot of Nook and Sony users apparently prefer to buy from Kobo. But like most e-bookstores, the Kobo store uses DRM, which makes it a pain (and is why I probably won't be using it much).

They use Adobe's DRM, and at least Adobe's Digital Editions app works in Wine under Linux. Amazon's app no longer does, and in case you're wondering why I didn't consider a Kindle, that's part of it. Amazon has a bad reputation for removing rights to previously purchased ebooks (as well as for spying on their customers' reading habits), and I've experienced it personally more than once.

Not only can I no longer use the Kindle app under Wine, but Amazon no longer lets me re-download the few Kindle books I've purchased in the past. I remember when my mother used to use the Kindle app on Android regularly; every few weeks all her books would disappear and she'd have to get on the phone again to Amazon to beg to have them back. It just isn't worth the hassle. Besides, Kindles can't read public library books (those are mostly EPUBs with Adobe DRM); and a Kindle would require converting my whole EPUB library to MOBI. I don't see any up side, and a lot of down side.

The Adobe scheme used by Kobo and Nook is better, but I still plan to avoid books with DRM as much as possible. It's not the stores' fault, and I hope Kobo does well, because they look like a good company. It's the publishers who insist on DRM. We can only hope that some day they come to their senses, like music publishers finally did with MP3 versus DRMed music. A few publishers have dropped DRM already, and if we readers avoid buying DRMed ebooks, maybe the message will eventually get through.

August 26, 2015 11:04 PM

Elizabeth Krumbach

Travels in Peru: Cusco

We started our Peruvian adventures in Lima. On Wednesday morning we too a very early flight to Cusco. The tour company had recommended an early flight so we could take a nap upon arrival to help adjust to the altitude, indeed, with Cusco over 2 miles high in elevation I did find myself with a slight headache during our visit there. After our nap we met up with our fellow travelers for our city tour of Cusco.

The tour began by going up for a view of all of Cusco from the hillside, where I got my first selfie with an alpaca. We also visited San Pedro’s Market, a large market complex that had everything from tourist goodies to everyday produce, meats, cheeses and breads.

From there we made our way to Qurikancha, said to be the most important temple in the Inca Empire. When the Spanish arrived they built their Church of Santo Domingo on top of it, so only the foundation and some of the rooms remain. I was happy that the tour focused on the Inca aspects and largely ignored the Church, aside from some of the famous religious paintings contained within.

More photos from Qurikancha here:

We then went to the Plaza de Armas where the Cusco Cathedral lords over the square. No photos were allowed inside, but the Cathedral is notable for the Señor de los Temblores, a Jesus statue that is believed to have halted an earthquake in 1650 and a huge, captivating painting by Marcos Zapata of a localized Last Supper where participants are dining on guinea pig and chicha morada!

That evening we had the most exceptional dinner in Cusco, at MAP Café. It’s located inside Museo Arqueologico Peruano (MAP) which is run in association with the fantastic Museo Larco that we visited in Lima. Since this museum also had late hours, we had a wonderful time browsing their collection before dinner. Dinner itself was concluded with some amazing desserts, including a deconstructed lemon meringue pie accompanied by caramel ice cream.

More photos from the museum and dinner here:

Thursday started off bright and early with a tour of a series of ruins outside of Cusco, in Saksaywaman. This was the first collection of ruins in Cusco we really got to properly climb, so with our tiny group of just four we were able to explore the citadel of Saksaywaman with a guide and then for a half hour on our own. In addition to the easy incline we took with the tour guide to walk on the main part of the ruins, which afforded our best view of Cusco, we walked up a multi-story staircase on the other side to get great panoramic views of the ruins. Plus, there were alpacas.

Beyond the main Saksaywaman sites, we visited other sites inside the park, seeing the fountains featured at Tambomachay, the amazing views from a quick stop at Puka Pukara and a near natural formation that had been carved for sacrifices at Q’enqo. The tour concluded by stopping at a local factory shop specializing in alpaca clothing.

More photos from throughout the morning here:

We were on our own for the afternoon, so we began by finally visiting a Chifa (Peruvian-inspired Chinese) restaurant. I enjoyed their take on Sweet and Sour Chicken. We then did some browsing at local shops before finally ending up at the Center for Traditional Textiles. They featured a small museum sharing details about the types and procedures for creating traditional Peruvian textiles, as well as live demonstration from master craftswomen and young trainees of the techniques involved. While there we fell in love with a pair of pieces that we took home with us, a finely woven tapestry and a small blanket that we’ll need to get framed soon.

Our time in Cusco concluded with a meal at Senzo, which had been really hyped but didn’t quite live up to our expectations, especially after the meal we had the previous night at MAP Café, but it was still an enjoyable evening. We’d have one last night in Cusco following our trip to Machu Picchu where we dined at Marcelo Batata, but the altitude sickness had hit me upon our return and I could only really enjoy the chicken soup, but as a ginger, mint & lemongrass soup, it was the perfect match for my queasy stomach (even if it didn’t manage to cure me of it).

More photos from Cusco here:

The next brought an early morning train to Aguas Calientes and Machu Picchu!

by pleia2 at August 26, 2015 03:01 AM

August 24, 2015

Elizabeth Krumbach

Travels in Peru: Lima

After the UbuCon Latin America conference that I wrote about here I had a day of work and personal project catch up with a dash of relaxation at my hotel before MJ arrived that night. Monday morning we were picked up by the folks at Viajes Pacifico who I had booked a tour of Lima and Cusco with.

It was the first time I used a group tour company, the price of the tour included all the hotels (selected by them) as well as transportation and entrance fees into the sites our tour went to. I definitely prefer the private driver we had in Mexico for our honeymoon, and we’re putting together our own itinerary for our trip to Japan in October, but given my schedule this year I simply didn’t have the time or energy to put together a schedule for Peru. The selected hotels were fine, but we likely would have gone to nicer ones if we booked ourselves. The tours were kept small, with the largest group being one in Cusco that was maybe 14 of us and the smallest being only 4. I wasn’t a fan of the schedule execution, we had a loose schedule each day but they wouldn’t contact us until the evening before with exact pickup times and it was unclear how long the tours would last or which trains we’d be taking, which caused making dinner reservations and the like to be a bit dicey. Still, it all worked out and it was great to have someone else worry about the logistical details.

On Monday we were picked up from our hotel in the afternoon for the schedule Lima city tour, which began at El Parque del Amor (Love Park), a beautiful seaside park in Miraflores with lots of flowers, a giant sculpture of a couple and lovely view of the Pacific Ocean. From there the tour bus did a quick drive around the ruins of Huaca Pucllana, which I had really hoped to see beyond just the windows of a bus – alas! And then on to the rest of our tour that took us to the main square in Lima where we got a tour of Basilica Cathedral of Lima which is notable not only by being the main cathedral but also the tomb of famous Spanish conqueror Francisco Pizarro. I learned that during excavations they discovered that his head was buried in a box separate from his body. The cathedral itself is beautiful.

More photos from the cathedral here:

Our next stop was the Convent of Santo Domingo. The claim to fame there are the tombs and related accoutrements of both Saint Rose of Lima and Saint Martin de Porres. They had an impressive library that spanned not just religious books, but various topics in Spanish and Latin. The convent also had some nice gardens and history of these places is always interesting to learn about. I think we may have gotten more out of them if were were Catholic (or even Christian).

More photos from the convent here:

That evening we met up with a friend of mine from high school who has lived in Lima for several years. It was fun to catch up over a nice Peruvian meal that included more ceviche and my first round of Pisco Sours.

Tuesday was our non-tour day in Lima, so I got up early for a walk down by the ocean and then up to the Artisan Markets of Miraflores (the “Inka Market”). I was able to pick up some tourist goodies and on my way to the market I walked through Kennedy Park. We were told about this park on the tour the previous day, it’s full of cats! Cats in the flowers, cats on the lawn, cats on the benches. Given my love for cats, it was quite the enjoyable experience. I took a bunch of pictures:

I made it back to our hotel shortly after noon in time to meet up with MJ to go to our lunch reservations at the famous Astrid y Gaston. This was definitely one of the highlights of our trip to Lima. We partook of their tasting menu which was made of over a dozen small plates that each were their own little work of art. It was easily one of the best meals I’ve ever had.

After lunch, which was a multiple hour affair, we made it to the ruins of Huaca Huallamarca just before closing. They have a small, single room museum that contains a mummy that was found on the site and some artifacts. They let you climb the mud brick “pyramid” that seems to have active archaeological digs going on (though no one was there when we visited). Definitely worth the stop as we rounded out our afternoon.

More photos of the site here:

Our early evening plans were made partially by what was still open after 5PM, which is how we found ourselves at the gem that is Museo Larco. Beautifully manicured grounds with lots of flowering plants, a stunning array of Peruvian artifacts dating back several thousand years with descriptions in multiple languages and a generally pleasant place to be. I particularly liked the exhibits with the cat themes, as the cats were an ancient symbol of earth (with heavens the bird and snakes below). Highly recommended and they’re open until 10PM! We didn’t stay that late though, we had dinner reservations at Brujas de Cachiche back down in Miraflores. With a focus on seafood, the menu was massive and the food was good.

More photos from Museo Larco here:

That meal wrapped up the Lima portion of or trip, we were up before the sun the next day for our flight to Cusco!

And more photos more generally around Lima are here:

by pleia2 at August 24, 2015 06:00 AM

Nathan Haines

Ubuntu Free Culture Showcase submissions are now open!

Ubuntu 15.10 is coming up soon, and what better way to celebrate a new release with beautiful new content to go with the release? The Ubuntu Free Culture Showcase is a way to celebrate the Free Culture movement, where talented artists across the globe create media and release it under licenses that encourage sharing and adaptation. We're looking for content which shows off the skill and talent of these amazing artists and will great Ubuntu 15.10 users. We announced the showcase last week, and now we are accepting submissions at the following groups: For more information, please visit the Ubuntu Free Culture Showcase page on the Ubuntu wiki.

August 24, 2015 03:40 AM

Making Hulu videos play in Ubuntu

A couple of weeks ago, Hulu made some changes to their video playback system to incorporate Adobe Flash DRM technology. Unfortunately, this meant that Hulu no longer functioned on Ubuntu because Adobe stopped supporting Flash on Linux several year ago, and therefore Adobe’s DRM requires HAL which was likewise obsoleted about 4 years ago and was dropped from Ubuntu in 13.10. The net result is that Hulu no longer functions on Ubuntu.

While Hulu began detecting Linux systems and displaying a link to Adobe’s support page when playback failed, and the Adobe site correctly identifies the lack of HAL support as the problem, the instructions given no longer function because HAL is no longer provided by Ubuntu.

Fortunately, Michael Blennerhassett has maintained a Personal Package Archive which rebuilds HAL so that it can be installed on Ubuntu. Adding this PPA and then installing the “hal” package will allow you to play Hulu content once again.

To do this, first open a Terminal window by searching for it in the Dash or by pressing Ctrl+Alt+T.

Next, type the following command at the command line and press Enter:

sudo add-apt-repository ppa:mjblenner/ppa-hal

You will be prompted for your password and then you will see a message from the PPA maintainer. Press Enter, and the PPA will be added to Ubuntu’s list of software sources. Next, have Ubuntu refresh its list of available software, which will now include this PPA, by typing the following and pressing Enter:

sudo apt update

Once this update finishes, you can then install HAL support on your computer by searching for “hal” in the Ubuntu Software Center and installing the “Hardware Abstraction Layer” software, or by typing the following command and pressing Enter:

sudo apt install hal

and confirming the installation when prompted by pressing Enter.

book cover

I explain more about how to install software on the command line in Chapter 5 and how to use PPAs in Chapter 6 of my upcoming book, Beginning Ubuntu for Windows and Mac Users, coming this October from Apress. This book was designed to help Windows and Mac users quickly and easily become productive on Ubuntu so they can get work done immediately, while providing a foundation for further learning and exploring once they are comfortable.

August 24, 2015 02:13 AM

August 21, 2015

Akkana Peck

Python module for reading EPUB e-book metadata

Three years ago I wanted a way to manage tags on e-books in a lightweight way, without having to maintain a Calibre database and fire up the Calibre GUI app every time I wanted to check a book's tags. I couldn't find anything, nor did I find any relevant Python libraries, so I reverse engineered the (simple, XML-bsaed) EPUB format and wrote a Python script to show or modify epub tags.

I've been using that script ever since. It's great for Project Gutenberg books, which tend to be overloaded with tags that I don't find very useful for categorizing books ("United States -- Social life and customs -- 20th century -- Fiction") but lacking in tags that I would find useful ("History", "Science Fiction", "Mystery").

But it wasn't easy to include it in other programs. For the last week or so I've been fiddling with a Kobo ebook reader, and I wanted to write programs that could read epub and also speak Kobo-ese. (I'll write separately about the joys of Kobo hacking. It's really a neat little e-reader.)

So I've factored my epubtag script into a usable Python module, so as well as being a standalone program for viewing epub book data, it's easy to use from other programs. It's available on GitHub: parse EPUB metadata and view or change subject tags.

August 21, 2015 02:27 AM

August 20, 2015

Jono Bacon

Talking with a Mythbuster and a Maker

Recently I started writing a column for Forbes.

My latest column covers the rise of the maker movement and in it I interviewed Jamie Hyneman from Mythbusters and Dale Dougherty from Make Magazine.

Go and give it a read right here.

by jono at August 20, 2015 04:48 PM

August 15, 2015

Nathan Haines

Reintroducing the Ubuntu Free Culture Showcase

In the past, we’ve had the opportunity to showcase some really fun, really incredible media in Ubuntu. Content creators who value free culture have offered beautiful photography for the desktop and entertaining videos and music.

Not only does this highlight the fantastic work that comes out of free culture on Ubuntu desktops worldwide, but the music and video selections also help show off Ubuntu’s fantastic multimedia support by providing content for the Ubuntu live images.

The wallpaper contest has continued from cycle to cycle, but the audio and video contests have fallen by the wayside. But Ubuntu is more popular than ever, and can now power phones and tablets as well as desktops and laptops. So as we move closer towards a goal of convergence, we’d like to bring back this opportunity for artists to share their work with millions of Ubuntu users around the world.

All content must be released under a Creative Commons Attribution-Sharealike or Creative Commons Attribute license. (The Creative Commons Zero waiver is okay, too!). Each entrant must only submit content they have created themselves, and all submissions must adhere to the Ubuntu Code of Conduct.

The winners will be featured in the Ubuntu 15.10 release in October!

We’re looking for work in one of three categories:

  • Wallpapers – we’ll choose 11 stunning photographs, but also one illustrative wallpaper that focuses on Ubuntu 15.10’s codename: wily werewolf.
  • Videos – we need to keep it small, so we’re looking for a 1280x720 video of about 30 seconds.
  • Audio – everybody loves music, so a song between 2-5 minutes will rock speakers around the world!

You’re limited to a total of two submissions in any single category.

We’re still cleaning out the cobwebs, and selecting a panel of judges from the Canonical and Ubuntu design teams and the Ubuntu community. So in the next week or two we’ll announce Flickr, Vimeo, and SoundCloud groups where you can submit your entries. But you’ll have plenty of time to prepare before the final submission deadline: September 15th, 2015 at 23:59 UTC.

There are more technical details for each category, so please see the Ubuntu Free Culture Showcase wiki page for the latest information and updates!

August 15, 2015 10:19 AM

August 10, 2015


Bad Metaphysics Costs Maintainability

header image

I find myself doing a lot of metaphysical thinking in my day to day work as a coder. Objects that are cohesive and are valid metaphysical analogues to common experiences make it much easier to read, understand, and fix existing code.

Taking an example:

struct ServiceQueue
    void place_customer_at_back();
    void service_front_customer();

This class maps well to a physical problem we encounter frequently in our day to day lives; customer service at a bank teller window, perhaps, or ordering a hamburger at fast food chain. Taking things to the realm of computers, ServiceQueue also maps to many computer problems pretty well as well. A packet arrives over the network, and we want to service the packet in a FIFO manner.

This class is cohesive and easily understood by someone new to the code because it maps well to a well understood concept; that of a Queue. By making use of the shared metaphysical concept of physical queueing, we introduce the new coder to the abstract queue that’s implemented in software. The new coder can understand and verify the interface and implementation quickly.

Now lets spice the interface up with a metaphysical error. Say that we have the above class, and we encounter an error condition that happens when the network becomes disconnected. One might be tempted to add a function like:

struct ServiceQueue
    void place_customer_at_back();
    void service_front_customer();
    void handle_network_disconnected();

The problem in adding this function is twofold; it makes the code harder to understand, and it makes writing a new object more difficult.

Rapid Understandibility

The first issues is that that analogue to a real life concept is diluted, and with enough dilution, will eventually be lost. This makes it more difficult to rapidly understand the role an object implementing the interface has.

I bet the coder that added “handle_network_disconnected” saw that a lot of the implementation where the error could be handled from was “conveniently” in the class implementing the interface, and punched “handle_network_disconnected” in. But did you catch the metaphysical error? ServiceQueue is no longer named properly, its become a different object. Its a ServerQueueThatCanBeDisconnected; and the analogy to the physical queue is weakened. It takes a bit more explaining to a new coder to explain what sort of interface ServiceQueue is. This additional explanation needed to the new coder makes it much more difficult to understand the object and the problem that is being solved. Consequently, its harder to maintain, and takes longer to debug, because of the added cost of understanding. 1

Alternative Implementations

With each error like this, it becomes a bit harder to write an implementation for the interface that solves a similar problem. ServiceQueue with “handle_network_disconnected” fits the network-packet problem, but its been made more difficult to use this interface with the myriad other problems (like the bank teller problem).

Now, in the practical world of software, we’re used to seeing this all the time. We can mentally handle one metaphysical error per interface quite easily. The actual problem comes in much worse scenarios, where there are multiple holes punched through the interface. Eventually, it can get to the point where the object really has no physical manifestation and the interface gets renamed to something ambiguous, like “ServiceManager”. At this point, the object has sluggish understandability, and is irreplaceable. We’ve found ourself with some difficult to maintain software!

It might take a bit of refactoring to get things right, but in the end, its worth it; both practically, and metaphysically.

This post originally appeared on, and is (c) Kevin DuBois 2015

by Kevin at August 10, 2015 10:27 AM

Akkana Peck

Bat Ballet above the Amaranths

This evening Dave and I spent quite a while clearing out amaranth (pigweed) that's been growing up near the house.

[Palmer's amaranth, pigweed] We'd been wondering about it for quite some time. It's quite an attractive plant when small, with pretty patterns on its leaves that remind me of some of the decorative houseplants we used to try to grow when I was a kid.

I've been working on an Invasive Plants page for the nature center, partly as a way to figure out myself which plants we need to pull and which are okay. For instance, Russian thistle (tumbleweed) -- everybody knows what it looks like when it's a dried-up tumbleweed, but by then it's too late, scattering its seeds all over. Besides, it's covered with spikes by then. The trick is to recognize and pull it when it's young, and the same is true of a lot of invasives, especially the ones with spiky seeds that stick to you, like stickseed and caltrops (goatheads).

A couple of the nature center experts have been sending me lists of invasive plants I should be sure to include, and one of them was a plant called redroot pigweed. I'd never heard of it, so I looked it up -- and it looked an awful lot like our mystery plant. A little more web searching on Amaranthus images eventually led me to Palmer's amaranth, which turns out to be aggressive and highly competitive, with sticky seeds.

Unfortunately the pretty little plants had had a month to grow by the time we realized the problem, and some of them had trunks an inch and a half across, so we had to go after them with a machete and a hand axe. But we got most of them cleared.

As we returned from dumping the last load of pigweed, a little after 8 pm, the light was fading, and we were greeted by a bat making rounds between our patio and the area outside the den. I stopped what I was doing and watched, entranced, as the bat darted into the dark den area then back out, followed a slalom course through the junipers, buzzed past my head and the out to make a sweep across the patio ... then back, around the tight corner and back to the den, over and over.

I stood watching for twenty minutes, with the bat sometimes passing within a foot of my head. (yay, bat -- eat some of these little gnats that keep whining by my ears and eyes!) It flew with spectacular maneuverability and grace, unsurpassed by anything save perhaps a hummingbird, changing direction constantly but always smoothly. I was reminded of the way a sea lion darts around underwater while it's hunting, except the bat is so much smaller, able to turn in so little space ... and of course maneuvering in the air, and in the dark, makes it all the more impressive.

I couldn't hear the bat's calls at all. Years ago, waiting for dusk at star parties on Fremont Peak, I used to hear the bats clearly. Are the bats here higher pitched than those California bats? Or am I just losing high frequencies as I get older? Maybe a combination of both.

Finally, a second bat, a little smaller than the first, appeared over the patio and both bats disappeared into the junipers. Of course I couldn't see either one well enough to tell whether the second bat was smaller because it was a different species, or a different gender of the same species. In Myotis bats, apparently the females are significantly larger than the males, so perhaps my first bat was a female Myotis and the male came to join her.

The two bats didn't reappear, and I reluctantly came inside.

Where are they roosting? In the trees? Or is it possible that one of them is using my bluebird house? I'm not going to check and risk disturbing anyone who might be roosting there.

I don't know if it's the same little brown bat I saw last week on the front porch, but it seems like a reasonable guess.

I've wondered how many bats there are flying around here, and how late they fly. I see them at dusk, but of course there's no reason to think they stop at dusk just because we're no longer able to see them. Perhaps I'll find out: I ordered parts for an Arduino-driven bat detector a few weeks ago, and they've been sitting on my desk waiting for me to find time to solder them together. I hope I find the time before summer ends and the bats fly off wherever they go in winter.

August 10, 2015 03:47 AM

August 09, 2015

Elizabeth Krumbach

UbuConLA 2015 in Lima

This week I had the honor of joining a couple hundred free software enthusiasts at UbuCon Latin America. I’d really been looking forward to it, even if I was a bit apprehensive about the language barrier, and the fact that mine was the only English talk on the schedule. But those fears melted away as the day began on Friday morning and I found myself loosely able to follow along with sessions with the help of slides, context and my weak understanding of Spanish (listening is much easier than speaking!).

The morning began by meeting a couple folks from Canonical and a fellow community member at the hotel lobby and getting a cab over to the venue. Upon arrival, we were brought into the conference speaker lounge to settle in before the event. Our badges had already been printed and were right there for us, and bottles of water available for us, it was quite the pleasant welcome.

José Antonio Rey kicked off the event at 10AM with a welcome, basic administrative notes about the venue, a series of thanks and schedule overview. Video (the audio in the beginning sounds like aliens descending, but it gets better by the end).

Immediately following him was a keynote by Pablo Rubianes, a contributor from Uruguay who I’ve known and worked with in the Ubuntu community for several years. As a member of the LoCo Council, he had a unique view into development and construction of LoCo (Local/Community) teams, which he shared in this talk. He talked some about how LoCos are organized, gave an overview of the types of events many of them do, like Ubuntu Hours, Global Jams and events in collaboration with other communities. I particularly enjoyed the photos he shared in his presentation. He left a lot of time for questions, which was needed as many people in the audience had questions about various aspects of LoCo teams. Also, I enjoyed the playful and good humored relationship they have with the title “LoCo” given the translation of the word into Spanish. Video.

My keynote was next, Building a Career in Free and Open Source Software (slides, English and Spanish). Based on audience reaction, I’m hopeful that a majority of the audience understood English well enough to follow along. For anyone who couldn’t, I hope there was value found in my bi-lingual slides. I had some great feedback following my talk both in person and on Twitter. Video (in English!).

Thanks to Pablo Rubianes for the photo (source)

For all the pre-conference jokes about a “cafeteria lunch” I was super impressed with my lunch yesterday. Chicken and spiced rice, some kind of potato-based side and a dessert of Chicha Morada pudding… which is what I called it until I learned the real name, Mazamorra Morada, a purple corn pudding that tastes like the drink I named it after. Yum!

After lunch we heard from Naudy Villaroel who spoke about the value of making sure people of all kinds are included in technology, regardless of disability. He gave an overview of several accessibility applications available in Ubuntu and beyond, including the Orca screen reader, the Enable Viacam (eViacam) tool for controlling the mouse through movements on camera and Dasher which allows for small movements to control words that are displayed through algorithms that anticipate words and letters the operator will want to use, and makes it easy to form them. He then went on to talk about other sites and tools that could be used. Video.

Following Naudy’s talk, was one by Yannick Warnier, president of Chamilo, which produces open source educational software. His talk was a tour of how online platforms, both open source and hosted (MOOC-style) have evolved over the past couple decades. He concluded by speculating far into the future as to how online learning platforms will continue to evolve and how important education will continue to be. Video. The first day concluded with a duo of talks from JuanJo Ciarlante, the first about free software on clouds (video… and ran over so continued in next link…) and a second that covered some basics around using Python to do data crunching, including some of the concepts around Map Reduce type jobs and Python-based libraries to accomplish it (video, which includes the conclusion of the cloud talk, the last half is about Python).

The evening was spent with several of my fellow speakers at La Bistecca. I certainly can’t say I haven’t been eating well while I’ve been here!

I also recommend reading Jose’s post about the first day, giving you a glimpse into the work he’s done to organize the conference here: UbuConLA 2015: The other side of things. Day 1.

And with that, we were on to day 2!

The day began at 10AM with a talk about Snappy by Sergio Schvezov. I was happy to have read a blog post by Ollie Ries earlier in the week that walked through all the Snappy/core/phone related names that have been floating around, but this talk went over several of the definitions again so I’m sure the audience was appreciative to get them straightened out. He brought along a BeagleBone and Ubuntu tablet that he did some demos on as he deployed Ubuntu Core and introduced Snapcraft for making Snappy packages. Video.

Following his talk was one by Luis Michael Ibarra in a talk about the Linux container hypervisor, LXD. I learned that LXD was an evolution of lxc-tools, and in his talk he dug through the filesystem and system processes themselves to show how the containers he was launching worked. Unfortunately his talk was longer than his slot, so he didn’t get through all his carefully prepared slides, so hopefully they’ll be published soon. Video.

Just prior to lunch, we enjoyed a talk by Sebastián Ferrari about Juju where he went through the background of Juju, what it’s for and where it fits into the deployment and orchestration world. He gave demos of usage and the web interface for it on both Amazon and Google Compute Engine. He also provided an introduction to the Juju Charm Store where charms for various applications are shared and shared the JuJu documentation for folks looking to get started with Juju. Video.

After lunch the first talk was by Neyder Achahuanco who talked about building Computer Science curriculum for students using tools available in Ubuntu. He demonstrated Scratch, Juegos de Blockly (Spanish version of Blockly Games), (which is in many languages, see bottom right of the site) and MIT App Inventor. Video).

Break, with Ubuntu and Kubuntu stickers!

As the afternoon continued, Pedro Muñoz del Río spoke on using Ubuntu for a platform for data analysis. Video. the Talks concluded with Alex Aragon who gave an introduction to 3d animation with Blender where he played the delightful Monkaa film. He then talked about features and went through various settings. Video.

Gracias to all the organizers, attendees and folks who made me feel welcome. I had a wonderful time! And as we left, I snagged a selfie with the flags flying outside the University. For what? Jose picked them out upon learning which countries people would be flying in from, the stars and stripes were flying for me!

More photos from UbuConLA here:

by pleia2 at August 09, 2015 03:13 AM

August 07, 2015

Elizabeth Krumbach

Lima, dia uno y UbuConLA prep

Saying my Spanish is “weak” is being generous. I know lots of nouns and a smattering of verbs from “learning Spanish” in school, but it never quite stuck and I lacked the immersive experience that leads to actually learning a language. So I was very thankful to be spending yesterday with my friend and Ubuntu colleague José Antonio Rey as we navigated the city and picked up a SIM for my phone.

I’m staying at Hotel & Spa Golf Los Incas in Lima. Jose and his father were kind enough to meet me at the airport, late on Wednesday night when my flight came in. The hotel itself is a bit of a drive from the airport, but it’s not far from the university where the conference is being held today, an 8 minute Uber ride yesterday evening in brisk traffic. They offer a free shuttle to a nearby mall, where I met up with Jose come morning. The day kicked off by discovering that Lima has Dunkin’ Donuts, and I don’t (at home in San Francisco). Having already finished breakfast, I didn’t avail myself of the opportunity for a doughnut. We then searched the mall, waited in some lines, waited for processing and finally got a SIM for my phone! With the data plan along with it, I plan on taking lots of pictures of llamas when I reach Cusco and sharing them with everyone.

From the mall we took a bus down the main east-west avenue in Lima, Avenida Javier Prado, and then the Línea 1 del Metro de Lima, a train! The Metro goes north to south and was very speedy and new, if packed. We took it just a couple stops from La Cultura to Gamarra.

Gamarra is home to a shopping district with various open air markets and a lot of clothing and street food along the way. Our journey took us here to pick up the custom t-shirts that were printed for the staff and crew working the UbuConLA conference. The shirts look great.

It was then on to the train and bus again, which took us to Señor Limón for some amazing ceviche!

After lunch we went over to Universidad de Lima to get a tour of the campus and see how things were coming together. Jose met up with several of his fellow conference planners as they tested audio and video, streaming and got all kinds of other logistical things. We also picked up boxes of Ubuntu goodies from across campus and brought them over so setup of tables could begin.

It was pretty fun to get a “behind the scenes” view of the pieces of the conference coming together. Huge thanks to everyone putting it together, it’s a real pleasure to be here.

My evening wound down at my hotel with a nice meal. At noon today I’ll be giving my keynote!

by pleia2 at August 07, 2015 02:40 PM

Meetup, baseball and kitties

I had fully intended on writing this before sitting in a hotel in Peru, but pre-trip tasks crept up, I had last minute things to finish with work (oh, leaving on a Wednesday!) and sitting on a plane all day is always much more exhausting than I expect it to be. So here we are!

Since returning from OSCON a couple weeks ago I’ve kept busy with other things. In addition to the continued work on my book. On the Thursday OSCON was still going on, I attended my first Write/Speak/Code SF & Bay Area event. It was a work evening where several women met up at a little eatery in SOMA, chatted about their work and each brought a project to work on. I had my keynote slides to perfect, and managed to do that and get them set off to the friend I was having them translate them into Spanish. I managed to also talk about the work I’d been doing on my book and found a couple people who may be interested in doing some review. It was also great to learn that some of them were interested in supporting Grace Hopper Conference speakers, and there may be an event in September to gather some of us who live and work in the area to support each other and practice fine tune our talks.

The following Monday MJ and I met at AT&T Park downtown to attend a Giants baseball game on Jewish Heritage Night. It had been a couple years since I’d been to a Giants game (the season goes by so quickly!), it was great to get to see a game again. Plus, the Kiddush cup they gave away as the special event gift now has a treasured spot in my home.

As the game began, I found myself sitting in front of the Rabbi for our congregation, who is a big baseball fan and is always fun to talk to about it. Since we bought tickets with other members we also found ourselves in the bleachers, which I’d never sat in before. It was a whole different angle and seating arrangement than I’m used to, but still lots of fun.

As an added bonus, it was a solid game that the Giants won. More photos from the game here:

In other “while I’m home” life news, I also started the sessions with a trainer at the gym. I have 5 paid for, and the first was a tour of pain, as advertised. I go running and have always been quite skilled at lifting things, but this trainer found muscles I’m not sure I’ve ever used. He also managed to put me in a state where it took me about 3 days to feel normal again, the first day of which I really struggled to walk down stairs! I’m sticking to it though and while I may ask him to tone it down slightly for my next session, I already have it on my schedule upon my return from Peru and the OpenStack Operators Meetup in Palo Alto.

I then spent a lot of time on work and getting some loose ends tied off for my book as I prepared for this trip to Peru. We’ve also had some vet visits interspersed as poor Simcoe has battled a bacterial problem that caused some eye trouble. Thankfully she was almost all healed up by the time I flew out on Wednesday and you can hardly tell there was an issue. Fortunately none of these troubles impacted her bouncy nature.

Or whatever is in their nature that makes them want to sleep on our suitcases.

Sitting on my suitcases aside, I already miss my fluffy critters, and am thankful that my husband is joining me on Sunday. Still, I’m super excited for UbuCon Latin America tomorrow!

by pleia2 at August 07, 2015 03:15 AM

July 31, 2015

Eric Hammond

AWS SNS Outage: Effects On The Unreliable Town Clock

It took a while, but the Unreliable Town Clock finally lived up to its name. Surprisingly, the fault was not mine, but Amazon’s.

For several hours tonight, a number of AWS services in us-east-1, including SNS, experienced elevated error rates according to the AWS status page.

Successful, timely chimes were broadcast through the Unreliable Town Clock public SNS topic up to and including:

2015-07-31 05:00 UTC

and successful chimes resumed again at:

2015-07-31 08:00 UTC

Chimes in between were mostly unpublished, though SNS appears to have delivered a few chimes during that period up to several hours late and out of order.

I had set up Unreliable Town Clock monitoring and alerting through This worked perfectly and I was notified within 1 minute of the first missed chime, though it turned out there was nothing I could do but wait for AWS to correct the underlying issue with SNS.

Since we now know SNS has the potential to fail in a region, I have launched an Unreliable Town Clock public SNS Topic in a second region: us-west-2. The infrastructure in each region is entirely independent.

The public SNS topic ARNs for both regions are listed at the top of this page:

You are welcome to subscribe to the public SNS topics in both regions to improve the reliability of invoking your scheduled functionality.

The SNS message content will indicate which region is generating the chime.

Original article and comments:

July 31, 2015 09:55 AM

July 30, 2015

Akkana Peck

A good week for critters

It's been a good week for unusual wildlife.

[Myotis bat hanging just outside the front door] We got a surprise a few nights ago when flipping the porch light on to take the trash out: a bat was clinging to the wall just outside the front door.

It was tiny, and very calm -- so motionless we feared it was dead. (I took advantage of this to run inside and grab the camera.) It didn't move at all while we were there. The trash mission accomplished, we turned out the light and left the bat alone. Happily, it wasn't ill or dead: it was gone a few hours later.

We see bats fairly regularly flying back and forth across the patio early on summer evenings -- insects are apparently attracted to the light visible through the windows from inside, and the bats follow the insects. But this was the first close look I'd had at a stationary bat, and my first chance to photograph one.

I'm not completely sure what sort of bat it is: almost certainly some species of Myotis (mouse-eared bats), and most likely M. yumanensis, the "little brown bat". It's hard to be sure, though, as there are at least six species of Myotis known in the area.

[Woodrat released from trap] We've had several woodrats recently try to set up house near the house or the engine compartment of our Rav4, so we've been setting traps regularly. Though woodrats are usually nocturnal, we caught one in broad daylight as it explored the area around our garden pond.

But the small patio outside the den seems to be a particular draw for them, maybe because it has a wooden deck with a nice dark space under it for a rat to hide. We have one who's been leaving offerings -- pine cones, twigs, leaves -- just outside the door (and less charming rat droppings nearby), so one night Dave set three traps all on that deck. I heard one trap clank shut in the middle of the night, but when I checked in the morning, two traps were sprung without any occupants and the third was still open.

But later that morning, I heard rattling from outside the door. Sure enough, the third trap was occupied and the occupant was darting between one end and the other, trying to get out. I told Dave we'd caught the rat, and we prepared to drive it out to the parkland where we've been releasing them.

[chipmunk caught in our rat trap] And then I picked up the trap, looked in -- and discovered it was a pretty funny looking woodrat. With a furry tail and stripes. A chipmunk! We've been so envious of the folks who live out on the canyon rim and are overloaded with chipmunks ... this is only the second time we've seen here, and now it's probably too spooked to stick around.

We released it near the woodpile, but it ran off away from the house. Our only hope for its return is that it remembers the nice peanut butter snack it got here.

[Baby Great Plains skink] Later that day, we were on our way out the door, late for a meeting, when I spotted a small lizard in the den. (How did it get in?) Fast and lithe and purple-tailed, it skittered under the sofa as soon as it saw us heading its way.

But the den is a small room and the lizard had nowhere to go. After upending the sofa and moving a couple of tables, we cornered it by the door, and I was able to trap it in my hands without any damage to its tail.

When I let it go on the rocks outside, it calmed down immediately, giving me time to run for the camera. Its gorgeous purple tail doesn't show very well, but at least the photo was good enough to identify it as a juvenile Great Plains skink. The adults look more like Jabba the Hut nothing like the lovely little juvenile we saw. We actually saw an adult this spring (outside), when we were clearing out a thick weed patch and disturbed a skink from its hibernation. And how did this poor lizard get saddled with a scientfic name of Eumeces obsoletus?

July 30, 2015 05:07 PM

July 27, 2015

Akkana Peck

Trackpad workarounds: using function keys as mouse buttons

I've had no end of trouble with my Asus 1015E's trackpad. A discussion of laptops on a mailing list -- in particular, someone's concerns that the nifty-looking Dell XPS 13, which is available preloaded with Linux, has had reviewers say that the trackpad doesn't work well -- reminded me that I'd never posted my final solution.

The Asus's trackpad has two problems. First, it's super sensitive to taps, so if any part of my hand gets anywhere near the trackpad while I'm typing, suddenly it sees a mouse click at some random point on the screen, and instead of typing into an emacs window suddenly I find I'm typing into a live IRC client. Or, worse, instead of typing my password into a password field, I'm typing it into IRC. That wouldn't have been so bad on the old style of trackpad, where I could just turn off taps altogether and use the hardware buttons; this is one of those new-style trackpads that doesn't have any actual buttons.

Second, two-finger taps don't work. Three-finger taps work just fine, but two-finger taps: well, I found when I wanted a right-click (which is what two-fingers was set up to do), I had to go TAP, TAP, TAP, TAP maybe ten or fifteen times before one of them would finally take. But by the time the menu came up, of course, I'd done another tap and that canceled the menu and I had to start over. Infuriating!

I struggled for many months with synclient's settings for tap sensitivity and right and left click emulation. I tried enabling syndaemon, which is supposed to disable clicks as long as you're typing then enable them again afterward, and spent months playing with its settings, but in order to get it to work at all, I had to set the timeout so long that there was an infuriating wait after I stopped typing before I could do anything.

I was on the verge of giving up on the Asus and going back to my Dell Latitude 2120, which had an excellent trackpad (with buttons) and the world's greatest 10" laptop keyboard. (What the Dell doesn't have is battery life, and I really hated to give up the Asus's light weight and 8-hour battery life.) As a final, desperate option, I decided to disable taps completely.

Disable taps? Then how do you do a mouse click?

I theorized, with all Linux's flexibility, there must be some way to get function keys to work like mouse buttons. And indeed there is. The easiest way seemed to be to use xmodmap (strange to find xmodmap being the simplest anything, but there you go). It turns out that a simple line like

  xmodmap -e "keysym F1 = Pointer_Button1"
is most of what you need. But to make it work, you need to enable "mouse keys":
  xkbset m

But for reasons unknown, mouse keys will expire after some set timeout unless you explicitly tell it not to. Do that like this:

  xkbset exp =m

Once that's all set up, you can disable single-finger taps with synclient:

  synclient TapButton1=0
Of course, you can disable 2-finger and 3-finger taps by setting them to 0 as well. I don't generally find them a problem (they don't work reliably, but they don't fire on their own either), so I left them enabled.

I tried it and it worked beautifully for left click. Since I was still having trouble with that two-finger tap for right click, I put that on a function key too, and added middle click while I was at it. I don't use function keys much, so devoting three function keys to mouse buttons wasn't really a problem.

In fact, it worked so well that I decided it would be handy to have an additional set of mouse keys over on the other side of the keyboard, to make it easy to do mouse clicks with either hand. So I defined F1, F2 and F3 as one set of mouse buttons, and F10, F11 and F12 as another.

And yes, this all probably sounds nutty as heck. But it really is a nice laptop aside from the trackpad from hell; and although I thought Fn-key mouse buttons would be highly inconvenient, it took surprisingly little time to get used to them.

So this is what I ended up putting in .config/openbox/autostart file. I wrap it in a test for hostname, since I like to be able to use the same configuration file on multiple machines, but I don't need this hack on any machine but the Asus.

if [ $(hostname) == iridum ]; then
  synclient TapButton1=0 TapButton2=3 TapButton3=2 HorizEdgeScroll=1

  xmodmap -e "keysym F1 = Pointer_Button1"
  xmodmap -e "keysym F2 = Pointer_Button2"
  xmodmap -e "keysym F3 = Pointer_Button3"

  xmodmap -e "keysym F10 = Pointer_Button1"
  xmodmap -e "keysym F11 = Pointer_Button2"
  xmodmap -e "keysym F12 = Pointer_Button3"

  xkbset m
  xkbset exp =m
  synclient TapButton1=1 TapButton2=3 TapButton3=2 HorizEdgeScroll=1

July 27, 2015 02:54 AM

July 25, 2015

Elizabeth Krumbach

OSCON 2015

Following the Community Leadership Summit (CLS), which I wrote about wrote about here, I spent a couple of days at OSCON.

Monday kicked off by attending Jono Bacon’s Community leadership workshop. I attended one of these a couple years ago, so it was really interesting to see how his advice has evolved with the change in tooling and progress that communities in tech and beyond has changed. I took a lot of notes, but everything I wanted to say here has been summarized by others in a series of great posts on

…hopefully no one else went to Powell’s to pick up the recommended books, I cleared them out of a couple of them.

That afternoon Jono joined David Planella of the Community Team at Canonical and Michael Hall, Laura Czajkowski and I of the Ubuntu Community Council to look through our CLS notes and come up with some talking points to discuss with the rest of the Ubuntu community regarding everything from in person events (stronger centralized support of regional Ubucons needed?) to learning what inspires people about the active Ubuntu phone community and how we can make them feel more included in the broader community (and helping them become leaders!). There was also some interesting discussion around the Open Source projects managed by Canonical and expectations for community members with regard to where they can get involved. There are some projects where part time, community contributors are wanted and welcome, and others where it’s simply not realistic due to a variety of factors, from the desire for in-person collaboration (a lot of design and UI stuff) to the new projects with an exceptionally fast pace of development that makes it harder for part time contributors (right now I’m thinking anything related to Snappy). There are improvements that Canonical can make so that even these projects are more welcoming, but adjusting expectations about where contributions are most needed and wanted would be valuable to me. I’m looking forward to discussing these topics and more with the broader Ubuntu community.

Laura, David, Michael, Lyz

Monday night we invited members of the Oregon LoCo out and had an Out of Towners Dinner at Altabira City Tavern, the restaurant on top of the Hotel Eastlund where several of us were staying. Unfortunately the local Kubuntu folks had already cleared out of town for Akademy in Spain, but we were able to meet up with long-time Ubuntu member Dan Trevino, who used to be part of the Florida LoCo with Michael, and who I last saw at Google I/O last year. I enjoyed great food and company.

I wasn’t speaking at OSCON this year, so I attended with an Expo pass and after an amazing breakfast at Mother’s Bistro in downtown Portland with Laura, David and Michael (…and another quick stop at Powell’s), I spent Tuesday afternoon hanging out with various friends who were also attending OSCON. When 5PM rolled around the actual expo hall itself opened, and surprised me with how massive and expensive some of the company booths had become. My last OSCON was in 2013 and I don’t remember the expo hall being quite so extravagant. We’ve sure come a long way.

Still, my favorite part of the expo hall is always the non-profit/open source project/organization area where the more grass-roots tables are. I was able to chat with several people who are really passionate about what they do. As a former Linux Users Group organizer and someone who still does a lot of open source work for free as a hobby, these are my people.

Wednesday was my last morning at OSCON. I did another walk around the expo hall and chatted with several people. I also went by the HP booth and got a picture of myself… with myself. I remain very happy that HP continues to support my career in a way that allows me to work on really interesting open source infrastructure stuff and to travel the world to tell people about it.

My flight took me home Wednesday afternoon and with that my OSCON adventure for 2015 came to a close!

More OSCON and general Portland photos here:

by pleia2 at July 25, 2015 12:27 AM

July 24, 2015


Linux Lite for older computers

At work I use several older desktops for various functions. As in "older" I mean 2006 or so :) One system is used primarily for the internet if a customer needs internet access, another system is set up for a cheap live webcam to monitor outdoor premises, and so on.

In looking for an easy to install OS that is Ubuntu/Debian based I have had MUCH success with Linux Lite. Linux Lite is a beginner-friendly Linux distribution based on Ubuntu 14.04 LTS and featuring the Xfce desktop.

Linux Lite is delightfully lightweight and runs fast & responsive on our old computers which are single core Pentium 4 - 3.0Ghtz, with 2GB of memory. I have had problems in the past with graphics while installing Xubuntu or Lubuntu, but not so with Linux Lite.

A ton of time is also saved with pre-installed programs like VLC, LibreOffice, GIMP, Firefox, Steam, and Thunderbird.

It also has its own built in program called Lite Software which makes life super easy for you to install other useful apps including: Chrome browser, Chromium browser, Dropbox, Ubuntu Games Pack, Pidgin chat, Google Talk plugin, Java, KeePassX password manager, PlayOnLinux for windows games, Ubuntu Restricted Extras, Skype, TeamViewer, Deluge torrent app, OpenShot video editor, VirtualBox, and XBMC.

If you have older computers and other distros are not working out for you, definitely give Linux Lite a try!

by iheartubuntu ( at July 24, 2015 12:43 AM

July 21, 2015

Elizabeth Krumbach

Community Leadership Summit 2015

My Saturday kicked off with the Community Leadership Summit (CLS) here in Portland, Oregon.

CLS sign

Jono Bacon opened the event by talking about the growth of communities in the past several years as internet-connected communities of all kinds are springing up worldwide. Though this near-OSCON CLS is open source project heavy, he talked about communities that range from the Maker movement to political revolutions. While we work to develop best practices for all kinds of communities, it was nice to hear one of his key thoughts as we move forward in community building: “Community is not an extension of the Marketing department.”

The day continued with a series of plenaries, which were 15 minutes long and touched upon topics like empathy, authenticity and vulnerability in community management roles. The talks wrapped up with a Facilitation 101 talk to give tips on how to run the unconference sessions. We then did the session proposals and scheduling that would pick up after lunch.

CLS schedule

As mentioned in my earlier post we had some discussion points from our experiences in the Ubuntu community that we wanted to get feedback on from the broader leadership community so we proposed 4 sessions that lasted the afternoon.

Lack of new generation of leaders

The root of this session came from our current struggle in the Ubuntu community to find leaders, from those who wish to sit on councils and boards to leaders for the LoCo teams. In addition to several people who expressed similar problems in their own communities, there was some fantastic feedback from folks who attended, including:

  • Some folks don’t see themselves as “Leaders” so using that work can be intimidating, if you find this is the case, shift to using different types of titles that do more to describe the role they are taking.
  • Document tasks that you do as a leader and slowly hand them off to people in your community to build a supportive group of people who know the ins and outs and can take a leadership role in the future.
  • Evaluate your community every few years to determine whether your leadership structure still makes sense, and make changes with every generation of community leaders if needed (and it often is!).
  • If you’re seeking to get more contributions from people who are employed to do open source, you may need to engage their managers to prioritize appropriately. Also, make sure credit is given to companies who are paying employees to contribute.
  • Set a clear set of responsibilities and expectations for leadership positions so people understand the role, commitment level and expectations of them.
  • Actively promote people who are doing good work, whether by expressing thanks on social media, in blog posts and whatever other communications methods you employ, as well as inviting them to speak at other events, fund them to attend events and directly engage them. This will all serve to build satisfaction and their social capital in the community.
  • Casual mentorship of aspiring leaders who you can hand over projects for them to take over once they’ve begun to grow and understand the steps required.

Making lasting friendships that are bigger than the project

This was an interesting session that was proposed as many of us found that we built strong relationships with people early on in Ubuntu, but have noticed fewer of those developing in the past few years. Many of us have these friendships which have lasted even as people leave the project, and even leave the tech industry entirely, for us Ubuntu wasn’t just an open source project, we were all building lasting relationships.

Recommendations included:

  • In person events are hugely valuable to this (what we used to get from Ubuntu Developer Summits). Empower local communities to host major events.
  • Find a way to have discussions that are not directly related to the project with your fellow project members, including creating a space where there’s a weekly topic, giving a space to share accomplishments, and perhaps not lumping it all together (some new off-topic threads on Discourse?)
  • Provide a space to have check-ins with members of and teams in your community, how is life going? Do you have the resources you need?
  • Remember that tangential interests are what bring people together on a personal level and seek to facilitate that

There was also some interesting discussion around handling contributors whose behavior has become disruptive (often due to personal things that have come up in their life), from making sure a Code of Conduct is in place to set expectations for behavior to approaching people directly to check in to make sure they’re doing all right and to discuss the change in their behavior.

Declining Community Participation

We proposed this session because we’ve seen a decline in community participation since before the Ubuntu Developer Summits ceased. We spent some time framing this problem in the space it’s in, with many Linux distributions and “core” components seeing similar decline and disinterest in involvement. It was also noted that when a project works well, people are less inclined to help because they don’t need to fix things, which may certainly be the case with a product like the Ubuntu server. In this vein, it was noted that 10 years ago the contributor to user ratio was much higher, since many people who used it got involved in order to file bugs and collaborate to fix things.

Some of the recommendations that came out of this session:

  • Host contests and special events to showcase new technologies to get people excited about involvement (made me think of Xubuntu testing with XMir, we had a lot of people testing it because it was an interesting new thing!)
  • In one company, the co-founder set a community expectation for companies who were making money from the product to give back 5% in development (or community management, or community support).
  • Put a new spin on having your code reviewed: it’s constructive criticism from programmers with a high level of expertise, you’re getting training while they chime in on reviews. Note that the community must have a solid code review community that knows how to help people and be kind to them in reviews.
  • Look at bright spots in your community and recreate them: Where has the community grown? (Ubuntu Phone) How can you bring excitement there to other parts of your project? Who are your existing contributors in the areas where you’ve seen a decline and how can you find more contributors like them?
  • Share stories about how your existing members got involved so that new contributors see a solid on-ramp for themselves, and know that everyone started somewhere.
  • Make sure you have clear, well-defined on-ramps for various parts of your project, it was noted that Mozilla does a very good job with this (Ubuntu does use Mozilla’s Asknot, but it’s hard to find!).

Barriers related to single-vendor control and development of a project

This session came about because of the obvious control that Canonical has in the direction of the Ubuntu project. We sought to find advice from other communities where there was single-vendor control. Perhaps unfortunately the session trended heavily toward specifically Ubuntu, but we were able to get some feedback from other communities and how they handle decisions made in an ecosystem with both paid and volunteer contrbutors:

  • Decisions should happen in a public, organized space (not just an IRC log, Google Hangout or in person discussion, even if these things are made public). Some communities have used: Github repo, mailing list threads, Request For Comment system to gather feedback and discuss it.
  • Provide a space where community members can submit proposals that the development community can take seriously (we did used to have for this, but it wound down over the years and became less valuable.
  • Make sure the company counts contributions as real, tangible things that should be considered for monetary value (non-profits already do this for their volunteers).
  • Make sure the company understands the motivation of community members so they don’t accidentally undermine this.
  • Evaluate expectations in the community, are there some things the company won’t budge on? Are they honest about this and do they make this clear before community members make an investment? Ambiguity hurts the community.

I’m really excited to have further discussions in the Ubuntu community about how these insights can help us. Once I’m home I’ll be able to collect my thoughts and take thoughts and perhaps even action items to the ubuntu-community-team mailing list (which everyone is welcome to participate in).

This first day concluded with a feedback session for the summit itself, which brought up some great points. On to day two!

As with day one, we began the day with a series of plenaries. The first was presented by Richard Millington who talked about 10 “Social Psychology Hacks” that you can use to increase participation in your community. These included “priming” or using existing associations to encourage certain feelings, making sure you craft your story about your community, designing community rituals to make people feel included and use existing contributors to gain more through referrals. It was then time for Laura Czajkowski’s talk about “Making your the Marketing team happy”. My biggest take-away from this one was that not only has she learned to use the tools the marketing team uses, but she now attends their meetings so she can stay informed of their projects and chime in when a suggestion has been made that may cause disruption (or worse!) in the community. Henrik Ingo then gave a talk where he did an analysis of the governance types of many open source projects. He found that all the “extra large” projects developer/commit-wise were all run by a foundation, and that there seemed to be a limit as to how big single-vendor controlled projects could get. I had suspected this was the case, but it was wonderful to have his data to back up my suspicions. Finally, Gina Likins of Red Hat spoke about her work to get universities and open source projects working together. She began her talk by explaining how few college Computer Science majors are familiar with open source, and suggested that a kind of “dating site” be created to match up open source projects with professors looking to get their students involved. Brilliant! I attended her session related to it later in the afternoon.

My afternoon was spent first by joining Gina and others to talk about relationships between university professors and open source communities. Her team runs and it turns out I subscribed to their mailing list some time ago. She outlined several goals, from getting students familiar with open source tooling (IRC, mailing lists, revision control, bug trackers) all the way up to more active roles directly in open source projects where the students are submitting patches. I’m really excited to see where this goes and hope I can some day participate in working with some students beyond the direct mentoring through internships that I’m doing now.

Aside from substantial “hallway track” time where I got to catch up with some old friends and meet some people, I went to a session on having open and close-knit communities where people talked about various things, from reaching out to people when they disappear, the importance of conduct standards (and swift enforcement), and going out of your way to participate in discussions kicked off by newcomers in order to make them feel included. The last session I went to shared tips for organizing local communities, and drew from the off-line community organizing that has happened in the past. Suggestions for increasing participation for your group included cross-promotion of groups (either through sharing announcements or doing some joint meetups), not letting volunteers burn out/feel taken for granted and making sure you’re not tolerating poisonous people in your community.

The Community Leadership Summit concluded with a Question and Answer session. Many people really liked the format, keeping the morning pretty much confined to the set presentations and setting up the schedule, allowing us to take a 90 minute lunch (off-site) and come back to spend the whole afternoon in sessions. In all, I was really pleased with the event, kudos to all the organizers!

by pleia2 at July 21, 2015 05:10 AM

July 20, 2015

Eric Hammond - Countdown Timer Microservice Built On Amazon API Gateway and AWS Lambda

deceptively simple web service with super powers is a fully functional, fully scalable microservice built on the just-released Amazon API Gateway and increasingly popular AWS Lambda platforms. is a public web service that maintains a practically unlimited number of countdown timers with one second resolution and no practical limit to the number of seconds each timer can run.

New timers can be created on a whim and each timer can be reset at any time to any number of seconds desired, whether it is still running or has already expired.


Let’s begin with an example to demonstrate the elegant simplicity of the interface.

1. Set timer - Any request of the following URL sets a timer named “YOURTIMERNAME” to start counting down immediately from 60 seconds:

You may click on that link now, or hit a URL of the same format with your own timer name and your chosen number of seconds. You may use a browser, a command like curl, or your favorite programming language.

2. Poll timer - The following URL requests the status of the above timer. Note that the only difference in the URL is that we have dropped the seconds count.

If the named timer is still running, will return HTTP Status code 200 OK, along with a JSON structure containing information like how many seconds are left.

If the timer has expired, will return an HTTP status code 504 Timeout.

That’s it!

No, really. That’s the entire API.

And the whole service is implemented in about 60 lines of code, on top of a handful of powerful infrastructure services managed, protected, maintained, and scaled by Amazon.

Not Included

The service does not perform any action when a timer expires. The timer should be polled to find out if it has expired.

On first thought, this may cause you to wonder if this service might, in fact, be completely useless. Instead of polling, why not just have your code keep its own timer records or look at a clock and see if it’s time yet?

The answer is that is not created for situations where you can depend on your own code to be running and keeping track of things. is designed for integration with existing third party software packages and services that already support a polling mechanism, but do not implement timers.

For example…

Event Monitoring

There are many types of monitoring software packages and free/commercial services that poll resources to see if they are healthy and alert you if there is a problem, but they have no way to alert you if an expected event does not occur. For example, you may want to ensure that a batch job runs every hour, or a message is posted to an SNS topic at least every 15 minutes.

The service can be the glue between the existing events you wish to monitor and your existing monitoring system. Here’s how it works:

1. Set timer - When your event runs, trigger a ping of to reset the timer. In the URL, specify the name of the timer and the number of seconds when your monitoring system should consider it a problem if no further event has run.

2. Poll timer - Add the polling URL for the same timer to your monitoring software, configuring it to alert you if the web request returns anything but success.

If your events keep resetting the timer before the timer expires, your monitoring system will stay happy and quiet, as the polling URL will always return success.

If the monitoring system polls the timer when no event has run in the specified number of seconds, then alarms sound, you will be woken up, and you can investigate why your batch job did not run on its expected schedule.

This is all possible using your existing monitoring system’s standard web check service, without any additional plugins or feature development.

Naming has no registration, no authentication, and no authorization. If you don’t want somebody else resetting your timer accidentally or on purpose, you should pick a timer name that is unguessable even with brute force.

For example:

# A sensible timer name with some unguessable random bits
timer=$(pwgen -s 22 1)
echo $timer

# (OR)
timer=$(uuid -v4 -FSIV)
echo $timer

# Set the timer to 1 hour
curl -s $timer/$seconds | jq .

# Check the timer
curl -s $timer | jq .

Cron Jobs

Say I have a cron job that runs once an hour. I don’t mind if it fails to compelete successfully once, but if it fails to check in twice in a row, I want to be alerted.

This example will use a random number for the timer name. You should generate your own unique timer names (see previous section).

Here’s a sample crontab entry that runs my job, then resets the countdown timer using

0 * * * * $HOME/bin/create-snapshots && curl -s >/dev/null

The timer is being reset at the end of each job to 8100 seconds which is two hours plus 15 minutes. The extra minutes gives the hourly cron job some extra time to complete before we start sounding alarms.

All that’s left is to add the monitor poll URL to my monitoring service:


Though you can ignore the response content from the web service, here are samples of what it returns.

If the timer has not yet expired because your events are running on schedule and resetting the countdown, then the monitoring URL returns a 200 success code along with the current state of the timer. This includes things like when the timer set URL was last requested, and how many seconds remain before the timer goes into an error state.

  "timer": "YOURTIMERNAME",
  "request_id": "501abe10-2dad-11e5-80c1-35cdcb449e41",
  "status": "ok",
  "now": 1437265810,
  "start_time": 1437265767,
  "start_seconds": 60,
  "seconds_elapsed": 43,
  "seconds_remaining": 17,
  "message": "Timer still running"

If the timer has expired and no event has been run to reset it, then the monitor URL returns a 504 timeout error code and an error message. Once I figure out how to get the API Manager to return both an error code and some JSON content, I will expand this to include more details about when the timer expired.

  "errorMessage": "504: timer timed out"

When you call the event URL, passing in the number of seconds for resetting the timer, the API returns the previous state of the timer (as in the first example above) along with a note that it has set the new values.

  "timer": "YOURTIMERNAME",
  "request_id": "36a764b6-2dad-11e5-9318-f3b076dd2a3a",
  "status": "ok",
  "now": 1437265767,
  "start_time": 1437263674,
  "start_seconds": 60,
  "seconds_elapsed": 2093,
  "seconds_remaining": -2033,
  "message": "Timer countdown updated",
  "new_start_time": 1437265767,
  "new_start_seconds": 60

If this is the first time you have set the particular timer, the previous state keys will be missing.


There are none. a free public service intended, but not guaranteed, to be useful. It may return unexpected results. At any time and with no warning, it may become unavailable for short periods or forever.

Terms of Use

Don’t use in an abusive manner. If you are unsure if your use case might be considered abusive, ask.


I am not aware of any services that operate the same way as, with the ability to add dead man switch features to existing polling-based monitoring services, but here are a few services that are targeted specifically at event monitoring.

What do you use for monitoring and alerting? Are you using monitoring to make sure scheduled events are not missed?

Original article and comments:

July 20, 2015 11:26 AM

Akkana Peck

Plugging in those darned USB cables

I'm sure I'm not the only one who's forever trying to plug in a USB cable only to find it upside down. And then I flip it and try it the other way, and that doesn't work either, so I go back to the first side, until I finally get it plugged in, because there's no easy way to tell visually which way the plug is supposed to go.

It's true of nearly all of the umpteen variants of USB plug: almost all of them differ only subtly from the top side to the bottom.

[USB trident] And to "fix" this, USB cables are built so that they have subtly raised indentations which, if you hold them to the light just right so you can see the shadows, say "USB" or have the little USB trident on the top side:

In an art store a few weeks ago, Dave had a good idea.

[USB cables painted for orientation] He bought a white paint marker, and we've used it to paint the logo side of all our USB cables.

Tape the cables down on the desk -- so they don't flop around while the paint is drying -- and apply a few dabs of white paint to the logo area of each connector. If you're careful you might be able to fill in the lowered part so the raised USB symbol stays black; or to paint only the raised USB part. I tried that on a few cables, but after the fifth or so cable I stopped worrying about whether I was ending up with a pretty USB symbol and just started dabbing paint wherever was handy.

The paint really does make a big difference. It's much easier now to plug in USB cables, especially micro USB, and I never go through that "flip it over several times" dance any more.

July 20, 2015 02:37 AM

July 18, 2015

Elizabeth Krumbach

SF activities and arrival in Portland, OR

Time at home in San Francisco came to an end this week with a flight to Portland, OR on Friday for some open source gatherings around OSCON. This ended my nearly 2 months without getting on a plane, the longest stretch I’ve gone in over 2 years. My initial intention with this time was to spend a lot of time on my book, which I have, but not nearly as much as I’d hoped because the work and creativity required isn’t something you can just turn on and off. It was nice getting to spend so much time with my husband though, and the kitties. The stretch at home also led me to join a gym again (I’d canceled my last month to month membership when a stretch of travel had me gone for over a month). Upon my return next week I have my first of four sessions with a trainer at the gym scheduled.

While I haven’t exactly had a full social calendar of late, I have been able to go to a few events. Last Wednesday I hosted an Ubuntu Hour and Bay Area Debian Dinner in San Francisco.

The day after, SwiftStack hosted probably the only OpenStack 5th birthday party I’ll be able to attend this year (leaving before the OSCON one, will be in Peru for the HP one!). I got to see some familiar faces, meet some Swift developers and eat some OpenStack cake.

MJ had a friend in town last week too, which meant I had a lot of time to myself. In the spirit of not having to worry about my own meals during this time, I cooked up a pot of beef stew to enjoy through the week and learned quickly that I should have frozen at least half of it. Even a modest pot of stew is much more than I can eat it all myself over the course of a week. I did enjoy it though, some day I’ll learn about spices so I can make one that’s not so bland.

I’ve also been running again, after a bit of a hiatus following the trip to Vancouver. Fortunately I didn’t lose much ground stamina-wise and was mostly able to pick up where I left off. It has been warmer than normal in San Francisco these past couple weeks though, so I’ve been playing around with the times of my runs, with early evenings as soon as the fog/coolness rolls in currently the winning time slot during the week. Sunday morning runs have been great too.

This week I made it out to a San Francisco DevOps meetup where Tom Limoncelli was giving a talk inspired by some of the less intuitive points in his book The Practice of Cloud Systems Administration. In addition to seeing Tom, it was nice to meet up with some of my local DevOps friends who I haven’t managed to connect with lately and meet some new people.

I had a busy week at home before my trip to Portland this week, upon settling in to the hotel I’m staying at I met up with my friend and fellow Ubuntu Community Council Member Laura Czajkowski. We took the metro over the bridge to downtown Portland and on the way she showed off her Ubuntu phone, and the photo taking app for a selfie together!

Since it was Laura’s first time in Portland, our first stop downtown was to Voodoo Doughnuts! I got my jelly-filled voodoo guy doughnut.

From there we made our way to Powell’s Books where we spent the rest of the afternoon, as you do with Powell’s. I picked up 3 books and learned that Powell’s Technical Books/Powell’s 2 has been absorbed into the big store, which was a little sad for me, it was fun to go to the store that just had science, transportation and engineering books. Still, it was a fun visit and I always enjoy introducing someone new to the store.

Then we headed back across the river to meet up with people for the Community Leadership Summit informal gathering event at the Double Tree. We had a really enjoyable time, I got to see Michael Hall of the Ubuntu Community Council and David Planella of the Community Team at Canonical to catch up with each other and chat about Ubuntu things. Plus, I ran into people I know from the broader open source community. As an introvert, it was one of the more energizing social events I’ve been to in a long time.

Today the Community Leadership Summit that I’m in town for kicks off! Looking forward to some great discussions.

by pleia2 at July 18, 2015 03:17 PM

July 16, 2015

Elizabeth Krumbach

Ubuntu at the upcoming Community Leadership Summit

This weekend I have the opportunity to attend the Community Leadership Summit. While there, I’ll be able to take advantage of an opportunity that’s rare now: meeting up with my fellow Ubuntu Community Council members Laura Czajkowski and Michael Hall, along with David Planella of the community team at Canonical. At the Community Council meeting today, I was able to work with David on narrowing down a few topics that impact us and we think would be of interest to other communities and we’ll propose for discussion at CLS:

  1. Declining participation
  2. Community cohesion
  3. Barriers related to [the perception of] company-driven control and development
  4. Lack of a new generation of leaders

As an unconference, we’ll be submitting these ideas for discussion and so we’ll see how many of them gain interest of enough people to have a discussion.


Community Leadership Summit 2015

Since we’ll all be together, we also managed to arrange some time together on Monday afternoon and Tuesday to talk about how these challenges impact Ubuntu specifically and get to any of the topics mentioned above that weren’t selected for discussion at CLS itself. By the end of this in person gathering we hope to have some action items, or at least some solidified talking points and ideas to bring to the ubuntu-community-team mailing list. I’ll also be doing a follow-up blog post where I share some of my takeaways.

What I need from you:

If you’re attending CLS join us for the discussions! If you just happen to be in the area for OSCON in general, feel free to reach out to me (email: to have a chat while I’m in town. I fly home Wednesday afternoon.

If you can’t attend CLS but are interested in these discussions, chime in on the ubuntu-community-team thread or send a message to the Community Council at community-council at with your feedback and we’ll work to incorporate it into the sessions. You’re also welcome to contact me directly and I’ll pass things along (anonymously if you’d like, just let me know).

Finally, a reminder that this time together is not a panacea. These are complicated concerns in our community that will not be solved over a weekend and a few members of the Ubuntu Community Council won’t be able to solve them alone. Like many of you, I’m a volunteer who cares about the Ubuntu community and am doing my best to find the best way forward. Please keep this in mind as you bring concerns to us. We’re all on the same team here.

by pleia2 at July 16, 2015 06:59 PM

July 15, 2015

Eric Hammond

Simple New Web Service: Testers Requested

Interested in adding scheduled job monitoring (dead man’s switch) to the existing monitoring and alerting framework you are already using (Nagios, Sensu, Zenoss, Zabbix, Monit, Pingdom, Montastic, Ruxit, and the like)?

Last month I wrote about how I use to monitor scheduled events with an example using an SNS Topic and AWS Lambda.

This week I spent a few hours building a simple web service that enables any polling-based monitor software or service to automatically support alerting when a target event has not occurred in a desired timeframe.

The new web service is built on infrastructure technologies that are reliably maintained and scaled by Amazon:

  • API Gateway
  • AWS Lambda
  • DynamoDB
  • CloudFront
  • Route53
  • CloudWatch

The source code is about a page long and the web service API is as trivial as it gets; but the functionality it adds to monitoring services is quite powerful and hugely scalable.

Integration requires these simple steps:

Step 1: There is no step one! There is no registration, no setup, and no configuration of the new web service for your use.

Step 2: Hit one URL when your target event occurs.

Step 3: Tell your existing monitoring system to poll another URL and to alert you when it fails.

Result: When your scheduled task misses an appointment and doesn’t check in, the second URL monitored by your software will start returning a failure code, and you will be alerted.


I’m still working on the blog post to introduce the web service, but would love to have some folks test it out this week and give feedback.

If you are interested, drop me an email and mention:

  • The monitoring/alerting frameworks you currently use

  • The type of scheduled activities you would like to monitor (cron job, SNS topic, Lambda function, web page view, email receipt, …)

  • The frequency of the target events (every 10 seconds, every 10 years, …)

Even if you don’t want to do testing this week, I’d love to hear your answers to the above three points, through email or in the comments below.

Original article and comments:

July 15, 2015 04:54 AM

July 14, 2015

Akkana Peck

Hummingbird Quidditch!

[rufous hummingbird] After months of at most one hummingbird at the feeders every 15 minutes or so, yesterday afternoon the hummingbirds here all suddenly went crazy. Since then, my patio looks like a tiny Battle of Britain, There are at least four males involved in the fighting, plus a couple of females who sneak in to steal a sip whenever the principals retreat for a moment.

I posted that to the local birding list and someone came up with a better comparison: "it looks like a Quidditch game on the back porch". Perfect! And someone else compared the hummer guarding the feeder to "an avid fan at Wimbledon", referring to the way his head keeps flicking back and forth between the two feeders under his control.

Last year I never saw anything like this. There was a week or so at the very end of summer where I'd occasionally see three hummingbirds contending at the very end of the day for their bedtime snack, but no more than that. I think putting out more feeders has a lot to do with it.

All the dogfighting (or quidditch) is amazing to watch, and to listen to. But I have to wonder how these little guys manage to survive when they spend all their time helicoptering after each other and no time actually eating. Not to mention the way the males chase females away from the food when the females need to be taking care of chicks.

[calliope hummingbird]

I know there's a rufous hummingbird (shown above) and a broad-tailed hummingbird -- the broad-tailed makes a whistling sound with his wings as he dives in for the attack. I know there a black-chinned hummer around because I saw his characteristic tail-waggle as he used the feeder outside the nook a few days before the real combat started. But I didn't realize until I checked my photos this morning that one of the combatants is a calliope hummingbird. They're usually the latest to arrive, and the rarest. I hadn't realized we had any calliopes yet this year, so I was very happy to see the male's throat streamers when I looked at the photo. So all four of the species we'd normally expect to see here in northern New Mexico are represented.

I've always envied places that have a row of feeders and dozens of hummingbirds all vying for position. But I would put out two feeders and never see them both occupied at once -- one male always keeps an eye on both feeders and drives away all competitors, including females -- so putting out a third feeder seemed pointless. But late last year I decided to try something new: put out more feeders, but make sure some of them are around the corner hidden from the main feeders. Then one tyrant can't watch them all, and other hummers can establish a beachhead.

It seems to be working: at least, we have a lot more activity so far than last year, even though I never seem to see any hummers at the fourth feeder, hidden up near the bedroom. Maybe I need to move that one; and I just bought a fifth, so I'll try putting that somewhere on the other side of the house and see how it affects the feeders on the patio.

I still don't have dozens of hummingbirds like some places have (the Sopaipilla Factory restaurant in Pojoaque is the best place I've seen around here to watch hummingbirds). But I'm making progress

July 14, 2015 06:45 PM

July 09, 2015

Akkana Peck

Taming annoyances in the new Google Maps

For a year or so, I've been appending "output=classic" to any Google Maps URL. But Google disabled Classic mode last month. (There have been a few other ways to get classic Google maps back, but Google is gradually disabling them one by one.)

I have basically three problems with the new maps:

  1. If you search for something, the screen is taken up by a huge box showing you what you searched for; if you click the "x" to dismiss the huge box so you can see the map underneath, the box disappears but so does the pin showing your search target.
  2. A big swath at the bottom of the screen is taken up by a filmstrip of photos from the location, and it's an extra click to dismiss that.
  3. Moving or zooming the map is very, very slow: it relies on OpenGL support in the browser, which doesn't work well on Linux in general, or on a lot of graphics cards on any platform.

Now that I don't have the "classic" option any more, I've had to find ways around the problems -- either that, or switch to Bing maps. Here's how to make the maps usable in Firefox.

First, for the slowness: the cure is to disable webgl in Firefox. Go to about:config and search for webgl. Then doubleclick on the line for webgl.disabled to make it true.

For the other two, you can add userContent lines to tell Firefox to hide those boxes.

Locate your Firefox profile. Inside it, edit chrome/userContent.css (create that file if it doesn't already exist), and add the following two lines:

div#cards { display: none !important; }
div#viewcard { display: none !important; }

Voilà! The boxes that used to hide the map are now invisible. Of course, that also means you can't use anything inside them; but I never found them useful for anything anyway.

July 09, 2015 04:54 PM

July 06, 2015

Elizabeth Krumbach

California Tourist

I returned from my latest conference on May 23rd, closing down what had been over 2 years of traveling every month to some kind of conference, event or family gathering. This was the longest stretch of travel I’ve done and I’ve managed to visit a lot of amazing places and meeting some unforgettable people. However, with a book deadline creeping up and tasks at home piling up, I figured it was time to slow down for a bit. I didn’t travel in June and my next trip isn’t until the end of July when I’m going up to Portland for the Community Leadership Summit and a couple days of schmoozing with OSCON friends.

Complicated moods of late and continued struggles with migraines have made it so I’ve not been as productive as I’ve wanted, but I have made real progress on some things I’ve wanted to and my book is really finally coming together. In the spaces between work I’ve also managed a bit of much needed fun and relaxation.

A couple weekends ago MJ and I took a weekend trip up to an inn and spa in Sonoma to get some massages and soak in natural mineral water pools provided by on site springs. We had some amazing dinners at the inn, including one evening where we enjoyed s’mores at an outdoor fire pit. The time spent was amazingly relaxing and refreshing, and although it wasn’t a cure-all for the dip in my mood of late, it was some time well spent together.

Perfect weather, beautiful venue

On Sunday morning we checked out of the inn and enjoyed a fantastic brunch that included lobster eggs benedict on the grounds before venturing on. While in Sonoma, we decided to stop by a couple wineries that we were familiar with, starting with Imagery, which is the sister winery to the one we got engaged at, and our next stop, Benziger. At both we picked up several nice wines, of which I’m looking forward to cracking open for Shabbats in our near future!

We also stopped by B.R. Cohn for a couple olive oils, and I picked up some delicious blackberry jam and some Chardonnay caramel sauce which has graced some bowls of ice cream since our return. On the trip back to San Francisco we made one final stop, at Jacuzzi Winery where we picked up several more interesting bottles of olive oil, which will soon make it into some salads, scrambled eggs and other dishes that we got recipe cards for.

Due to my backlog, I’ve been spending a lot of time at home and not much at local events, with the exception of a great gathering at the East Bay Linux Users Group a few weeks ago. In contrast with my professional colleagues who work on Linux full time as systems administrators, engineers and DevOps, it’s so refreshing to go to a LUG where I’m meeting with long term tech hobbiests who still distro-hop and come up with interesting questions around the distros I’m most familiar with and the Linux ecosystem in general. This group has also had interest in Partimus lately, so it was nice to get some feedback about our on-going efforts and volunteer recruitment activities.

In an effort to get out of the house more, I picked up the book Historic Walks in San Francisco: 18 Trails Through the City’s Past and finally took it out for a spin this weekend. I went on the Financial District walk which took me around what is essentially my own neighborhood but had me look at it with whole new eyes. I learned that the Hallidie Building tricked me into believing it was a new building with it’s glass exterior, but is actually from 1917 and one of the first American buildings to feature glass curtain walls.

Hallidie Building

One of my favorite buildings on the tour turned out to be the Kohl Building, which was built in 1901 and withstood the 1906 earthquake that leveled most of downtown San Francisco and so was used as a command post during the recovery. Erected for Alvinza Hayward, the “H” shape of the building is allegedly in honor of his last name.

Kohl Building

The tour had lots more fun landmarks and stories of recovery (or not) following the 1906 earthquake. Amusingly for my European friends, the young age of San Francisco itself, and our shaky history means that there was not much at all here 160 years ago, so “historical” for us means 50+ years. Over 110 years and you’re going back before the city was essentially leveled by the earthquake and fire to some truly impressive sturdy buildings. The oldest on the tour was the oldest standing building downtown and it dates from 1877 and now houses the Pacific Heritage Museum, which I hope to visit one of these days when they’re open.

More photos from my walk here:

While on the topic of walking tours, doing this tour alone left something to be desired, even with Tony Bennett and company crooning in my ears. I think I might look up some of the free San Francisco Walking Tours for my next adventure.

My 4th of July weekend here has been pretty low-key. MJ has a friend in town, so they’ve been spending the days out and I’ll sometimes tag along for dinner. With an empty house, I got some reading done, plowed through several tasks on my to do list and started catching up on book related tasks. I still don’t feel like I got “enough” done, but there’s always tomorrow.

by pleia2 at July 06, 2015 01:23 AM

July 04, 2015

Akkana Peck

Create a signed app with Cordova

I wrote last week about developing apps with PhoneGap/Cordova. But one thing I didn't cover. When you type cordova build, you're building only a debug version of your app. If you want to release it, you have to sign it. Figuring out how turned out to be a little tricky.

Most pages on the web say you can sign your apps by creating platforms/android/ with the same keystore information in it that you'd put in an ant build, then running cordova build android --release

But Cordova completely ignored my file and went on creating a debug .apk file and no signed one.

I found various other purported solutions on the web, like creating a build.json file in the app's top-level directory ... but that just made Cordova die with a syntax error inside one of its own files). This is the only method that worked for me:

Create a file called platforms/android/, and put this in it:

// if you don't want to enter the password at every build, use this:

Then cordova build android --release finally works, and creates a file called platforms/android/build/outputs/apk/android-release.apk

July 04, 2015 12:02 AM

June 30, 2015

Elizabeth Krumbach

Contributing to the Ubuntu Weekly Newsletter

Super star Ubuntu Weekly Newsletter contributor Paul White recently was reflecting upon his work with the newsletter and noted that he was approaching 100 issues that he’s contributed to. Wow!

That caused me to look at how long I’ve been involved. Back in 2011 the newsletter when on a 6 month hiatus when the former editor had to step down due to obligations elsewhere. After much pleading for the return of the newsletter, I spent a few weeks working with Nathan Handler to improve the scripts used in the release process and doing an analysis of the value of each section of the newsletter in relation to how much work it took to produce each week. The result was a slightly leaner, but hopefully just as valuable newsletter, which now took about 30 minutes for an experienced editor to release rather than 2+ hours. This change was transformational for the team, allowing me to be involved for a whopping 205 consecutive issues.

If you’re not familiar with the newsletter, every week we work to collect news from around our community and the Internet to bring together a snapshot of that week in Ubuntu. It helps people stay up to date with the latest in the world of Ubuntu and the Newsletter archive offers a fascinating glimpse back through history.

But we always need help putting the newsletter together. We especially need people who can take some time out of their weekend to help us write article summaries.

Summary writers. Summary writers receive an email every Friday evening (or early Saturday) US time with a link to the collaborative news links document for the past week which lists all the articles that need 2-3 sentence summaries. These people are vitally important to the newsletter. The time commitment is limited and it is easy to get started with from the first weekend you volunteer. No need to be shy about your writing skills, we have style guidelines to help you on your way and all summaries are reviewed before publishing so it’s easy to improve as you go on.

Interested? Email and we’ll get you added to the list of folks who are emailed each week.

I love working on the newsletter. As I’ve had to reduce my commitment to some volunteer projects I’m working on, I’ve held on to the newsletter because of how valuable and enjoyable I find it. We’re a friendly team and I hope you can join us!

Still just interested in reading? You have several options:

And everyone is welcome to drop by #ubuntu-news on Freenode to chat with us or share links to news we may found valuable for the newsletter.

by pleia2 at June 30, 2015 02:29 AM

June 29, 2015

Akkana Peck

Chollas in bloom, and other early summer treats

[Bee in cholla blossom] We have three or four cholla cacti on our property. Impressive, pretty cacti, but we were disappointed last year that they never bloomed. They looked like they were forming buds ... and then one day the buds were gone. We thought maybe some animal ate them before the flowers had a chance to open.

Not this year! All of our chollas have gone crazy, with the early rain followed by hot weather. Last week we thought they were spectacular, but they just kept getting better and better. In the heat of the day, it's a bee party: they're aswarm with at least three species of bees and wasps (I don't know enough about bees to identify them, but I can tell they're different from one another) plus some tiny gnat-like insects.

I wrote a few weeks ago about the piñons bursting with cones. What I didn't realize was that these little red-brown cones are all the male, pollen-bearing cones. The ones that bear the seeds, apparently, are the larger bright green cones, and we don't have many of those. But maybe they're just small now, and there will be more later. Keeping fingers crossed. The tall spikes of new growth are called "candles" and there are lots of those, so I guess the trees are happy.

[Desert willow in bloom] Other plants besides cacti are blooming. Last fall we planted a desert willow from a local native plant nursery. The desert willow isn't actually native to White Rock -- we're around the upper end of its elevation range -- but we missed the Mojave desert willow we'd planted back in San Jose, and wanted to try one of the Southwest varieties here. Apparently they're all the same species, Chilopsis linearis.

But we didn't expect the flowers to be so showy! A couple of blossoms just opened today for the first time, and they're as beautiful as any of the cultivated flowers in the garden. I think that means our willow is a 'Rio Salado' type.

Not all the growing plants are good. We've been keeping ourselves busy pulling up tumbleweed (Russian thistle) and stickseed while they're young, trying to prevent them from seeding. But more on that in a separate post.

As I write this, a bluebird is performing short aerobatic flights outside the window. Curiously, it's usually the female doing the showy flying; there's a male out there too, balancing himself on a piñon candle, but he doesn't seem to feel the need to show off. Is the female catching flies, showing off for the male, or just enjoying herself? I don't know, but I'm happy to have bluebirds around. Still no definite sign of whether anyone's nesting in our bluebird box. We have ash-throated flycatchers paired up nearby too, and I'm told they use bluebird boxes more than the bluebirds do. They're both beautiful birds, and welcome here.

Image gallery: Chollas in bloom (and other early summer flowers.

June 29, 2015 01:38 AM

June 23, 2015

Akkana Peck

Cross-Platform Android Development Toolkits: Kivy vs. PhoneGap / Cordova

Although Ant builds have made Android development much easier, I've long been curious about the cross-platform phone development apps: you write a simple app in some common language, like HTML or Python, then run something that can turn it into apps on multiple mobile platforms, like Android, iOS, Blackberry, Windows phone, UbuntoOS, FirefoxOS or Tizen.

Last week I tried two of the many cross-platform mobile frameworks: Kivy and PhoneGap.

Kivy lets you develop in Python, which sounded like a big plus. I went to a Kivy talk at PyCon a year ago and it looked pretty interesting. PhoneGap takes web apps written in HTML, CSS and Javascript and packages them like native applications. PhoneGap seems much more popular, but I wanted to see how it and Kivy compared. Both projects are free, open source software.

If you want to skip the gory details, skip to the summary: how do Kivy and PhoneGap compare?


I tried PhoneGap first. It's based on Node.js, so the first step was installing that. Debian has packages for nodejs, so apt-get install nodejs npm nodejs-legacy did the trick. You need nodejs-legacy to get the "node" command, which you'll need for installing PhoneGap.

Now comes a confusing part. You'll be using npm to install ... something. But depending on which tutorial you're following, it may tell you to install and use either phonegap or cordova.

Cordova is an Apache project which is intertwined with PhoneGap. After reading all their FAQs on the subject, I'm as confused as ever about where PhoneGap ends and Cordova begins, which one is newer, which one is more open-source, whether I should say I'm developing in PhoneGap or Cordova, or even whether I should be asking questions on the #phonegap or #cordova channels on Freenode. (The one question I had, which came up later in the process, I asked on #phonegap and got a helpful answer very quickly.) Neither one is packaged in Debian.

After some searching for a good, comprehensive tutorial, I ended up on a The Cordova tutorial rather than a PhoneGap one. So I typed:

sudo npm install -g cordova

Once it's installed, you can create a new app, add the android platform (assuming you already have android development tools installed) and build your new app:

cordova create hello com.example.hello HelloWorld
cordova platform add android
cordova build


Error: Please install Android target: "android-22"
Apparently Cordova/Phonegap can only build with its own preferred version of android, which currently is 22. Editing files to specify android-19 didn't work for me; it just gave errors at a different point.

So I fired up the Android SDK manager, selected android-22 for install, accepted the license ... and waited ... and waited. In the end it took over two hours to download the android-22 SDK; the system image is 13Gb! So that's a bit of a strike against PhoneGap.

While I was waiting for android-22 to download, I took a look at Kivy.


As a Python enthusiast, I wanted to like Kivy best. Plus, it's in the Debian repositories: I installed it with sudo apt-get install python-kivy python-kivy-examples

They have a nice quickstart tutorial for writing a Hello World app on their site. You write it, run it locally in python to bring up a window and see what the app will look like. But then the tutorial immediately jumps into more advanced programming without telling you how to build and deploy your Hello World. For Android, that information is in the Android Packaging Guide. They recommend an app called Buildozer (cute name), which you have to pull from git, build and install.

buildozer init
buildozer android debug deploy run
got started on building ... but then I noticed that it was attempting to download and build its own version of apache ant (sort of a Java version of make). I already have ant -- I've been using it for weeks for building my own Java android apps. Why did it want a different version?

The file buildozer.spec in your project's directory lets you uncomment and customize variables like:

# (int) Android SDK version to use
android.sdk = 21

# (str) Android NDK directory (if empty, it will be automatically downloaded.)
# android.ndk_path = 

# (str) Android SDK directory (if empty, it will be automatically downloaded.)
# android.sdk_path = 

Unlike a lot of Android build packages, buildozer will not inherit variables like ANDROID_SDK, ANDROID_NDK and ANDROID_HOME from your environment; you must edit buildozer.spec.

But that doesn't help with ant. Fortunately, when I inspected the Python code for buildozer itself, I discovered there was another variable that isn't mentioned in the default spec file. Just add this line:

android.ant_path = /usr/bin

Next, buildozer gave me a slew of compilation errors:

kivy/graphics/opengl.c: No such file or directory
 ... many many more lines of compilation interspersed with errors
kivy/graphics/vbo.c:1:2: error: #error Do not use this file, it is the result of a failed Cython compilation.

I had to ask on #kivy to solve that one. It turns out that the current version of cython, 0.22, doesn't work with kivy stable. My choices were to uninstall kivy and pull the development version from git, or to uninstall cython and install version 0.21.2 via pip. I opted for the latter option. Either way, there's no "make clean", so removing the dist and build directories let me start over with the new cython.

apt-get purge cython
sudo pip install Cython==0.21.2
rm -rf ./.buildozer/android/platform/python-for-android/dist
rm -rf ./.buildozer/android/platform/python-for-android/build

Buildozer was now happy, and proceeded to download and build Python-2.7.2, pygame and a large collection of other Python libraries for the ARM platform. Apparently each app packages the Python language and all libraries it needs into the Android .apk file.

Eventually I ran into trouble because I'd named my python file instead of; apparently this is something you're not allowed to change and they don't mention it in the docs, but that was easily solved. Then I ran into trouble again:

Exception: Unable to find capture version in ./ (looking for `__version__ = ['"](.*)['"]`)
The buildozer.spec file offers two types of versioning: by default "method 1" is enabled, but I never figured out how to get past that error with "method 1" so I commented it out and uncommented "method 2". With that, I was finally able to build an Android package.

The .apk file it created was quite large because of all the embedded Python libraries: for the little 77-line pong demo, /usr/share/kivy-examples/tutorials/pong in the Debian kivy-examples package, the apk came out 7.3Mb. For comparison, my FeedViewer native java app, roughly 2000 lines of Java plus a few XML files, produces a 44k apk.

The next step was to make a real mini app. But when I looked through the Kivy examples, they all seemed highly specialized, and I couldn't find any documentation that addressed issues like what widgets were available or how to lay them out. How do I add a basic text widget? How do I put a button next to it? How do I get the app to launch in portrait rather than landscape mode? Is there any way to speed up the very slow initialization?

I'd spent a few hours on Kivy and made a Hello World app, but I was having trouble figuring out how to do anything more. I needed a change of scenery.

PhoneGap, redux

By this time, android-22 had finally finished downloading. I was ready to try PhoneGap again.

This time,

cordova platforms add android
cordova build
worked fine. It took a long time, because it downloaded the huge gradle build system rather than using something simpler like ant. I already have a copy of gradle somewhere (I downloaded it for the OsmAnd build), but it's not in my path, and I was too beaten down by this point to figure out where it was and how to get cordova to point to it.

Cordova eventually produced a 1.8Mb "hello world" apk -- a quarter the size of the Kivy package, though 20 times as big as a native Java app. Deployed on Android, it initialized much faster than the Kivy app, and came up in portrait mode but rotated correctly if I rotated the phone.

Editing the HTML, CSS and Javascript was fairly simple. You'll want to replace pretty much all of the default CSS if you don't want your app monopolized by the Cordova icon.

The only tricky part was file access: opening a file:// URL didn't work. I asked on #phonegap and someone helpfully told me I'd need the file plugin. That was easy to find in the documentation, and I added it like this:

cordova plugin search file
cordova plugin add org.apache.cordova.file

My final apk, for a small web app I use regularly on Android, was almost the same size as their hello world example: 1.8Mb. And it works great: phonegap had no problem playing an audio clip, something that was tricky when I was trying to do the same thing from a native Android java WebView class.

Summary: How do Kivy and PhoneGap compare?

This has been a long article, I know. So how do Kivy and PhoneGap compare, and which one will I be using?

They both need a large amount of disk space for the development environment. I wish I had good numbers to give you, but I was working with both systems at the same time, and their packages are scattered all over the disk so I haven't found a good way of measuring their size. I suspect PhoneGap is quite a bit bigger, because it uses gradle rather than ant and because it insists on android-22.

On the other hand, PhoneGap wins big on packaged application size: its .apk files are a quarter the size of Kivy's.

PhoneGap definitely wins on documentation. Kivy has seemingly lots of documentation, but its tutorials jumped around rather than following a logical sequence, and I had trouble finding answers to basic questions like "How do I display a text field with a button?" PhoneGap doesn't need that, because the UI is basic HTML and CSS -- limited though they are, at least most people know how to use them.

Finally, PhoneGap wins on startup speed. For my very simple test app, startup was more or less immediate, while the Kivy Hello World app required several seconds of startup time on my Galaxy S4.

Kivy is an interesting project. I like the ant-based build, the straightforward .spec file, and of course the Python language. But it still has some catching up to do in performance and documentation. For throwing together a simple app and packaging it for Android, I have to give the win to PhoneGap.

June 23, 2015 06:09 PM

June 19, 2015

Jono Bacon

Rebasing Ubuntu on Android?

NOTE: Before you read this, I want to clear up some confusion. This post shares an idea that is designed purely for some intellectual fun and discussion. I am not proposing we actually do this, nor advocating for this. So, don’t read too much into these words…

The Ubuntu phone is evolving step by step. The team has worked their socks off to build a convergent user interface, toolkit, and full SDK. The phone exposes an exciting new concept, scopes, that while intriguing in their current form, after some refinement (which the team are already working on) could redefine how we use devices and access content. It is all the play for.

There is one major stumbling block though: apps.

While scopes offer a way of getting access to content quickly, they don’t completely replace apps. There will always be certain apps that people are going to want. The common examples are Skype, WhatsApp, Uber, Google Maps, Fruit Ninja, and Temple Run.

Now this is a bit of a problem. The way new platforms usually solve this is by spending hundreds of thousands of dollars to pay those companies to create and support a port. This isn’t really an option for the Ubuntu phone (there is much more than just the phone being funded by Canonical).

So, it seems to me that the opportunity of the Ubuntu phone is a sleek and sexy user interface that converges and puts content first, but the stumbling block is the lack of apps, and the lack of apps may well have a dramatic impact on adoption.

So, i have an idea to share based on a discussion last night with a friend.

Why don’t we rebase the phone off Android?

OK, bear with me…

In other words, the Ubuntu phone would be an Android phone but instead of the normal user interface it would be a UI that looks and feels like the Ubuntu phone. It would have the messaging menu, scopes, and other pieces, and select Android API calls could be mapped to the different parts of the Unity UI such as the messaging menu and online account support.

The project could even operate like how we build Ubuntu today. Every six months upstream Android would be synced into Launchpad where a patchset would live on and applied to the codebase (in much the same way we do with Debian today).

This would mean that Ubuntu would continue to be an Open Source project, based on a codebase easily supported by hardware manufacturers (thus easier to ship), it would run all Android apps without requiring a cludgy porting/translation layer running on Ubuntu, it would look and feel like an Ubuntu phone, it would still expose scopes as a first-class user interface, the Ubuntu SDK would still be the main ecosystem play, Ubuntu apps would still stand out as more elegant and engaging apps, and it would reduce the amount of engineering required (I assume).

Now, the question is how this would impact a single convergent Operating System across desktop, phone, tablet, and TV. If Unity is essentially a UI that runs on top of Android and exposes a set of services, the convergence story should work well too, after all…it is all Linux. It may need different desktop, phone, tablet, and TV kernels, but I think we would need different kernels anyway.

So where does this put Debian and Ubuntu packages? Well, good question. I don’t know. The other unknown of course would be the impact of such a move on our flavors and derivatives, but then again I suspect the march towards snappy is going to put us in a similar situation if flavors/derivatives choose to stick with the Debian packaging system.

Of course, I am saying all this as who really only understands a small part of the picture, but this just strikes me as a logical step forward. I know there has been a reluctance to support Android apps on Ubuntu as it devalues the Ubuntu app ecosystem and people would just use Android apps, but I honestly think some kind of middle-ground is needed to get into the game, otherwise I worry we won’t even make it to the subs bench no matter how awesome our technology is.

Just a thought, would love to hear what everyone thinks, including if what I am suggesting is total nonsense. :-)

Again, remember, this is just an idea I am throwing out for the fun of the discussion; I am not suggesting we actually do this.

by jono at June 19, 2015 04:20 AM

June 18, 2015

Eric Hammond

lambdash: AWS Lambda Shell Hack: New And Improved!

easier, simpler, faster, better

Seven months ago I published the lambdash AWS Lambda Shell Hack that lets you run shell commands to explore the environment in which AWS Lambda functions are executed.

I also posted samples of command output that show fascinating properties of the AWS Lambda runtime environment.

In the last seven months, Amazon has released new features and enhancements that have made a completely new version of lambdash possible, with many benefits including:

  • Ability to use AWS CloudFormation to create all needed resources including the AWS Lamba function and the IAM role.

  • Ability to create AWS Lambda functions by referencing a ZIP file in an S3 bucket.

  • Simpler IAM role structure.

  • Increased AWS Lamba function memory limit, with corresponding faster execution.

  • Ability to invoke an AWS Lambda function synchronously.

This last point means that we no longer need to put the shell command output into an S3 bucket and poll the bucket from the local host. Instead, we can simply return the shell command output directly to the client that invoked the AWS Lambda function.

The above have made the lambdash code much simpler, much easier to intstall, and much, much faster to execute and get results.

You can browse the source here:

There are three easy steps to get lambdash working:

1. CloudFormation Stack

Option 1: Here are sample steps to create the lambdash AWS Lambda function and to use a local command to invoke the function and output the results of commands run inside of Lambda

git clone
cd lambdash

The lambdash-install script runs the aws-cli command aws cloudformation create-stack passing in the template file to create the AWS Lambda function in a CloudFormation stack.

The above assumes that you have installed aws-cli and have appropriate credentials configured.

Option 2: You may use the AWS Console to create a lambdash CloudFormation stack by pressing this button:

Launch Stack

Accept all the defaults, confirm the IAM role creation (after reading the CloudFormation template and verifying that I am not doing anything malicious), and perhaps add a Tag to help identify the lambdash CloudFormation stack.

2. Environment Variable

Since the CloudFormation stack creates the AWS Lambda function with a unique name, you need to find out what this name is before you can invoke it with the lambdash command.

If you ran the lambdash-install command, it printed the export statement you should use.

If you used the AWS Console, click on the lambdash CloudFormation stack’s [Output] tab and copy the export command listed there.

It will look something like this, with your own unique 12-character suffix:

export LAMBDASH_FUNCTION=lambdash-function-ABC123EXAMPL

Run this in your current shell and, perhaps, add it to your $HOME/.bashrc or equivalent.

3. Local lambdash Program

The previous step installs the AWS Lambda function in the AWS environment. You also need a complementary local command that will invoke the function with your requested command line then receive and print the stdout and stderr content.

This is the lambdash program, which is now a small Python script that uses boto3.

You can either use the lambdash program in the GitHub repo you cloned above, or download it directly:

sudo curl -so/usr/local/bin/lambdash \
sudo chmod +x /usr/local/bin/lambdash

This Python program requires boto3, so install it using your favorite method. This worked for me:

sudo -H pip install boto3

Now you’re ready to run shell commands on AWS Lambda.


You can now execute shell commands in the AWS Lambda environment and see the output. This command shows us that Amazon has upgraded the AWS Lambda environment from Amazon Linux 2014.03 when it was launched, to 2015.03 today:

$ lambdash cat /etc/issue
Amazon Linux AMI release 2015.03
Kernel \r on an \m

Nodejs has been upgraded from v0.10.32 to v0.10.36

$ lambdash node -v

Here’s a command I use to occasionally check in on changes in the Amazon’s awslambda nodejs framework that runs our Lambda functions:

mkdir awslambda-source
lambdash tar cvzf - -C /var/runtime/node_modules/awslambda . | 
  tar xzf - -C awslambda-source

For example, the most recent change was to “log only 256K of errorMessage into customer’s cloudwatch”. Good to know.


Deleting the lambdash CloudFormation stack removes all resources including the AWS Lambda function and the IAM role. You can do this by running this command in the GitHub repo:


Or, you can delete the lambdash CloudFormation stack in the AWS Console.

Original article and comments:

June 18, 2015 11:00 AM

June 15, 2015

Jono Bacon

New Forbes Column: From Piracy to Prosperity

My new Forbes column is published.

This article covers how technology has impacted how creatives, artists, and journalists create, distribute, and engage around their work.

For it I sat down with Mike Shinoda, co-founder of grammy award winning Linkin Park as well as Ali Velshi, host on Al Jazeera and former CNN Senior Business Corrospondent.

Go and read the article here.

After that you may want to see my previous article where I interviewed Chris Anderson, founder of 3DR and author of The Long Tail, where we discuss building the open drone revolution. Read that article here.

by jono at June 15, 2015 04:49 PM

June 14, 2015

Elizabeth Krumbach

Weekends, street cars and red pandas

I’m home for the entire month of June! Looking back through my travel schedule, the last month I didn’t get on a plane was March of 2013. The travel-loving part of me is a little sad about breaking my streak, but given that it’s June and I’ve already given 8 presentations in 5 countries across 3 continents, I’m due for this break from travel. It’s not a break from work though, I’ve had to really hunker down on some projects I’m working on at work now that I have solid chunks of time to concentrate, and some serious due dates for my book are looming. I’ve also been tired, which prompted an extensive pile of blood work that had some troubling results that I’m now working with a specialist to get to the bottom of. I’m continuing to run and improve my diet by eating more fresh, green things which have traditionally helped bump my energy level because I’m treating my body better, but lately they both just make me more tired. And ultimately, tired means some evenings I spend more time watching True Blood and The Good Wife than I should with all these book deadlines creeping up. Don’t tell my editor ;)

I’m also getting lots of kitty snuggles as I remain at home, and lots of opportunities to take cute kitty pictures.

I continue to take Saturdays off, which continues to be my primary burnout protection mechanism. I’ve continued to evolve what this day off means. While originally inspired by the Jewish tradition of Shabbat, and we practice Shabbat rituals in our home (candles, challah, etc), and I continue to avoid work, the definition of work is in flux for me. Early on, I’d still check “personal” email and social media, until I discovered that there’s no such thing, with my open source volunteer work, open source day job and personal life so intertwined. There recently have also been some considerable stresses related to my volunteer open source work, which I want to have a break from on my day off. So currently I work hard to avoid checking email and social media, even though it’s still a struggle. It’s caused me to learn how much of a slave I’ve become to my phone. It beeps, I leap for it. Having a day off has caused me to create discipline around my relationship with my phone, so even on days when I’m working, I’m less inclined to prioritize phone beeps over the work I’m currently engaged in, leading to a greater ability to focus. Sorry to people who randomly text or direct message me on Twitter/Facebook expecting an immediate response, it will rarely happen.

So currently, my Saturdays often include either:

  • Attending Synagogue services with MJ and having a lunch out together
  • Going to some museum, movie or cultural event with MJ
  • Staying home and reading, writing, catching up with some online classes or working on hobby projects

I had played around with avoiding computers entirely on Saturdays, but on home days I realized I’d get bored too easily if I’m reading all day and some times I’m really not in the mood for my offline activities. When I get bored, I end up napping or watching TV instead, neither of which are rejuvenating or satisfying, and I end up just feeling sad about wasting the day. So my criteria has shifted to “not work” to including fun, enriching projects that I likely don’t have time or energy for on my other six “working” days. I have struggled with whether these hobbies should be on my to do list or not, since putting them on my list adds a level of structure that can lead to stress, but my coping habit for task organization makes leaving them off a challenging mental exercise. Writing here in my blog also requires a computer, and these days off give me ample time for ideas to settle and finally have some quite time to get my thoughts in order and write without distraction. Though I do have to admit that buying a vintage mechanical typewriter has crossed my mind more than a few times. Which reminds me, have any recommendations? Aside from divorce lawyers and a bigger home in the event that I drive MJ crazy. I also watch videos associated with various electronics projects and online classes I’m learning for fun (Arduinos! History and anthropology!), so a computer or tablet is regularly involved there.

It’s still not perfect. My stress levels have been high this year and we’ve booked a weekend at a beautiful inn and spa in Sonoma next weekend to unplug away from the random tasks that come from spending our weekends at home. I’m counting down the hours.

Last weekend was a lot of fun though, even if I was still stressed. On Saturday we went on a Blackpool Boat Tram Tour along the F-line. I’ve been looking for an opportunity to ride on this “topless” street car for years, but the charters always conflicted with my travel schedule, until last weekend! MJ and I booked tickets and at 1:30PM on Saturday we were on our way down Market Street.

As the title of the tour suggests, these unusually styled street cars come from Blackpool, England, a region known for their seaside activities, including Blackpool Pleasure Beach where they now have the first Wallace and Gromit theme park ride, Wallace & Gromit’s Thrill-O-Matic ride! Well, they also have a tramway where these cars came from, and California now has three of them – two functioning ones operated here in the city by MUNI and maintained by the Market Street Railway non-profit, which I’m a member of and conducted this charter.

We met at 1:15 to pick up our tickets, browse through the little SF Railyway Museum and capture some pre-travel photos of the boat tram.

Upon boarding, we took seats at the back of the street car. The tour was in two parts, half of it guided by a representative from Market Street Railway who gave some history of the transportation lines themselves as we glided up Market Street along the standard F-line until getting to Castro where a slightly different route was taken to turn back on to Market.

At the turn around near Castro, the guides swapped places and we got a representative from San Francisco City Guides who typically does walking tours of the city. As a local enthusiast he was able to give us details about the major landmarks along Market and up the Embarcadero as we made our way up to Pier 39. I knew most of what both guides told us, but there were a few bits of knowledge I was excited to learn. I was also reminded of the ~12 minute A Trip Down Market Street, 1906 that was taken just days before the earthquake in 1906 that destroyed many of the buildings seen in the video. Fascinating stuff.

At Pier 39 we had the opportunity to get out of the car and take some pictures around it, including the obligatory pictures of ourselves!

The trip lasted a couple hours, and with the open top of the car I managed to get a bit of sunburn on my face, oops!

More photos from the tram tour can be found here:

Sunday morning I took advantage of the de-stressing qualities of a visit to the zoo.

I finally got to see all three of the red pandas. It had been some time since I’d seen their exhibit, and last time only one of them was there. It was fun to see all three of them together, two of them climbing the trees (pictured below) and the third walking around the ground of the enclosure. I’m kind of jealous of their epic tree houses.

Also got to swing by the sea lions Henry and Silent Knight, with Henry playing king of the rock in the middle of their pool.

More photos here:

In other miscellaneous life things, MJ and I made it out to see Mad Max: Fury Road recently. It’s been several months since I’d been to a theater, and probably over a year since MJ and I had gone to a movie together, so it was a nice change of pace. Plus, it was a fun, mind-numbing movie that took my mind off my ever-growing task list. MJ and I have also been able to spend several nice dinners together, including indulging in a Brazilian Steakhouse one evening and fondue another night. In spite of these things, with running, improved breakfast and lunch and mostly skipping desserts I’ve dropped 5lbs in the past month, which is not rapid weight loss but is being done in a way that’s sustainable without completely eliminating the things I love (including my craft beer hobby). Hooray!

I’ve cut back on events, sadly turning down invitations to local panels and presentations in favor of staying home and working on my book during my off-work hours. I did host an Ubuntu Hour this week though.

Next week I’m planning on popping over to a nearby Ubuntu/Juju Mine and Mingle. I’ll also be heading down to the south end of the east bay for an EBLUG meeting where they’ve graciously offered to host space, time and expertise for an evening of discussing work on some servers that Partimus is planning on deploying in some of the schools we work with. It will be great to meet up and chat with some of the volunteers who I’ve largely only worked on thus far online, and to block off some of my own time for the raw technical tasks that Partimus needs to be focusing on but I’ve been suffering from time constraints around.

I really am looking forward to that spa weekend, but for now I’m rounding out my relaxing Saturday and preparing for get-things-done Sunday!

by pleia2 at June 14, 2015 01:38 AM

June 08, 2015

Akkana Peck

Adventure Dental

[Adventure Dental] This sign, in Santa Fe, always makes me do a double-take.

Would you go to a dentist or eye doctor named "Adventure Dental"?

Personally, I prefer that my dental and vision visits are as un-adventurous as possible.

June 08, 2015 02:54 PM