Planet Ubuntu California

July 24, 2014

Akkana Peck

Predicting planetary visibility with PyEphem

Part 1: Basic Planetary Visibility

All through the years I was writing the planet observing column for the San Jose Astronomical Association, I was annoyed at the lack of places to go to find out about upcoming events like conjunctions, when two or more planets are close together in the sky. It's easy to find out about conjunctions in the next month, but not so easy to find sites that will tell you several months in advance, like you need if you're writing for a print publication (even a club newsletter).

For some reason I never thought about trying to calculate it myself. I just assumed it would be hard, and wanted a source that could spoon-feed me the predictions.

The best source I know of is the RASC Observer's Handbook, which I faithfully bought every year and checked each month so I could enter that month's events by hand. Except for January and February, when I didn't have the next year's handbook yet by the time my column went to press and I was on my own. I have to confess, I was happy to get away from that aspect of the column when I moved.

In my new town, I've been helping the local nature center with their website. They had some great pages already, like a What's Blooming Now? page that keeps track of which flowers are blooming now and only shows the current ones. I've been helping them extend it by adding features like showing only flowers of a particular color, separating the data into CSV databases so it's easier to add new flowers or butterflies, and so forth. Eventually we hope to build similar databases of birds, reptiles and amphibians.

And recently someone suggested that their astronomy page could use some help. Indeed it could -- it hadn't been updated in about five years. So we got to work looking for a source of upcoming astronomy events we could use as a data source for the page, and we found sources for a few things, like moon phases and eclipses, but not much.

Someone asked about planetary conjunctions, and remembering how I'd always struggled to find that data, especially in months when I didn't have the RASC handbook yet, I got to wondering about calculating it myself. Obviously it's possible to calculate when a planet will be visible, or whether two planets are close to each other in the sky. And I've done some programming with PyEphem before, and found it fairly easy to use. How hard could it be?

Note: this article covers only the basic problem of predicting when a planet will be visible in the evening. A followup article will discuss the harder problem of conjunctions.

Calculating planet visibility with PyEphem

The first step was figuring out when planets were up. That was straightforward. Make a list of the easily visible planets (remember, this is for a nature center, so people using the page aren't expected to have telescopes):

import ephem

planets = [
    ephem.Moon(),
    ephem.Mercury(),
    ephem.Venus(),
    ephem.Mars(),
    ephem.Jupiter(),
    ephem.Saturn()
    ]

Then we need an observer with the right latitude, longitude and elevation. Elevation is apparently in meters, though they never bother to mention that in the PyEphem documentation:

observer = ephem.Observer()
observer.name = "Los Alamos"
observer.lon = '-106.2978'
observer.lat = '35.8911'
observer.elevation = 2286  # meters, though the docs don't actually say

Then we loop over the date range for which we want predictions. For a given date d, we're going to need to know the time of sunset, because we want to know which planets will still be up after nightfall.

observer.date = d
sunset = observer.previous_setting(sun)

Then we need to loop over planets and figure out which ones are visible. It seems like a reasonable first approach to declare that any planet that's visible after sunset and before midnight is worth mentioning.

Now, PyEphem can tell you directly the rising and setting times of a planet on a given day. But I found it simplified the code if I just checked the planet's altitude at sunset and again at midnight. If either one of them is "high enough", then the planet is visible that night. (Fortunately, here in the mid latitudes we don't have to worry that a planet will rise after sunset and then set again before midnight. If we were closer to the arctic or antarctic circles, that would be a concern in some seasons.)

min_alt = 10. * math.pi / 180.
for planet in planets:
    observer.date = sunset
    planet.compute(observer)
    if planet.alt > min_alt:
        print planet.name, "is already up at sunset"

Easy enough for sunset. But how do we set the date to midnight on that same night? That turns out to be a bit tricky with PyEphem's date class. Here's what I came up with:

    midnight = list(observer.date.tuple())
    midnight[3:6] = [7, 0, 0]
    observer.date = ephem.date(tuple(midnight))
    planet.compute(observer)
    if planet.alt > min_alt:
        print planet.name, "will rise before midnight"

What's that 7 there? That's Greenwich Mean Time when it's midnight in our time zone. It's hardwired because this is for a web site meant for locals. Obviously, for a more general program, you should get the time zone from the computer and add accordingly, and you should also be smarter about daylight savings time and such. The PyEphem documentation, fortunately, gives you tips on how to deal with time zones. (In practice, though, the rise and set times of planets on a given day doesn't change much with time zone.)

And now you have your predictions of which planets will be visible on a given date. The rest is just a matter of writing it out into your chosen database format.

In the next article, I'll cover planetary and lunar conjunctions -- which were superficially very simple, but turned out to have some tricks that made the programming harder than I expected.

July 24, 2014 03:32 AM

July 22, 2014

Elizabeth Krumbach

Surgery coming up, Pride, Tiburon and a painting

This year has been super packed with conferences and travel. I’ve done 13 talks across 3 continents and have several more coming up in the next few months. I’ve also been squeezing in the hosting of Ubuntu Hours each month.


Buttercup at his first Utopic Unicorn cycle Ubuntu Hour

Aside from all this, life-wise things have been pretty mellow due to my abdominal pain (sick of hearing about it yet?). I’ve been watching a lot of TV because of how exhausted the pain is making me. Exercise has totally taken a back seat, this compounds the tiredness and means I’ve put on some weight that I’m not at all happy about. Once I’m better I plan on starting Couch to 5K again and may also join a new gym to get back into shape.

The gallbladder removal surgery itself is on Thursday and I’m terribly nervous about it. Jet lag combined with surgery nervousness means I haven’t been sleeping exceptionally well either. I’m not looking forward to the recovery, it should be relatively fast (a couple of weeks), but I’m a terrible patient and get bored easily when I’m not doing things. It will take a lot of effort to not put too much stress on my system too quickly. I’ll be so happy when this is all over.

I did take some time to do a few things though. On June 29th our friend Danita was still in town and we got to check out the Pride parade, which is always a lot of fun, even if I did get a bit too much sun.

Lots more photos from the parade here: https://www.flickr.com/photos/pleia2/sets/72157645439712155/

MJ and I also took a Sunday to drive north a couple weeks ago to visit Tiburon for some brunch. It was a beautiful day for it, and always nice to further explore the beautiful places around where we live, I hope we can make more time for it.


Sunny day in Tiburon!

Finally, I’m happy to report that after a couple months, I’ve gotten a painting back from Chandler Fine Art who was working with a restoration artist to clean it up and to have it framed. Not much can be done about the cracks without a significant amount of work (the nature of oil paintings!) but they were able to fix a dent in the canvas and clean up some stains, I can’t even tell where the defects were now.

It may not strictly match the decor of our home, but it was a favorite of my father’s growing up and it’s nice to have such a nice memory from my childhood hanging here now.

by pleia2 at July 22, 2014 04:05 PM

July 21, 2014

Elizabeth Krumbach

The Official Ubuntu Book, 8th Edition now available!

This past spring I had the great opportunity to work with Matthew Helmke, José Antonio Rey and Debra Williams of Pearson on the 8th edition of The Official Ubuntu Book.

Official Ubuntu Book, 8th Edition

In addition to the obvious task of updating content, one of our most important tasks was working to “future proof” the book more by doing rewrites in a way that would make sure the content of the book was going to be useful until the next Long Term Support release, in 2016. This meant a fair amount of content refactoring, less specifics when it came to members of teams and lots of goodies for folks looking to become power users of Unity.

Quoting the product page from Pearson:

The Official Ubuntu Book, Eighth Edition, has been extensively updated with a single goal: to make running today’s Ubuntu even more pleasant and productive for you. It’s the ideal one-stop knowledge source for Ubuntu novices, those upgrading from older versions or other Linux distributions, and anyone moving toward power-user status.

Its expert authors focus on what you need to know most about installation, applications, media, administration, software applications, and much more. You’ll discover powerful Unity desktop improvements that make Ubuntu even friendlier and more convenient. You’ll also connect with the amazing Ubuntu community and the incredible resources it offers you.

Huge thanks to all my collaborators on this project. It was a lot of fun to work them and I already have plans to work with all three of them on other projects in the future.

So go pick up a copy! As my first published book, I’d be thrilled to sign it for you if you bring it to an event I’m at, upcoming events include:

And of course, monthly Ubuntu Hours and Debian Dinners in San Francisco.

by pleia2 at July 21, 2014 04:21 PM

July 20, 2014

Elizabeth Krumbach

Tourist in Darmstadt

This past week I was in Germany! I’ve gone through Frankfurt many times over the years, but this was the first time I actually left the airport via ground transportation.


Trip began with a flight on a Lufthansa 380

Upon arrival I found the bus stop for the shuttle to Darmstadt and after a 20 minute ride was at Hauptbahnhof (main transit station) in Darmstadt and a very short walk took me to the Maritim Konferenzhotel Darmstadt where I’d be staying for the week.

The hotel was great, particularly for a European hotel. The rooms were roomy, the shower was amazing, and all the food I had was good.

Our timing on the sprint was pretty exceptional, with most of us arriving on Sunday just in time to spend the evening watching the World Cup final, which Germany was in! Unfortunately for us the beer gardens in the city required reservations and we didn’t have any, so we ended up camping out in the hotel bar and enjoying the game there, along with some beers and good conversations. In spite of my current gallbladder situation, I made an exception to my abstinence from alcohol that night and had a couple of beers to commemorate the World Cup and my first proper time in Germany.


Beer, World Cup

Unfortunately I wasn’t so lucky gallbladder-wise the rest of the week. I’m not sure if I was having some psychosomatic reaction to knowing the removal surgery is so close, but it definitely felt like I was in more pain this week. This kept me pretty close to the hotel and I sadly had to skip most of the evenings out with my co-workers at beer gardens because I was too tired, in pain and couldn’t have beer anyway.

I did make it out on Wednesday night, since I couldn’t resist a visit to Darmstädter Ratskeller, even if I did only have apple juice. This evening brought me into Darmstadt center where I got to take all my tourist photos, and also gave me an opportunity to visit the beer garden and chat with everyone.


Darmstädter Ratskeller

Plus, I managed to avoid pork by ordering Goulash – a dish I hadn’t had the opportunity to enjoy since my childhood.


Goulash! Accompanied by apple juice

I wish I had felt up to more adventuring. Had I felt better I probably would have spent a few extra days in Frankfurt proper giving myself a mini-vacation to explore. Next time.

All photos from my adventure that night in Darmstadt center (and planes and food and things!) here: https://www.flickr.com/photos/pleia2/sets/72157645839688233/

by pleia2 at July 20, 2014 03:58 PM

July 19, 2014

Elizabeth Krumbach

OpenStack QA/Infrastructure Meetup in Darmstadt

I spent this week at the QA/Infrastructure Meetup in Darmstadt, Germany.

Our host was Marc Koderer of Deutsche Telekom who sorted out all logistics for having our event at their office in Darmstadt. Aside from the summer heat (the conference room lacked air conditioning) it all worked out very well, we had a lot of space to work, the food was great, we had plenty of water. It was also nice that the hotel most of us stayed at was an easy walk away.

The first day kicked off with an introduction by Deutsche Telekom that covered what they’re using OpenStack for in their company. Since they’re a network provider, networking support was a huge component, but they use other components as well to build an infrastructure as they plan to have a quicker software development cycle that’s less tied to the hardware lifetime. We also got a quick tour of one of their data centers and a demo of some of the running prototypes for quicker provisioning and changing of service levels for their customers.

Monday afternoon was spent with an on-boarding tutorial for newcomers to OpenStack when it comes to contributing, and on Tuesday we transitioned into an overview of the OpenStack Infrastructure and QA systems that we’d be working on for the rest of the week. Beyond the overview of the infrastructure presented by James E. Blair, key topics included in the infrastructure included jeepyb presented by Jeremy Stanley, devstack-gate and Grenade presented by Sean Dague, Tempest presented by Matthew Treinish (including the very useful Tempest Field Guide) and our Elasticsearch, Logstash and Kibana (ELK) stack presented by Clark Boylan.

Wednesday we began the hacking/sprint portion of the event, where we moved to another conference room and moved tables around so we could get into our respective working groups. Anita Kuno presented the Infrastructure User Manual which we’re looking to flesh out, and gave attendees a task of helping to write a section to help guide users of our CI system. This ended up being a great thing for newcomers to get their feet wet with, and I hope to have a kind of entry level task at every infrastructure sprint moving forward. Some folks worked on getting support for uploading log files to Swift, some on getting multinode testing architected, and others worked on Tempest. In the early afternoon we had some discussions covering recheck language, next steps I’d be taking when it comes to the evaluation of translations tools, a “Gerrit wishlist” for items that developers are looking for as Khai Do prepares to attend a Gerrit hack event and more. I also took time on Wednesday to dive into some documentation I noticed needed some updating after the tutorial day the day before.

Thursday the work continued, I did some reviews, helped out a couple of new contributors and wrote my own patch for the Infra Manual. It was also great to learn and collaborate on some of the aspects of the systems we use that I’m less familiar with and explain portions to others that I was familiar with.


Zuul supervised my work

Friday was a full day of discussions, which were great but a bit overwhelming (might have been nice to have had more on Thursday). Discussions kicked off with strategies for handling the continued publishing of OpenStack Documentation, which is currently just being published to a proprietary web platform donated by one of the project sponsors.

A very long discussion was then had about managing the gate runtime growth. Managing developer and user expectations for our gating system (thorough, accurate testing) while balancing the human and compute resources that we have available on the project is a tough thing to do. Some technical solutions to ease the pain on some failures were floated and may end up being used, but the key takeaway I had from this discussion was that we’d really like the community to be more engaged with us and each other (particularly when patches impact projects or functionality that you might not feel is central to your patch). We also want to stress that the infrastructure is a living entity that evolves and we accept input as to ideas and solutions to problems that we’re encountering, since right now the team is quite small for what we’re doing. Finally, there were some comments about how we run tests in the process of reviewing, and how scalable the growth of tests is over time and how we might lighten that load (start doing some “traditional CI” post merge jobs? having some periodic jobs? leverage experimental jobs more?).

The discussion I was most keen on was around the refactoring of our infrastructure to make it more easily consumable by 3rd parties. Our vision early on was that we were an open source project ourselves, but that all of our customizations were a kind of example for others to use, not that they’d want to use them directly, so we hard coded a lot into our special openstack_projects module. As the project has grown and more organizations are starting to use the infrastructure, we’ve discovered that many want to use one largely identical to ours and that making this easier is important to them. To this end, we’re developing a Specification to outline the key steps we need to go through to achieve this goal, including splitting out our puppet modules, developing a separate infra system repo (what you need to run an infrastructure) and project stuff repo (data we load into our infrastructure) and then finally looking toward a way to “productize” the infrastructure to make it as easily consumable by others as possible.

The afternoon finished up with discussions about vetting and signing of release artifacts, ideas for possible adjustment of the job definition language and how teams can effectively manage their current patch queues now that the auto-abandon feature has been turned off.

And with that – our sprint concluded! And given the rise in temperature on Friday and how worn out we all were from discussions and work, it was well-timed.

Huge thanks to Deutsche Telekom for hosting this event, being able to meet like this is really valuable to the work we’re all doing in the infrastructure and QA for OpenStack.

Full (read-only) notes from our time spent throughout the week available here: https://etherpad.openstack.org/p/r.OsxMMUDUOYJFKgkE

by pleia2 at July 19, 2014 11:07 AM

July 17, 2014

Jono Bacon

Community Leadership Summit and OSCON Plans

As many of you will know, I organize an event every year called the Community Leadership Summit. The event brings together community leaders, organizers and managers and the projects and organizations that are interested in growing and empowering a strong community.

The event kicks off this week on Thursday evening (17th July) with a pre-CLS gathering at the Doubletree Hotel at 7.30pm, and then we get started with the main event on Friday (18th July) and Saturday (19th July). For more details, see http://www.communityleadershipsummit.com/.

This year’s event is shaping up to be incredible. We have a fantastic list of registered attendees and I want to thank our sponsors, O’Reilly, Citrix, Oracle, Mozilla, Ubuntu, and LinuxFund.

Also, be sure to join the new Community Leadership Forum for discussing topics that relate to community management, as well as topics for discussion at the Community Leadership Summit event each year. The forum is designed to be a great place for sharing and learning tips and techniques, getting to know other community leaders, and having fun.

The forum is powered by Discourse, so it is a pleasure to use, and I want to thank discoursehosting.com for generously providing free hosting for us.

Speaking Events and Training at OSCON

I also have a busy OSCON schedule. Here is the summary:

Community Management Training

On Monday 21st July from 9am – 6pm in D135 I will be providing a full day of community management training at OSCON. This full day of training will include topics such as

  • The Core Mechanics Of Community
  • Planning Your Community
  • Building a Strategic Plan
  • Building Collaborative Workflow
  • Defining Community Governance
  • Marketing, Advocacy, Promotion, and Social Media
  • Measuring Your Community
  • Tracking and Measuring Community Management
  • Conflict Resolution

Dealing With Disrespect

On Tues 22nd July at 10.40am in Expo Hall A I will be providing an Office Hours Meeting in which you can come and ask me about:

  • Building collaborative workflow and tooling
  • Conflict resolution and managing complex personalities
  • Building buzz and excitement around your community
  • Incentivized prizes and innovation
  • Hiring community managers
  • Anything else!

Office Hours

Finally, on Wed 23rd July at 2.30pm in E144 I will be giving a presentation called Dealing With Disrespect that is based upon my free book of the same name for managing complex communications.

This is the summary of the talk:

In this new presentation from Jono Bacon, author of The Art of Community, founder of the Community Leadership Summit, and Ubuntu Community Manager, he discusses how to process, interpret, and manage rude, disrespectful, and non-constructive feedback in communities so the constructive criticism gets through but the hate doesn’t.

The presentation covers the three different categories of communications, how we evaluate and assess different attributes in each communication, the factors that influence all of our communications, and how to put in place a set of golden rules for handling feedback and putting it in perspective.

If you personally or your community has suffered rudeness, trolling, and disrespect, this presentation is designed to help.

I will also be available for discussions and meetings. Just drop me an email at jono@jonobacon.org if you want to meet.

I hope to see many of you in Portland this week!

by jono at July 17, 2014 12:44 AM

Akkana Peck

Time-lapse photography: a simple Arduino-driven camera intervalometer

[Arduino intervalometer] While testing my automated critter camera, I was getting lots of false positives caused by clouds gathering and growing and then evaporating away. False positives are annoying, but I discovered that it's fun watching the clouds grow and change in all those photos ... which got me thinking about time-lapse photography.

First, a disclaimer: it's easy and cheap to just buy an intervalometer. Search for timer remote control or intervalometer and you'll find plenty of options for around $20-30. In fact, I ordered one. But, hey, it's not here yet, and I'm impatient. And I've always wanted to try controlling a camera from an Arduino. This seemed like the perfect excuse.

Why an Arduino rather than a Raspberry Pi or BeagleBone? Just because it's simpler and cheaper, and this project doesn't need much compute power. But everything here should be applicable to any microcontroller.

My Canon Rebel Xsi has a fairly simple wired remote control plug: a standard 2.5mm stereo phone plug. I say "standard" as though you can just walk into Radio Shack and buy one, but in fact it turned out to be surprisingly difficult, even when I was in Silicon Valley, to find them. Fortunately, I had found some, several years ago, and had cables already wired up waiting for an experiment.

The outside connector ("sleeve") of the plug is ground. Connecting ground to the middle ("ring") conductor makes the camera focus, like pressing the shutter button halfway; connecting ground to the center ("tip") conductor makes it take a picture. I have a wired cable release that I use for astronomy and spent a few minutes with an ohmmeter verifying what did what, but if you don't happen to have a cable release and a multimeter there are plenty of Canon remote control pinout diagrams on the web.

Now we need a way for the controller to connect one pin of the remote to another on command. There are ways to simulate that with transistors -- my Arduino-controlled robotic shark project did that. However, the shark was about a $40 toy, while my DSLR cost quite a bit more than that. While I did find several people on the web saying they'd used transistors with a DSLR with no ill effects, I found a lot more who were nervous about trying it. I decided I was one of the nervous ones.

The alternative to transistors is to use something like a relay. In a relay, voltage applied across one pair of contacts -- the signal from the controller -- creates a magnetic field that closes a switch and joins another pair of contacts -- the wires going to the camera's remote.

But there's a problem with relays: that magnetic field, when it collapses, can send a pulse of current back up the wire to the controller, possibly damaging it.

There's another alternative, though. An opto-isolator works like a relay but without the magnetic pulse problem. Instead of a magnetic field, it uses an LED (internally, inside the chip where you can't see it) and a photo sensor. I bought some opto-isolators a while back and had been looking for an excuse to try one. Actually two: I needed one for the focus pin and one for the shutter pin.

How do you choose which opto-isolator to use out of the gazillion options available in a components catalog? I don't know, but when I bought a selection of them a few years ago, it included a 4N25, 4N26 and 4N27, which seem to be popular and well documented, as well as a few other models that are so unpopular I couldn't even find a datasheet for them. So I went with the 4N25.

Wiring an opto-isolator is easy. You do need a resistor across the inputs (presumably because it's an LED). 380&ohm is apparently a good value for the 4N25, but it's not critical. I didn't have any 380&ohm but I had a bunch of 330&ohm so that's what I used. The inputs (the signals from the Arduino) go between pins 1 and 2, with a resistor; the outputs (the wires to the camera remote plug) go between pins 4 and 5, as shown in the diagram on this Arduino and Opto-isolators discussion, except that I didn't use any pull-up resistor on the output.

Then you just need a simple Arduino program to drive the inputs. Apparently the camera wants to see a focus half-press before it gets the input to trigger the shutter, so I put in a slight delay there, and another delay while I "hold the shutter button down" before releasing both of them.

Here's some Arduino code to shoot a photo every ten seconds:

int focusPin = 6;
int shutterPin = 7;

int focusDelay = 50;
int shutterOpen = 100;
int betweenPictures = 10000;

void setup()
{
    pinMode(focusPin, OUTPUT);
    pinMode(shutterPin, OUTPUT);
}

void snapPhoto()
{
    digitalWrite(focusPin, HIGH);
    delay(focusDelay);
    digitalWrite(shutterPin, HIGH);
    delay(shutterOpen);
    digitalWrite(shutterPin, LOW);
    digitalWrite(focusPin, LOW);
}

void loop()
{
    delay(betweenPictures);
    snapPhoto();
}

Naturally, since then we haven't had any dramatic clouds, and the lightning storms have all been late at night after I went to bed. (I don't want to leave my nice camera out unattended in a rainstorm.) But my intervalometer seemed to work fine in short tests. Eventually I'll make some actual time-lapse movies ... but that will be a separate article.

July 17, 2014 12:31 AM

July 12, 2014

Akkana Peck

Trapped our first pack rat

[White throated woodrat in a trap] One great thing about living in the country: the wildlife. I love watching animals and trying to photograph them.

One down side of living in the country: the wildlife.

Mice in the house! Pack rats in the shed and the crawlspace! We found out pretty quickly that we needed to learn about traps.

We looked at traps at the local hardware store. Dave assumed we'd get simple snap-traps, but I wanted to try other options first. I'd prefer to avoid killing if I don't have to, especially killing in what sounds like a painful way.

They only had one live mousetrap. It was a flimsy plastic thing, and we were both skeptical that it would work. We made a deal: we'd try two of them for a week or two, and when (not if) they didn't work, then we'd get some snap-traps.

We baited the traps with peanut butter and left them in the areas where we'd seen mice. On the second morning, one of the traps had been sprung, and sure enough, there was a mouse inside! Or at least a bit of fur, bunched up at the far inside end of the trap.

We drove it out to open country across the highway, away from houses. I opened the trap, and ... nothing. I looked in -- yep, there was still a furball in there. Had we somehow killed it, even in this seemingly humane trap?

I pointed the open end down and shook the trap. Nothing came out. I shook harder, looked again, shook some more. And suddenly the mouse burst out of the plastic box and went HOP-HOP-HOPping across the grass away from us, bounding like a tiny kangaroo over tufts of grass, leaving us both giggling madly. The entertainment alone was worth the price of the traps.

Since then we've seen no evidence of mice inside, and neither of the traps has been sprung again. So our upstairs and downstairs mice must have been the same mouse.

But meanwhile, we still had a pack rat problem (actually, probably, white-throated woodrats, the creature that's called a pack rat locally). Finding no traps for sale at the hardware store, we went to Craigslist, where we found a retired wildlife biologist just down the road selling three live Havahart rat traps. (They also had some raccoon-sized traps, but the only raccoon we've seen has stayed out in the yard.)

We bought the traps, adjusted one a bit where its trigger mechanism was bent, baited them with peanut butter and set them in likely locations. About four days later, we had our first captive little brown furball. Much smaller than some of the woodrats we've seen; probably just a youngster.

[White throated woodrat bounding away] We drove quite a bit farther than we had for the mouse. Woodrats can apparently range over a fairly wide area, and we didn't want to let it go near houses. We hiked a little way out on a trail, put the trap down and opened both doors. The woodrat looked up, walked to one open end of the trap, decided that looked too scary; walked to the other open end, decided that looked too scary too; and retreated back to the middle of the trap.

We had to tilt and shake the trap a bit, but eventually the woodrat gathered up its courage, chose a side, darted out and HOP-HOP-HOPped away into the bunchgrass, just like the mouse had.

No reference I've found says anything about woodrats hopping, but the mouse did that too. I guess hopping is just what you do when you're a rodent suddenly set free.

I was only able to snap one picture before it disappeared. It's not in focus, but at least I managed to catch it with both hind legs off the ground.

July 12, 2014 06:05 PM

July 08, 2014

Elizabeth Krumbach

OpenStack Infrastructure July 2014 Bug Day

Today the OpenStack Infrastructure team hosted our first bug day of the cycle.

The Killing Jar; the last moments of a Parage aegeria

The steps we have for running a bug day can be a bit tedious, but it’s not hard, here’s the rundown:

  1. I create our etherpad: cibugreview-july2014 (see etherpad from past bug days on the wiki at: InfraTeam#Bugs)
  2. I run my simple infra_bugday.py script and populate the etherpad.
  3. Grab the bug stats from launchpad and copy them into the pad so we (hopefully) have inspiring statistics at the end of the day.
  4. Then comes the real work. I open up the old etherpad and go through all the bugs, copying over comments from the old etherpad where applicable and making my own comments as necessary about obvious updates I see (and updating my own bugs).
  5. Let the rest of the team dive in on the etherpad and bugs!

Throughout the day we chat in #openstack-infra about bug statuses, whether we should continue pursuing certain strategies outlined in bugs, reaching out to folks who have outstanding bugs in the tracker that we’d like to see movement on but haven’t in a while. Plus, we get to triage a whole pile of New bugs and close others we may have lost track of.

As we wrap up, here are the stats from today:

Bug day start total open bugs: 281

  • 64 New bugs
  • 41 In-progress bugs
  • 5 Critical bugs
  • 22 High importance bugs
  • 2 Incomplete bugs

Bug day end total open bugs: 231

  • 0 New bugs
  • 33 In-progress bugs
  • 4 Critical bugs
  • 16 High importance bugs
  • 10 Incomplete bugs

Thanks again everyone!

by pleia2 at July 08, 2014 10:43 PM

Akkana Peck

Big and contrasty mouse cursors

[Big mouse cursor from Comix theme] My new home office with the big picture windows and the light streaming in come with one downside: it's harder to see my screen.

A sensible person would, no doubt, keep the shades drawn when working, or move the office to a nice dim interior room without any windows. But I am not sensible and I love my view of the mountains, the gorge and the birds at the feeders. So accommodations must be made.

The biggest problem is finding the mouse cursor. When I first sit down at my machine, I move my mouse wildly around looking for any motion on the screen. But the default cursors, in X and in most windows, are little subtle black things. They don't show up at all. Sometimes it takes half a minute to figure out where the mouse pointer is.

(This wasn't helped by a recent bug in Debian Sid where the USB mouse would disappear entirely, and need to be unplugged from USB and plugged back in before the computer would see it. I never did find a solution to that, and for now I've downgraded from Sid to Debian testing to make my mouse work. I hope they fix the bug in Sid eventually, rather than porting whatever "improvement" caused the bug to more stable versions. Dealing with that bug trained me so that when I can't see the mouse cursor, I always wonder whether I'm just not seeing it, or whether it really isn't there because the kernel or X has lost track of the mouse again.)

What I really wanted was bigger mouse cursor icons in bright colors that are visible against any background. This is possible, but it isn't documented at all. I did manage to get much better cursors, though different windows use different systems.

So I wrote up what I learned. It ended up too long for a blog post, so I put it on a separate page: X Cursor Themes for big and contrasty mouse cursors.

It turned out to be fairly complicated. You can replace the existing cursor font, or install new cursor "themes" that many (but not all) apps will honor. You can change theme name and size (if you choose a scalable theme), and some apps will honor that. You have to specify theme and size separately for GTK apps versus other apps. I don't know what KDE/Qt apps do.

I still have a lot of unanswered questions. In particular, I was unable to specify a themed cursor for xterm windows, and for non text areas in emacs and firefox, and I'd love to know how to do that.

But at least for now, I have a great big contrasty blue mouse cursor that I can easily see, even when I have the shades on the big windows open and the light streaming in.

July 08, 2014 04:25 PM

July 04, 2014

Elizabeth Krumbach

Google I/O 2014

Last week I had the opportunity to attend Google I/O. As someone who has only done much hardware-focused development as a hobby I’d never been to a vendor-specific developer conference. Google’s was a natural choice for me, I’m a fan of Android (yay Linux!) and their events always have cool goodies for attendees, plus it was 2 blocks from home. My friend Danita attended with me, so it was also nice to not go alone.

We registered on Tuesday, before the conference. Wednesday we headed over what we thought was early, but then after picking up breakfast and getting in line for the 9:00 keynote at 8:25 found ourselves in a line that had wrapped around the whole block of Moscone West + Intercontinental! The keynote began while we were very much still in line, and didn’t get into the main room until around 9:30. The line was still wrapped around the building when we got in, so I can’t imagine how late the other folks got in, and many of them must have ended up in some kind of overflow room since we got some of the last few seats in the main room.

Once we got in, the keynote itself was fun. Talked about Android design, Android Wear including a couple watches that we later learned we’d get to take home (woo! One we could pick up the next day, the next later this year when it’s released) and Android Auto which has partnerships with several vehicle manufacturers that will start coming out later this year (they didn’t give us one of these though). They also talked about Android TV, which was nice to hear about since it always seemed a bit strange that they had a separate division for the OS they run on tablets/phones, TVs and for Google Fiber. The keynote wrapped up by talking about Google’s cloud offerings.

By the time the keynote had finished at 11:40 the first session was pretty much over, so I grabbed some lunch and then made my way over to Who cares about new domain names? We do. If you want happy users, then you should too. In this session they announced their initiative to sell domain names as a registrar and then, most interestingly, dove into the details related to how the new naming scheme will impact web and application development when it comes to URL and email validation. Beyond just parsing of more domains, there are now considerations for UTF-8 characters included in some new domain names and how that works with DNS. I particularly liked that they showed off some of the problems Google itself was having with applications like GMail when it comes to these new domains, and how they fixed them.

The next session I went to was Making sense of online course data. I’m a big fan of MOOCs, so this was of particular interest to me. Peter Norvig and Julia Wilkowski discussed some of Google’s initiatives in developing MOOCs and what they’ve learned from their students following each one. It was refreshing to hear that they were catering to the educational needs of the students by going as far as completely breaking down old course models and doing things like offering search tools for classes if students only want to complete a portion of it, making all materials and schedule (including quiz dates and deadlines) available at the beginning and largely giving the students the ability to create their own lesson plans based on what they want to learn.

We also found time in the day to check out vendor booths and get our pictures taken with giant Androids!

The last session I attended the first day was HTTPS Everywhere. As a technical person, I’m very careful about my online activities and simply avoid authentication for non-HTTPS sites when I’m not on a trusted network. The main argument that kicked off this talk was that most people don’t do that, plus the cumulative effect of having all your HTTP-based traffic sniffed can be a major privacy violation even if there are no authentication details leaked. Fair enough. The rest of the talk covered tools and tips for migrating a site to be HTTPS-only, including how to do things properly so that your search rankings don’t plummet during the switch. Some of the key resources I gleaned from this talk include:

The first day after party was held by the conference in Yerba Buena park, and I got plenty of rest before Thursday morning when we got our chance to check out Android Auto in one of the several cars they got up to the 3rd floor for demoing! As someone who has almost always driven older cars, I am concerned about how dated the Android Auto technology will quickly become, but it does seem better than many of the current dead end technologies that seem to be shipping with cars today that are fully built in.

We also got to pick up our Android watch! After finally tracking down the developer info and charging for a bit I was able to get mine going. It’s still pretty buggy, but it is nice to get alerts on my wrist without having to pull my phone out of my purse.

Session-wise, we started out with the packed Cardboard: VR for Android session. Google Cardboard sure seems like a joke, but it’s actually a pretty cool way to use your phone for a cheap Virtual Reality environment. The session covered some of the history of the project (and of VR in general), the current apps available to try out for Cardboard and some ideas for developers.

From there I went to Transforming democracy and disasters with APIs. After seeing a presentation on Poplus when I was in Croatia, I was interested to see what Google was doing in the space of civic hacking, and was pleasantly surprised! Many of these sorts of organizations – Code for America, Poplus, Google’s initiatives, actually make efforts work together in this space. Some of the things Google has been focusing on include getting voting data to people (including who their representative is, where polling places are) accessible via the Google Civic Information API. They also talked some about the Common Alerting Protocol (CAP), which is an XML standard that Google is trying to help encourage adoption for so their and other services can more easily consume alerts worldwide for tools that use these feeds to alert populations. From this, they talked about various other sites, including:

And many more 3rd party APIs documented in this Google doc.

After lunch I went to the very crowded Nest for developers session. Even after watching this I am somewhat skeptical about how much more home automation you can get from a system that started with a thermostat and still focuses on environmental control. On the flip side, I’ve actually seen Nest “in the wild” so perhaps it gets closer to home automation than most other technologies have in this space.

Continuing my interest in sessions about civic good, I then attended Maps for good: Saving trees and saving lives with petapixel-scale computing. Presenter Rebecca Moore started off with this great story about how she stopped a very bad logging plan in her area by leveraging maps and other technology tools to give presentations around her community (see here for more). Out of her work here, and further 20% work at Google, came the birth of the initiative she currently works on full time, Google Earth Outreach.

Google Earth Outreach gives nonprofits and public benefit organizations the knowledge and resources they need to visualize their cause and tell their story in Google Earth & Maps to hundreds of millions of people.

Pretty cool stuff. She spoke more in depth about some really map geek stuff, including collection and inclusion of historical and current Landsat data in Google Earth, as well as the tools now available for organizations looking to process map data now and over time for everything from disaster relief to tracking loss of habitat.

The last slot of the day was a contentious one, so many things I wanted to see! Fortunately it’s all recorded so I can go back and see the ones I missed. I decided to go to Strengthening communities with technology: A discussion with Bay Area Impact Challenge finalists. This session featured three bay-area organizations who have been doing good in the community:

  • One Degree – “The easiest way to find, manage, and share nonprofit services for you and your family.”
  • Hack the Hood – “Hack the Hood provides technical training in high in-demand multimedia and tech skills to youth who will then apply their learning through real-world consulting projects with locally-owned businesses and non-profits.”
  • Beyond 12 – “Ensuring all students have the opportunity to succeed in college and beyond.”

Google brought these organizations together as finalists in their Bay Area Impact Challenge, from which they all received large grants. There were some interesting observations from all these organizations, on the technical side I learned that most low income people in the bay area have a smartphone, whereas only half have a computer and internet at home. There was also higher access to text messaging abilities than to email, which was an important consideration when some organizations were launching their online services – it’s better to rely on text for registration than email. They also all work with existing organizations and are very involved with communities they serve so they make sure they are meeting the needs of their communities – which may seem obvious, but there are many technical initiatives for under-served communities that fail because they are either solving the wrong problem, have the wrong solution or aren’t very accessible.

And with that, Google I/O came to a close!

In all, it was a worthwhile experience, but as someone who is not doing application development as my core job function it was more “fun and interesting” than truly valuable (particularly with the $900 price tag). I sure do enjoy my Android watch though! And am looking forward to the round face version coming out in a few months (which we’ll also get one of!).

More photos from the event here: https://www.flickr.com/photos/pleia2/sets/72157645456636793/

by pleia2 at July 04, 2014 03:18 AM

Akkana Peck

Detecting wildlife with a PIR sensor (or not)

[PIR sensor] In my last crittercam installment, the NoIR night-vision crittercam, I was having trouble with false positives, where the camera would trigger repeatedly after dawn as leaves moved in the wind and the morning shadows marched across the camera's field of view. I wondered if a passive infra-red (PIR) sensor would be the answer.

I got one, and the answer is: no. It was very easy to hook up, and didn't cost much, so it was a worthwhile experiment; but it gets nearly as many false positives as camera-based motion detection. It isn't as sensitive to wind, but as the ground and the foliage heat up at dawn, the moving shadows are just as much a problem as they were with image-based motion detection.

Still, I might be able to combine the two, so I figure it's worth writing up.

Reading inputs from the HC-SR501 PIR sensor

[PIR sensor pins]

The PIR sensor I chose was the common HC-SR501 module. It has three pins -- Vcc, ground, and signal -- and two potentiometer adjustments.

It's easy to hook up to a Raspberry Pi because it can take 5 volts in on its Vcc pin, but its signal is 3.3v (a digital signal -- either motion is detected or it isn't), so you don't have to fool with voltage dividers or other means to get a 5v signal down to the 3v the Pi can handle. I used GPIO pin 7 for signal, because it's right on the corner of the Pi's GPIO header and easy to find.

There are two ways to track a digital signal like this. Either you can poll the pin in an infinfte loop:

import time
import RPi.GPIO as GPIO

pir_pin = 7
sleeptime = 1

GPIO.setmode(GPIO.BCM)
GPIO.setup(pir_pin, GPIO.IN)

while True:
    if GPIO.input(pir_pin):
        print "Motion detected!"
    time.sleep(sleeptime)

or you can use interrupts: tell the Pi to call a function whenever it sees a low-to-high transition on a pin:

import time
import RPi.GPIO as GPIO

pir_pin = 7
sleeptime = 300

def motion_detected(pir_pin):
    print "Motion Detected!"

GPIO.setmode(GPIO.BCM)
GPIO.setup(pir_pin, GPIO.IN)

GPIO.add_event_detect(pir_pin, GPIO.RISING, callback=motion_detected)

while True:
    print "Sleeping for %d sec" % sleeptime
    time.sleep(sleeptime)

Obviously the second method is more efficient. But I already had a loop set up checking the camera output and comparing it against previous output, so I tried that method first, adding support to my motion_detect.py script. I set up the camera pointing at the wall, and, as root, ran the script telling it to use a PIR sensor on pin 7, and the local and remote directories to store photos:

# python motion_detect.py -p 7 /tmp ~pi/shared/snapshots/
and whenever I walked in front of the camera, it triggered and took a photo. That was easy!

Reliability problems with add_event_detect

So easy that I decided to switch to the more efficient interrupt-driven model. Writing the code was easy, but I found it triggered more often: if I walked in front of the camera (and stayed the requisite 7 seconds or so that it takes raspistill to get around to taking a photo), when I walked back to my desk, I would find two photos, one showing my feet and the other showing nothing. It seemed like it was triggering when I got there, but also when I left the scene.

A bit of web searching indicates this is fairly common: that with RPi.GPIO a lot of people see triggers on both rising and falling edges -- e.g. when the PIR sensor starts seeing motion, and when it stops seeing motion and goes back to its neutral state -- when they've asked for just GPIO.RISING. Reports for this go back to 2011.

On the other hand, it's also possible that instead of seeing a GPIO falling edge, what was happening was that I was getting multiple calls to my function while I was standing there, even though the RPi hadn't finished processing the first image yet. To guard against that, I put a line at the beginning of my callback function that disabled further callbacks, then I re-enabled them at the end of the function after the Pi had finished copying the photo to the remote filesystem. That reduced the false triggers, but didn't eliminate them entirely.

Oh, well, The sun was getting low by this point, so I stopped fiddling with the code and put the camera out in the yard with a pile of birdseed and peanut suet nuggets in front of it. I powered on, sshed to the Pi and ran the motion_detect script, came back inside and ran a tail -f on the output file.

I had dinner and worked on other things, occasionally checking the output -- nothing! Finally I sshed to the Pi and ran ps aux and discovered the script was no longer running.

I started it again, this time keeping my connection to the Pi active so I could see when the script died. Then I went outside to check the hardware. Most of the peanut suet nuggets were gone -- animals had definitely been by. I waved my hands in front of the camera a few times to make sure it got some triggers.

Came back inside -- to discover that Python had gotten a segmentation fault. It turns out that nifty GPIO.add_event_detect() code isn't all that reliable, and can cause Python to crash and dump core. I ran it a few more times and sure enough, it crashed pretty quickly every time. Apparently GPIO.add_event_detect needs a bit more debugging, and isn't safe to use in a program that has to run unattended.

Back to polling

Bummer! Fortunately, I had saved the polling version of my program, so I hastily copied that back to the Pi and started things up again. I triggered it a few times with my hand, and everything worked fine. In fact, it ran all night and through the morning, with no problems except the excessive number of false positives, already mentioned.

[piñon mouse] False positives weren't a problem at all during the night. I'm fairly sure the problem happens when the sun starts hitting the ground. Then there's a hot spot that marches along the ground, changing position in a way that's all too obvious to the infra-red sensor.

I may try cross-checking between the PIR sensor and image changes from the camera. But I'm not optimistic about that working: they both get the most false positives at the same times, at dawn and dusk when the shadow angle is changing rapidly. I suspect I'll have to find a smarter solution, doing some image processing on the images as well as cross-checking with the PIR sensor.

I've been uploading photos from my various tests here: Tests of the Raspberry Pi Night Vision Crittercam. And as always, the code is on github: scripts/motioncam with some basic documentation on my site: motion-detect.py: a motion sensitive camera for Raspberry Pi or other Linux machines. (I can't use github for the documentation because I can't seem to find a way to get github to display html as anything other than source code.)

July 04, 2014 02:13 AM

July 01, 2014

Jono Bacon

Getting Started in Community Management

If there is one question I get more than most, it is the proverbial:

How do I get started in community management?

While there are many tactical things to learn about building strong communities (which I cover in depth in The Art of Community), the main guidance I am keen to share is the importance of leadership.

Last night, while working in my hotel room, I bored myself writing up my thoughts in a blog post and just fired up my webcam:

Can’t see it? See it here

If you want to get involved in community management, be sure to join the awesome community that is forming on the Community Leadership Forum and if possible, join us on the 18th and 19th July 2014 in Portland for the Community Leadership Summit.

by jono at July 01, 2014 04:31 PM

June 28, 2014

Elizabeth Krumbach

Symphony, giraffes and pinnipeds

Prior to my trips to Texas and Croatia, MJ and I were able to make it over to Sherith Israel to enjoy the wonderful acoustics in a show by the Musicians of the San Francisco Symphony in a concert to benefit the SF-Marin Food Bank. It was a wonderful concert, and a wonderful way to round out a busy weekend before my trips.


During intermission

Last weekend our friend Danita came into town to visit for a week. Saturday we spent with a leisurely brunch at the Beach Chalet, one of my favorites. From there we went to the San Francisco Zoo to catch up with our new little friend, the baby patas monkey, who has grown even since my last visit a couple weeks ago!

We visited with the giraffes, as is appropriate since it was World Giraffe Day. I also got to finally see the peccary babies, but we were too late to make it into the lion house by 4pm to visit the two-toed sloth who I’ve never met. Next time.

On Sunday we went the amusement park route and made our way up to Six Flags Discovery Kingdom. Given my health lately, I wasn’t keen on going on any rides, but I learned a while back that this park has walruses (the only place in the bay area that does), along with lots of other animals, so I was pretty excited.

The walruses didn’t disappoint. One of the larger of the three seemed thrilled to delight the humans who were visiting their tank:

And the rest swam around doing walrus things. It was awesome to see them, I’m a general pinniped fan but I don’t get to see walruses all that often.

I also got to visit the seals and sea lions, and got to feed a mamma sea lion, the baby was a bit too shy.

Continuing on our giraffe trend, we also got to visit the giraffes there at the park as they celebrated a whole weekend of World Giraffe Day!

More photos from Six Flags here (I even got one of a roller coaster!): https://www.flickr.com/photos/pleia2/sets/72157645359472733/

Then I had a busy week. I attended Google I/O for the first time, which I’ll write about later. I also had an Upper Endoscopic Ultrasound (EUS) done to poke around to see what was going in with my gallbladder. The worst part about the procedure was the sore throat and mild neck bruising I had following it, which hasn’t made me feel great when coupled with the cough I’m recovering from. The doctor looking at the initial results mentioned sludge, but didn’t think there was concern, but upon follow-up with the surgeon I’ve been working with I learned that the amount of sludge when combined with my symptoms and family history made him think the right course of action would be gallbladder removal. I’m scheduled to have it removed on July 24th. I’ve never had surgery aside from wisdom teeth removal, so I’m pretty apprehensive about the procedure, but thankful that they finally found something so there is hope that the abdominal pain I’ve been having since April will finally go away.

by pleia2 at June 28, 2014 07:18 PM

June 27, 2014

Jono Bacon

Exponential Community

As some of you will know, recently I moved from Canonical to XPRIZE to work as Sr. Dir. Community. My charter here at XPRIZE is to infuse the organization and the incentive prizes it runs with community engagement.

For those unfamiliar with XPRIZE, it was created by Peter H. Diamandis to solve the grand challenges of our time by creating incentive prize competitions. The first XPRIZE was the $10million Ansari XPRIZE to create a space-craft that went into space and back twice in two weeks and carrying three crew. It was won by Scaled Composites with their SpaceShipOne craft, and the technology ultimately led to birth of the commercial space-flight industry. Other prizes have focused on ocean health, more efficient vehicles, portable health diagnosis, and more.

The incentivized prize model is powerful. It is accessible to anyone with the drive to compete, it results in hundreds of teams engaging in extensive R&D, only the winner gets paid, and the competitive nature generally results in market competition which then drives even more affordable and accessible technology to be built.

The XPRIZE model is part of Peter Diamandis’s vision of exponential technology. In a nutshell, Peter has identified that technology is doubling every year, across a diverse range of areas (not just computing), and that technology can ultimately solve our grand challenges such as scarcity, clean water, illiteracy, space exploration, clean energy, and more. If you are interested in finding out more, read Abundance; it really is an excellent and inspirational read.

When I was first introduced to XPRIZE the piece that inspired me about the model is that collaboratively we can solve grand challenges that we couldn’t solve alone. Regular readers of my work will know that this is precisely the same attribute in communities that I find so powerful.

As such, connecting the dots between incentivized prizes that solve grand challenges with effective and empowering community management, has the potential for a profound impact on the world.


The XPRIZE Lobby.

My introduction to XPRIZE helped me to realize that the exponential growth that Peter sees in technology is a key ingredient in how communities work. While not as crisply predictable (a doubling of community does not neccessarily mean a doubling of output), we have seen time and time again that when communities build momentum and growth the overall output of the community (irrespective of the specific size) can often exponentially grow.

An example of this is Wikipedia. From the inception of the site, the tremendous growth of the community resulted in huge growth in not just the site, but the value the site brought to users (as value is often defined by completeness). Another example is Linux. When the Linux kernel was only authored by Linus Torvalds, it had limited growth. The huge community that formed there has resulted in a technology that has literally changed how technology infrastructure in the world runs. We also have political examples such as the Arab Spring in which social media helped empower large swathes of citizens to challenge their leaders. Again, as the community grew, so did the potency of their goals.

XPRIZE plays a valuable role because exponential growth in technology does not necessarily mean that the technology will be built. Traditionally, only governments were solving the grand challenges of our time because companies found it difficult to understand or define a market. XPRIZE competitions put a solid stake in the ground that highlights the problem, legitimizes the development of the technology with a clear goal and prize purse, and empowers fair participation.

The raw ingredients (smart people with drive and passion) for solving these challenges are already out there, and XPRIZE works to mobilize them. In a similar fashion, the raw ingredients for creating globally impactive communities are there, we just need to activate them.

So what will I be doing at XPRIZE to build community engagement?

Well, I have only been here a few weeks so my priority right now are some near-term goals and getting to know the team and culture, so I don’t have anything concrete I can share right now. I assure you though, I will be talking more about my work in the coming months.

You can stay connected to this work via this blog, my Twitter account, and my Google+ account. Also, be sure to follow XPRIZE to stay up to date with the general work of the organization.

by jono at June 27, 2014 04:42 PM

June 26, 2014

Akkana Peck

A Raspberry Pi Night Vision Camera

[Mouse caught on IR camera]

When I built my http://shallowsky.com/blog/hardware/raspberry-pi-motion-camera.html (and part 2), I always had the NoIR camera in the back of my mind. The NoIR is a version of the Pi camera module with the infra-red blocking filter removed, so you can shoot IR photos at night without disturbing nocturnal wildlife (or alerting nocturnal burglars, if that's your target).

After I got the daylight version of the camera working, I ordered a NoIR camera module and plugged it in to my RPi. I snapped some daylight photos with raspstill and verified that it was connected and working; then I waited for nightfall.

In the dark, I set up the camera and put my cup of hot chocolate in front of it. Nothing. I hadn't realized that although CCD cameras are sensitive in the near IR, the wavelengths only slightly longer than visible light, they aren't sensitive anywhere near the IR wavelengths that hot objects emit. For that, you need a special thermal camera. For a near-IR CCD camera like the Pi NoIR, you need an IR light source.

Knowing nothing about IR light sources, I did a search and came up with something called a "Infrared IR 12 Led Illuminator Board Plate for CCTV Security CCD Camera" for about $5. It seemed similar to the light sources used on a few pages I'd found for home-made night vision cameras, so I ordered it. Then I waited, because I stupidly didn't notice until a week and a half later that it was coming from China and wouldn't arrive for three weeks. Always check the shipping time when ordering hardware!

When it finally arrived, it had a tiny 2-pin connector that I couldn't match locally. In the end I bought a package of female-female SchmartBoard jumpers at Radio Shack which were small enough to make decent contact on the light's tiny-gauge power and ground pins. I soldered up a connector that would let me use a a universal power supply, taking a guess that it wanted 12 volts (most of the cheap LED rings for CCD cameras seem to be 12V, though this one came with no documentation at all). I was ready to test.

Testing the IR light

[IR light and NoIR Pi camera]

One problem with buying a cheap IR light with no documentation: how do you tell if your power supply is working? Since the light is completely invisible.

The only way to find out was to check on the Pi. I didn't want to have to run back and forth between the dark room where the camera was set up and the desktop where I was viewing raspistill images. So I started a video stream on the RPi:

$ raspivid -o - -t 9999999 -w 800 -h 600 | cvlc -vvv stream:///dev/stdin --sout '#rtp{sdp=rtsp://:8554/}' :demux=h264

Then, on the desktop: I ran vlc, and opened the network stream:
rtsp://pi:8554/
(I have a "pi" entry in /etc/hosts, but using an IP address also works).

Now I could fiddle with hardware in the dark room while looking through the doorway at the video output on my monitor.

It took some fiddling to get a good connection on that tiny connector ... but eventually I got a black-and-white view of my darkened room, just as I'd expect under IR illumination. I poked some holes in the milk carton and used twist-ties to seccure the light source next to the NoIR camera.

Lights, camera, action

Next problem: mute all the blinkenlights, so my camera wouldn't look like a christmas tree and scare off the nocturnal critters.

The Pi itself has a relatively dim red run light, and it's inside the milk carton so I wasn't too worried about it. But the Pi camera has quite a bright red light that goes on whenever the camera is being used. Even through the thick milk carton bottom, it was glaring and obvious. Fortunately, you can disable the Pi camera light: edit /boot/config.txt and add this line

disable_camera_led=1

My USB wi-fi dongle has a blue light that flickers as it gets traffic. Not super bright, but attention-grabbing. I addressed that issue with a triple thickness of duct tape.

The IR LEDs -- remember those invisible, impossible-to-test LEDs? Well, it turns out that in darkness, they emit a faint but still easily visible glow. Obviously there's nothing I can do about that -- I can't cover the camera's only light source! But it's quite dim, so with any luck it's not spooking away too many animals.

Results, and problems

For most of my daytime testing I'd used a threshold of 30 -- meaning a pixel was considered to have changed if its value differed by more than 30 from the previous photo. That didn't work at all in IR: changes are much more subtle since we're seeing essentially a black-and-white image, and I had to divide by three and use a sensitivity of 10 or 11 if I wanted the camera to trigger at all.

With that change, I did capture some nocturnal visitors, and some early morning ones too. Note the funny colors on the daylight shots: that's why cameras generally have IR-blocking filters if they're not specifically intended for night shots.

[mouse] [rabbit] [rock squirrel] [house finch]

Here are more photos, and larger versions of those: Images from my night-vision camera tests.

But I'm not happy with the setup. For one thing, it has far too many false positives. Maybe one out of ten or fifteen images actually has an animal in it; the rest just triggered because the wind made the leaves blow, or because a shadow moved or the color of the light changed. A simple count of differing pixels is clearly not enough for this task.

Of course, the software could be smarter about things: it could try to identify large blobs that had changed, rather than small changes (blowing leaves) all over the image. I already know SimpleCV runs fine on the Raspberry Pi, and I could try using it to do object detection.

But there's another problem with detection purely through camera images: the Pi is incredibly slow to capture an image. It takes around 20 seconds per cycle; some of that is waiting for the network but I think most of it is the Pi talking to the camera. With quick-moving animals, the animal may well be gone by the time the system has noticed a change. I've caught several images of animal tails disappearing out of the frame, including a quail who visited yesterday morning. Adding smarts like SimpleCV will only make that problem worse.

So I'm going to try another solution: hooking up an infra-red motion detector. I'm already working on setting up tests for that, and should have a report soon. Meanwhile, pure image-based motion detection has been an interesting experiment.

June 26, 2014 07:31 PM

June 25, 2014

Jono Bacon

Community Leadership Forum

A little while ago I set up the Community Leadership Forum. The forum is designed to be a place where community leaders and managers can learn and share experience about how to grow fun, productive, and empowered communities.

The forum is open and accessible to all communities – technology, social, environmental, entertainment, or anything else. It is intended to be diverse and pull together a great set of people.

It is also designed to be another tool (in addition to the Community Leadership Summit) to further the profession, art, and science of building great communities.

We are seeing some wonderful growth on the forum, and because the forum is powered by Discourse it is a simple pleasure to use.

I am also encouraging organizations who are looking for community managers to share their job descriptions on the forum. This forum will be a strong place to find the best talent in community management and for the talent to find great job opportunities.

I hope to see you there!

Join the Community Leadership Forum

by jono at June 25, 2014 04:29 PM

June 24, 2014

Jono Bacon

The Return of my Weekly Q&A

As many of you will know, I used to do a weekly Q&A on Ubuntu On Air for the Ubuntu community where anyone could come and ask any question about anything.

I am pleased to announce my weekly Q&A is coming back but in a new time and place. Now it will be every Thursday at 6pm UTC (6pm UK, 7pm Europe, 11am Pacific, 2pm Eastern), starting this week.

You can join each weekly session at http://www.jonobacon.org/live/

You are welcome to ask questions about:

  • Community management, leadership, and how to build fun and productive communities.
  • XPRIZE, our work there, and how we solve the world’s grand challenges.
  • My take on Ubuntu from the perspective of an independent community member.
  • My views on technology, Open Source, news, politics, or anything else.

As ever, all questions are welcome! I hope to see you there!

by jono at June 24, 2014 05:09 AM

June 23, 2014

Eric Hammond

EBS-SSD Boot AMIs For Ubuntu On Amazon EC2

With Amazon’s announcement that SSD is now available for EBS volumes, they have also declared this the recommended EBS volume type.

The good folks at Canonical are now building Ubuntu AMIs with EBS-SSD boot volumes. In my preliminary tests, running EBS-SSD boot AMIs instead of EBS magnetic boot AMIs speeds up the instance boot time by approximately… a lot.

Canonical now publishes a wide variety of Ubuntu AMIs including:

  • 64-bit, 32-bit
  • EBS-SSD, EBS-SSD pIOPS, EBS-magnetic, instance-store
  • PV, HVM
  • in every EC2 region
  • for every active Ubuntu release

Matrix that out for reasonable combinations and you get 492 AMIs actively supported today.

On the Alestic.com blog, I provide a handy reference to the much smaller set of Ubuntu AMIs that match my generally recommended configurations for most popular uses, specifically:

I list AMIs for both PV and HVM, because different virtualization technologies are required for different EC2 instance types.

Where SSD is not available, I list the magnetic EBS boot AMI (e.g., Ubuntu 10.04 Lucid).

To access this list of recommended AMIs, select an EC2 region in the pulldown menu towards the top right of any page on Alestic.com.

If you like using the AWS console to launch instances, click on the orange launch button to the right of the AMI id.

The AMI ids are automatically updated using an API provided by Canonical, so you always get the freshest released images.

Original article: http://alestic.com/2014/06/ec2-ebs-ssd-ami

by Eric Hammond at June 23, 2014 06:12 AM

June 21, 2014

Akkana Peck

Mirror a website using lftp

I'm helping an organization some website work. But I'm not the only one working on the website, and there's no version control. I wanted an easy way to make sure all my files were up-to-date before I start to work on one ... a way to mirror the website, or at least specific directories, to my local disk.

Normally I use rsync -av over ssh to mirror directories, but this website is on a server that only offers ftp access. I've been using ncftp to copy files up one by one, but although ncftp's manual says it has a mirror mode and I found a few web references to that, I couldn't find anything telling me how to activate it.

Making matters worse, there are some large files that I don't need to mirror. The first time I tried to use get * in ncftp to get one directory, it spent 15 minutes trying to download a huge powerpoint file, then stalled and lost the connection. There are some big .doc and .docx files, too. And ncftp doesn't seem to have a way to exclude specific files.

Enter lftp. It has a mirror mode (with documentation, even!) which includes a -X to exclude files matching specified patterns.

lftp includes a -e to pass commands -- like "mirror" -- to it on the command line. But the documentation doesn't say whether you can use more than one command at a time. So it seemed safer to start up an lftp session and pass a series of commands to it.

And that works nicely. Just set up the list of directories you want to mirror, and you can write a nice shell function you can put in your. .zshrc or .bashrc:

sitemirror() {
commands=""
for dir in thisdir thatdir theotherdir
do
  commands="$commands
mirror --only-newer -vvv -X '*.ppt' -X '*.doc*' -X '*.pdf' htdocs/$dir $HOME/web/webmirror/$dir"
done

echo Commands to be run:
echo $commands
echo

lftp <<EOF
open -u 'user,password' ftp.example.com
$commands
bye
EOF
}

Super easy -- all I do is type sitemirror and wait a little. Now I don't have any excuse for not being up to date.

June 21, 2014 06:39 PM

Elizabeth Krumbach

Tourist in Zagreb, Croatia

In addition to attending and presenting at DORS/CLUC, I had the opportunity to see some sights while I was in Zagreb, Croatia this past week.


View from my room at Panorama Zagreb

My tourist days began in the late afternoon on Monday when my local friend Jasna could pull herself away from conference things. Huge thanks to her for doing this, I know the exhaustion and pressure of working with a conference, and I’m really grateful that she was willing to take the time in the midst of this to walk around the city with me.

We did about 7 miles of walking around the center of the city. Our first stop was to visit the Nikola Tesla statue! I learned on my trip that Tesla was born in what is modern day Croatia so visiting the statue quickly became a must.

From the statue, we walked north and picked up a snack at one of the dozens of small bakeries that are all over the city and sat down next to the beautiful Croatian National Theatre to enjoy.

I was able to get some shopping done and when we made it to the main square in Upper Town I noticed that it had been almost completely taken over by World Cup festivities. Most of the United States doesn’t get too excited about the World Cup, so being in a country that cares about it during it was a treat. In addition to giant screens put up, they had little soccer (er, football?) games set up at pubs, roadside stands selling fan goodies and even cars were sporting the iconic red and white checkers of the Croatian team.

As our adventures wound down, I also got to see the outside of the main railway station in the city, which we’d go back to on Tuesday to catch a tram down to the zoo.

Monday night after tourist adventures, the conference organizers had a wonderful dinner for the keynote speakers at Pod grickim topom or “Under the Canon.” The food was exceptional, and even though I’m off alcohol at the moment (no honey schnapps for me!), I really enjoyed the family style dinner that was prepared for us.

Photos from the rest of my touristing adventures here: https://www.flickr.com/photos/pleia2/sets/72157645274192044/

On Tuesday evening we went to the Zagreb Zoo! It’s always interesting to visit zoos in other countries, but I’m also a bit apprehensive since they often aren’t particularly accredited by organizations like the AZA for many zoos in the United States, so I am not sure what to expect. I was pleasantly surprised by the quality of the Zagreb Zoo – many of the animals had big enclosures, very natural. The new lion enclosure was particularly impressive. As a city zoo in a park it reminded me a lot of the Central Park Zoo, but it definitely larger, if not as big as some of the other zoos I’ve been to.

More photos from the zoo here: https://www.flickr.com/photos/pleia2/sets/72157645264459992/

Unfortunately I had to cut my touristing short by Wednesday as I had come down with a cold and decided that my time would be better spent getting some rest before my trip home. Still, I got a lot in during my stay. Next time I’ll have to visit the coast, I hear the beaches on the Adriatic are well worth the visit.

by pleia2 at June 21, 2014 06:21 AM

June 20, 2014

Elizabeth Krumbach

DORS/CLUC 2014 OpenStack CI Keynote and more

Several months ago I was invited to give a keynote at the DORS/CLUC conference in Croatia on the OpenStack Continuous Integration System. I’ve been excited about this opportunity since it came up, so it was a real pleasure to spend this past week at the conference, getting to know my friend Jasna better and meeting the rest of the conference crew and attendees.

Each day I attended for the keynotes, as they were all in English, and on Monday I also participated in a Women in FLOSS panel. The evenings were spent exploring the beautiful city of Zagreb, which I’ll write about in another post once I upload my photos.

The first keynote was by Georg Greve who spoke on Kolab in his talk “Kolab: Do it right, or don’t do it at all” (slides here). I evaluated Kolab for use about 3 years ago and it was a bit rough around the edges, and I believe it was still using Horde as the webmail client. It was interesting to learn about where they’ve gone with development and the progress they’ve made. I was happy to learn that they are still fully open source (no “open core” or other kinds of proprietary modules for paying customers). Today it uses RoundCube for webmail, which I’d also go with if I were in a position to deploy a webmail client again. Finally, he spoke some about the somewhat unexpected success of their hosted Kolab solution, MyKolab, which had me seriously thinking again about my non-free use of Google Apps for email.

Next up was Dave Whiteland in a talk he called “Sharing things that work or ‘hey I just had somebody else’s really good idea’” where he talked about the work that MySociety and Poplus are doing in the space of civic coding. I’m a big fan of civic coding projects, and it was great to hear that the existing projects are working together to provide platforms for governments and municipalities all over the world. He also talked about public engagement and the success of email alerts in the UK about politics, saying that if people have access to structured data, they will use it. This really resonated with me, as someone who is interested in being better informed, I find myself struggling to find the time to stay informed, we’re all busy people and if we can get access to the facts in a clean, simple interface that draws from officially released information (which is hopefully largely unbiased) it’s super helpful. It was really cool to hear about the Poplus components available today, including MapIt and WriteIt to make civic mapping and contact projects easier.


Dave Whiteland

It was then time for the “Women in FLOSS technology” round table where I participated in with Ana Mandić, Jasna Benčić, Lucija Pilić and Marta Milaković. Jasna did a great job of rounding up great, really technical women for this panel with a variety of experiences and experience levels in the FLOSS sphere. After introductions, we talked about challenges we’ve encountered in our work, which tended to be those that every new contributor runs into (not so gender-based) and ways in which we’ve been helped in our work, from women-focused groups like LinuxChix to more formal programs like the Outreach Program for Women organized by the GNOME Foundation, serving a vast array of FLOSS projects. Huge thanks to all my fellow round table participants and the great, positive comments from the audience about how they can help.


Women in FLOSS round table participants, thanks to Milan Rajačić for this photo

Tuesday kicked off with a keynote by Miklos Vajna on “Libre Office – what we fixed in 4.2/4.3.” During preparation for my recent talks on Ubuntu 14.04, I reviewed the release notes for 4.2, so I was somewhat familiar with changes like the bigger, improved Start Center screen that you get upon launch, but some of the other features I was less familiar with. They’ve added LibCMIS support (perhaps most notably for GDrive support), preliminary release of import of Keynote slide decks into Impress, per-character border support, spreadsheet (Calc) storage rewrite for improved functionality and speed, ability to print notes and more. Upcoming features include Impress slide control from Android and iPhone and ability collaborate on documents using the Jabber protocol.


Miklos Vajna

We then heard from Georg Greve in his second keynote, “Living in the Free Software Society” (slides here). He began his talk by covering some of the fundamentals of FLOSS philosophies and then went into the importance of having people understand their rights when it comes to software they use and depend upon. He had several quotes from Lawrence Lessig’s recent commentary on free software and civic involvement. There was the sad realization that even though FLOSS has had better arguments for being the preferred solution for users (transparency, rights), it often hasn’t “won” as the preferred solution. As a result, he stressed the importance of helping those in power understand the technological fundamentals of bills and laws they are getting through Congress/Parliament and our role there. I also appreciated the observation that companies need to make an investment in implementing FLOSS technologies the “right way” with upstream collaboration (not internal forking) to avoid the massive internal maintenance problem that so many companies have encountered when going down this path and causing their FLOSS deployment to ultimately fail. Finally, I learned about the Terms of Service; Didn’t Read project which seeks “to rate and label website terms & privacy policies, from very good Class A to very bad Class E.” Cool.


Georg Greve

Wednesday morning was my keynote, and I had unfortunately developed a cold by this point in my week! Fortunately, I was able to get a lot of rest prior to my talk and my familiarity with the material and slide deck made my talk go well in spite of this. Several of my colleagues have given this Overview of the Continuous Integration for OpenStack before, so I was excited about my own opportunity to give a talk on this fully open source system, particular to an audience who are pretty new to the relatively new CI concept – hopefully they’ll think of us when they do get around to setting up their own CI system.

Slides from my talk are available here. We manage the slide deck collaboratively in git and you can always view the most recent rendered version here: http://docs.openstack.org/infra/publications/overview/


Thanks to Vedran Papeš for this photo, source

I really enjoyed this conference, huge thanks to Jasna Benčić and the whole conference crew for helping organize my trip and providing meals and entertainment for all of us while we were in town. It means so much to be welcomed so warmly into a country I’m not familiar with!

A few more of my photos from the event available here: https://www.flickr.com/photos/pleia2/sets/72157644861010398/

And photos from others in the DORS/CLUC 2014 Group on Flickr: https://www.flickr.com/groups/dc2014/

by pleia2 at June 20, 2014 09:08 PM

iheartubuntu

Retrieve Your Ubuntu One Data

PUBLIC SERVICE ANNOUNCEMENT

Canonical announced the file services for Ubuntu One has been discontinued.  Your data is available for download until the end of July - if you haven't taken action already, you need to do so now in order to ensure you have a copy of all your data.

In order to make it easy for you to retrieve all of your content, Ubuntu has released a new feature that lets you download all your content at once. The website https://one.ubuntu.com/ has been updated with instructions on how to conveniently download all your files.

In addition, you still can use Mover.io's offer to transfer your data to another cloud provider for free. And the Ubuntu One web interface is available for you to download individual files.

https://mover.io/connectors/ubuntu-one/

The previously announced option of downloading your files as a zip file is producing unreliable results for a small number of users and therefore that option has been removed. If you already retrieved your files as a zip file, Ubuntu encourages you to check for the validity of the zip file contents. If there are problems with that file, please use one of the options above to retrieve a complete copy of your data.

Remember that you will have until 31st July 2014 to collect all of your content. After that date, all remaining content will be deleted.

The Ubuntu One team

NOTE: To remove the annoying "Ubuntu One is closing down soon" pop ups, you can remove Ubuntu one with the following terminal command (I used this and it worked fine):

sudo apt-get autoremove --purge python-ubuntuone-storageprotocol

Just make sure it doesnt try to remove "ubuntu-desktop" :) Alternatively, if you dont trust the command line, Go into the Software Center and search for "python-ubuntuone-storageprotocol" and uninstall that.

by iheartubuntu (noreply@blogger.com) at June 20, 2014 04:15 AM

June 18, 2014

Akkana Peck

Fuzzy house finch chicks

[house finch chick] The wind was strong a couple of days ago, but that didn't deter the local house finch family. With three hungry young mouths to feed, and considering how long it takes to crack sunflower seeds, poor dad -- two days after Father's Day -- was working overtime trying to keep them all fed. They emptied by sunflower seed feeder in no time and I had to refill it that evening.

The chicks had amusing fluffy "eyebrow" feathers sticking up over their heads, and one of them had an interesting habit of cocking its tail up like a wren, something I've never seen house finches do before.

More photos: House finch chicks.

June 18, 2014 08:40 PM

June 16, 2014

Elizabeth Krumbach

Texas Linuxfest wrap-up

Last week I finally had the opportunity to attend Texas Linuxfest. I first heard about this conference back when it started from some Ubuntu colleagues who were getting involved with it, so it was exciting when my talk on Code Review for Systems Administrators was accepted.

I arrived late on Thursday night, much later than expected after some serious flight delays due to weather (including 3 hours on the tarmac at a completely different airport due to running out of fuel over DFW). But I got in early enough to get rest before the expo hall opened on Friday afternoon where I helped staff the HP booth.

At the HP booth, we were showing off the latest developments in the high density Moonshot system, including the ARM-based processors that are coming out later this year (currently it’s sold with server grade Atom processors). It was cool to be able to see one, learn more about it and chat with some of the developers at HP who are focusing on ARM.


HP Moonshot

That evening I joined others at the Speaker dinner at one of the Austin Java locations in town. Got to meet several cool new people, including another fellow from HP who was giving a talk, an editor from Apress who joined us from England and one of the core developers of BusyBox.

On Saturday the talks portion of the conference began!

The keynote was by Karen Sandler, titled “Identity Crisis: Are we who we say we are? which was a fascinating look at how we all present ourselves in the community. As a lawyer, she gave some great insight into the multiple loyalties that many contributors to Open Source have and explored some of them. This was quite topical for me as I continue to do a considerable amount of volunteer work with Ubuntu and work at HP on the OpenStack project as my paid job. But am I always speaking for HP in my role in OpenStack? I am certainly proud to represent HP’s considerable efforts in the community, but in my day to day work I’m largely passionate about the project and my work on a personal level and my views tend to be my own. During the Q&A there was also interesting discussion about use of email aliases, which got me thinking about my own. I have an Ubuntu address which I pretty strictly use for Ubuntu mailing lists and private Ubuntu-related correspondences, I have an HP address that I pretty much just use for internal HP work and then everything else in my life pretty much goes to my main personal address – including all correspondences on the OpenStack, local Linux and other mailing lists.


Karen Sandler beginning her talk with a “Thank You” to the conference organizers

The next talk I went to was by Corey Quinn on “Selling Yourself: How to handle a technical interview” (slides here). I had a chat with him a couple weeks back about this talk and was able to give some suggestions, so it was nice to see the full talk laid out. His experience comes from work at Taos where he does a lot of interviewing of candidates and was able to make several observations based on how people present themselves. He began by noting that a resume’s only job is to get you an interview, so more time should be spent on actually practicing interviewing rather than strictly focusing on a resume. As the title indicates, the key take away was generally that an interview is the place where you should be selling yourself, no modesty here. He also stressed that it’s a 2 way interview, and the interviewer is very interested in making sure that the person will like the job and that they are actually interested to some degree in the work and the company.

It was then time for my own talk, “Code Review for Systems Administrators,” where I talked about how we do our work on the OpenStack Infrastructure team (slides here). I left a bit of extra time for questions than I usually do since my colleague Khai Do was doing a presentation later that did a deeper dive into our continuous integration system (“Scaling the Openstack Test Environment“). I’m glad I did, there were several questions from the audience about some of our additional systems administration focused tooling and how we determine what we use (why Puppet? why Cacti?) and what our review process for those systems looked like.

Unfortunately this was all I could attend of the conference, as I had a flight to catch in order to make it to Croatia in time for DORS/CLUC 2014 this week. I do hope to make it back to Texas Linuxfest at some point, the event had a great venue and was well-organized with speaker helpers in every room to do introductions, keep things on track (so nice!) and make sure the A/V was working properly. Huge thanks to Nathan Willis and the other organizers for doing such a great job.

by pleia2 at June 16, 2014 05:23 AM

June 15, 2014

Akkana Peck

Vim: Set wrapping and indentation according to file type

Although I use emacs for most of my coding, I use vim quite a lot too, for quick edits, mail messages, and anything I need to edit when logged onto a remote server. In particular, that means editing my procmail spam filter files on the mail server.

The spam rules are mostly lists of regular expression patterns, and they can include long lines, such as:
gift ?card .*(Visa|Walgreen|Applebee|Costco|Starbucks|Whitestrips|free|Wal.?mart|Arby)

My default vim settings for editing text, including line wrap, don't work if get a flood of messages offering McDonald's gift cards and decide I need to add a "|McDonald" on the end of that long line.

Of course, I can type ":set tw=0" to turn off wrapping, but who wants to have to do that every time? Surely vim has a way to adjust settings based on file type or location, like emacs has.

It didn't take long to find an example of Project specific settings on the vim wiki. Thank goodness for the example -- I definitely wouldn't have figured that syntax out just from reading manuals. From there, it was easy to make a few modifications and set textwidth=0 if I'm opening a file in my procmail directory:

" Set wrapping/textwidth according to file location and type
function! SetupEnvironment()
  let l:path = expand('%:p')
  if l:path =~ '/home/akkana/Procmail'
    " When editing spam filters, disable wrapping:
    setlocal textwidth=0
endfunction
autocmd! BufReadPost,BufNewFile * call SetupEnvironment()

Nice! But then I remembered other cases where I want to turn off wrapping. For instance, editing source code in cases where emacs doesn't work so well -- like remote logins over slow connections, or machines where emacs isn't even installed, or when I need to do a lot of global substitutes or repetitive operations. So I'd like to be able to turn off wrapping for source code.

I couldn't find any way to just say "all source code file types" in vim. But I can list the ones I use most often. While I was at it, I threw in a special wrap setting for mail files:

" Set wrapping/textwidth according to file location and type
function! SetupEnvironment()
  let l:path = expand('%:p')
  if l:path =~ '/home/akkana/Procmail'
    " When editing spam filters, disable wrapping:
    setlocal textwidth=0
  elseif (&ft == 'python' || &ft == 'c' || &ft == 'html' || &ft == 'php')
    setlocal textwidth=0
  elseif (&ft == 'mail')
    " Slightly narrower width for mail (and override mutt's override):
    setlocal textwidth=68
  else
    " default textwidth slightly narrower than the default
    setlocal textwidth=70
  endif
endfunction
autocmd! BufReadPost,BufNewFile * call SetupEnvironment()

As long as we're looking at language-specific settings, what about doing language-specific indentation like emacs does? I've always suspected vim must have a way to do that, but it doesn't enable it automatically like emacs does. You need to set three variables, assuming you prefer to use spaces rather than tabs:

" Indent specifically for the current filetype
filetype indent on
" Set indent level to 4, using spaces, not tabs
set expandtab shiftwidth=4

Then you can also use useful commands like << and >> for in- and out-denting blocks of code, or ==, for indenting to the right level. It turns out vim's language indenting isn't all that smart, at least for Python, and gets the wrong answer a lot of them time. You can't rely on it as a syntax checker the way you can with emacs. But it's a lot better than no language-specific indentation.

I will be a much happier vimmer now!

June 15, 2014 05:29 PM

June 12, 2014

Jono Bacon

FirefoxOS and Developing Markets

It seems Mozilla is targeting emerging markets and developing nations with $25 cell phones. This is tremendous news, and an admirable focus for Mozilla, but it is not without risk.

Bringing simple, accessible technology to these markets can have a profound impact. As an example, in 2001, 134 million Nigerians shared 500,000 land-lines (as covered by Jack Ewing in Businessweek back in 2007). That year the government started encouraging wireless market competition and by 2007 Nigeria had 30 million cellular subscribers.

This generated market competition and better products, but more importantly, we have seen time and time again that access to technology such as cell phones improves education, provides opportunities for people to start small businesses, and in many cases is a contributing factor for bringing people out of poverty.

So, cell phones are having a profound impact in these nations, but the question is, will it work with FirefoxOS?

I am not sure.

In Mozilla’s defence, they have done an admirable job with FirefoxOS. They have built a powerful platform, based on open web technology, and they lined up a raft of carriers to launch with. They have a strong brand, an active and passionate community, and like so many other success stories, they already have a popular existing product (their browser) to get them into meetings and headlines.

Success though is judged by many different factors, and having a raft of carriers and products on the market is not enough. If they ship in volume but get high return rates, it could kill them, as is common for many new product launches.

What I don’t know is whether this volume/return-rate balance plays such a critical role in developing markets. I would imagine that return rates could be higher (such as someone who has never used a cell phone before taking it back because it is just too alien to them). On the other hand, I wonder if those consumers there are willing to put up with more quirks just to get access to the cell network and potentially the Internet.

What seems clear to me is that success here has little to do with the elegance or design of FirefoxOS (or any other product for that matter). It is instead about delivering incredibly dependable hardware. In developing nations people have less access to energy (for charging devices) and have to work harder to obtain it, and have lower access to support resources for how to use new technology. As such, it really needs to just work. This factor, I imagine, is going to be more outside of Mozilla’s hands.

So, in a nutshell, if the $25 phones fail to meet expectations, it may not be Mozilla’s fault. Likewise, if they are successful, it may not be to their credit.

by jono at June 12, 2014 11:40 PM

Akkana Peck

Comcast actually installed a cable! Or say they did.

The doorbell rings at 10:40. It's a Comcast contractor.

They want to dig across the driveway. They say the first installer didn't know anything, he was wrong about not being able to use the box that's already on this side of the road. They say they can run a cable from the other side of the road through an existing conduit to the box by the neighbor's driveway, then dig a trench across the driveway to run the cable to the old location next to the garage.

They don't need to dig across the road since there's an existing conduit; they don't even need to park in the road. So no need for a permit.

We warn them we're planning to have driveway work done, so the driveway is going to be dug up at some point, and they need to put it as deep as possible. We even admit that we've signed a contract with CenturyLink for DSL. No problem, they say, they're being paid by Comcast to run this cable, so they'll go ahead and do it.

We shrug and say fine, go for it. We figure we'll mark the trench across the driveway afterward, and when we finally have the driveway graded, we'll make sure the graders know about the buried cable. They do the job, which takes less than an hour.

If they're right that this setup works, that means, of course, that this could have been done back in February or any time since then. There was no need to wait for a permit, let alone a need to wait for someone to get around to applying for a permit.

So now, almost exactly 4 months after the first installer came out, we may have a working cable installed. No way to know for sure, since we've been happily using DSL for over a month. But perhaps we'll find out some day.

The back story, in case you missed it: Getting cable at the house: a Comcast Odyssey.

June 12, 2014 09:48 PM

June 11, 2014

Jono Bacon

Community Management Training at OSCON, LinuxCon North America, and LinuxCon Europe

I am a firm believer in building strong and empowered communities. We are in an age of a community management renaissance in which we are defining repeatable best practice that can be applied many different types of communities, whether internal to companies, external to volunteers, or a mix of both.

I have been working to further this growth in community management via my books, The Art of Community and Dealing With Disrespect, the Community Leadership Summit, the Community Leadership Forum, and delivering training to our next generation of community managers and leaders.

Last year I ran my first community management training course, and it was very positively received. I am delighted to announce that I will be running an updated training course at three events over the coming months.

OSCON

On Sunday 20th July 2014 I will be presenting the course at the OSCON conference in Portland, Oregon. This is a tutorial, so you will need to purchase a tutorial ticket to attend. Attendance is limited, so be sure to get to the class early on the day to reserve a seat!

Find Out More

LinuxCon North America and Europe

I am delighted to bring my training to the excellent LinuxCon events in both North America and Europe.

Firstly, on Fri 22nd August 2014 I will be presenting the course at LinuxCon North America in Chicago, Illinois and then on Thurs Oct 16th 2014 I will deliver the training at LinuxCon Europe in Düsseldorf, Germany.

Tickets are $300 for the day’s training. This is a steal; I usually charge $2500+/day when delivering the training as part of a consultancy arrangement. Thanks to the Linux Foundation for making this available at an affordable rate.

Space is limited, so go and register ASAP:

What Is Covered

So what is in the training course?

My goal with each training day is to discuss how to build and grow a community, including building collaborative workflows, defining a governance structure, planning, marketing, and evaluating effectiveness. The day is packed with Q&A, discussion, and I encourage my students to raise questions, challenge me, and explore ways of optimizing their communities. This is not a sit-down-and-listen-to-a-teacher-drone on kind of session; it is interactive and designed to spark discussion.

The day is mapped out like this:

  • 9.00am – Welcome and introductions
  • 9.30am – The core mechanics of community
  • 10.00am – Planning your community
  • 10.30am – Building a strategic plan
  • 11.00am – Building collaborative workflow
  • 12.00pm – Governance: Part I
  • 12.30pm – Lunch
  • 1.30pm – Governance: Part II
  • 2.00pm – Marketing, advocacy, promotion, and social
  • 3.00pm – Measuring your community
  • 3.30pm – Tracking, measuring community management
  • 4.30pm – Burnout and conflict resolution
  • 5.00pm – Finish

I will warn you; it is an exhausting day, but ultimately rewarding. It covers a lot of ground in a short period of time, and then you can follow with further discussion of these and other topics on our Community Leadership discussion forum.

I hope to see you there!

by jono at June 11, 2014 05:55 PM

June 10, 2014

Elizabeth Krumbach

Simcoe’s June Checkup

On June 7th I brought Simcoe in to the vet for her regular-ish checkup to see how she’s handling her kidney disease. Her last visit was back in January. She wasn’t happy about this visit, with MJ out of town for a week I think she was feeling pretty out of sorts, and the poor thing is always terrified at the vet.

But I’m happy to report that it’s now been over 2.5 years since she was diagnosed, and while things aren’t getting better, she’s pretty stable. It was nice to see her weight holding steady at 9.62lbs.

Her BUN and CRE levels have both shifted slightly, from 57 to 51 on BUN and 3.6 to 3.9 on CRE.

BUN: 51 (normal range: 14-36)
CRE: 3.9 (normal range: .6-2.4)

Of course we’re not thrilled to see CRE continue to creep up, it had managed to stick around high normal for quite some time, but neither are near where she was when she was diagnosed back in December of 2011. We’ll continue stick to her care schedule of subcutaneous fluids and Pepcid AC.

by pleia2 at June 10, 2014 04:04 AM

June 08, 2014

Elizabeth Krumbach

Life, critters and upcoming travel

MJ has been traveling a fair amount lately, so I’ve been working harder to meet up with friends, which I don’t do nearly enough. I had brunch with a friend just before Memorial day, and ended up at The View for drinks and dessert with a couple who just moved into town last Sunday. This week I was able to meet up with my old friend Mark down in the Mission to chat about life and enjoy some foods that were much too greasy for my own good (and quickly learned I should be avoiding right now, oops!).

Before leaving on his last trip, we were able to meet up with my cousin Melissa who was in town for an event, and my in-town (for now) cousin Brendan who I manage to only meet up with when other relatives are in town. We ate over at Fang before inviting them up to our condo for a bit.

After MJ left on his trip, I was looking to take a walk and decided to head over to the San Francisco Zoo to finally renew my membership and visit some of their new critters. My first stop was Tenzing, their new Red Panda! He popped out for a few minutes and then I had to return after a loop around that part of the zoo to return to him wandering around his enclosure.

From there I went over to visit one of their newest babies – a baby Patas Monkey! They are probably my favorite monkeys at the zoo, so getting to see a baby was pretty awesome, especially once he stopped being latched to his mother and wandered around a bit on his own. So cute.

More photos from my short zoo visit here: https://www.flickr.com/photos/pleia2/sets/72157644958797102

I still need to visit the baby peccaries before they get too big and see if I can get a glimpse of the two-toed sloth that’s living in the Lion House.

Work has been busy, but continues to be enjoyable. I mentioned in a previous post that I regretfully had to miss the OpenStack Summit in Atlanta due to being sick, but some of my awesome colleagues sent me a small gift and post card. I also have more travel coming up this week. On Thursday I fly to Austin to speak at the Texas Linux Festival, which I’ve been hearing great things about for years but never had the opportunity to attend until now. From there, I’ll be flying to Zagreb, Croatia to participate in DORS/CLUG 2014 where I will be giving a keynote on the Continuous Integration system for OpenStack developers and also participating in a Women in Tech panel with Ana Mandić, Jasna Benčić and Lucija Pilić.

Other than that, health stuff has taken up much of my time as my abdominal pain continues and I have to avoid alcohol and fatty/fried foods to keep the pain in check. On the bright side, this means I’m now eating healthier! I’ve gone in for 2 major diagnostic tests, both of which came back inconclusive. I’m continuing to work with doctors and will be making an appointment with a surgeon soon to see about my options for removing organs-I-don’t-need-but-may-be-causing-pain. Given my travel schedule this year, I’m terribly stressed about the timing of all this and hope that we find the cause and can solve it quickly and efficiently with limited recovery time, as I’m planning to travel again in July.

by pleia2 at June 08, 2014 07:43 PM

June 06, 2014

Akkana Peck

Santa Fe Highway Art, and the Digestive Deer

Santa Fe is a city that prides itself on its art. There are art galleries everywhere, glossy magazines scattered around town pointing visitors to the various art galleries and museums.

Why, then, is Santa Fe county public art so bad?

[awful Santa Fe art with eagle, jaguar and angels] Like this mural near the courthouse. It has it all! It combines motifs of crucifixions, Indian dancing, Hermaphroditism, eagles, jaguars, astronomy, menorahs (or are they power pylons?), an angel, armed and armored, attempting to stab an unarmed angel, and a peace dove smashing its head into a baseball. All in one little mural!

But it's really the highway art north of Santa Fe that I wanted to talk about today.

[roadrunner highway art] [horned toad highway art] [rattlesnake highway art] Some of it isn't totally awful. The roadrunner and the horned toad are actually kind of cute, and the rattlesnake isn't too bad.

[rooster highway art] [turkey highway art] On the other hand, the rooster and turkey are pretty bad ...

[rabbit highway art] and the rabbit is beyond belief.

As you get farther away from Santa Fe, you get whole overpasses decorated with names and symbols:
[Posuwaegeh and happy dancing shuriken]

[Happy dancing shuriken] I think of this one near Pojoaque as the "happy dancing shuriken" -- it looks more like a Japanese throwing star, a shuriken, than anything else, though no doubt it has some deeper meaning to the Pojoaque pueblo people.

But my favorite is the overpass near Cuyamungue.

[K'uuyemugeh and digestive deer]

See those deer in the upper right and left corners?

[Cuyamungue digestive deer highway art] Here it is in close-up. We've taken to calling it "the digestive deer".

I can't figure out what this is supposed to tell us about a deer's alimentary tract. Food goes in ... and then we don't want to dwell on what happens after that? Is there a lot of foliage near Cuyamungue that's particularly enticing to deer? A "land of plenty", at least for deer? Do they then go somewhere else to relieve themselves?

I don't know what it means. But as we drive past the Cuyamungue digestive deer on the way to Santa Fe ... it's hard to take the city's airs of being a great center of art and culture entirely seriously.

June 06, 2014 06:40 PM

June 03, 2014

Nathan Haines

Ubuntu Installfest with OCLUG

Nathan Haines and Stephen Ingram

Last Saturday, Ubuntu held an installfest along with the Orange County Linux Users Group (OCLUG) in Fullerton, California. Thanks to the enthusiasm of OCLUG and its members, and the assistance of volunteers from the Ubuntu California Local Community Team, the event was a success.

OCLUG used to hold Linux installfests all the time, but has been fairly dormant the past couple of years, with meeting attendance small but consistent. Late last year, they considered holding an installfest as a way to get more interest from students and the community. The LUG agreed that it was best to promote a single distribution to reduce confusion and that teasing or jokes about other software—even though good-natured—was to be avoided during the event. A simple majority agreed that a default Ubuntu install was the best distro to offer to new users and it was agreed that anyone who came in wanting to install specific software would be welcomed as well. This was a compromise that everyone was happy with and it allowed the installfest to be a focused event.

OCLUG meets once a month at California State University Fullerton, and so advertising for the event was done with flyers, which were posted around the campus and in nearby coffee shops. It contained a simple pitch for Ubuntu, a URL for OCLUG and a QR code for the OCLUG installfest information page. We also emailed school faculty with information about the installfest, attaching a PDF of the flyers as well as a single-page “talking points” flyer that had a bulleted list talking about Ubuntu, installfests, and OCLUG, to encourage faculty to discuss the event with their students.

Ubuntu California supplied their secondary banner and table cloth, and Canonical arranged for reimbursement for pizza costs. Both were funded via the Ubuntu community donations from the Ubuntu download page, so I am very grateful to the generosity of the community. Canonical also provided Ubuntu 14.04 LTS discs and a conference pack with giveaway items. I designed name badges for both the OCLUG volunteers and the installfest attendees, and I also adapted the installfest liability release forms and data sheet forms from the Installfest HOWTO so that they matched the flyers and other documents.

When the day of the installfest finally arrived, we had four Ubuntu volunteers and nine OCLUG volunteers. We had 7 attendees, with 4 who brought their computers for an install and 3 more who simply wanted to attend and learn more about Ubuntu. Everyone arrived on time and enjoyed the donuts and coffee provided by OCLUG as is usual for their meetings. We had a greeter or two by the parking structure to direct attendees to the classroom. An OCLUG volunteer passed out the installfest forms and I had the Ubuntu volunteers distribute a standard swag pack for each attendee: Ubuntu lanyard, sticker sheet, pen, button, and Desktop install disc. Stephan Ingram, the president of OCLUG welcomed everyone and introduced me, then I gave my presentation to the group. I briefly discussed operating systems and the ideals of Free Software so that I could go into detail about what and why Ubuntu offers a complete computing solution that is elegant and easy to use. I described the Ubuntu and local Linux communities, and then quickly explained the release forms and discussed some USB keys that were available for purchase. Then installation began.

Everyone helped out and the attendees were able to get Ubuntu installed on their machines and have conversations about computer and software, and everyone had a good time. I burned some 32-bit Ubuntu discs for a couple attendees and passed out the Xubuntu discs I had prepared for slower machines. The pizza came, and while everyone was eating I showed off Ubuntu on my phone, demonstrating phone and desktop convergence using the Weather app. After the pizza was finished, I returned to the front of the room to demonstrate the key features of the Unity desktop interface, discuss the benefits of Unity’s online search and how to turn it off, and how to enable Autohide, change the desktop background, and use the Ubuntu Software Center and the Unity Dash to search for and install applications. Using Stellarium as an example, I then proceeded to launch and demonstrate this virtual planetarium software as an example of the rich content available with Free Software solutions.

We ended the installfest with a giveaway. I drew names of attendees and we gave away an Android tablet and an external phone/tablet battery provided by OCLUG members, and then we gave away three exclusive Ubuntu Cloud t-shirts provided by Canonical in their conference pack. By the time the installfest was over, we had installed Ubuntu successfully on every target machine, passed out 35 Ubuntu Desktop discs and 3 Ubuntu Server discs, sold 5 USB drives, and impressed a faculty member who promised to promote the next installfest to his students because he said there was no reason they should have to pay for scientific software if they could have high quality software for free. He also discussed academic year timing with the OCLUG president and based on that there are preliminary plans to repeat the installfest in September when we should be able to attract more students.

Looking back, the flyers were designed for on-campus use but traveled further, so they should give a little more location context, and the installfest page should probably include specific event information instead of relying on the OCLUG main page. We only had a 4-hour window for the event and I still feel this isn’t quite long enough. I didn’t have much time to dedicate to the Ubuntu volunteers, all of whom were volunteering for the first time and while I felt bad about this, they all stepped up and excelled in a way that made me very proud. For September, I intend to engage the university’s radio, television, and newspapers to help spread the word a bit further on campus.

Photos of the event are available to download at http://people.ubuntu.com/~nhaines/images/events/2014/oc-installfest-may/

I’d like to encourage anyone in the Ubuntu community to modify and adapt any printable resource that would be helpful to them. All printable media as well as the source documents, the main presentation, and sanitized attendee records are available to download at http://people.ubuntu.com/~nhaines/documents/events/2014/oc-installfest-may/

June 03, 2014 03:35 AM

Akkana Peck

Cicada Rice Krispies

[Cicadas mating] Late last week we started hearing a loud buzz in the evenings. Cicadas? We'd heard a noise like that last year, when we visited Prescott during cicada season while house-hunting, but we didn't know they had them here in New Mexico. The second evening, we saw one in the gravel off the front walk -- but we were meeting someone to carpool to a talk, so I didn't have time to race inside and get a camera.

A few days later they started singing both morning and evening. But yesterday there was an even stranger phenomenon.

"It sounds like Rice Krispies out in the yard. Snap, crackle, pop," said Dave. And he was right -- a constant, low-level crackling sound was coming from just about all the junipers.

Was that cicadas too? It was much quieter than their loud buzzing -- quiet enough to be a bit eerie, really. You had to stop what you were doing and really listen to notice it.

It was pretty clearly an animal of some kind: when we moved close to a tree, the crackling (and snapping and popping) coming from that tree would usually stop. If we moved very quietly, though, we could get close to a tree without the noise entirely stopping. It didn't do us much good, though: there was no motion at all that we could see, no obvious insects or anything else active.

Tonight the crackling was even louder when I went to take the recycling out. I stopped by a juniper where it was particularly noticeable, and must have disturbed one, because it buzzed its wings and moved enough that I actually saw where it was. It was black, maybe an inch long, with narrow orange stripes. I raced inside for my camera, but of course the bug was gone by the time I got back out.

So I went hunting. It almost seemed like the crackling was the cicadas sort of "tuning up", like an orchestra before the performance. They would snap and crackle and pop for a while, and then one of them would go snap snap snap-snap-snap-snapsnapsnapsnap and then break into its loud buzz -- but only for a few seconds, then it would go back to snapping again. Then another would speed up and break into a buzz for a bit, and so it went.

One juniper had a particularly active set of crackles and pops coming from it. I circled it and stared until finally I found the cicadas. Two of them, apparently mating, and a third about a foot away ... perhaps the rejected suitor?

[Possible cicada emergence holes]
Near that particular juniper was a section of ground completely riddled with holes. I don't remember those holes being there a few weeks ago. The place where the cicadas emerged?

[Fendler's Hedgehog Cactus flower] So our Rice Krispies mystery was solved. And by the way, I don't recommend googling for combinations like cicada rice krispies ... unless you want to catch and eat cicadas.

Meanwhile, just a few feet away from the cicada action, a cactus had sprung into bloom. Here, have a gratuitous pretty flower. It has nothing whatever to do with cicadas.

Update: in case you're curious, the cactus is apparently called a Fendler's Hedgehog, Echinocereus fendleri.

June 03, 2014 03:20 AM

May 30, 2014

Akkana Peck

Punctuation Reveals Truth in advertising

This ad appeared in one of the free Santa Fe weeklies. It's got to be one of the funniest mis-uses of quotes I've seen.

Does she not know that putting quotes around something means that you're quoting someone, you're introducing an unfamiliar word or phrase, or you're trying to draw attention to the quoted phrase and cast doubt on it or make fun of it? That third use, by the way, is called scare quotes. Like you'd see in a phrase like this:

One expects lawyers to have a good command of English, and to pay attention to detail, so ... what should we think?

"Injured" isn't an unfamiliar word, so it has to be either the first or third use. And whether she's soliciting clients who only say they're injured, or she's casting doubt on the injury, it's hard not to read this as an offer to help people pretend to be injured to collect a payout.

Which I'm sure happens all the time ... but I don't think I've previously seen an ad implying it so strongly.

May 30, 2014 07:32 PM

May 29, 2014

Elizabeth Krumbach

Elasticsearch blog features OpenStack elastic-recheck

At the Southern California Linux Expo this year I ran into Leslie Hawthorn of Elasticsearch and I told her a bit about how we’re using Elasticsearch in a newly developed elastic-recheck tool for failed tests unrelated to the code being tested. The result was an invitation to write a guest post on the Elasticsearch blog.

Thanks to Leslie and editors on the OpenStack side the post went live today!

OpenStack elastic-recheck: Powered by the ELK Stack

The post describes how the tool works and is used by developers and features a presentation by Sean Dague from the OpenStack Summit in Atlanta this month.

by pleia2 at May 29, 2014 04:51 PM

Jono Bacon

Last Day Today

Recently I announced I am stepping down as Ubuntu Community Manager at Canonical and moving to XPRIZE as Senior Director of Community. Today is my last day at Canonical.

I just want to say how touched I have been by the response. The comments, social media posts, emails, and calls from you have been so kind and supportive. You are all good people, and I am going to miss every single one of you.

The reason why I have devoted my life to understanding communities is that I believe communities bring out the best in people, and all of you are a perfect example of that. I cannot express just how much I appreciate it.

Over the course of the next few weeks my replacement will be sourced and announced. and in the interim my team (Daniel Holbach, Michael Hall, David Planella, Nicholas Skaggs, Alan Pope) will take over my duties. Everything has been transitioned over, and remember, the weekly Q&As will continue at 6pm UTC every Tuesday on Ubuntu On Air with my team filling in for me. As ever, any and all Ubuntu questions are welcome!

Of course, I will still be around. I am going to continue to be a member of the Ubuntu community and an avid Ubuntu user, tester, and supporter. I will continue to be on IRC, you can email me at jono@jonobacon.org, I will continue to do Bad Voltage, and I have a busy schedule at the Community Leadership Summit, OSCON, and more. I am also going to continue to have my own Q&A session every week where you can ask questions about my perspectives on Ubuntu, Canonical, community management, XPRIZE, and more; I will announce this soon.

Ubuntu has a tremendous future ahead of it, built on the hard work and passion of a global community. We are only just getting started with a new era of Ubuntu convergence and cloud orchestration and while I will miss being there in an official capacity, I am just thankful that I can continue to be along for the ride in the very community I played a part in building.

I now have a few weeks off and then my new adventure begins. Stay tuned. :-)

by jono at May 29, 2014 03:34 PM

Community Leadership Summit 2014, New Forum, OSCON, Training, and More!

As many of you will know, I organize an event every year called the Community Leadership Summit. The event brings together community leaders, organizers and managers and the projects and organizations that are interested in growing and empowering a strong community.

The event pulls together these leading minds in community management, relations and online collaboration to discuss, debate and continue to refine the art of building an effective and capable community.

This year’s event is shaping up to be incredible. We have a fantastic list of registered attendees and I want to thank our sponsors, O’Reilly, Citrix, and LinuxFund.

The event is taking place on 18 – 19 July 2014 in Portland, Oregon. I hope to see you all there, it is going to be a fantastic CLS this year!

I also have a few other things to share too…

Community Leadership Forum

My goal as a community manager is to help contribute to the growth of the community management profession. I started this journey by publishing The Art of Community and ensuring it is available freely as well as in stores. I then set up the Community Leadership Summit as just discussed, and now I am keen to put together a central community for community management and leadership discussion.

As such, I am proud to launch the new Community Leadership Forum for discussing topics that relate to community management, as well as topics for discussion at the Community Leadership Summit event each year. The forum is designed to be a great place for sharing and learning tips and techniques, getting to know other community leaders, and having fun.

The forum is powered by Discourse, so it is a pleasure to use, and I want to thank discoursehosting.com for generously providing free hosting for us.

Be sure to go and sign up!

Speaking Events and Training

I also wanted to share that I will be at OSCON this year and I will be giving a presentation called Dealing With Disrespect that is based upon my free book of the same name for managing complex communications.

This is the summary of the talk:

In this new presentation from Jono Bacon, author of The Art of Community, founder of the Community Leadership Summit, and Ubuntu Community Manager, he discusses how to process, interpret, and manage rude, disrespectful, and non-constructive feedback in communities so the constructive criticism gets through but the hate doesn’t.

The presentation covers the three different categories of communications, how we evaluate and assess different attributes in each communication, the factors that influence all of our communications, and how to put in place a set of golden rules for handling feedback and putting it in perspective.

If you personally or your community has suffered rudeness, trolling, and disrespect, this presentation is designed to help.

This presentation is on Wed 23rd July at 2.30pm in E144.

In addition to this I will also be providing a full day of community management training at OSCON on Sunday 20th July in D135.

I will also be providing full day community management training at LinuxCon North America and LinuxCon Europe. More details of this to follow soon in a dedicated blog post.

Lots of fun things ahead and I hope to see you there!

by jono at May 29, 2014 04:51 AM

May 26, 2014

Elizabeth Krumbach

On college and my father

Higher education is a tricky subject for me. I work in an industry where a relatively high percentage of my professional peers are autodidacts who do not have formal higher education (myself included). I live in a world where I see a majority of students leaving their higher education institution burdened with substantial student debt and struggling to launch a career, or even get a job at all that will cover basic needs and school debt.

When I look at candidates I find myself largely disregarding higher education and focusing on experience (whether that be tinkering with a Raspberry Pi at a Hackerspace or running a large scale infrastructure) and their level of passion for technology. If the passion and personal excitement are there, our team can bring them up to speed with what they need to know. Plus, technology changes so fast that you need people who can keep up and learn new things quickly, not be tied into old technologies or development paradigms.

Now, there are certainly fundamentals in computer science that are valuable as you progress in your career, but today services like Coursera can get a committed professional up to speed on fundamentals online for free (an aside: The Hardware/Software Interface is an great class, I did it last year and it’s starting again on June 30th).

For me, being an autodidact was largely related to my learning disability, one that I share with my father but which we took two very different paths to overcome.

I graduated high school and swore never to back to school. School was hard for me, I had to study a lot to do well. Today I still employ strategies of note-taking and multimedia to grasp concepts. I also learn through immersion and have a much easier time understanding something if I spend a lot of time drilling down to the bottom and work my way up from specific to general. This takes longer and is really hard to do in a school setting where you’re taught pieces, quizzed on them, and then brought to the next part. Over time I have learned that I often have a much greater understanding than my peers at the end, but along the way it certainly doesn’t feel that way, because I struggle a lot more and ask a lot of strange questions.

Overall, my philosophy has been that if I need additional education, there are plenty of opportunities for me to get it. Putting myself into substantial debt (and the stress of a school setting) in the beginning of adulthood when I don’t actually know what I want to do anyway seemed like it would have been a serious waste.

My father graduated high school, and reading letters he sent to my grandfather I learned that he was a dedicated intellectual who also struggled. But college was the Thing To Do if you wanted to have a professional career in late 1960s-early 1970s. It was a sign of prestige and what was expected from an upper middle class family. It seems to me that it was less about “smart not being enough” and more “if you, have the means, you go to college.”

With this in mind, I spent last week going through some of my father’s papers from Bethany College in West Virginia. My father was very proud of the time he spent at Bethany getting his Journalism degree, participating in school publications, a fraternity and a youth conservative/libertarian party.

I received a Super8 from my Aunt Meg where he is walking around the Bethany campus, showing off his dorm room and what looks like the fraternity house.

Of course graduation itself was a big deal for him as well, so I was happy to find photos from that occasion.


Posing with my grandfather

And in the collection of paintings sent to me, his diploma (which I’m shipping off to my youngest sister this week).

A handful of other photos from his graduation here: https://www.flickr.com/photos/pleia2/sets/72157644277944608/

I also learned that in 1978 he established the “James W. Carty Jr. Award” presented to “an outstanding student who excels in work with the campus print media.” Going through papers it looks like he got a notification letter as late as 1999 for a student who has received it.

I’m really proud of my father. He never lost his passion for learning in spite of difficulties, and made the step that I couldn’t do in going to and completing college. I’ve also found myself softening on the idea of higher education. There are so many experiences that he and others had in these higher learning institutions and professional networking opportunities that I have simply missed out on.

He also left an inspiring legacy for me that stressed hard work as the route to success and brought me up in a culture that saw exceptional performance as the norm.

by pleia2 at May 26, 2014 08:23 PM

May 25, 2014

Akkana Peck

Raspberry Pi Motion Camera: Part 2, using gphoto2

I wrote recently about the hardware involved in my Raspberry Pi motion-detecting wildlife camera. Here are some more details.

The motion detection software

I started with the simple and clever motion-detection algorithm posted by "brainflakes" in a Raspberry Pi forum. It reads a camera image into a PIL (Python Imaging Library) Image object, then compares bytes inside that Image's buffer to see how many pixels have changed, and by how much. It allows for monitoring only a test region instead of the whole image, and can even create a debug image showing which pixels have changed. A perfect starting point.

Camera support

As part of the PiDoorbell project, I had already written a camera wrapper that could control either a USB webcam or the pi camera module, if it was installed. Initially that plugged right in.

But I was unhappy with the Pi camera's images -- it can't focus closer than five feet (though a commenter to my previous article pointed out that it's possible to break the seal on the lens and refocus it manually. Without refocusing, the wide-angle lens means that a bird five feet away is pretty small, and even when you get something in focus the images aren't very sharp. And a web search for USB webcams with good optical quality was unhelpful -- the few people who care about webcam image quality seem to care mostly about getting the widest-angle lens possible, the exact opposite of what I wanted for wildlife.

[Motion detector camera with external  high-res camera] Was there any way I could hook up a real camera, and drive it from the Pi over USB as though it were a webcam? The answer turned out to be gphoto2.

But only a small subset of cameras are controllable over USB with gphoto2. (I think that's because the cameras don't allow control, not because gphoto doesn't support them.) That set didn't include any of the point-and-shoot cameras we had in the house; and while my Rebel DSLR might be USB controllable, I'm not comfortable about leaving it out in the backyard day and night.

With gphoto2's camera compatibility list in one tab and ebay in another, I looked for a camera that was available, cheap (since I didn't know if this was going to work at all), and controllable. I ordered a used Canon A520.

As I waited for it to arrive, I fiddled with my USB-or-pi-camera to make a start at adding gphoto2 support. I ended up refactoring the code quite a bit to make it easy to add new types of cameras besides the three it supports now -- pi, USB webcam, and gphoto2. I called the module pycamera.

Using gphoto2

When the camera arrived, I spent quite a while fiddling with gphoto2 learning how to capture images. That turns out to be a bit tricky -- there's no documentation on the various options, apparently because the options may be different for every camera, so you have to run

$ gphoto2 --set-config capture=1 --list-config
to get a list of options the camera supports, and then, for each of those options, run
$ gphoto2 --get-config name [option]
to see what values that option can take.

Dual-camera option

Once I got everything working, the speed and shutter noise of capturing made me wonder if I should worry about the lifespan of the Canon if I used it to capture snapshots every 15 seconds or so, day and night.

Since I still had the Pi cam hooked up, I fiddled the code so that I could use the pi cam to take the test images used to detect motion, and save the real camera for the high-resolution photos when something actually changes. Saves wear on the more expensive camera, and it's certainly a lot quieter that way.

Uploading

To get the images off the Pi to where other computers can see them, I use sshfs to mount a filesystem from another machine on our local net.

Unfortunately, sshfs on the pi doesn't work quite right. Apparently it uses out-of-date libraries (and gives a warning to that effect). You have to be root to use it at all, unlike newer versions of sshfs, and then, regardless of the permissions of the remote filesystem or where you mount it locally, you can only access the mounted filesystem as root.

Fortunately I normally run the motion detector as root anyway, because the picamera Python module requires it, and I've just gotten in the habit of using it even when I'm not using python-picamera. But if you wanted to run as non-root, you'd probably have to use NFS or some other remote filesystem protocol. Or find a newer version of sshfs.

Testing the gphoto setup

[Rock squirrel using Raspberry Pi camera] For reference, here's an image using the previous version of the setup, with the Raspberry Pi camera module. Click on the image to see a crop of the full-resolution image in daylight -- basically the best the camera can do. Definitely not what I was hoping for.

So I eagerly set up the tripod and hooked up the setup with the Canon. I had a few glitches in trying to test it. First, no birds; then later I discovered Dave had stolen my extension cord, but I didn't discover that until after the camera's batteries needed recharging.

A new extension cord and an external power supply for the camera, and I was back in business the next day.

[Rock squirrel using Raspberry Pi camera] And the results were worth it. As you can see here, using a real camera does make a huge difference. I used a zoom setting of 6 (it goes to 12). Again, click on the image to see a crop of the full-resolution photo.

In the end, I probably will order one of the No-IR Raspberry pi cameras, just to have an easy way of seeing what sorts of critters visit us at night. But for daylight shots, an external camera is clearly the way to go.

The scripts

The current version of the script is motion_detect.py and of course it needs my pycamera module. And here's documentation for the motion detection camera.

May 25, 2014 02:09 AM

May 24, 2014

Elizabeth Krumbach

My father traveled

Following the passing of my grandmother this winter my aunts Elaine and Meg sent me a few boxes of my father’s possessions that my grandparents had kept in their care since his passing in 2004.

Going through all of this paperwork has been sad and I’ve learned more about my father than I knew growing up, but one thing I did know was that he traveled a lot in his youth. As a result, I’ve had the travel bug for as long as I can remember, but it’s only in the past few years that I’ve finally had the opportunity to do it. South America and Antarctica are the last two continents I have to cross off my list, plus various specific countries I want to visit.

A few boxes of slides that came along with the paperwork spoke to his travels, showing trips to Spain and a safari in Africa. I had 57 slides in total transferred into digital, which was an expensive endeavor at $2.25 each (need to look into their bulk deal next time!), but I’m really impressed with how clear they came out.

In the collection I found an article about a trip taken in what the photos indicate was 1967, making my father 15 or 16 at the time of the trip. The article describes that they did a touchdown in Lisbon and then spent a day in Madrid and Rome. From the slides from when he was in Spain, I learned that they go to see and go inside Las Ventas Bullring. From the article:

They also saw the bull ring (but no bull fight) and were interested in the fact that on fight days a complete hospital was set up right at the ring with surgeons, nurses, orderlies and everything else – a silent commentary on what a bull fight may be expected to produce.


My father and Uncle Paul outside Las Ventas (and inside too)

They also were able to visit the Royal Palace, where the article writes:

They went to the Royal Palace which they were told has somewhere around 2,800 rooms. “But we only saw about 50 of them” Carl says.

More photos from the trip here: https://www.flickr.com/photos/pleia2/sets/72157644678697471/

As a favorite of my father’s, Madrid is still in the top places I’d like to visit some time.

I also had a box of slides labeled “Extra Africa” which were dated 1973 and showed one of my father’s safari trips in Africa, including a rare photo of rhinos that I remember him being quite proud of:

And a hilarious photo of a large sleeping lion:

Now I’ve been to Africa, but never a safari (though feeding monkeys in Ghana was one of the best experiences of my life). Based on his passport stamps, I was able to determine that he went to Kenya and Tanzania during his travels. Doing a proper safari to see big wild animals in Africa is totally still on my list.

More photos from their time in Africa here: https://www.flickr.com/photos/pleia2/sets/72157644733378853/

Finally, there was also a post card from when my father visited Hong Kong – somewhere I actually have been!

On the back of the postcard my father writes that they visited Kowloon. Based on passport stamps it looks like he went to Hong Kong in 1973.

My father also had stories about traveling to Egypt that I remember slides from (Cairo is totally on my list once things settle down) and I know there are more Africa slides floating around. I’ll have to follow up with my relatives to see if I can borrow them to be processed to digital.

by pleia2 at May 24, 2014 09:12 PM

May 21, 2014

Elizabeth Krumbach

Ubuntu 14.04 Presentations at FeltonLUG and BALUG

This past week I’ve had the opportunity to join two separate Linux Users Groups (LUGs) to give presentations on the Ubuntu 14.04.

The talks were a full talk version of the mini talk I gave at the release party in San Francisco last month, covering:

  • Unity Desktop
  • Server
  • Phablet
  • Xubuntu

I’m a member of the Xubuntu team and use it primarily myself, which is why that flavor got special treatment ;)

The first talk was on Saturday for FeltonLUG down near Santa Cruz. Since I had a series of laptops already installed and set up from when we did the release party , I packed them up and brought them along with me.

We had a small group, so the meeting was a bit more on the informal side and folks had a lot of great questions and comments throughout the presentation. Given the group size it was also possible to have everyone give my Nexus 7 with Ubuntu on it a try, which folks had a lot of questions about.

Thanks to Bob Lewis and Larry Cafiero for being such great hosts, at their scenic drive recommendation my husband and I had a wonderful trip up route 1 along the coast on our way home.

Last night I joined BALUG here in San Francisco. I brought along my trusty tahr and pile of demo laptops for this presentation as well.

In addition to the great questions about the direction of Ubuntu in general (desktops! servers! clouds! tablets! phones!) I was really happy to have server folks in my audience for this talk who were eager to hear about the changes to virtualization technologies and such on the server side. I even was able to have a chat with a sysadmin who is doing a lot of virtualization and told me that her team is looking at deploying OpenStack in the near future.

Slides from both presentations are available online, the BALUG one includes some screenshots from Xubuntu since I was using a Unity-based laptop to present there:

The .odp versions of these slides are also available, just swap out .pdf for .odp in each url.

by pleia2 at May 21, 2014 06:11 PM

Akkana Peck

Comcast contractors showed up! but wouldn't do the actual job

There's a new wrinkle in our ongoing Comcast Odyssey.

I was getting ready to be picked up to carpool to a meeting when the doorbell rang. It was Comcast's contractor!

Dave and I went out and showed them the flags marking the route the cable was supposed to take. They nodded, then asked, in broken English, "Okay to dig under driveway?"

"Whaa-aa?" we said? "The cable goes from there" (indicating the box across the street) "to here" (indicating the line of flags across from the box, same side of the driveway.

They went over and pointed to the box on our side of the street, on the neighbor's property -- the box the Comcast installer had explicitly told us could in no way be used for our cable service. No, we don't know why, we told them, but every Comcast person who's been here has been insistent that we can't use that box, we have to use the one across the street.

We pointed to the painted lines on the street, the ones that have been there for a month or more, the ones that the county people left after inspecting the area and determining it safe to dig. We point out that digging across the street is the reason they had to get a traffic permit. We tell them that the cable under the driveway is why the cable was torn up in the first place, and that we're expecting to have our driveway graded some time soon, so they put a new cable there, it will probably just get torn up again. Not that any of that matters since Comcast says we can't use that box anyway.

They look at us blankly and say "We dig across driveway?"

My ride arrives. I have to leave. Dave tries for another five or ten minutes, but he has to leave too. So he finally gives up, tells them no, don't put the cable across the driveway, go back and confirm with their supervisor about the job they're here to do because that isn't it.

I guess they left. There were no signs of digging when we got back.

Later, I checked the dates. It's been 18 days since they applied for a permit. I'm pretty sure the county told me a permit is only good for 11 days, or was it two weeks? Any, less than 18 days. So they probably didn't have a permit any more to dig across the street anyway ... not that that necessarily has any bearing on whether they'd dig.

May 21, 2014 03:20 PM

Elizabeth Krumbach

Back East and Out West

A few weeks ago MJ and I flew to Philadelphia to do some visiting with family and so I could speak at LOPSA-East. The timing worked out well since it was also the week of our first anniversary and we got married in Pennsylvania.

We had a wonderful dinner on our anniversary at the same restaurant that we took our family to prior to the wedding.

Our wedding cake came from Bredenbeck’s Bakery in Chestnut Hill, Philadelphia and they offer a free anniversary cake in order to curb the “freeze top of cake” tradition. We picked up ours on Monday afternoon and fortunately we were staying in an Extended Stay hotel for most of our stay so we had a refrigerator (and even a big knife, fork and plates!) so we could enjoy the cake over a few days, yum!

As always, it was really nice to be able to catch up with some family and friends while we were in town. It was quite a busy trip though. I’m hoping we can schedule a proper vacation some time this year instead of only having “short trips attached to a conference.” As fun as they can be, I could really use a beach.

Well, a warm beach. We have beaches in northern California. On Saturday on our way home from FeltonLUG where I gave a presentation on Ubuntu 14.04 we took the long route home and were able to enjoy the scenes of Route 1 up the coast. California is truly my favorite place, it is always nice to take a coastal drive for the beautiful reminder.

Finally, it’s not all been roses over here. I’ve been pretty sick on and off since just prior to my trip to Montreal in April for PyCon. I had to take a couple of days off of work, missed the OpenStack Summit in Atlanta last week after an emergency room visit (thankfully we mostly ruled out appendicitis) and have generally been trying to take it easy. Next Tuesday I go in for more diagnostic tests that will hopefully determine what is causing my abdominal pain. In the meantime, small meals and plenty of liquids are getting me through with minimal pain.

Running has taken a back seat since getting sick, but I’m hoping I can get back to it once I get a diagnosis and work out some kind of treatment plan that can factor in cardio workouts again. Being in constant pain (even dull pain) is also exhausting. Every day I’ve had to carefully plan out my work schedule so I can get a full 8 hours in, be cautious about how much personal work I’m doing and cut back on going out so not to get too exhausted and make things worse. Also, as much as I enjoyed catching up with Once Upon a Time and binge watching the first season of The Paradise, I’m now terribly bored of this whole “taking it easy” thing. There’s a reason I work so much!

by pleia2 at May 21, 2014 12:13 AM

May 19, 2014

Jono Bacon

Goodbye Canonical, Hello XPRIZE

After nearly eight years of service at Canonical, I will be stepping down as the Ubuntu Community Manager and leaving my fellow warthogs at Canonical on 29th May 2014.

I have always been passionate about two things in my life. Firstly, I want to go to work every day and feel that my efforts are having a wider impact on the world. Secondly, I believe that community and collaboration is at the core what makes us human and what drives us to create beautiful things.

Canonical has provided room for me to explore both of these areas in droves. Free Software is an undeniable power for good in making technology accessible to all. Ubuntu has been at the forefront of this; focusing on simplicity, elegance, and ease of use to make technology as accessible and widely available as possible. Canonical and the Ubuntu Community has also provided an environment in which I could explore the many facets of community building, leadership, and growth…trying lots of ideas, learning from what worked and what didn’t, and evolving what we do.

This has resulted in me having the opportunity to learn from great people, in fun and challenging situations, and to further the art and science of building great communities.

A new chapter

…and this is where a new chapter in my life opens.

Recently I was presented with the opportunity to go and work at the XPRIZE Foundation.

For those of you unfamiliar with XPRIZE, their focus is to solve the major problems facing humanity. This work is delivered by incentivized competitions to solve these grand challenges.

This started with the $10million Ansari XPRIZE that spawned the commercial space-flight industry. Other examples include the Qualcomm Tricorder XPRIZE (to create an affordable handheld device to diagnose health issues), the Google Lunar XPRIZE (to achieve the safe landing of a private craft on the surface of the moon), the Wendy Schmidt Ocean Health XPRIZE (improving our understanding of ocean acidification), and the A.I XPRIZE (create the first A.I. to walk or roll out on stage and present a TED Talk so compelling that it commands a standing ovation).

XPRIZE is an organization with significant ideas and ambitions to have a profound impact on the world. If you want to get a better feel for this, I recommend you watch this video by founder, Peter Diamandis; it is tremendously inspiring.

Peter believes that competition is in our DNA. I believe that collaboration and community is in our DNA. As you can imagine, these concepts are complimentary to each other and this is why I feel like this such a natural fit for me.

As such, I will be joining XPRIZE as Senior Director of Community. I will be there to look at the full breadth of what XPRIZE does and inject community and collaboration into the many different layers from how the prizes are picked, how teams are formed, how R&D is created, how technologies go into production, and more. I am tremendously excited about the opportunity.

Difficult decisions

Although XPRIZE is an exciting (if unknown) road forward, leaving Canonical is bittersweet.

To put this in starker terms, Canonical quite literally changed my life. It helped to transform my career from a position of observation of communities to one of structured best practice. It helped me to think differently, challenge myself, and be open to being challenged by others. It afforded me the opportunity to travel the world, meet incredible people, see incredible things, and ultimately led me to meet my wife, Erica, who has become the corner-stone of our family. This was never a job, it was a way of life, and Canonical provided every ounce of support in helping me to achieve what I did here and to be the best that I could be.

Working with the Ubuntu community has not just been a privilege, it has been a pleasure. One of the many reasons why I love what I do is that I am exposed to so many incredible people, minds, and ideas, and the Ubuntu community is a text-book definition of what makes community so powerful and such an agent for making the world a better place. I will be forever thankful for not just the opportunity to meet so many different members of the global Ubuntu family, but to also continue these many friendships into my next endeavour.

Now, some of you reading this may be concerned by this move. Some of you may be worried that my departure is due to a negative experience at Canonical, or that the community is somehow less important than it used to be. I want to be very clear in responding to this.

I am not leaving Canonical due to annoyance, frustration, bureaucracy, lack of support or anything else negative. I have a wonderful relationship with Mark Shuttleworth, Jane Silber, Rick Spencer and the other executives. I have a great relationship with my peers and my team, and I love going to work every single day. These people are not just colleagues, they are friends. I have long said I have the very best job in community management and I feel as strong about that today as I did when I joined.

I am not leaving Canonical due to problems, I am moving on to a new opportunity at XPRIZE. I actually wasn’t looking for a move; I was quite content in my role at Canonical, but XPRIZE came out of nowhere, and it felt like a good next step to move forward to.

Likewise, I can assure you that the relationship with community at Canonical has not changed at all. Mark Shuttleworth and the rest of the leadership team are passionate about our community and they are intimately aware that our community is critical to the success of Ubuntu.

I believe in Ubuntu as much as I did when I joined. I have long talked about how Free Software and Open Source is only truly game-changing if the technology is simple, powerful, and accessible. Ubuntu is the very best place to get Open Source across the desktop, cloud, and now the mobile space too. Canonical has hired a phenomenal team over the years to drive this, and we are seeing the fruits of this success. I look forward to seeing this story unfold more and more and seeing Canonical achieve wider and wider ambitions.

Before I wrap up, I just want to offer some thanks to Mark Shuttleworth, Jane Silber, Rick Spencer, my team, my peers in the Ubuntu Engineering Management Team, my fellow warthogs at Canonical, and everyone in the Ubuntu community for being so supportive over the years. You all helped me turn my dream into a reality and help me become the person I am today.

I also want to say a special thank-you to Mark who gave me a shot in 2006 and has been a constant beacon of support and inspiration for so many years. I consider Mark a mentor, but more importantly a friend.

We have taken on some tough challenges over the years in Ubuntu, challenges that were necessary for us to grow. I have never questioned Mark’s commitment to our values and our success as a project once, and I am thankful for him to lead Ubuntu towards success; successful projects need leaders who can constantly ask new questions and explore new territory.

You don’t get rid of me that easily

Now, I won’t actually be going anywhere. I will still be hanging out on IRC, posting on my social media networks, still responding to email, and will continue to do Bad Voltage and run the Community Leadership Summit. I will continue to be an Ubuntu Member, to use Ubuntu on my desktop and server, and continue to post about and share my thoughts about where Ubuntu is moving forward. I am looking forward in many ways to experiencing the true Ubuntu community experience now I will be on the other side of the garden.

As I step out of my position at Canonical, I am hugely proud of the accomplishments of my team (Daniel Holbach, David Planella, Michael Hall, Nicholas Skaggs, Alan Pope (and alumni, Jorge Castro, Kyle Nitzsche, Ahmed Kamal)). I can’t think of a better group of people to continue to help our community to do great work and be successful.

To wrap things up, I will be doing my very last Q&A session on Tuesday 27th May 2014 at 6pm UTC on Ubuntu On Air – I hope to see you all there!

So, here is to fun and fond memories, and here is to a new set of challenges helping to create a a better world with XPRIZE. Thanks!

by jono at May 19, 2014 03:15 PM

Elizabeth Krumbach

Open Business Conference 2014

Back in 2010 I attended the Open Source Business Conference for the first time. It was only a few months after moving to San Francisco, and that and the Linux Foundation Collaboration Summit were my first exposure to major companies coming to conferences to get serious about open source adoption. It was a really inspiring event, as a passionate advocate of open source for years, watching it go mainstream has been a big deal for me personally (and, it turns out, my career).

The conference has since been rebranded the Open Business Conference, where they are focusing all all kinds of open, from open infrastructure planning to open data. Even better, I was excited to see that so many major companies are now not only advocating use of open source software, but are now employing programmers and engineers like myself to contribute directly to the open source projects they are using.

It was held at The Palace Hotel, which I can see from my bedroom window. Ostensibly I was attending as a local to help staff the HP booth, where I happily joined Jeanne and another local representative from the printer team at HP. But I was fortunate that the conference closed the booth areas during talks and booth staff were encouraged to attend the keynotes and sessions, hooray!

The first keynote from Matt Asay of MongoDB was slightly more toned down than the “we have made it!” excitement of the event in 2010. His message was that while open source can be called mainstream at this point, we have not yet saturated the industry and there are key spaces where open source still isn’t doing a great job of competing with proprietary vendors in the enterprise.

It was great to hear from Dr. Sanjiva Weerawarana, Founder, Chairman & CEO and WSO2 about their commitment to open source in their middleware product offerings. I also enjoyed hearing from Dr. Ibrahim Haddad of Samsung that they’ve launched the OSS Group to work toward becoming more of a leader in the open source world. Both of these companies were showing dedication to the open source ecosystem for similar reasons centered around their own products depending upon it, a faster path to innovation (starting from solid, open source core) and that the proverbial writing is on the wall when it comes to companies pushing back against vendor lock in and running vital business functions on too much proprietary code.

There were also a couple OpenStack related keynotes from Alan Clark of the OpenStack Foundation and Bill Hilf of HP Cloud. Clark echoed some of the business reasons for choosing open source covered by others, and specifically cited the success of OpenStack in an ecosystem of multiple vendors collaborating under a foundation, rather than a central company driving development. Hilf of HP focused more on defining hybrid clouds in the complex enterprise networks of today where customers can’t easily fit into defined boxes for solutions.

Other highlights the first day included a talk by Donnie Berkholz where he talked about the current divide in DevOps between communities that come from an Operations background and those that come from a Development background. Ops folks tend to be focusing on configuration management where devs are more interested in APIs, SDKs and Continuous Integration. Both communities could benefit from working more closely together to merge their efforts and even their conferences to a more solidified DevOps movement. I also enjoyed Svetlana Sicular’s talk on “the Internet of data” where she spoke about some of the current challenges that organizations and our society as a whole have with big data. There is a considerable amount of data out there that could be doing everything from solving small problems to saving lives, if we can just learn how to appropriately (and safely) share and process it all.

It was also great to hear from Intel and Dish Network on the Open Source work they’ve been doing. I also enjoyed much of the talk from Sanjib Sahoo of tradeMONSTER about their use of open source, but was pretty disappointed that they seem to almost actively choose not to contribute features back to the open source projects they use. I find the excuse that “open source development is not our business” to be wearing pretty thin these days when you see companies like Samsung and HP making such major efforts.

After the opening keynotes, I think my favorite presentation was by Dianne Marsh of Netflix talking about the Continuous Delivery system and “monkeys” that they have in production to test the resiliency, conformity and more in their infrastructure and applications. This work is cool enough and made it a valuable session for me, but what made it noteworthy was that key portions of this infrastructure are open source at: netflix.github.io. Awesome!

The last two talks I went to were related once again to OpenStack, first Alex Freedland of Mirantis whose points about the current open source ecosystem were very valuable, most notably that innovation is now happening in the open source space rather than attempting to play catch up by offering open source alternatives to proprietary solutions. The second was by Andrew Clay Shafer whose talk was related to a blog post from last November about some of the weaknesses of OpenStack. While I don’t agree with all of his points, his slides are here and get to his specific critiques around slide 24. He also offered some suggestions on how to improve development, most notably by increasing the focus on OpenStack users, particularly smaller organizations who currently struggle with it.

In all, it’s definitely a more business-centric conference than I’m used to attending (which is by design) but I found a lot of value in many of the sessions even from a technical perspective.

More photos from the conference available here: https://www.flickr.com/photos/pleia2/sets/72157644220587259/

by pleia2 at May 19, 2014 01:27 AM

May 16, 2014

Elizabeth Krumbach

Ridgewood Schoolhouse Museum

Back in February I lost my grandmother. Due to her wishes, timing (middle of a rough winter in New Hampshire) and our family being spread all over the world there wasn’t a service immediately following her passing. So when I learned I’d be in New Jersey in April I made time in my schedule to visit the Schoolhouse Museum, maintained by the Ridgewood Historical Society, where my grandmother volunteered for years.

I have fond memories of the Schoolhouse as a child. When I last went it was still set up as a one room schoolhouse with a desk for the teacher in front, blackboards and desks for students. Displays of historical artifacts lined the edges of the room and I remember stories from my grandfather extolling the benefits of the one room schoolhouse.

Since I had been there last, there were a lot of changes. Instead of being set up like a classroom, it’s now a series of more sparsely spaced exhibits which gives the museum a much brighter feel. And while I do miss the traditional feel of the old place, this new format has allowed them to have more extensive revolving exhibits, which keep the museum relevant to locals and visitors alike.

It was nice to see the back rooms still had some of the artifacts I was familiar with, from uniforms of various soldiers in various wars who called Ridgewood home, to the farm exhibit that spoke to the farming origins of the village.

During our visit they were showcasing a diversity exhibit in the main room, seeking to highlight some of historically less celebrated (and even actively discriminated against) communities that have made up Ridgewood over the years. The beautiful exhibit highlighted artifacts from the Native Americans who first lived on the land, and community members of Jewish, African American, Korean and Irish ancestry.

Perhaps best of all, I was able to meet a couple of the docents who were exceptionally welcoming to us. One of them had worked with my grandmother and they both were able to fill me in on some of the extraordinary work my grandmother did organizing some of their collections. We even had the honor of going upstairs to the attic to browse their storage, walking up that narrow staircase also brought back a flood of memories!

Huge thanks to the docents I met with for making the visit such a pleasure, and to Board of Trustees president Sheila Brogan for making me feel welcome via email prior to our visit!

More photos from our visit are available here: https://www.flickr.com/photos/pleia2/sets/72157644219844720/

by pleia2 at May 16, 2014 02:28 AM

May 15, 2014

Akkana Peck

A Raspberry Pi motion-detecting wildlife camera

I've been working on an automated wildlife camera, to catch birds at the feeder, and the coyotes, deer, rabbits and perhaps roadrunners (we haven't seen one yet, but they ought to be out there) that roam the juniper woodland.

This is a similar project to the PiDoorbell project presented at PyCon, and my much earlier proximity camera project that used an Arduino and a plug computer but for a wildlife camera I didn't want to use a sonar rangefinder. For one thing, it won't work with a bird feeder -- the feeder is always there, so the addition of a bird won't change anything as far as a sonar rangefinder is concerned. For another, the rangefinders aren't very accurate beyond about six feet.

Starting with a Raspberry Pi was fairly obvious. It's low power, cheap, it even has an optional integrated camera module that has reasonable resolution, and I could re-use a lot of the camera code I'd already written for PiDoorbell.

I patched together some software for testing. I'll write in more detail about the software in a separate article, but I started with the simple motion detection code posted by "brainflakes" in the Raspberry Pi forums. It's a slick little piece of code you'll find in various versions all over the net; it uses PIL, the Python Imaging Library, to compare a specified region from successive photos to see how much has changed.

One aside about the brainflakes code: most of the pages you'll find referencing it tell you to install python-imaging-tk. But there's nothing in the code that uses tk, and python-imaging is really all you need to install. I wrote a GUI wrapper for my motion detection code using gtk, so I had no real need to learn the Tk equivalent.

Once I had some software vaguely working, it was time for testing.

The hardware

One big problem I had to solve was the enclosure. I needed something I could put the Pi in that was moderately waterproof -- maybe not enough to handle a raging thunderstorm, but rain or snow can happen here at any time without much warning. I didn't want to have to spend a lot of time building and waterproofing it, because this is just a test run and I might change everything in the final version.

I looked around the house for plastic objects that could be repurposed into a camera enclosure. A cookie container from the local deli looked possible, but I wasn't quite happy with it. I was putting the last of the milk into my morning coffee when I realized I held in my hand a perfect first-draft camera enclosure.

[Milk carton camera enclosure] A milk carton must be at least somewhat waterproof, right? Even if it's theoretically made of paper.

[cut a hole to mount the Pi camera] I could use the flat bottom as a place to mount the Pi camera with its two tiny screw holes,

[Finished milk cartnn camera enclosure] and then cut a visor to protect the camera from rain.

[bird camera, installed] It didn't take long to whip it all together: a little work with an X-acto knife, a little duct tape. Then I put the Pi inside it, took it outside and bungeed it to the fence, pointing at the bird feeder.

A few issues I had to resolve:

Raspbian has rather complicated networking. I was using a USB wi-fi dongle, but I had trouble getting the Pi to boot configured properly to talk to our WPA router. In Raspbian networking is configured in about six different places, any one of which might do something like prioritize the not-connected eth0 over the wi-fi dongle, making it impossible to connect anywhere. I ended up uninstalling Network Manager and turning off ifplugd and everything else I could find so it would use my settings in /etc/network/interfaces, and in the end, even though ifconfig says it's still prioritizing eth0 over wlan0, I got it talking to the wi-fi.

I also had to run everything as root. The python-picamera module imports RPi.GPIO and needs access to /dev/mem, and even if you chmod /dev/mem to give yourself adequate permissions, it still won't work except as root. But if I used ssh -X to the Pi and then ran my GUI program with sudo, I couldn't display any windows because the ssh permission is for the "pi" user, not root.

Eventually I gave up on sudo, set a password for root, and used ssh -X root@pi to enable X.

The big issue: camera quality

But the real problem turned out to be camera quality.

The Raspberry Pi camera module has a resolution of 2592 x 1944, or 5 megapixels. That's terrific, far better than any USB webcam. Clearly it should be perfect for this tast.

[House finch with the bad Raspberry Pi camera module] Update: see below. It's not a good camera, but it turns out I had a lens problem and it's not this bad.

So, the Pi camera module might be okay if all I want is a record of what animals visit the house. This image is good enough, just barely, to tell that we're looking at a house finch (only if we already rule out similar birds like purple finch and Cassin's finch -- the photo could never give us enough information to distinguish among similar birds). But what good is that? I want decent photos that I can put on my web site.

I have a USB camera, but it's only one megapixel and gives lousy images, though at least they're roughly in focus so they're better than the Pi cam.

So now I'm working on a setup where I drive an external camera from the Pi using gphoto2. I have most of the components working, but the code was getting ugly handling three types of cameras instead of just two, so I'm refactoring it. With any luck I'll have something to write about in a week or two.

Meanwhile, the temporary code is in my github rpi directory -- but it will probably move from there soon.

I'm very sad that the Pi camera module turned out to be so bad. I was really looking forward to buying one of the No-IR versions and setting up a night wildlife camera. I've lost enthusiasm for that project after seeing how bad the images were. I may have to investigate how to remove the IR filter from a point-and-shoot camera, after I get the daylight version working.

[rock squirrel with cheeks full of sunflower seeds] Update, a few days later: It turns out I had some spooge on the lens. It's not quite as bad as I made it out to be. Here's a sample. It's still not a great camera, and it can't focus anywhere near as close as the 2 feet I've seen claimed -- 5 feet is about the closest mine can focus, which means I can't get very close to the wildlife, which was a lot of the point of building a wildlife camera. I've seen suggestions of putting reading glasses in front of the lens as a cheap macro adaptor.

Instead, I'm going ahead with the gphoto2 option, which is about ready to test -- but the noIR Pi camera module might be marginally acceptable for a night wildlife camera.


May 15, 2014 07:30 PM

May 13, 2014

Jono Bacon

Announcing Ubuntu Pioneers

Ubuntu has always been about breaking new ground. We broke the ground with the desktop back in 2004, we have broken the ground with cloud orchestration across multiple clouds and providers, and we are building a powerful, innovative mobile and desktop platform that is breaking ground with convergence.

The hardest part about breaking new ground and innovating is not having the vision and creating the technology, it is getting people on board to be part of it.

We knew this was going to be a challenge when we first took the wraps off the Ubuntu app developer platform: we have a brand new platform that was still being developed, and when we started many of the key pieces were not there such as a solid developer portal, documentation, API references, training and more. Today the story is very different with a compelling, end-to-end, developer story for building powerful convergent apps.

We believed and always have believed in the power of this platform, and every single one of those people who also believed in what we are doing and wrote apps have shared the same spirit of pioneering a new platform that we have.

As such, we want to acknowledge those people.

And with this, I present Ubuntu Pioneers.

The idea is simple, we want to celebrate the first 200 app developers who get their apps in Ubuntu. We are doing this in two ways.

Firstly, we have created http://developer.ubuntu.com/pioneers which displays all of these developers and lists the apps that they have created. This will provide a permanent record of those who were there right at the beginning.

Secondly, we have designed a custom, limited-edition Ubuntu Pioneers t-shirt that we want to send to all of our pioneers. For those of you who are listed on this page, please ensure that your email address is correct in MyApps as we will be getting in touch soon.

Thank-you so much to every single person listed on that page. You are an inspiration for me, my team, and the wider Ubuntu project.

If you have that pioneering spirit and wished you were up there, fear not! We still have some space before we hit 200 developers, so go here to get started building an app.

by jono at May 13, 2014 04:24 PM

May 11, 2014

Akkana Peck

Sonograms in Python

I went to a terrific workshop last week on identifying bird songs. We listened to recordings of songs from some of the trickier local species, and discussed the differences and how to remember them. I'm not a serious birder -- I don't do lists or Big Days or anything like that, and I dislike getting up at 6am just because the birds do -- but I do try to identify birds (as well as mammals, reptiles, rocks, geographic features, and pretty much anything else I see while hiking or just sitting in the yard) and I've always had trouble remembering their songs.

[Sonogram of ruby-crowned kinglet] One of the tools birders use to study bird songs is the sonogram. It's a plot of frequency (on the vertical axis) and intensity (represented by color, red being louder) versus time. Looking at a sonogram you can identify not just how fast a bird trills and whether it calls in groups of three or five, but whether it's buzzy/rattly (a vertical line, lots of frequencies at once) or a purer whistle, and whether each note is ascending or descending.

The class last week included sonograms for the species we studied. But what about other species? The class didn't cover even all the local species I'd like to be able to recognize. I have several collections of bird calls on CD (which I bought to use in combination with my "tweet" script -- yes, the name messes up google searches, but my tweet predates Twitter -- a tweet Python script and tweet in HTML for Android). It would be great to be able to make sonograms from some of those recordings too.

But a search for Linux sonogram turned up nothing useful. Audacity has a histogram visualization mode with lots of options, but none of them seem to result in a usable sonogram, and most discussions I found on the net agreed that it couldn't do it. There's another sound editor program called snd which can do sonograms, but it's fiddly to use and none of the many color schemes produce a sonogram that I found very readable.

Okay, what about python scripts? Surely that's been done?

I had better luck there. Matplotlib's pylab package has a specgram() call that does more or less what I wanted, and here's an example of how to use pylab.specgram(). (That post also has another example using a library called timeside, but timeside's PyPI package doesn't have any dependency information, and after playing the old RPM-chase game installing another dependency, trying it, then installing the next dependency, I gave up.)

The only problem with pylab.specgram() was that it shows the full range of the sound, both in time and frequency. The recordings I was examining can last a minute or more and go up to 20,000 Hz -- and when pylab tries to fit that all on the screen, you end up with a plot where the details are too small to show you anything useful.

You'd think there would be a way for pylab.specgram() to show only part of the spectrum, but that doesn't seem to be. I finally found a Stack Overflow discussion where "edited" gives an excellent rewritten version of pylab.specgram which allows setting minimum and maximum frequency cutoffs. Worked great!

Then I did some fiddling to allow for analyzing only part of the recording -- Python's wave package has no way to read in just the first six seconds of a .wav file, so I had to read in the whole file, read the data into a numpy array, then take a slice representing the seconds of the recording I actually wanted.

But now I can plot nice sonograms of any bird song I want to see, print them out or stick them on my Android device so I can carry them with me.

Update: Oops! I forgot to include a link to the script. Here it is: Sonograms in Python.


May 11, 2014 03:17 PM

May 05, 2014

Elizabeth Krumbach

LOPSA East 2014 wrap-up

On Friday and Saturday I had the opportunity to finally attend and participate in a conference I’ve had my eyes on for years: LOPSA-East. I first heard about this conference several years ago while living in Philadelphia, but could never gather up the time or funds to attend. This year I was delighted to see an invitation to submit a proposal land in my inbox and I submitted a talk on Code Review for Systems Administrators which was accepted. Even better, they also asked if I could give the closing keynote on attracting more women to IT!

One of the things I always admired from afar about this conference was their passion for systems administration/ops work, the people who voluntarily spend their time running this conference and many of the speakers spend vast amounts of time off work hours on the community. It syncs up well with my own passions and those of many of the local groups nearby, so I was really delighted when I saw PLUG represented on their supporters board near the entrance to registration (along with the great Franklin tux logo by Stephanie A. Fox!).

Friday was a tutorial day for the conference, where I chose to attend Jennifer Davis’ “Implementing Kanban to Improve your Workflow” in the morning and “How to Interview a System Administrator” by Adam Moskowitz in the afternoon.

Jennifer’s tutorial was a real treat. She had group activities throughout the tutorial that made it more engaging, and since they were about our work I was happy to engage, rather than being uncomfortable (group activities don’t tend to be my thing). Even better, she managed to sneak in a game of Fluxx as one of the activities to demonstrate the disruptive and interrupt-drive environment that systems administrators often find ourselves in. The Kanban scheduling system for work is something I’m seeing increasingly in the industry, including on a team I work with in OpenStack. I’ve also been reading The Phoenix Project, where they appear prominently, but it was great to sit down and have a tutorial that helped me better understand how other teams are using them in production. We also got to make a demo one ourselves with post-its, which was a lot of fun, especially if you love office supplies like I do (doesn’t everyone?).

Adam’s session on interviewing systems administrators was really great too. The team I work on has been doing a fair amount of hiring lately, so I’ve been asked to help conduct interviews. The first good news out of this session is that I generally have the right idea with interviews, but there are always improvements to be made! He suggested an approach that centers around the key question of “Tell me a time when you…” which will show you about how they solved a problem and will teach you a lot about their skills in that area. The goal is to show that the applicant is a smart problem-solver who is able to learn and adapt to new applications as the job of systems administration often requires, not someone who is solidly attached to a single technology – “don’t ask them what the -z flag of ls does.” He also explained the process at his company where an applicant must give a presentation on a subject (typically a technology or problem they’ve solved) to the interview panel, which was quite the contentious suggestion, but he argued that communication skills are vital for an applicant and they wouldn’t be judging them on their public speaking ability. Finally, one of my favorite things he mentioned was making the applicant feel comfortable. Interviews are stressful, just by seeing how an applicant performs in an interview you get some idea of how they handle stress, there is no need to manufacture stress for them.

Friday night was the first keynote, by OpenStack guru Vishvananda Ishaya. He gave a history of OpenStack talk and gave some details about the current uses of it in the industry. I’ve heard a similar talk from him before, but this was the first time I’d seen it at an operations-focused conference, so that was pretty exciting. It was also notable that both the keynotes this year were by folks who work full time on OpenStack. First we took over open source conferences, now operations!

Saturday kicked off the talks of the conference. I had a chance to catch up with Kartik Subbarao who recently published a book Enlightening Technical Leadership. I’ve recently jumped on the meditation bandwagon and have sought to bring mindfulness into standard practice in my life, so the timing of his book, and the related talk I went to first thing on Saturday, was great. His proposal was for the changing of mental models for handling various situations, and brought up in person vs email discussions as an example: body language and tone tell us a lot in person, in email so many things are much less clear, a phrase like “Good luck” can be interpreted many ways. He implores the audience to take a mindful step back and seek to adjust their reactions to be more positive, constructive and rational.

The second talk of the day was mine, I’ve given my Code Review for Sys Admins talk at a few open source conferences, but this was my first time giving it to a sysadmin-ful audience at an ops-focused conference, so I was eager to hear feedback. I ended up having a lot of great chats after my talk with folks who were coming from various backgrounds who were interested in learning more about the tools and where the bottlenecks were in our workflow. But perhaps the most exciting part about my talk was during someone else’s – Adam Moskowitz did a talk in the afternoon called “The Future of System Administration (and what you should do to prepare)” where he described an almost identical workflow he was using at his company with automated developer testing and systems administration code being pushed through code review too! His premise was that sysadmins will increasingly need coding ability as we dive further into automation of everything. It sure was exciting to see the work we do in the OpenStack project being called the future.

The next talk I went to was “Git Hooks for Sys Admins: With Puppet Examples” by Thomas Uphill. I’ve been using git on a day to day basis for over a year now and over the past few months have been thinking “I really should figure out these git hook things”. This presentation was the kick I needed, particularly since his `puppet parser validate` example is something I totally should be using instead of manually running my own script prior to commit. It was really nice to hear some of the details about what all the stuff in the hooks files were so I’ll be more familiar once I start digging in. His slides, including the examples, are available here: goo.gl/dg5TVw

Early in the morning I was approached to participate as a panelist in the “Professional Topics Panel Discussion” and so that was my next stop of the day. The first topic that was brought up by an audience member was how people handle change review processes that end up really getting in the way of work and goals of the team they’re working on. After some discussion consensus was that working with your manager and other teams to make sure efficiency goals were synced up with the needs of the change review process was important, and above all else – communication is key. Too many teams get too wrapped up in process and how things “have to be” when things actually could be vastly improved. The second topic was the position of an IT team in a small company running interference with the larger company that just bought them to make sure the small company employees could continue to do their work with their preferred workflow. Buy in from management was another key thing here, but there were also comments about how the smaller company is valuable to the larger when it came to some of their IT innovations and how honest communication between both IT teams was key.

The rest of my afternoon I spent in a series of talks, starting with “Don’t Be That Guy” which went through some of the typical archetypes of technology-types and offer advice on how to handle them, from the BOFH to the Downer to the person who seems to live in a cave and rarely collaborates with the team. The conference had a great series of lightning talks, and then I headed over to “Packing It In: Images, Containers, and Config Management” where Michael Goetz discussed use of Packer, Docker and Chef to build an environment where virtualization, containerization and configure management work well together. He also gave some tips about using containers, stressing that one should not overload a container like you might be tempted to for a full virtual machine.

And with that, the talks came to an end! All that was left was my closing keynote.

My talk was titled “Universal Design for Tech: Improving Gender Diversity in our Industry” and it’s one I was more nervous about than any other talks I’ve given recently. This is a tough topic, and one that’s quite personal for me. I’ve been on panels on the topic in the past couple years, this was the first time I’d done a solo talk on it since 2009 when I did an improving participation talk for a Linux Users Group. Since then I’ve either argued to adjust the topic or declined the invitation to speak on the topic because it’s too stressful. When the opportunity to give this keynote came up I was hesitant at first, but it’s important and I decided it was time to get back out there, even if it’s just for one talk. Over the years, and through success I’ve seen in my career and that of women I work with, I felt I could add value to the discussion and that it would be worth the stress and risk one takes when giving this kind of talk. I also had valuable input from several women I know, without whom I don’t think I could have crafted such an effective presentation.

There were a lot of great questions from the audience as I wrapped up, and I ended up being late for dinner due to post-talk discussions (oops!). Thanks to everyone who was so engaged and interested in this topic, it was really great to have such a supportive audience. Slides here: LOPSA-East-2014-Keynote-ekjoseph.pdf

In all, this was a great conference and I will be encouraging others to attend. Audience members were regularly engaged with the speakers (agreements and disagreements!). Even though I’m shy, I was able to have a lot more discussions with folks I don’t know than I usually do, a sure sign that I was pretty comfortable. So thanks to everyone who took time to talk to me and be friendly, it makes all the difference. Also thanks to the organizers for crafting such a great environment that I am proud to have participated in.

by pleia2 at May 05, 2014 03:08 AM

Akkana Peck

Getting internet at the new house: a Comcast Odyssey

We finally have internet at the house!

Not through any success on the Comcast front, mind you. They're still stalling and dithering. We finally did what we should have done the very first time they failed to install, and ordered DSL.

I've been keeping track of the progress on the Comcast debacle, though, with the intention of blogging it. But as I updated it last week, I realized that it was up to nearly 400 lines already -- at least triple my usual maximum for a blog post -- and though we're over ten weeks in, it's still nowhere near over.

So I've decided to post what I have, as a running web page that I can add to as the odyssey continues, and just post a link to it here. It's a tale of woe, lies, misleading information, crossed wires, chopped-up wires, and about every other problem you could imagine.

If you're thinking of signing up with Comcast but you have any other options, it'll definitely convince you to look elsewhere.

Here's the tale so far: Getting internet at the new house: a Comcast Odyssey.

Now excuse me while I go enjoy my DSL some more.

May 05, 2014 01:13 AM

May 02, 2014

Jono Bacon

Unwrapping ‘Dealing With Disrespect’

Dealing With DisrespectWith the growth of the Internet and the ease of publishing content, more and more creative minds are coming online to share videos, music, software, products, services, opinions, and more. While the technology has empowered a generation to build new audiences and share interesting things, an unfortunate side-effect has been a culture in which some consumers of this content have provided feedback in a form that is personalized, mean-spirited, disrespectful, and in some cases, malicious.

We have all seen it…the trolls, the haters, the comment boxes filled with venom and vitriol, typically pointed at people just trying to do good and interesting things.

Unfortunately, this conduct can be jarring for many people, with some going as far to give up sharing their creative endeavours so as not to deal with the “wrath of the Internet”.

As some of you will know, this has been bothering me for a while now. While there is no silver bullet for solving these issues, one thing I have learned over the years is how to put negative, anti-social, and non-constructive comments and feedback into perspective.

To help others with this I have written a free book called Dealing With Disrespect.

Dealing With Disrespect is a short, simple to read, free book that provides a straight-forward guide for handling this kind of challenging feedback, picking out the legitimate criticism to learn from, and how to not just ignore the haters, but how to manage them. The book helps put all communication, whether on or offline, into perspective and helps you to become a better communicator yourself.

My goal with the book is that when someone reads something anti-social that demotivates them, a friend can recommend ‘Dealing With Disrespect’ as something that can help put things in perspective.

Go and check out the new website, watch the introductory video, and go and grab the PDF, read it online, or get it for your Kindle. There is also a FAQ.

The book is licensed under a Creative Commons license, and I encourage everyone who enjoys it and finds it useful to share it.

by jono at May 02, 2014 05:28 AM

April 29, 2014

Akkana Peck

The evil HTML double-dash problem in Emacs is still there

Long ago (in 2006!), I blogged on an annoying misfeature of Emacs when editing HTML files: you can't type double dashes. Emacs sees them as an SGML comment and insists on indenting all subsequent lines in strange ways.

I wrote about finding a fix for the problem, involving commenting out four lines in sgml-mode.el. That file had a comment at the very beginning suggesting that they know about the problem and had guarded against it, but obviously it didn't work and the variable that was supposed to control the behavior had been overridden by other hardwired behaviors.

That fix has worked well for eight years. But just lately, I've been getting a lot of annoying warnings when I edit HTML files: "Error: autoloading failed to define function sgml_lexical_context". Apparently the ancient copy of sgml-mode.el that I'd been using all these years was no longer compatible with ... something else somewhere inside emacs. I needed to update it.

Maybe, some time during the intervening 8 years, they'd actually fixed the problem? I was hopeful. I moved my old patched sgml-mode.el aside and edited some files. But the first time I tried typing a double dashes -- like this, with text inside that's long enough to wrap to a new line -- I saw that the problem wasn't fixed at all.

I got a copy of the latest sgml-mode.el -- on Debian, that meant:

apt-get install emacs23-el
cp /usr/share/emacs/23.4/lisp/textmodes/sgml-mode.el.gz ~/.emacs-lisp
gunzip ~/.emacs-lisp/sgml-mode.el.gz
Then I edited the file and started searching for strings like font-lock and comment.

Unfortunately, the solution I documented in my old blog post is no longer helpful. The code has changed too much, and now there are many, many different places where automatic comment handling happens. I had to comment out each of them bit by bit before I finally found the section that's now causing the problem. Commenting out these lines fixed it:

   (set (make-local-variable 'indent-line-function) 'sgml-indent-line)
   (set (make-local-variable 'comment-start) "")
   (set (make-local-variable 'comment-indent-function) 'sgml-comment-indent)
   (set (make-local-variable 'comment-line-break-function)
        'sgml-comment-indent-new-line)

I didn't have to remove any .elc files, like I did in 2006; just putting the sgml-mode.el file in my Emacs load-path was enough. I keep all my customized Emacs code in a directory called .emacs-lisp, and in my .emacs I make sure it's in my path:

(setq load-path (cons "~/.emacs-lisp/" load-path))
And now I can type double dashes again. Whew!

April 29, 2014 06:42 PM

April 25, 2014

Elizabeth Krumbach

San Francisco Ubuntu 14.04 Release Party

On Thursday, April 24th, the Ubuntu California team celebrated the 14.04 release with a party in San Francisco.

Our parties prior to this one had been more loosely organized, typically meeting up at a brewery or restaurant to just enjoy some food, drinks and the company of each other. This time, for the LTS, I wanted to do something bigger and more organized. Laptops! Tablets! A quiz with prizes! A presentation!

James Ouyang of the Ubuntu California team put me in touch with Patrick Adair of AdRoll in downtown San Francisco where they offered to host the event. I’m really excited that we were able to make this partnership work, the venue ended up being perfectly sized for our needs, plenty of space for food and demos and places for folks to sit without feeling like an overwhelming space to fill.

The day before the event I met up with Eric P. Scott to head over to Costco to pick up cookies and chips for the event. We also put together a wiki page where people could share what hardware they were bringing. Michael Paoli and Robert Wall brought laptops to demo. Robert also brought along a Nexus 7 tablet, as did Grant Bowman, which with mine gave us a total of three tablets running Ubuntu for folks to play with. Thankfully Eric had the foresight to bring along some hand wipes so our pizza + tablets event wasn’t a problem.

With the help of James and Will, who came by my home prior to the event to help haul all my stuff over, I brought along 3 of my own laptops, Ubuntu 64-bit on my main personal laptop, Xubuntu 14.04 32-bit on my mini9, and Lubuntu PPC on my old PowerBook.

We wanted some structure to the event, so Robert and I made up some quiz questions prior to the event, which I printed up for folks to fill out. Michael Paoli and I brought along a couple Ubuntu books to give away, and Jono Bacon brought along 3 copies of his Art of Community to give out as well.

I also gave a short presentation about some of the new features of this Ubuntu release, and then left them on rotate on the projector for the rest of the event. Slides from my presentation are available on SpreadUbuntu: http://spreadubuntu.org/en/material/presentation/ubuntu-1404-whats-new

We had pizzas delivered (thanks to the Community Donations Funding distributed by Canonical) and with an event of 40-50 people as the event progressed I managed to get the number right with 11 18″ pizzas of various types.

The event ran from 6-9PM, which felt like a good amount of time for demos and lively chatting among the attendees. It was really great to see some folks who hadn’t come out to events in a while, and as always the new faces who I was able to chat with about projects and work they’re doing with Ubuntu, plus sharing tips as newcomers navigated the interface of the tablets.

Huge thanks to everyone who contributed to this event and joined us! It’s was a lot of work, but I think it would be great to at least strive to continue these for the LTS releases.

Photos I (and James) took during the event here: https://www.flickr.com/photos/pleia2/with/14023115873/

And thanks to Sameer Verma for also taking photos, available here: https://plus.google.com/photos/+SameerVerma/albums/6006302887746284977

by pleia2 at April 25, 2014 10:39 PM

April 23, 2014

Akkana Peck

Some code from PiDoorbell

If anyone has been waiting for the code repository for PiDoorbell, the Raspberry Pi project we presented at PyCon a couple of weeks ago, at least part of it (the parts I wrote) is also available in my GitHub scripts repo, in the rpi subdirectory. It's licensed as GPLv2-or-later.

That includes the code that drives the HC-SR04 sonar rangefinder, and the script that takes photos and handles figuring out whether you have a USB camera or a Pi Camera module.

It doesn't include the Dropbox or Twilio code. For that I'm afraid you'll have to wait for the official PiDoorbell repo. I'm not clear what the holdup is on getting the repo opened up.

The camera script, piphoto.py, has changed quite a bit in the couple of weeks since PyCon. I've been working on a similar project that doesn't use the rangefinder, and relies only on the camera to detect motion, by measuring changes between the previous photo and the current one. I'm building a wildlife camera, and the rangefinder trick doesn't work well if there's a bird feeder already occupying the target range.

Of course, using motion detection means I'll get a lot of spurious photos of shadows, tree limbs bending in the wind and so forth. It'll be an interesting challenge seeing if I can make the code smart enough to handle that. Of course, I'll write about the project in much more detail once I have it working.

It looks like the biggest issue will be finding a decent camera I can control from a Pi. The Pi Camera module looked so appealing -- and it comes in a night version, with the IR filter removed, perfect for those coyote, rabbit and deer pictures! -- but sadly, it looks like its quality is so poor that it really isn't useful for much of anything. It's great for detecting what types of animals visit you (especially at night), but, sadly, no good for taking photos you'd actually want to display.

If anyone knows of a good camera that can be driven from Linux over USB -- perhaps a normal digital camera that supports the USB camera protocol? -- please let me know! My web searches so far haven't been very illuminating.

Meanwhile, I hope someone finds the rangefinder and camera driving software useful. And stay tuned for more detailed articles about my wildlife camera project!

April 23, 2014 05:57 PM

April 20, 2014

Elizabeth Krumbach

Tourist in Montreal

A couple weeks ago I was in Montreal for PyCon 2014. It was an amazing conference, but I was also glad to have some time to explore the beautiful city that is Montreal.

On Thursday (2nd day of tutorials) I didn’t have anything scheduled conference-wise, so I met up with my friend and long time Ubuntu contributor John Chiazzese (IdleOne). We’ve worked together online on Ubuntu for several years, and even both lived in the same area at the same time at one point, but we never managed to meet. My love of zoos landed us at the Montreal Biodome, housed in a former Olympic building.

The Biodome takes you through 4 different environments where they have mini-ecosystems for each and animals that populate the zones. The lynx were a big draw for me:

The river otter was also quite adorable and looking for attention. I also quite enjoyed the monkeys! And the penguins!

One of the evenings after the conference I joined a few of my colleagues to see And Then There Was Light sound and like show at the Notre Dame Basilica, not far from the convention center.

As a fan of historical religious buildings, I was eager for my chance to walk around the basilica as a tourist. The “sound and light show” portion of the show was a bit cheesy, giving folks a history of the French colonists and the basilica itself, but we had fun. Afterwards, we had 15 minutes to walk around and take photos, hooray!

Once they had pulled up the curtains used during the show, the interior did not disappoint. The alter in particular was spectacular:

I was also exposed to a lot of great food in Montreal, only a fraction of which I could eat. I had unfortunately fallen ill just before my trip and was on a strict bland diet – no red meat, no alcohol, no fatty foods. In a city full of steakhouses, wine and cheese this was a special kind of torture, but it did allow me to explore the menus beyond what I might typically order (and I did cheat a bit with the cheese). I ate a lot of chicken, fish and vegetables.

I was fortunate to have decent walking weather during most of the trip, but as the event wound down I found the chilly weather coming back, I even hear that there were some flurries the day after I left. Montreal is great, but was nice to be on my way back to California when the snow returned!

More photos from my tourist adventures in Montreal here: https://www.flickr.com/photos/pleia2/sets/72157643982902633/

by pleia2 at April 20, 2014 05:11 PM

April 17, 2014

Jono Bacon

Ubuntu 14.04 Is Out!

My apologies in advance for the shorter blog post about this, but like many other Ubuntu folks, I am absolutely exhausted right now. Everyone, across the board, has been working their collective socks off to make Ubuntu 14.04 LTS a fantastic release on desktop, server, and cloud, and pull together our next iteration of Ubuntu for smart-phones and tablets. Consequently, when the trigger is pulled to share our final product with the world, release day is often less of a blistering and energetic woo-hoo, but more of an exhausted but satisfying oh-yeah (complete with beer firmly clenched in hand).

I am hugely proud of this release. The last six months have arguably been our busiest yet. No longer are we just working on desktop and server editions of Ubuntu, but we are building for the cloud and full convergence across the client. No longer are we “just” pulling together the fruits of upstream software projects but we are building our own platform too; the Ubuntu SDK, developer eco-system, charm store, image-based updates, push notifications, app lifecycle, and more. While the work has been intense and at times frantic, it has always been measured and carefully executed. Much of this has been thanks to many of our most under-thanked people; the members of our tremendous QA and CI teams.

Today, tomorrow, and for weeks to come our users, the press, the industry, and others will assess our work in Ubuntu 14.04 across these different platforms, and I am very confident they will love what they see. Ubuntu 14.04 embodies the true spirit of Ubuntu; innovation, openness, and people.

But as we wait to see the reviews let’s take a moment for each other. Now is a great time to reach out to each other and those Ubuntu folks you know (and don’t know) and share some kudos, some thanks, and some great stories. Until we get to the day where machines make software, today software is made by people and great software is built by great people.

Thanks everyone for every ounce of effort you fed into Ubuntu and our many flavors. We just took another big leap forward towards our future.

by jono at April 17, 2014 10:58 PM

Akkana Peck

Back from PyCon

I'm back from Montreal, settling back in.

The PiDoorbell tutorial went well, in the end. Of course just about everything that could go wrong, did. The hard-wired ethernet connection we'd been promised didn't materialize, and there was no way to get the Raspberry Pis onto the conference wi-fi because it used browser authentication (it still baffles me why anyone still uses that! Browser authentication made sense in 2007 when lots of people only had 801.11g and couldn't do WPA; it makes absolutely zero sense now).

Anyway, lacking a sensible way to get everyone's Pis on the net, Deepa stepped as network engineer for the tutorial and hooked up the router she had brought to her laptop's wi-fi connection so the Pis could route through that.

Then we found we had too few SD cards. We didn't realize why until afterward: when we compared the attendee count to the sign-up list we'd gotten, we had quite a few more attendees than we'd planned for. We had a few extra SD cards, but not enough, so I and a couple of the other instructors/TAs had to loan out SD cards we'd brought for our own Pis. ("Now edit /etc/network/interfaces ... okay, pretend you didn't see that, that's the password for my home router, now delete that and change it to ...")

Then some of the SD cards turned out not to have been updated with the latest packages, Mac users couldn't find the drivers to run the serial cable, Windows users (or was it Macs?) had trouble setting static ethernet addresses so they could ssh to the Pi, all the problems we'd expected and a few we hadn't.

But despite all the problems, the TAs: Deepa (who was more like a co-presenter than a TA), Serpil, Lyz and Stuart, plus Rupa and I, were able to get everyone working. All the attendees got their LEDs blinking, their sonar rangefinders rangefinding, and the PiDoorbell script running. Many people brought cameras and got their Pis snapping pictures when the sensor registered someone in front of it. Time restrictions and network problems meant that most people didn't get the Dropbox and Twilio registration finished to get notifications sent to their phones, but that's okay -- we knew that was a long shot, and everybody got far enough that they can add the network notifications later if they want.

And the most important thing is that everybody looked like they were having a good time. We haven't seen the reviews (I'm not sure if PyCon shares reviews with the tutorial instructors; I hope so, but a lot of conferences don't) but I hope everybody had fun and felt like they got something out of it.

The rest of PyCon was excellent, too. I went to some great talks, got lots of ideas for new projects and packages I want to try, had fun meeting new people, and got to see a little of Montreal. And ate a lot of good food.

Now I'm back in the land of enchantment, with its crazy weather -- we've gone from snow to sun to cold breezes to HOT to threatening thunderstorm in the couple of days I've been back. Never a dull moment! I confess I'm missing those chocolate croissants for breakfast just a little bit. We still don't have internet: it's nearly 9 weeks since Comcast's first visit, and their latest prediction (which changes every time I talk to them) is a week from today.

But it's warm and sunny this morning, there's a white-crowned sparrow singing outside the window, and I've just seen our first hummingbird (a male -- I think it's a broad-tailed, but it'll take a while to be confident of IDs on all these new-to-me birds). PyCon was fun -- but it's nice to be home.

April 17, 2014 04:20 PM

April 16, 2014

Elizabeth Krumbach

Finding a Tahr (or two!)

Tomorrow the next Ubuntu Long Term Support (LTS) release comes out, 14.04, development code name Trusty Tahr. In preparation, I was putting together some materials for our release event next week and found myself looking for the Tahr artwork when I remembered that it was included in the installer. So now I’ll share it with you as well!

If you go to this source page you will see a “download file” link which will allow you to download a .png of the tahr artwork.

Trusty Tahr

I haven’t found an svg version of this logo, but I’ll be sure to update this post if I do.

Thanks to Tom Macfarlane of Canonical for emailing me a copy of the svg version! You can get a copy here.

Looking for something slightly different? The Xubuntu team also included a tahr in our installer, created by Simon Steinbeiß:


This png has transparency, which make it show grey on white, but you can flavor it with any color you wish!

You can grab it at this source page where you will see the “download file” link. I’ve also uploaded the svg: art_tahr.svg

Enjoy! And happy release everyone!

by pleia2 at April 16, 2014 10:16 PM