Planet Ubuntu California

March 28, 2015

Elizabeth Krumbach

Simcoe’s March 2015 Checkup

Our little Siamese, Simcoe, has Chronic Renal Failure (CRF). She has been doing well for over 3 years now with subcutaneous fluid injections every other day to keep her hydrated and quarterly check-ins with the vet to make sure her key blood levels and weight are staying within safe parameters.

On March 14th she went in for her latest visit and round of blood work. As usual, she wasn’t thrilled about the visit and worked hard to stay in her carrier the whole time.

She came out long enough for the exam, and the doctor was healthy with her physical, though her weight had dropped a little again, going from 9.74lbs to 9.54lbs.

Both her BUN and CRE levels remained steady.

Unfortunately her Calcium levels continue to come back a bit high, so the vet wants her in for an ionized Calicum test. She has explained that it’s only the ionized Calcium that is a concern because it can build up in the kidneys and lead to more rapid deterioration, so we’d want to get her on something to reduce the risk if that was the case. We’ll probably be making an appointment once I return from my travels in mid April to get this test done.

In the meantime, she gets to stay at home and enjoy a good book.

…my good book.

by pleia2 at March 28, 2015 02:07 AM

The spaces between

It’s been over 2 months since I’ve done a “miscellaneous life stuff” blog post. Anyone reading this blog recently might think I only write about travel and events! Since that last post I have had other things pop up here and there, but I am definitely doing too many events. That should calm down a bit in the 2nd quarter of the year and almost disappear in the third, with the notable exception of a trip to Peru, part work and part pleasure.

Unfortunately it looks like stress I mentioned in that last post flipped the switch on my already increasing-in-frequency migraines. I’ve seen my neurologist twice this year and we’ve worked through several medications, finally finding one that seems to work. And at least a visit to my neurologist affords me some nice views.

So I have been working on stress reduction, part which is making sure I keep running. It doesn’t reduce stress immediately but a routine of exercise does help even me out in the long term. To help clear my head, I’ve also been refining my todo lists to make them more comprehensive. I’m also continuing to let projects go when I find they’re causing my stress levels to spike for little gain. This is probably the hardest thing to do, I care about everything I work on and I know some things will just drop on the ground if I don’t do them, but I really need to be more realistic about what I can actually get done and focus my energy accordingly.

And to clear the way in this post for happier things, I did struggle with the loss of Eric in January. My Ubuntu work here in San Francisco simply won’t be the same without him, and every time I start thinking about planning an event I am reminded that he won’t be around to help or attend. Shortly after learning of his passing, several of us met up at BerkeleyLUG to share memories. Then on March 18th a more organized event was put together to gather friends from his various spheres of influence to celebrate his life at one of his favorite local pizzerias. It was a great event, I met some really good people and saw several old friends. It also brought some closure for me that I’d been lacking in dealing with this on my own.

On to happier things! I actually spent 30 days in a row off a plane in March. Home time means I got to do lots of enjoyable home things, like actually spending time with my husband over some fantastic meals, as well as finally finishing watching Breaking Bad together. I also think I’ve managed to somewhat de-traumatize my cats, who haven’t been thrilled about all my travel. We’ve been able to take some time to do some “home things” – like get some painting estimates so we can get some repairs done around the condo. I also spent a day down in Mountain View so I could meet up with a local colleague who I hadn’t yet met to kick off a new project, and then have dinner with a friend who was in the area visiting. Plus, I got to see cool things like a rare storm colliding with a sunset one evening:

I’ve been writing some, in January my article 10 entry points to tech (for girls, women, and everyone) went live on opensource.com. In early March I was invited to publish an article on Tech Talk Live Blog on Five Ways to Get Involved with Ubuntu as an Educator based on experience working with teachers over the past several years. I’ve also continued work toward a new book in progress, which has been time-consuming but I’m hoping will be ready for more public discussion in the coming months. Mark G. Sobell’s A Practical Guide to Ubuntu Linux, 4th Edition also came out earlier this year, and while I didn’t write that, I did spend a nice chunk of time last summer doing review for it. I came away with a quote on the cover endorsing the great work Mark did with the book!

Work-wise, aside from travel and conferences I’ve talked about in previous posts, I was recently promoted to root and core for OpenStack Infrastructure. This has meant a tremendous amount to me, both the trust the team has placed in me and the increased ability for me to contribute to the infrastructure I’ve spent so much time with over these past couple of years. It also means I’ve been learning a lot and sorting through the tribal knowledge that should be formally documented. I was also able to participate as a Track Chair for selecting talks for the Related OSS Projects track at the OpenStack Summit in Vancouver in May, I did this for Atlanta last year but ended up not being able to attend due to being too sick (stupid gallbladder). And while on the topic of Vancouver, a panel proposed by the Women of OpenStack that I’m participating in has been accepted, Standing Tall in the Room, where we hope to give other women in our community some tips for success. My next work trip is coming up before Vancouver I’m heading off to South Carolina for Posscon where I’ll be presenting on Tools for Open Source Systems Administration, a tour of tools we use in order to make collaborating online with a distributed team of systems administrators from various companies possible (and even fun!).

In the tech goodies department, I recently purchased a Nexus 6. I was compelled to after I dropped my Galaxy S3 while sitting up on the roof deck. I was pretty disappointed by the demise of my S3, it was a solid phone and the stress of replacement wasn’t something I was thrilled to deal with immediately upon my return from Oman. I did a bunch of research before I settled on the Nexus 6 and spent my hard-earned cash on retail price for a phone for the first time in my life. It’s now been almost a month and I’m still not quite used to how BIG the Nexus 6 is, but it is quite a pleasure to use. I still haven’t quite worked out how to carry it on my runs; it’s too big for my pockets and the arm band solution isn’t working (too bulky, and other reasons), I might switch to a small backpack that can carry water too. It’s a great phone though, so much faster than my old one, which honestly did deserve to be replaced, even if not in the way I face-planted it on the concrete, sorry S3.


Size difference: Old S3 in new Nexus 6 case

I also found my old Chumby while searching through the scary cave that is our storage unit for the paint that was used for previous condo painting. They’ve resurrected the service for a small monthly fee, now I just need to find a place to plug it in near my desk…

I actually made it out of the house to be social a little too. My cousin Steven McCorry is the lead singer in a band called Exotype, which signed a record deal last year and has since been on several tours. This one brought him to San Francisco, so I finally made my way out to the famous DNA Lounge to see the show. It was a lot of fun, but as much as I can appreciate metal, I’m pleased with their recent trend toward rock, which I prefer. It was also great to visit with my cousin and his band mates.

This week it was MJ’s turn to be out of the country for work. While I had Puppet Camp to keep me busy on Tuesday, I did a poor job of scheduling social engagements and it’s been a pretty lonely time. It gave me space to do some organization and get work done, but I wasn’t as productive as I really wanted to be and I may have binge watched the latest slew of Mad Men episodes that landed on Netflix one night. Was nice to have snuggle time with the kitties though.

MJ comes home Sunday afternoon, at which time we have to swap out the contents of his suitcase and head back to the airport to catch a red eye flight to Philadelphia. We’re spending next week moving a storage unit, organizing our new storage situation and making as many social calls as possible. I’m really looking forward to visiting PLUG on Wednesday to meet up with a bunch of my old Philadelphia Linux friends. And while I’m not actively looking forward to the move, it’s something we’ve needed to do for some time now, so it’ll be nice for that to be behind us.

by pleia2 at March 28, 2015 01:58 AM

March 27, 2015

Akkana Peck

Hide Google's begging (or any other web content) via a Firefox userContent trick

Lately, Google is wasting space at the top of every search with a begging plea to be my default search engine.

[Google begging: Switch your default search engine to Google] Google already is my default search engine -- that's how I got to that page. But if you don't have persistent Google cookies set, you have to see this begging every time you do a search. (Why they think pestering users is the way to get people to switch to them is beyond me.)

Fortunately, in Firefox you can hide the begging with a userContent trick. Find the chrome directory inside your Firefox profile, and edit userContent.css in that directory. (Create a new file with that name if you don't already have one.) Then add this:

#taw { display: none !important; }

Restart Firefox, do a Google search and the begs should be gone.

In case you have any similar pages where there's pointless content getting in the way and you want to hide it: what I did was to right-click inside the begging box and choose Inspect element. That brings up Firefox's DOM inspector. Mouse over various lines in the inspector and watch what gets highlighted in the browser window. Find the element that highlights everything you want to remove -- in this case, it's a div with id="taw". Then you can write CSS to address that: hide it, change its style or whatever you're trying to do.

You can even use Inspect element to remove elements immediately. That won't help you prevent them from showing up later, but it can be wonderful if you need to use a page that has an annoying blinking ad on it, or a mis-designed page that has images covering the content you're trying to read.

March 27, 2015 02:17 PM

March 20, 2015

kdub

A few years of Mir TDD

asm header

We started the Mir project a few years ago guided around the principles in the book, Growing Object Oriented Software Guided by Tests. I recommend a read, especially if you’ve never been exposed to “Test-driven development”

Compared to other projects that I’ve worked on, I find that as a greenfield  TDD project Mir has really benefitted from the TDD process in terms of ease of development, and reliability. Just a few quick thoughts:

  • I’ve found the mir code to be ready to ship as soon as code lands. There’s very little going back and figuring out how the new feature has caused regressions in other parts of the code.
  • There’s much less debugging in the intial rounds of development, as you’ve already planned and written out tests for what you want the code to do.
  • It takes a bit more faith when you’re starting a new line of work that you’ll be able to get the code completed. Test-driven development forces more exploratory spikes (which tend to have exploratory interfaces), and then to revisit and methodically introduce refactorings and new interfaces that are clearer than the ropey interfaces seen in the ‘spike’ branches. That is, the interfaces that land tend to be the second-attempt interfaces that have been selected from a fuller understanding of the problem, and tend to be more coherent.
  • You end up with more modular, object-oriented code, because generally you’re writing a minimum of two implementations of any interface you’re working on (the production code, and the mock/stub)
  • The reviews tend to be less about whether things work, and more about the sensibility of the interfaces.

by Kevin at March 20, 2015 11:31 PM

test post, ignore

test post, ignore the man behind the curtain

by Kevin at March 20, 2015 01:25 PM

March 19, 2015

Akkana Peck

Hints on migrating Google Code to GitHub

Google Code is shutting down. They've sent out notices to all project owners suggesting they migrate projects to other hosting services.

I moved all my personal projects to GitHub years ago, back when Google Code still didn't support git. But I'm co-owner on another project that was still hosted there, and I volunteered to migrate it. I remembered that being very easy back when I moved my personal projects: GitHub had a one-click option to import from Google Code. I assumed (I'm sure you know what that stands for) that it would be just as easy now.

Nope. Turns out GitHub no longer has any way to import from Google Code: it tells you it can't find a repository there when you give it the address to Google's SVN repository.

Google's announcement said they were providing an exporter to GitHub. So I tried that next. I had the new repository ready on GitHub -- under the owner's account, not mine -- and I expected Google's exporter to ask me for the repository.

Not so. As soon as I gave it my OAuth credentials, it immediately created a new repository on GitHub under my name, using the name we had used on Google Code (not the right name, since Google Code project names have to be globally unique while GitHub projects don't).

So I had to wait for the export to finish; then, on GitHub, I went to our real repository, and did an import there from the new repository Google had created under my name. I have no idea how long that took: GitHub's importer said it would email me when the import was finished, but it didn't, so I waited several hours and decided it was probably finished. Then I deleted the intermediate repository.

That worked fine, despite being a bit circuitous, and we're up and running on GitHub now.

If you want to move your Google Code repository to GitHub without the intermediate step of making a temporary repository, or if you don't want to give Google OAuth access to your GitHub account, here are some instructions (which I haven't tested) on how to do the import via a local copy of the repo on your own machine, rather than going directly from Google to GitHub: krishnanand's steps for migrating Google code to GitHub

March 19, 2015 07:11 PM

March 18, 2015

Elizabeth Krumbach

Elastic{ON} 2015

I’m finally home for a month, so I’ve taken advantage of some of this time to attend and present at some local events. The first of which was Elastic{ON}, the first user conference for Elasticsearch and related projects now under the Elastic project umbrella. The conference venue was Pier 27, a cruise terminal on the bay. It was a beautiful venue with views of the bay, and clever use for a terminal while there aren’t ships coming in.

The conference kicked off with a keynote where they welcomed attendees (of which there were over 1300 from 35 countries!) and dove into project history from the first release in 2010. A tour of old logos and websites built up to the big announcement, the “Elastic” rebranding, as the scope of their work now goes beyond search in the former Elasticsearch name. The opening keynotes continued with several leads from projects within the Elastic family, including updates from Logstash and Kibana.

At lunch I ended up sitting with 3 other women who were attending the conference on behalf of their companies (when gender ratios are skewed, this type of congregation tends to happen naturally). We all got to share details about how we were using Elasticsearch, so that was a lot of fun. One woman was doing data analysis against it for her networking-related work, another was using it to store metadata for videos and the third was actually speaking that afternoon on how they’re using it to supplement the traditional earthquake data with social media data about earthquakes at the USGS.

Track sessions began after lunch, and I spent my afternoon camped out in the Demo Theater. The first talk was by the Elastic Director of Developer Relations, Leslie Hawthorne. She talked about the international trio of developer evangelists that she works with, focusing on their work to support and encourage meetup groups worldwide, noting that 75 cities now have meetups with a total of over 17,000 individual members. She shared some tips from successful meetup groups, including offering a 2nd track during meetups for beginners, using an unconference format rather than set schedule and mixing things up sometimes with hack nights on Elastic projects. It was interesting to learn how they track community metrics (code/development stats, plus IRC and mailing list activity) and she wrapped up by noting the new site at https://www.elastic.co/community where they’re working to add more how-tos and on-ramping content, which their recent acquisition of Found, which has maintained a lot of that kind of material.


Leslie Hawthorn on “State of the Community”

The next session was “Elasticsearch Data Journey: Life of a Document in Elasticsearch” by Alexander Reelsen & Boaz Leskes. When documents enter Elasticsearch as json output from a service like Logstash, it can seem like a bit of a black box as far as what exactly happens to it in order for it to be added to Elasticsearch. This talk went through what happens. It’s first stored in Elasticsearch, where it’s stored node-wise is based on several bits of criteria analyzed upon bringing in, and the data is normalized and sorted. While the data is coming in, it’s stored in a buffer and also written to a transaction log until it’s actually committed to disk, at which time it’s still in the transaction log until it can be replicated across the Elasticsearch cluster. From there, they went into discussing data retrieval, cluster scaling and while stressing that replication is NOT backups, how to actually do backups of each node and how to restore from them. Finally, they talked about the data deletion process and how it queues data for deletion on each node in segments and noted that this is not a reversible option.

Continuing in “Life of” theme, I also attended “Life of an Event in Logstash” by Colin Surprenant. Perhaps my favorite talk of the day, Colin did an excellent job of explaining and defining all the terms he used in his talk. Contrary to popular belief, this isn’t just useful to folks new to the project, but as a systems administrator who maintains dozens of different types of applications over hundreds of servers, I am not necessarily familiar with what Logstash in particular calls everything terminology-wise, so having it made clear during the talk was great. His talk walked us through the 3 stages that events coming into Logstash go through: Input, Filter and Output, and the sized queues between each of them. The Input stage takes whatever data you’re feeding into Logstash and uses plugins to transform it into a Logstash event. The Filter stage actually modifies the data from the event so that the data is made uniform. The Output stage translates the uniform data into whatever output you’re sending it to, whether it’s STDOUT or sending it off to Elastisearch as json. Knowing the bits of this system is really valuable for debugging loss of documents, I look forward to having the video online to share with my colleagues. EDIT 3/20/2015: Detailed slides online here.


Colin Surprenant on “Life of an Event in Logstash”

I tended to a avoid many of the talks by Elasticsearch users talking about how they use it. While I’m sure there’s valuable insights to be gained by learning how others use it, we’re pretty much convinced about our use and things are going well. So use cases were fresh to me when the day 2 keynotes kicked off with a discussion with Don Duet, Co-head of Technology at Goldman Sachs. It was interesting to learn that nearly 1/3 of the employees at Goldman Sachs are in engineering or working directly with engineering in some kind of technical analysis capacity. They were also framed as very tech-conscious company and long time open source advocate. In exploring some of their work with Elasticsearch he used legal documents as an example: previously they were difficult to search and find, but using Elasticsearch an engineer was empowered to work with the legal department to make the details about contracts and more searchable and easier to find.

The next keynote was a surprising one, from Microsoft! As a traditional proprietary, closed-source company, they haven’t historically been known for their support of open source software, at least in public. This has changed in recent years as the world around has changed and they’ve found themselves needing to not only support open source software in their stacks but contributing to things like the Linux kernel as well. Speaker Pablo Castro had a good sense of humor about this all as he walked attendees through three major examples of Elasticsearch use at Microsoft. It was fascinating to learn that it’s used for content on MSN.com, which gets 18 billion hits per month. They’re using Elasticsearch on the Microsoft Dynamics CRM for social media data, and in this case their actually using Ubuntu as well. Finally, they’re using it for the search tool in their cloud offering, Azure. They’ve come a long way!


Pablo Castro of Microsoft

The final keynote was from NASA JPL. The room was clearly full of space fans, so this was a popular presentation. They talked about how they use Elasticsearch to hold data about user behavior from browsers on internal sites so they can improve them for employees. They also noted the terribly common practice of putting data (in this case, for the Mars rover) into Excel or Powerpoint and emailing it around as a mechanism for data sharing, and how they’ve managed to get this data into Elasticsearch instead, clearly improving the experience for everyone.

After the keynotes, it time to do my presentation! The title of my talk was “elastic-Recheck Yourself Before You Wreck Yourself: How Elasticsearch Helps OpenStack QA” and I can’t take credit for the title, my boring title was replaced by a suggestion from the talk selection staff. The talk was fun, I walked through our use of Elasticsearch to power our elastic-recheck (status page, docs) tooling in OpenStack. It’s been valuable not only for developer feedback (“your patch failed tests because of $problem, not your code”), but by giving the QA an Infrastructure teams a much better view into what the fleet of test VMs are up to in the aggregate so we can fix problems more efficiently. Slides from my talk are here (pdf).


All set up for elastic-Recheck Yourself Before You Wreck Yourself

Following my talk, ended up having lunch with the excellent Carol Willing. We got to geek out on all kinds of topics from Python to clouds as we enjoyed an outdoor lunch by the bay. Until it started drizzling.

The most valuable talk in the afternoon for me was “Resiliency in Elasticsearch and Lucene” with Boaz Leskes & Igor Motov. They began by talking about how with scale came the realization that more attention needed to be paid to recovering from various types of failures, and that they show up more often when you have more workers. The talk walked through various failures scenarios and how they’ve worked (and are working) on making improvements in these areas, including “pulling the plug” for a full shutdown, various hard disk failures, data corruption, and several types of cluster and HA failures (splitbrain and otherwise), out of memory resiliency and external pressures. This is another one I’m really looking forward to the video from.

The event wrapped up with a panel from GuideStar, kCura and E*Trade on how they’re using Elasticsearch and several “war stories” from their experiences working with the software itself, open source in general and Elastic the company.

In all, the conference was a great experience for me, and it was an impressive inaugural conference, though perhaps I should have expected that given the expertise and experience of the community team they have working there! They plan on doing a second one, and I recommend attendance to folks working with Elasticsearch.

More of my photos from the conference here: https://www.flickr.com/photos/pleia2/sets/72157650940379129/

by pleia2 at March 18, 2015 10:58 PM

March 14, 2015

Akkana Peck

Making a customized Firefox search plug-in

It's getting so that I dread Firefox's roughly weekly "There's a new version -- do you want to upgrade?" With every new upgrade, another new crucial feature I use every day disappears and I have to spend hours looking for a workaround.

Last week, upgrading to Firefox 36.0.1, it was keyword search: the feature where, if I type something in the location bar that isn't a URL, Firefox would instead search using the search URL specified in the "keyword.URL" preference.

In my case, I use Google but I try to turn off the autocomplete feature, which I find it distracting and unhelpful when typing new search terms. (I say "try to" because complete=0 only works sporadically.) I also add the prefix allintext: to tell Google that I only want to see pages that contain my search term. (Why that isn't the default is anybody's guess.) So I set keyword.URL to: http://www.google.com/search?complete=0&q=allintext%3A+ (%3A is URL code for the colon character).

But after "up"grading to 36.0.1, search terms I typed in the location bar took me to Yahoo search. I guess Yahoo is paying Mozilla more than Google is now.

Now, Firefox has a Search tab under Edit->Preferences -- but that just gives you a list of standard search engines' default searches. It would let me use Google, but not with my preferred options.

If you follow the long discussions in bugzilla, there are a lot of people patting each other on the back about how much easier the preferences window is, with no discussion of how to specify custom searches except vague references to "search plugins". So how do these search plugins work, and how do you make one?

Fortunately a friend had a plugin installed, acquired from who knows where. It turns out that what you need is an XML file inside a directory called searchplugins in your profile directory. (If you're not sure where your profile lives, see Profiles - Where Firefox stores your bookmarks, passwords and other user data, or do a systemwide search for "prefs.js" or "search.json" or "cookies.sqlite" and it should lead you to your profile.)

Once you have one plugin installed, it's easy to edit it and modify it to do anything you want. The XML file looks roughly like this:

<SearchPlugin xmlns="http://www.mozilla.org/2006/browser/search/" xmlns:os="http://a9.com/-/spec/opensearch/1.1/">
<os:ShortName>MySearchPlugin</os:ShortName>
<os:Description>The search engine I prefer to use</os:Description>
<os:InputEncoding>UTF-8</os:InputEncoding>
<os:Image width="16" height="16">data:image/x-icon;base64,ICON GOES HERE</os:Image>
<SearchForm>http://www.google.com/</SearchForm>
<os:Url type="text/html" method="GET" template="https://www.google.com/search">
  <os:Param name="complete" value="0"/>
  <os:Param name="q" value="allintext: {searchTerms}"/>
  <!--os:Param name="hl" value="en"/-->
</os:Url>
</SearchPlugin>

There are four things you'll want to modify. First, and most important, os:Url and os:Param control the base URL of the search engine and the list of parameters it takes. {searchTerms} in one of those Param arguments will be replaced by whatever terms you're searching for. So <os:Param name="q" value="allintext: {searchTerms}"/> gives me that allintext: parameter I wanted.

(The other parameter I'm specifying, <os:Param name="complete" value="0"/>, used to make Google stop the irritating autocomplete every time you try to modify your search terms. Unfortunately, this has somehow stopped working at exactly the same time that I upgraded Firefox. I don't see how Firefox could be causing it, but the timing is suspicious. I haven't been able to figure out another way of getting rid of the autocomplete.)

Next, you'll want to give your plugin a ShortName and Description so you'll be able to recognize it and choose it in the preferences window.

Finally, you may want to modify the icon: I'll tell you how to do that in a moment.

Using your new search plugin

[Firefox search prefs]

You've made all your modifications and saved the file to something inside the searchplugins folder in your Firefox profile. How do you make it your default?

I restarted firefox to make sure it saw the new plugin, though that may not have been necessary. Then Edit->Preferences and click on the Search icon at the top. The menu near the top under Default search engine is what you want: your new plugin should show up there.

Modifying the icon

Finally, what about that icon?

In the plugin XML file I was copying, the icon line looked like:

<os:Image width="16"
height="16">data:image/x-icon;base64,AAABAAEAEBAAAAEAIABoBAAAFgAAACgAAAAQAAAAIAAAAAEAIAAAAAAAAAAAAAAA
... many more lines like this then ... ==</os:Image>
So how do I take that and make an image I can customize in GIMP?

I tried copying everything after "base64," and pasting it into a file, then opening it in GIMP. No luck. I tried base64 decoding it (you do this with base64 -d filename >outfilename) and reading it in with GIMP. Still no luck: "Unknown file type".

The method I found is roundabout, but works:

  1. Copy everything inside the tag: data:image/x-icon;base64,AA ... ==
  2. Paste that into Firefox's location bar and hit return. You'll see the icon from the search plugin you're modifying.
  3. Right-click on the image and choose Save image as...
  4. Save it to a file with the extension .ico -- GIMP won't open it without that extension.
  5. Open it in GIMP -- a 16x16 image -- and edit to your heart's content.
  6. File->Export as...
  7. Use the type "Microsoft Windows icon (*.ico)"
  8. Base64 encode the file you just saved, like this: base64 yourfile.ico >newfile
  9. Copy the contents of newfile and paste that into your os:Image line, replacing everything after data:image/x-icon;base64, and before </os:Image>

Whew! Lots of steps, but none of them are difficult. (Though if you're not on Linux and don't have the base64 command, you'll have to find some other way of encoding and decoding base64.)

But if you don't want to go through all the steps, you can download mine, with its lame yellow smiley icon, as a starting point: Google-clean plug-in.

Happy searching! See you when Firefox 36.0.2 comes out and they break some other important feature.

March 14, 2015 06:35 PM

March 10, 2015

kdub

Mir Android-platform Multimonitor

My latest work on the mir android platform includes multimonitor support! It should work with slimport/mhl; Mir happily sits at an abstraction level above the details of mhl/slimport. This should be available in the next release (probably mir 0.13), or you can grab lp:mir now to start tinkering.

by Kevin at March 10, 2015 01:33 PM

March 08, 2015

Akkana Peck

GIMP: Turn black to another color with Screen mode

[20x20 icon, magnified 8 times] I needed to turn some small black-on-white icons to blue-on-white. Simple task, right? Except, not really. If there are intermediate colors that are not pure white or pure black -- which you can see if you magnify the image a lot, like this 800% view of a 20x20 icon -- it gets trickier.

[Bucket fill doesn't work for this] You can't use anything like Color to Alpha or Bucket Fill, because all those grey antialiased pixels will stay grey, as you see in the image at left.

And the Hue-Saturation dialog, so handy for changing the hue of a sky, a car or a dress, does nothing at all -- because changing hue has no effect when saturation is zero, as for black, grey or white. So what can you do?

I fiddled with several options, but the best way I've found is the Screen layer mode. It works like this:

[Make a new layer] In the Layers dialog, click the New Layer button and accept the defaults. You'll get a new, empty layer.

[Set the foreground color] Set the foreground color to your chosen color.

[Set the foreground color] Drag the foreground color into the image, or do Edit->Fill with FG Color.

Now it looks like your whole image is the new color. But don't panic!

[Use screen mode] Use the menu at the top of the Layers dialog to change the top layer's mode to Screen.

Layer modes specify how to combine two layers. (For a lot more information, see my book, Beginning GIMP). Multiply mode, for example, multiplies each pixel in the two layers, which makes light colors a lot more intense while not changing dark colors very much. Screen mode is sort of the opposite of Multiply mode: GIMP inverts each of the layers, multiplies them together, then inverts them again. All those white pixels in the image, when inverted, are black (a value of zero), so multiplying them doesn't change anything. They'll still be white when they're inverted back. But black pixels, in Screen mode, take on the color of the other layer -- exactly what I needed here.

Intensify the effect with contrast

[Mars sketch, colorized orange] One place I use this Screen mode trick is with pencil sketches. For example, I've made a lot of sketches of Mars over the years, like this sketch of Lacus Solis, the "Eye of Mars". But it's always a little frustrating: Mars is all shades of reddish orange and brown, not grey like a graphite pencil.

Adding an orange layer in Screen mode helps, but it has another problem: it washes out the image. What I need is to intensify the image underneath: increase the contrast, make the lights lighter and the darks darker.

[Colorized Mars sketch, enhanced  with brightness/contrast] Fortunately, all you need to do is bump up the contrast of the sketch layer -- and you can do that while keeping the orange Screen layer in place.

Just click on the sketch layer in the Layers dialog, then run Colors->Brightness/Contrast...

This sketch needed the brightness reduced a lot, plus a little more contrast, but every image will be different. Experiment!

March 08, 2015 01:22 AM

March 02, 2015

Elizabeth Krumbach

Tourist in Muscat, Oman

I had the honor of participating in FOSSC Oman this February, which I wrote about here. Our gracious hosts were very accommodating to all of our needs, starting with arranging assistance at the airport and lodging at a nearby Holiday Inn.

The Holiday Inn was near the airport without much else around, so it was my first experience with a familiar property in a foreign land. It was familiar enough for me to be completely comfortable, but different enough to never let me forget that I was in a new, interesting place. In keeping with standards of the country, the hotel didn’t serve alcohol or pork, which was fine by me.

During my stay we had one afternoon and evening to visit the sights with some guides from the conference. Speakers and other guests convened at the hotel and boarded a bus which first took us to the Sultan Qaboos Grand Mosque. Visiting hours for non-Muslims were in the morning, so we couldn’t go inside, but we did get to visit the outside gardens and take some pictures in front of the beautiful building.

From there we went to a downtown area of Muscat and were able to browse through some shops that seemed aimed at tourists and enjoy the harbor for a bit. Browsing the shops allowed me to identify some of the standard pieces I may want to purchase later, like the style of traditional incense burner. The harbor was quite enjoyable, a nice breeze coming in to take the edge off the hot days, which topped out around 90F while we were there (and it was their winter!).

We were next taken to Al Alam Palace, where the Sultan entertains guests. This was another outside only tour, but the walk through the plaza up to the palace and around was well worth the trip. There were also lit up mountainside structures visible from the palace which looked really stunning in the evening light.

That evening we headed up to the Shangri-La resort area on what seemed like the outskirts of Muscat. It was a whole resort complex, where we got to visit a beach before meeting up with other conference folks for a buffet dinner and musical entertainment for the evening.

I really enjoyed my time in Oman. It was safe, beautiful and in spite of being hot, the air conditioning in all the buildings made up for the time we spent outdoors, and the mornings and evenings were nice and cool. There was some apprehension as it was my first trip to the middle east and as a woman traveling alone, but I had no problems and everyone I worked with throughout the conference and or stay was professional, welcoming and treated me well. I’d love the opportunity to go back some day.

More photos from my trip here: https://www.flickr.com/photos/pleia2/sets/72157650553216248/

by pleia2 at March 02, 2015 02:47 AM

February 24, 2015

Akkana Peck

Tips for developing on a web host that offers only FTP

Generally, when I work on a website, I maintain a local copy of all the files. Ideally, I use version control (git, svn or whatever), but failing that, I use rsync over ssh to keep my files in sync with the web server's files.

But I'm helping with a local nonprofit's website, and the cheap web hosting plan they chose doesn't offer ssh, just ftp.

While I have to question the wisdom of an ISP that insists that its customers use insecure ftp rather than a secure encrypted protocol, that's their problem. My problem is how to keep my files in sync with theirs. And the other folks working on the website aren't developers and are very resistant to the idea of using any version control system, so I have to be careful to check for changed files before modifying anything.

In web searches, I haven't found much written about reasonable workflows on an ftp-only web host. I struggled a lot with scripts calling ncftp or lftp. But then I discovered curftpfs, which makes things much easier.

I put a line in /etc/fstab like this:

curlftpfs#user:password@example.com/ /servername fuse rw,allow_other,noauto,user 0 0

Then all I have to do is type mount /servername and the ftp connection is made automagically. From then on, I can treat it like a (very slow and somewhat limited) filesystem.

For instance, if I want to rsync, I can

rsync -avn --size-only /servername/subdir/ ~/servername/subdir/
for any particular subdirectory I want to check. A few things to know about this:
  1. I have to use --size-only because timestamps aren't reliable. I'm not sure whether this is a problem with the ftp protocol, or whether this particular ISP's server has problems with its dates. I suspect it's a problem inherent in ftp, because if I ls -l, I see things like this:
    -rw-rw---- 1 root root 7651 Feb 23  2015 guide-geo.php
    -rw-rw---- 1 root root 1801 Feb 14 17:16 guide-header.php
    -rw-rw---- 1 root root 8738 Feb 23  2015 guide-table.php
    
    Note that a file modified a week ago shows a modification time, but files modified today show only a day and year, not a time. I'm not sure what to make of this.
  2. Note the -n flag. I don't automatically rsync from the server to my local directory, because if I have any local changes newer than what's on the server they'd be overwritten. So I check the diffs by hand with tkdiff or meld before copying.
  3. It's important to rsync only the specific directories you're working on. You really don't want to see how long it takes to get the full file tree of a web server recursively over ftp.

How do you change and update files? It is possible to edit the files on the curlftpfs filesystem directly. But at least with emacs, it's incredibly slow: emacs likes to check file modification dates whenever you change anything, and that requires an ftp round-trip so it could be ten or twenty seconds before anything you type actually makes it into the file, with even longer delays any time you save.

So instead, I edit my local copy, and when I'm ready to push to the server, I cp filename /servername/path/to/filename.

Of course, I have aliases and shell functions to make all of this easier to type, especially the long pathnames: I can't rely on autocompletion like I usually would, because autocompleting a file or directory name on /servername requires an ftp round-trip to ls the remote directory.

Oh, and version control? I use a local git repository. Just because the other people working on the website don't want version control is no reason I can't have a record of my own changes.

None of this is as satisfactory as a nice git or svn repository and a good ssh connection. But it's a lot better than struggling with ftp clients every time you need to test a file.

February 24, 2015 02:46 AM

February 23, 2015

Elizabeth Krumbach

FOSSC Oman 2015

This past week I had the honor of speaking at FOSSC Oman 2015 in Muscat, following an invitation last fall from Professor Hadj Bourdoucen and the organizing team. Prior to my trip I was able to meet up with 2013 speaker Cat Allman who gave me invaluable tips about visiting the country, but above all made me really excited to visit the middle east for the first time and meet the extraordinary people putting on the conference.


Some of the speakers and organizers meet on Tuesday, from left: Wolfgang F. Finke, Matthias Stürmer, Khalil Al Maawali, me and Hadj Bourdoucen

My first observation was that the conference staff really went out of their way to be welcoming to all the speakers, welcoming us at the hotel the day before the conference, making sure all our needs were met. My second was that the conference was that it was really well planned and funded. They did a wonderful job finding a diverse speaker list (both topic and gender-wise) from around the world. I was really happy to learn that the conference was also quite open and free to attend, so there were participants from other nearby companies, universities and colleges. I’ll also note that there were more women at this conference than I’ve ever seen at an open source conference, at least half the audience, perhaps slightly more.

The conference itself began on Wednesday morning with several introductions and welcome speeches from officials of Sultan Qaboos University (SQU), the Information Technology Authority (ITA) and Professor Hadj Bourdoucen who gave the opening FOSSC 2015 speech. These introductions were all in Arabic and we were all given headsets for live translations into English.

The first formal talk of the conference was Patrick Sinz on “FOSS as a motor for entrepreneurship and job creation.” In this talk he really spoke to the heart of why the trend has been leaning toward open source, with companies tired of being beholden to vendors for features, being surprised by changes in contracts, and the general freedom of not needing “permission” to alter the software that’s running your business, or your country. After a break, his talk was followed by one by Jan Wildeboer titled “Open is default.” He covered a lot in his talk, first talking about how 80% of most software stacks can easily be shared between companies without harming any competitive advantage, since everyone needs all the basics of hardware interaction, basic user interaction and more, thus making use of open source for this 80% an obvious choice. He also talked about open standards and how important it is to innovation that they exist. While on the topic of innovation he noted that instead of trying to make copies of proprietary offerings, open source is now leading innovation in many areas of technology, and has been for the past 5 years.

My talk came up right after Jan’s, and with a topic of “Building a Career in FOSS” it nicely worked into things that Patrick and Jan had just said before me. In this world of companies who need developers for features and where they’re paying good money for deployment of open source, there are a lot of jobs cropping up in the open source space. My talk gave a tour of some of the types of reasons one may contribute (aside from money, there’s passion for openness, recognition, and opportunity to work with contributors from around the world), types of ways to get involved (aside from programming, people are paid for deployments, documentation, support and more) and companies to aim for when looking to find a job working on open source (fully open source, open source core, open source division of a larger company). Slides from my talk are available here (pdf).

Directly following my talk, I participated in a panel with Patrick, Jan and Matthias (who I’d met the previous day) where we talked about some more general issues in the open source career space, including how language barriers can impact contributions, how the high profile open source security issues of 2014 have impacted the industry and some of the biggest mistakes developers make regarding software licenses.

The afternoon began with a talk by Hassan Al-Lawati on the “FOSS Initiative in Oman, Facts and Challenges” where he outlined the work they’ve been doing in their multi-year plan to promote the use and adoption of FOSS inside of Oman. Initiatives began with awareness campaigns to familiarize people with the idea of open source software, development of training material and programs, in addition to existing certificate programs in the industry, and the deployment of Open Source Labs where classes on and development of open source can be promoted. He talked about some of the further future plans including more advanced training. He wrapped up his talk by discussing some of the challenges, including continued fears about open source by established technologists and IT managers working with proprietary software and in general less historical demand for using open source solutions. Flavia Marzano spoke next on “The role and opportunities of FOSS in Public Administrations” where she drew upon her 15 years of experience working in the public sector in Italy to promote open source solutions. Her core points centered around the importance of the releasing of data by governments in open formats and the value of laws that make government organizations consider FOSS solutions, if not compel them. She also stressed that business leaders need to understand the value of using open source software, even if they themselves aren’t the ones who will get the read the source code, it’s important that someone in your organization can. Afternoon sessions wrapped up with a panel on open source in government, which talked about how cost is often not a motivator and that much of the work with governments is not a technical issue, but a political one.


FOSS in Government panel: David Hurley, Hassan Al-Lawati, Ali Al Shidhani and Flavia Marzano

The conference wrapped up with lunch around 2:30PM and then we all headed back to our hotels before an evening out, which I’ll talk more about in an upcoming post about my tourist fun in Muscat.

Thursday began a bit earlier than Wednesday, with the bus picking us up at the hotel at 7:45AM and first talks beginning at 8:30AM.

Matthias Stürmer kicked off the day with a talk on “Digital sustainability of open source communities” where he outlined characteristics of healthy open source communities. He first talked about the characteristics that defined digital sustainability, including transparency and lack of legal or policy restrictions. The characteristics of healthy open source communities included:

  • Good governance
  • Heterogeneous community (various motivations, organizations involved)
  • Nonprofit foundation (doing marketing)
  • Ecosystem of commercial service providers
  • Opportunity for users to get things done

It was a really valuable presentation, and his observations were similar to mine when it comes to healthy communities, particularly as they grow. His slides are pretty thorough with main points clearly defined and are up on slideshare here.

After his presentation, several of us speakers were whisked off to have a meeting with the Vice-chancellor of SQU to talk about some of the work that’s been done locally to promote open source education, adoption and training. Can’t say I was particularly useful at this session, lacking experience with formal public sector migration plans, but it was certainly interesting for me to participate in.

I then met up with Khalil for another adventure, over to Middle East College to give a short open source presentation to students in an introductory Linux class. The class met in one of the beautiful Open Source Labs that Hassan had mentioned in his talk, it was a real delight to go to one. It was also fascinating to see that the vast majority of the class was made up of women, with only a handful of men – quite the opposite from what I’m used to! My presentation quickly covered the basics of open source, the work I’ve done both as a paid and volunteer contributor, examples of some types of open source projects (different size, structure and volunteer to paid ratios) and common motivations for companies and individuals to get involved. The session concluded with a great Q&A session, followed by a bunch of pictures and chats with students. Slides from my talk are here (pdf).


Khalil and me at the OSL at MEC

My day wound down back at SQU by attending the paper sessions that concluded the conference and then lunch with my fellow speakers.

Now for some goodies!

There is a YouTube video of each day up, so you can skim through it along with the schedule to find specific talks:

There was also press at the conference, so you can see one release published on Zawya: FOSSC-Oman Kicks Off; Forum Focuses on FOSS Opportunities and Communities and an article by the Oman Tribune: Conference on open source software begins at SQU.

And more of my photos from the conference are here: https://www.flickr.com/photos/pleia2/sets/72157650553205488/

by pleia2 at February 23, 2015 02:15 AM

February 19, 2015

Akkana Peck

Finding core dump files

Someone on the SVLUG list posted about a shell script he'd written to find core dumps.

It sounded like a simple task -- just locate core | grep -w core, right? I mean, any sensible packager avoids naming files or directories "core" for just that reason, don't they?

But not so: turns out in the modern world, insane numbers of software projects include directories called "core", including projects that are developed primarily on Linux so you'd think they would avoid it ... even the kernel. On my system, locate core | grep -w core | wc -l returned 13641 filenames.

Okay, so clearly that isn't working. I had to agree with the SVLUG poster that using "file" to find out which files were actual core dumps is now the only reliable way to do it. The output looks like this:

$ file core
core: ELF 32-bit LSB core file Intel 80386, version 1 (SYSV), too many program headers (375)

The poster was using a shell script, but I was fairly sure it could be done in a single shell pipeline. Let's see: you need to run locate to find any files with 'core" in the name.

Then you pipe it through grep to make sure the filename is actually core: since locate gives you a full pathname, like /lib/modules/3.14-2-686-pae/kernel/drivers/edac/edac_core.ko or /lib/modules/3.14-2-686-pae/kernel/drivers/memstick/core, you want lines where only the final component is core -- so core has a slash before it and an end-of-line (in grep that's denoted by a dollar sign, $) after it. So grep '/core$' should do it.

Then take the output of that locate | grep and run file on it, and pipe the output of that file command through grep to find the lines that include the phrase 'core file'.

That gives you lines like

/home/akkana/geology/NorCal/pinnaclesGIS/core: ELF 32-bit LSB core file Intel 80386, version 1 (SYSV), too many program headers (523)

But those lines are long and all you really need are the filenames; so pass it through sed to get rid of anything to the right of "core" followed by a colon.

Here's the final command:

file `locate core | grep '/core$'` | grep 'core file' | sed 's/core:.*//'

On my system that gave me 11 files, and they were all really core dumps. I deleted them all.

February 19, 2015 07:54 PM

February 18, 2015

Jono Bacon

Bobbing for Influence

Companies, communities, families, clubs, and other clumps of humans all have some inherent social dynamics. At a simple level there are leaders and followers, but in reality the lines are rarely as clear as that.

Many leaders, with a common example being some founders, have tremendous vision and imagination, but lack the skills to translate that vision into actionable work. Many followers need structure to their efforts, but are dynamic and creative in the execution. Thus, the social dynamic in organizations needs a little more nuance.

This is where traditional management hierarchies break down in companies. You may have your SVPs, then your VPs, then your Senior Directors, then your Directors, and so on, but in reality most successful companies don’t observe those hierarchies stringently. In many organizations a junior-level employee who has been there for a while can have as much influence and value, if not more, than a brand new SVP.

As such, the dream is that we build organizations with crisp reporting lines but in which all employees feel they have the ability to bring their creativity and ideas to logically influence the scope, work, and culture of the organization.

Houston, we have a problem

Sadly, this is where many organizations run into trouble. It seems to be the same ‘ol story time after time: as the organization grows, the divide between the senior leadership and the folks on the ground widens. Water cooler conversations and bar-side grumblings fuel the fire and resentment, frustrations, and resume-editing often sets in.

So much of this is avoidable though. Of course, there will always be frustration in any organization: this is part and parcel of people working together. Nothing will be perfect, and it shouldn’t be…frustration and conflict can often lead to organizations re-pivoting and taking a new approach. I believe though, that there are a lot of relatively simple things we can do to make organizations feel more engaging.

Influence

A big chunk of the problems many organizations face is around influence. More specifically, the problems set in when employees and contributors feel that they no longer have the ability to have a level of influence or impact in an organization, and thus, their work feels more mechanical, is not appreciated, and there is little validation.

Now, influence here is subtle. It is not always about being involved in the decision-making or being in the cool meetings. Some people won’t, and frankly shouldn’t, be involved in certain decisions: when we have too many cooks in the kitchen, you get a mess. Or Arby’s. Choose your preferred mess.

The influence I am referring to here is the ability to feed into the overall culture and to help shape and craft the organization. If we want to build truly successful organizations, we need to create a culture in which the very best ideas and perspectives bubble to the surface. These ideas may come from SVPs or it may come from the dude who empties out the bins.

The point being, if we can figure out a formula in which people can feel they can feed into the culture and help shape it, you will build a stronger sense of belonging and people will stick around longer. A sense of empowerment like this keeps people around for the long haul. When people feel unengaged or pushed to the side, they will take the next shiny opportunity that bubbles up on LinkedIn.

Some Practical Things To Do So, we get what the challenge ahead is. How do we beat it? Well, while there are many books written on the subject, I believe there are ten simple approaches we can get started with.

You don’t have to execute them in this order (in fact, these are not in any specific order), and you may place different levels of importance in some of them. I do believe though, they are all important. Let’s take a spin through them.

1. Regularly inform

A lack of information is a killer in an organization. If an organization has problems and is working to resolve them, the knowledge and assurance of solving these challenges is of critical importance to share.

In the Seven Habits, Covey talks about the importance of working on Important, and not just Urgent things. In the rush to solve problems we often forget to inform where changes, improvements, and engagement is happening. No one ever cried about getting too much clarity, but the inverse has resulted in a few gin and tonics in an evening.

There are two key types of updates here: informational and engagement. For the former, this is the communication to the wider organization. It is the memo, or if you are more adventurous, the podcast, video presentation, all-hands meeting or otherwise. These updates are useful, but everyone expects them to be very formal, lack specifics, and speak in generalities.

The latter, engagement updates, are within specific teams or with individuals. These should be more specific, and where appropriate, share some of the back-story. This gives a sense of feeling “in” on the story. Careful use of both approaches can do wondrous things to build a sense of engagement to leadership.

2. Be collaborative around the mission and values

Remember that mission statement you wrote and stuck on a web page or plaque somewhere? Yeah, so do we. Looked at it recently? Probably not.

Mission statements are often a broad and ambiguous statement, once written, and mostly forgotten. They are typically drafted by a select group of people, and everyone on the ground in service of that very mission typically feels rather disconnected from it.

Let’s change that. Dig out the mission statement and engage with your organization to bring it up to date. Have an interactive conversation about what people feel the broader goals and opportunities are, and take practical input from people and merge it into the mission. You will end up with a mission that is more specific, more representative, and in which people really felt a part of.

Do the same for your organizational values, code of conduct, and other key documents.

3. Provide opportunities for success

The very best organizations are ones where everyone has the opportunities to bring their creativity to the fold and further our overall mission and goals. The very worst organizations shut their people down because their business card doesn’t have the right thing written on it, or because of a clique of personalities.

We want an environment where everyone has the opportunity to step to the plate. An example of this was when I hired a translations coordinator for my team at Canonical. He did great work so I offered him opportunities to challenge himself and his skills. That same guy filled my shoes when I left the Canonical few years later.

Now, let’s be honest. This is tough. It relies on leaders really knowing their teams. It relies on seeing potential, not just ticked-off work items. If you create a culture though where you can read potential, tap it, and bring it into new projects, it will create an environment in which everyone feels opportunity is around the corner if they work hard.

4. If you Make Plans, Action Them

This is going to sound like a doozy, but it blows me away how much this happens. This is one for the leaders of organizations. Yes, you reading this: this includes you.

If you create a culture in which people can be more engaged, this will invariably result in new plans, ideas, and platforms. When these plans are shared, those people will feel engaged and excited about contributing to the wider team.

If that then goes into a black hole never to be assessed, actioned, or approved, discontentment will set in.

So, if you want to have a culture of engagement, take the time to actually follow up and make sure people can actually do something. Accepting great ideas, agreeing to them, and not following up will merely spark frustration for those who take the initiative to think holistically about the organization.

5. Regularly survey

It never ceases to amaze me how valuable surveys can be. You often have an idea of what you think people have a perspective on, you decide to survey them, and the results are in many cases enlightening.

Well structured surveys are an incredibly useful tool. You don’t need to do any crazy data analysis on these things: you often just need to see the general trends and feedback. It is important in these surveys to to always have a general open-ended question that can gather all feedback that didn’t fit neatly into your question matrix.

Of course, there is a whole science around running great surveys, and some great books to read, but my primary point here is to do them, do them often, and learn-from and action the results.

One final point: surveys will often freak managers out as they will worry about accountability. Don’t treat these worries with a sledgehammer: help them to understand the value of learning from feedback and to embrace a culture in which we constantly improve. This is not about yelling about mistakes, it is about exploring how we improve.

6. Create a thoughtful management culture

OK, that title might sound a little fluffy, but this is a key recommendation.

I learned from an old manager a style of management that I have applied subsequently and that I feel works well.

The idea is simple: when someone joins my team, I tell them that I want to help them in two key ways. Firstly, I want them to be successful in their role, to have all the support they need, to get the answers they need, and to be able to do a great job and enjoy doing it. Most managers focus their efforts here.

What is important is the second area of focus as a manager. I tell my team members that I want to help them be the very best they can be in their career; to support, mentor, and motivate them to not just do a great job here at the organization, but to feel that this time working here was one that was a wider investment in their career.

I believe both of these pledges from a manager are critical. Think about the best managers and teachers you have had: they paid attention to your immediate as well as long-term success.

If you are on an executive team of company, you should demand that your managers provide both of these pledges to their teams. This should be real, not just words, and be authentic.

7. Surprise your staff

This is another one for leaders in an organization.

We are all people and in business we often forget we are people. We all have hobbies, interests, ideas, jokes, stories, experiences to share. When we infuse our organizations with this humanity they feel more real and more engaging.

In any melting pot of an organization, some people will freely share their human side…their past experiences, stories, families, hobbies, favorite movies and bands…but in many cases the more senior up the chain you go, these kinds of human elements become isolated and shared with people who have a similar rank in the organization. This creates leadership cliques.

On many cases, seeing leaders surprise their staff and be relaxed, open, and engaging, can send remarkably positive messages. It shows the human side of someone who may be primarily experienced by staff as merely giving directives and reviewing performance. Remember, folks, we are all animals.

8. Set expectations

Setting expectations is a key thing in many successful projects. Invariably though, we often think about the expectations of consumers of our work; stakeholders, customers, partners etc.

It is equally important to set expectations with our teams that we welcome input, ideas, and perspectives for how the team and the wider organization works.

I like to make this bluntly clear to anyone I work with: I want all feedback, even if that feedback is deeply critical of my or the work I am doing. I would rather have an uncomfortable conversation and be able to tend to those concerns, than never to hear them in the first place and keep screwing up.

Thus, even if you think it is well understood that feedback and engagement is welcome, make it bluntly clear, from the top level and throughout the ranks that this is not only welcome, but critical for success.

9. Focus on creativity and collaboration

I hated writing that title. It sounds so buzzwordy, but it is an important point. The most successful organizations are ones that feel creative and collaborative, and where people have the ability to explore new ideas.

Covey talks about the importance of synergy and that working with others not only brings the best out of us, but helps us to challenge broken or misaligned assumptions. As such, getting people together to creatively solve problems is not just important for the mission, but also for the wellbeing of the people involved.

As discussed earlier though, we want to infuse specific teams with this, but also create a general culture of collaboration. To do this on a wider level you could have organization-wide discussions, online/offline planning events, incentive competitions and more.

10. Should I stay or should I go?

This is going to be a tough pill to swallow for some founders and leaders, but sometimes you just need to get out the way and let your people do their jobs.

Organizations that are too directed and constrained by leadership, either senior or middle-management, feel restrictive and limiting. Invariably this will quash the creativity and enthusiasm in some staff.

We want to strike a balance where teams are provided the parameters of what success looks like, and then leadership trusts them to succeed within those parameters. Regular gate reviews make perfect sense, but daily whittering over specifics does not.

This means that for some leaders, you just need to get out the way. I learned this bluntly when a member of my team at Canonical told me over a few beers one night that I needed to stop meddling and leave the team alone to get on with a project. They were right: I was worried about my teams delivery and projecting that down by micro-managing them. I gave them the air they needed, and they succeeded.

On the flip side, we also need to ensure leadership is there for support and guidance when needed. Regular check-ins, 1-on-1s, and water-cooler time is a great way to do this in a more comfortable way.

I hope this was useful and if nothing else, provided some ideas for further thinking about how we build organizations where we can tap into the rich chemistry of ideas, creativity, and experience in our wider teams. As usual, feedback is always welcome. Thanks for reading!

by jono at February 18, 2015 05:45 PM

February 17, 2015

Jono Bacon

Video Phone Review and Wider Thoughts

I recorded and posted a video with a detailed review of the bq Aquaris E4.5 Ubuntu phone, complete with wider commentary on the scopes and convergence strategy and the likelihood of success.

See it below:

Can’t see it? See it here.

by jono at February 17, 2015 05:26 PM

February 14, 2015

Akkana Peck

The Sangre de Cristos wish you a Happy Valentine's Day

[Snow hearts on the Sangre de Cristo mountains]

The snow is melting fast in the lovely sunny weather we've been having; but there's still enough snow on the Sangre de Cristos to see the dual snow hearts on the slopes of Thompson Peak above Santa Fe, wishing everyone for miles around a happy Valentine's Day.

Dave and I are celebrating for a different reason: yesterday was our 1-year anniversary of moving to New Mexico. No regrets yet! Even after a tough dirty work session clearing dead sage from the yard.

So Happy Valentine's Day, everyone! Even if you don't put much stock in commercial Hallmark holidays. As I heard someone say yesterday, "Valentine's day is coming up, and you know what that means. That's right: absolutely nothing!"

But never mind what you may think about the holiday -- you just go ahead and have a happy day anyway, y'hear? Look at whatever pretty scenery you have near you; and be sure to enjoy some good chocolate.

February 14, 2015 10:01 PM

February 11, 2015

Elizabeth Krumbach

Wrap up of the San Francisco Ubuntu Global Jam at Gandi

This past Sunday I hosted an Ubuntu Global Jam at the Gandi office here in downtown San Francisco. Given the temporal proximity to a lot of travel, I’ve had to juggle a lot to make this happen, a fair amount of work goes into an event like this, from logistics of getting venue, food and drinks, and giveaways to the actual prep for the event and actually telling people about it. In this case we were working on Quality Assurance for Xubuntu (and a little Lubuntu on a PPC Mac).

It’s totally worth it though, so I present to you the full list of prep, should you wish to do a QA event in your region:

  • Secure venue: Completed in December (thanks AJ at Gandi!).
  • Secure refreshments funding: Completed in January via the Ubuntu donations funding.
  • Create LoCo Team Portal event and start sharing it everywhere (social media, friendly mailing lists for locals who may be interested). Do this for weeks!
  • Prepare goodies. I had leftover pens and stickers from a previous event. I then met up with Mark Sobell earlier in the week to have him sign copies of A Practical Guide to Ubuntu Linux, 4th Edition we received from the publisher (thank you Mark and Prentice Hall!).
  • Collect and stage all the stuff you’re bringing.
  • Print out test cases, since it can be tricky to juggle reading the test case while also navigating the actual test on their laptop.
  • Also print out signs for the doors at the venue.
  • Tour venue and have final chat with your host about what you need (plates, cups and utensils? power? wifi? projector?).
  • Send out last minute email to attendees as a reminder and in case of any last minute info.
  • Make sure dietary requirements of attendees are met. I did go with pizza for this event, but I made sure to go with a pizzeria that offered gluten free options and I prepared a gluten free salad (which people ate!).
  • Download and burn/copy the daily ISOs as soon as they come out on the day of the event, and put them on USB sticks or discs as needed: Xubuntu went on USB sticks, Lubuntu for PPC went on a CD-R (alternate) and DVD-R (desktop, currently oversized).
  • Bring along any extra laptops you have so folks who don’t bring one or have trouble doing testing on theirs can participate
  • Make penguin-shaped cookies (this one may be optional).

With all of this completed, I think the event went pretty smoothly. My Ubuntu California team mates James Ouyang and Christian Einfeldt met me at my condo nearby to help me carry over everything. AJ met us upon arrival and we were able to get quickly set up.

I had planned on doing a short presentation to give folks a tour of the ISO Tracker but the flow of attendees made it such that I could get the experienced attendees off and running pretty quick (some had used the tracker before) and by the time they were starting we had some newcomers joining us who I was able to guide one-on-one.

I did a lot of running around, but attendees were able to help out each other too, and it was a huge help to bring along some extra laptops. I was also surprised to see that another PPC Mac showed up at the event! I thought the one I brought would be the only one that would be used for Lubuntu. Later in the event we were joined by some folks who came over after the nearby BerkeleyLUG meeting wrapped up at 3PM, and caused us to push the event a full hour later than expected (thanks to AJ for putting up with us for another hour!).

Prior to the event, I had worried some about attendance, but throughout the event we had about 12 people total come and go, which was the perfect amount for me and a couple of other Ubuntu Members to manage so that attendees didn’t feel ignored as they worked through their tests. Post event, I’ve been able to provide some feedback to the Ubuntu Quality team about some snafus we encountered while doing testing. Hopefully these can be fixed next time around so other teams don’t run into the same issues we did.

Aside from some of the hiccups with the trackers, I received really positive feedback from attendees. Looking forward to doing this again in the future!

More photos from the event available here: https://www.flickr.com/photos/pleia2/sets/72157650663176996/

by pleia2 at February 11, 2015 04:26 AM

February 10, 2015

Akkana Peck

Making flashblock work again; and why HTML5 video doesn't work in Firefox

Back in December, I wrote about Problems with Firefox 35's new deprecation of flash, and a partial solution for Debian. That worked to install a newer version of the flash plug-in on my Debian Linux machine; but it didn't fix the problem that the flashblock program no longer works properly on Firefox 35, so that clicking on the flashblock button does nothing at all.

A friend suggested that I try Firefox's built-in flash blocking. Go to Tools->Add-ons and click on Plug-ins if that isn't the default tab. Under Shockwave flash, choose Ask to Activate.

Unfortunately, the result of that is a link to click, which pops up a dialog that requires clicking a button to dismiss it -- a pointless and annoying extra step. And there's no way to enable flash for just the current page; once you've enabled it for a domain (like youtube), any flash from that domain will auto-play for the remainder of the Firefox session. Not what I wanted.

So I looked into whether there was a way to re-enable flashblock. It turns out I'm not the only one to have noticed the problem with it: the FlashBlock reviews page is full of recent entries from people saying it no longer works. Alas, flashblock seems to be orphaned; there's no comment about any of this on the main flashblock page, and the links on that page for discussions or bug reports go to a nonexistent mailing list.

But fortunately there's a comment partway down the reviews page from user "c627627" giving a fix.

Edit your chrome/userContent.css in your Firefox profile. If you're not sure where your profile lives, Mozilla has a poorly written page on it here, Profiles - Where Firefox stores your bookmarks, passwords and other user data, or do a systemwide search for "prefs.js" or "search.json" or "cookies.sqlite" and it will probably lead you to your profile.

Inside yourprofile/chrome/userContent.css (create it if it doesn't already exist), add these lines:

@namespace url(http://www.w3.org/1999/xhtml);
@-moz-document domain("youtube.com"){
#theater-background { display:none !important;}}

Now restart Firefox, and flashblock should work again, at least on YouTube. Hurray!

Wait, flash? What about HTML5 on YouTube?

Yes, I read that too. All the tech press sites were reporting week before last that YouTube was now streaming HTML5 by default.

Alas, not with Firefox. It works with most other browsers, but Firefox's HTML5 video support is too broken. And I guess it's a measure of Firefox's increasing irrelevance that almost none of the reportage two weeks ago even bothered to try it on Firefox before reporting that it worked everywhere.

It turns out that using HTML5 video on YouTube depends on something called Media Source Extensions (MSE). You can check your MSE support by going to YouTube's HTML5 info page. In Firefox 35, it's off by default.

You can enable MSE in Firefox by flipping the media.mediasource preference, but that's not enough; YouTube also wants "MSE & H2.64". Apparently if you care enough, you can set a new preference to enable MSE & H2.64 support on YouTube even though it's not supported by Firefox and is considered too buggy to enable.

If you search the web, you'll find lots of people talking about how HTML5 with MSE is enabled by default for Firefox 32 on youtube. But here we are at Firefox 35 and it requires jumping through hoops. What gives?

Well, it looks like they enabled it briefly, discovered it was too buggy and turned it back off again. I found bug 1129039: Disable MSE for Firefox 36, which seems an odd title considering that it's off in Firefox 35, but there you go.

Here is the dependency tree for the MSE tracking bug, 778617. Its dependency graph is even scarier. After taking a look at that, I switched my media.mediasource preference back off again. With a dependency tree like that, and nothing anywhere summarizing the current state of affairs ... I think I can live with flash. Especially now that I know how to get flashblock working.

February 10, 2015 12:08 AM

February 04, 2015

Akkana Peck

Studying Glaciers on our Roof

[Roof glacier as it slides off the roof] A few days ago, I wrote about the snowpack we get on the roof during snowstorms:

It doesn't just sit there until it gets warm enough to melt and run off as water. Instead, the whole mass of snow moves together, gradually, down the metal roof, like a glacier.

When it gets to the edge, it still doesn't fall; it somehow stays intact, curling over and inward, until the mass is too great and it loses cohesion and a clump falls with a Clunk!

The day after I posted that, I had a chance to see what happens as the snow sheet slides off a roof if it doesn't have a long distance to fall. It folds gracefully and gradually, like a sheet.

[Underside of a roof glacier] [Underside of a roof glacier] The underside as they slide off the roof is pretty interesting, too, with varied shapes and patterns in addition to the imprinted pattern of the roof.

But does it really move like a glacier? I decided to set up a camera and film it on the move. I set the Rebel on a tripod with an AC power adaptor, pointed it out the window at a section of roof with a good snow load, plugged in the intervalometer I bought last summer, located the manual to re-learn how to program it, and set it for a 30-second interval. I ran that way for a bit over an hour -- long enough that one section of ice had detached and fallen and a new section was starting to slide down. Then I moved to another window and shot a series of the same section of snow from underneath, with a 40-second interval.

I uploaded the photos to my workstation and verified that they'd captured what I wanted. But when I stitched them into a movie, the way I'd used for my time-lapse clouds last summer, it went way too fast -- the movie was over in just a few seconds and you couldn't see what it was doing. Evidently a 30-second interval is far too slow for the motion of a roof glacier on a day in the mid-thirties.

But surely that's solvable in software? There must be a way to get avconv to make duplicates of each frame, if I don't mind that the movie come out slightly jump. I read through the avconv manual, but it wasn't very clear about this. After a lot of fiddling and googling and help from a more expert friend, I ended up with this:

avconv -r 3 -start_number 8252 -i 'img_%04d.jpg' -vcodec libx264 -r 30 timelapse.mp4

In avconv, -r specifies a frame rate for the next file, input or output, that will be specified. So -r 3 specifies the frame rate for the set of input images, -i 'img_%04d.jpg'; and then the later -r 30 overrides that 3 and sets a new frame rate for the output file, -timelapse.mp4. The start number is because the first file in my sequence is named img_8252.jpg. 30, I'm told, is a reasonable frame rate for movies intended to be watched on typical 60FPS monitors; 3 is a number I adjusted until the glacier in the movie moved at what seemed like a good speed.

The movies came out quite interesting! The main movie, from the top, is the most interesting; the one from the underside is shorter.

Roof Glacier
Roof Glacier from underneath.

I wish I had a time-lapse of that folded sheet I showed above ... but that happened overnight on the night after I made the movies. By the next morning there wasn't enough left to be worth setting up another time-lapse. But maybe one of these years I'll have a chance to catch a sheet-folding roof glacier.

February 04, 2015 02:46 AM

Elizabeth Krumbach

Afternoon in Brussels

My trip to Brussels for FOSDEM was a short one, I have a lot of work to do at home so it was impossible for me to make the case for staying more than three days. But since I got in early Friday morning, I did have Friday afternoon to do a bit of exploring.

First stop: get some mussels and frites!

For the rest of the afternoon I had planned on taking one of the tourist buses around town, but by the time I was ready to go it was 2PM and the last loop started at 2:30 that day, not giving me enough time to snag the last bus, and even if I had, where’s the fun in never getting off it? So I made my way toward Grand Place, where there were loads of shops, drinks and museums.

I decided to spend my afternoon at the Museum of the City of Brussels, which is dedicated to the history of the city and housed at Grand Place in the former King’s Mansion (Maison du Roi).

I’m glad I went, the museum had some beautiful pieces and I enjoyed learning about some of the history of the city. They were also running a special exhibit about the German occupation around World War I, which offered some interesting and sad insight into how the Belgians handled the occupation and the suffering endured by citizens during that time. Finally, I thoroughly enjoyed the browse through the amusing array of costumes made for the famous Manneken Pis.

The museum closed at 5PM and I made my way to visit the actual Manneken Pis fountain, located a few blocks south of the Grand Palace. It was starting to get quite chilly out and I was glad I had packed mittens. I snapped my photo of the fountain and then meandered my way back north until I found a little cafe where I got myself a nice cup of hot chocolate and warmed up while I waited for the Software Freedom Conservancy dinner at Drug Opera.

I also spent time scouring shop fronts for a Delirium Tremens stuffed toy elephant (as seen here). I saw one through a shop window the last time I was in Brussels in 2010, but it was late at night and the shop was closed. Alas, I never did find the elephant… until after dinner when I was walking back to my hotel once again late at night and the shop was closed! Argh! May we meet again some day, dear pink elephant.

In general the short length of the trip meant that I also didn’t get to enjoy many Belgian beers on my trip, quite the tragedy, but I did have to be alert for the actual conference I came to speak at and attend.

More photos from my tourist adventure here: https://www.flickr.com/photos/pleia2/sets/72157650562831526/

by pleia2 at February 04, 2015 02:25 AM

February 02, 2015

Jono Bacon

Bad Voltage: Live @ SCALE13x

As regular readers of my blog will know, I rather like the SoCal Linux Expo, more commonly known as SCALE. I have been going for over eight years and every year it delivers an incredible balance of content and community spirit. I absolutely love going every year.

Other readers may also know that I do a podcast with three other idiots every two weeks called Bad Voltage. The show is a soup of Linux, Open Source, technology, digital rights, politics, and more, all mixed together with reviews, interviews, and plenty more. I am really proud of the show: I think it is fun but also informative, and has developed an awesome community around it.

Given my love of SCALE and Bad Voltage, I am therefore tickled pink that we are going to be taping Bad Voltage: Live at SCALE. This will be our very first show in front of a live audience, and in-fact, the first time the whole team has been in the same building before.

The show takes place on the evening of Fri 20th Feb 2015 in the main La Jolla room.

The show will be packed with discussions, contests, give-aways, challenges, and more. It will be a very audience participatory show and we will be filming it as well as recording the podcast, ready for release post-SCALE.

So, be sure to get along and join the show on the evening of Fri 20th Feb 2015, currently slated to start at 9pm, but the time may adjust, so keep your eye on the schedule!

by jono at February 02, 2015 06:55 AM

Elizabeth Krumbach

FOSDEM 2015

This weekend I spent in Brussels for my first FOSDEM. As someone who has been actively involved with open source since 2003, stories of FOSDEM have floated around in communities I’ve participated in for a long time, so I was happy to finally have the opportunity to attend and present.

Events kicked off Friday night with a gathering at a dinner with the Software Freedom Conservancy. It was great to start things off with such a friendly crowd, most of whom I’ve known for years. I sat with several of my OpenStack colleagues as we enjoyed dinner and conversation about StoryBoard and bringing the OpenStack activity board formally into our infrastructure with Puppet modules. It was a fun and productive dinner, I really appreciated that so many at this event took the initiative to gather in team tables so we could have our own little mini-meetups during the SFC event. After dinner I followed some colleagues over to Delirium Cafe for the broader pre-FOSDEM beer event, but the crowd was pretty overwhelming and I was tired, so I ended up just heading back to my hotel to get some rest.

On Saturday I met up with my colleague Devananda van der Veen and we headed over to the conference venue. The conference began with a couple keynotes. Karen Sandler was the first, giving her talk on Identity Crisis: Are we who we say we are? where she addressed the different “hats” we wear as volunteers, paid contributors, board members, and more in open source projects. She stressed how important it is that we’re clear about who and what we’re representing when we contribute to discussions and take actions in our communities. I was excited to see that she also took the opportunity to announce Outreachy, the successor to the Outreach Program for Women, which not only continues the work of bringing women into open source beyond GNOME, but also “from groups underrepresented in free and open source software.” This was pretty exciting news, congratulations to everyone involved!

The next keynote was by Antti Kantee who spoke on What is wrong with Operating Systems (and how do we make things better). Antti works on the NetBSD Rump Kernels and is a passionate advocate for requiring as little as possible from an underlying Operating System in today’s world. He argues that a complicated OS only serves to introduce instability and unnecessary complexity into most ways we do computing these days, with their aggressive support of multi-user environments on devices that are single user and more. He demonstrated how you can strip away massive amounts of the kernel and still have a viable, basic user environment with a TCP/IP stack that applications can then interface with.

The next talk I went to was Upstream Downstream: The relationship between developer and package maintainer by Norvald H. Ryeng of the MySQL project. Over the years I’ve been a contributor on both sides of this, but it’s been a few years since I was directly involved in the developer-packager relationship so it was great to hear about the current best practices of communities working in this space. He walked through what a release of MySQL looks like, including all the types of artifacts created and distribution mechanisms utilized (source, packages, FTP, developer site direct downloads) and how they work with distribution package maintainers. He had a lot of great tips for both upstream developers and downstream packagers about how to have an effective collaboration, much of it centering around communication. Using MySQL as an example, he went through several things they’ve done, including:

  • Being part of Ubuntu’s Micro Release Exception program so packagers don’t cherry-pick security vulnerabilities, instead they can take the full micro-release from the trusted, well-tested upstream.
  • Participating in downstream bug trackers, sometimes even bumping the priority of packaged software bugs because they know a huge number of users are using the distro packages.
  • Running their own package repos, which gives users more options version-wise but has also taught their upstream team about some of the challenges in packaging so they can be more effective collaborators with the in-distro packagers and even catch pain points and issues earlier. Plus, then packaging is integrated into their QA processes!

He also talked some about how cross-distro collaboration doesn’t really happen on the distro level, so it’s important for upstream to stay on top of that so they can track things like whether the installation is interactive (setting passwords, other config options during install), whether the application is started upon install and more. Their goal being to make the experience of using their application as consistent as possible across platforms, both by similar configuration and reduction of local patches carried by distributions.

At lunch I met up with Louise Corrigan of Apress, who I met last year at the Texas Linux Fest. We also grabbed some much needed coffee, as my jet lag was already starting to show. From there I headed over to the OpenStack booth for my 2PM shift, where I met Adrien Cunin (who I also knew from the Ubuntu community) and later Marton Kiss who I work with on the OpenStack Infrastructure team. I was one of my more fun booth experiences, with lots of folks I knew dropping by, like Jim Campbell who I’d worked with on Documentation in Ubuntu in the past and a couple the people I met at DORS/CLUC in Croatia last year. I also got to meet Charles Butler of Canonical whose Juju talk I attended later in the afternoon.

At 5PM things got exciting for my team, with Spencer Krum presenting Consuming Open Source Configuration: Infrastructure and configuration is now code, and some of it is open source. What is it like to be downstream of one of these projects? In addition to working with us upstream in the OpenStack Infrastructure team, Spencer works on a team within HP that is consuming our infrastructure for projects within HP that need a Continuous Integration workflow. The OpenStack Infrastructure team has always first been about providing for the needs of the OpenStack community, and with Spencer’s help as an active downstream contributor we’ve slowly shifted our infrastructure to be more consumable by the team he’s on and others. In this talk he covered the value of consuming our architecture, including not having to do all the work, and benefiting from a viable architecture that’s been used in production for several years. He noted that any divergence from upstream incurred technical debt for the downstream team, so he’s worked upstream to help decouple components and not make assumptions about things like users and networks, reducing the need for these patches downstream. The biggest takeaway from this, was how much Spencer has been involved with the OpenStack Infrastructure team. His incremental work over time to make our infrastructure more consumable, coupled with his desire to also further the goals on our team (I can always depend upon him for a review of one of my Puppet changes) makes his work as a downstream much easier. Slides from his presentation are online (html-based) here.

My day of talks wrapped up with one of my own! In The open source OpenStack project infrastructure: Fully public Puppet I gave a talk that was complementary to Spencer’s where I spoke from the upstream side about the lesson’s we’ve learned about crafting an effective upstream infrastructure project using Puppet in the past year to make our infrastructure more consumable by downstreams like the team at HP. I outlined the reasons we had for going with a fully open source Puppet configuration (rather than just releasing modules) and why you might want to (others can contribute! sharing is nice!). Then I outlined the work we did in a couple specs we’ve finished to break out some of our components from the previously monolithic configuration. I think the talk went well, it was great to talk to some folks about their own infrastructure challenges afterwards and how our thorough specifications about splitting modules may help them too. Slides from the talk as pdf available here.

I spent the evening with some of my colleagues at HP who are working on OpenStack Designate. I had intended to call it a somewhat early night, but dinner didn’t manage to wrap up until 11PM, cutting severely into beer time!

Sunday morning I headed over to the conference venue at 9AM, noticing that it had snowed over night. I spent the morning at the OpenStack booth, my booth volunteer slot sadly overlapping with Thierry Carrez’s talk on our OpenStack infrastructure tools talk. Wrapping up booth duty, I met up with a friend and made our way through the campus as the snow came down to check out another building with project booths.

I then made my way over to Testing and automation dev room to see Aleksandra Fedorova speak on CI as an Infrastructure. The talk diverged from the typical “process” talks about Continuous Integration (CI), which often pretty abstractly talk about the theory and workflows. She instead talked about the technical infrastructure that is actually required for running such a system, and how it ends up being much more complicated in practice. Beyond the general workflow, you need artifact (logs and other things that result from builds) management, service communication coordination (CIs are chatty! Particularly when there are failures) and then hooks into all the pieces of your infrastructure, from the bug tool to your revision control system and perhaps a code review system. Even when running a very simple test like flake8 you need a place to run it, proper isolation to set up, a pinning process for flake8 versions (need to test it when new versions come out – else it could break your whole process!) and preferably do all of this using QA and language-specific tools created for the purpose. Perhaps my favorite part of her talk was the stress she placed upon putting infrastructure configuration into revision control. I’ve been a fan of doing this for quite some time, particularly in our world of configuration management where it’s now easy to do, but perhaps her most compelling point was keeping track of your Jenkins jobs over time. By putting your Jenkins configurations into revision control, you have a proper history of how you ran your tests months ago, which can be a valuable resource as your project matures.

I attended one more talk, but spent much of the rest of the event meeting up with open source friends who I hadn’t seen in a while. Astonishingly, even though I got to catch up with a number of people, the conference was so big and spread out around the campus that there were people who I knew were there but I never managed to see! One of my colleagues at HP I never saw until after the conference when a group met up for dinner on Sunday night.

The closing keynote was by Ryan MacDonald who spoke on Living on Mars: A Beginner’s Guide: Can we Open Source a society? He spoke about the Mars One program which seemed well on its way. I’m looking forward to the video being published, as I know more than a few people who’d be interested in seeing it from the perspective he presented.

Finally, the wrap-up. Looking back to the introduction to the conference, one of the organizers told the audience that unlike other conferences recently, they didn’t feel the need to adopt a Code of Conduct. They cited that we’re “all adults here” and pretty much know how to act toward each other. I was pretty disappointed by this, particularly at a conference that served alcohol throughout the day and had a pretty bad gender ratio (it’s one of the worst I’ve ever seen). Apparently I wasn’t the only one. Prior to the keynote, a tweet from FOSDEM said “message received” regarding the importance of a Code of Conduct. I’m really proud of them for acknowledging the importance and promising to improve, it makes me feel much better about coming back in the future.

Huge thanks to all the volunteers who make this conference happen every year, I hope I can make it back next year! A few more photos from the event here: https://www.flickr.com/photos/pleia2/sets/72157650191787498/

by pleia2 at February 02, 2015 05:30 AM

January 31, 2015

Akkana Peck

Snow day!

We're having a series of snow days here. On Friday, they closed the lab and all the schools; the ski hill people are rejoicing at getting some real snow at last.

[Snow-fog coming up from the Rio Grande] It's so beautiful out there. Dave and I had been worried about this business of living in snow, being wimpy Californians. But how cool (literally!) is it to wake up, look out your window and see a wintry landscape with snow-fog curling up from the Rio Grande in White Rock Canyon?

The first time we saw it, we wondered how fog can exist when the temperature is below freezing. (Though just barely below -- as I write this the nearest LANL weather station is reporting 30.9°F. But we've seen this in temperatures as low as 12°F.) I tweeted the question, and Mike Alexander found a reference that explains that freezing fog consists of supercooled droplets -- they haven't encountered a surface to freeze upon yet. Another phenomenon, ice fog, consists of floating ice crystals and only occurs below 14°F.

['Glacier' moving down the roof] It's also fun to watch the snow off the roof.

It doesn't just sit there until it gets warm enough to melt and run off as water. Instead, the whole mass of snow moves together, gradually, down the metal roof, like a glacier.

When it gets to the edge, it still doesn't fall; it somehow stays intact, curling over and inward, until the mass is too great and it loses cohesion and a clump falls with a Clunk!

[Mysterious tracks in the snow] When we do go outside, the snow has wonderful collections of tracks to try to identify. This might be a coyote who trotted past our house on the way over to the neighbors.

We see lots of rabbit tracks and a fair amount of raccoon, coyote and deer, but some are hard to identify: a tiny carnivore-type pad that might be a weasel; some straight lines that might be some kind of bird; a tail-dragging swish that could be anything. It's all new to us, and it'll be great fun learning about all these tracks as we live here longer.

January 31, 2015 05:17 PM

Snow day!

We're having a series of snow days here. On Friday, they closed the lab and all the schools; the ski hill people are rejoicing at getting some real snow at last.

[Snow-fog coming up from the Rio Grande] It's so beautiful out there. Dave and I had been worried about this business of living in snow, being wimpy Californians. But how cool (literally!) is it to wake up, look out your window and see a wintry landscape with snow-fog curling up from the Rio Grande in White Rock Canyon?

The first time we saw it, we wondered how fog can exist when the temperature is below freezing. (Though just barely below -- as I write this the nearest LANL weather station is reporting 30.9°F. But we've seen this in temperatures as low as 12°F.) I tweeted the question, and Mike Alexander found a reference that explains that freezing fog consists of supercooled droplets -- they haven't encountered a surface to freeze upon yet. Another phenomenon, ice fog, consists of floating ice crystals and only occurs below 14°F.

['Glacier' moving down the roof] It's also fun to watch the snow off the roof.

It doesn't just sit there until it gets warm enough to melt and run off as water. Instead, the whole mass of snow moves together, gradually, down the metal roof, like a glacier.

When it gets to the edge, it still doesn't fall; it somehow stays intact, curling over and inward, until the mass is too great and it loses cohesion and a clump falls with a Clunk!

[Mysterious tracks in the snow] When we do go outside, the snow has wonderful collections of tracks to try to identify. This might be a coyote who trotted past our house on the way over to the neighbors.

We see lots of rabbit tracks and a fair amount of raccoon, coyote and deer, but some are hard to identify: a tiny carnivore-type pad that might be a weasel; some straight lines that might be some kind of bird; a tail-dragging swish that could be anything. It's all new to us, and it'll be great fun learning about all these tracks as we live here longer.

January 31, 2015 05:17 PM

January 27, 2015

Jono Bacon

Designers Needed to Help Build Software to Teach Kids Literacy

Designers! Imagine you could design a piece of Open Source tablet software that teaches a child how to read, write, and perform arithmetic, without the aid of a teacher. This is not designed to replace teachers, but to bring education where little or none exists.

Just think of the impact. UNESCO tells us that 54 million children have zero access to education. 250 million kids have rudimentary access to education but don’t have any literacy skills. If we can build software that teaches kids literacy, think of the opportunities this opens up in their lives, and the ability to help bring nations out of poverty.

The Global Learning XPRIZE is working to solve that problem and help build this technology.

A Foundation of Awesome Design

This is where designers come in.

We want to encourage designers to use their talent and imagination to explore and share ideas of how this software could look and work. Designers create and craft unique and innovative experiences, and these ideas can be the formation of great discussions with other members of the community.

We are asking designers to explore and create those experiences and then share those wireframes/mockups in the XPRIZE community. This will inspire discussion and ideas for how we create this important software. This is such an important way in which you can participate.

Find out more about how to participate by clicking right here and please share this call for designers widely – the more designs we can see, the more designers involved, the more ideas we can explore. Every one of you can play such a key role in building this technology. Thanks!

by jono at January 27, 2015 06:00 PM

kdub

SVG Hardware Drawer Labels

I recently made a set of SVG labels for my hardware small parts bin in Inkscape for the common Akro-Mills 10164 small parts organizer. Its sized to print the labels the correct size on a 11″x8.5″ sheet of paper (results may vary, so make sure to resize for whatever drawer and printer you have)

AkroMillsLabels

The Labels in action

I thought I’d share them here in SVG format, which should make it pretty easy for you to download and customize. (Eg, you could change the resistor color codes to your set of resistors, change the values, etc). If you do sink a lot of effort into adapting the file, please share-back (open source!) via the comments, and I’ll update the file so others can use it.

Drawer Labels

Drawer Labels

SVG file (copyright (c) 2015 Kevin DuBois, Licensed under CC BY-NC-SA)

by Kevin at January 27, 2015 03:28 AM

January 26, 2015

Jono Bacon

Global Learning XPRIZE: Call For Teams!

As many of my regular readers will know, I joined the XPRIZE Foundation last year. At XPRIZE we run large competitions that incentivize the solution of some of the grandest challenges that face humanity.

My role at XPRIZE is to create a global community that can practically change the world via XPRIZE, inside and outside of these competitions. You will be reading more about this in the coming months.

Back in September we launched the Global Learning XPRIZE. This is a $15 million competition that has the ability to impact over 250 million kids. From the website:

The Global Learning XPRIZE challenges teams from around the world to develop open source and scalable software that will enable children in developing countries to teach themselves basic reading, writing and arithmetic within the 18 month period of competition field testing. Our goal is an empowered generation that will positively impact their communities, countries and the world.

Many of my readers here are Open Source folks, and this prize is an enormous Open Source opportunity. Here we can not only change the world, but we can create Open Source technology that is at the core of this revolution in education.

Not only that, but a key goal we have with the competition is to encourage teams and other contributors to collaborate around common areas of interest. Think about collaboration around storytelling platforms, power management, design, voice recognition, and more. We will be encouraging this collaboration openly on our forum and in GitHub.

You will be hearing more and more about this in the coming months, but be sure to join our forum to keep up to date.

Call For Teams

Since we launched the prize, we have seen an awesome number of teams registering to participate. Our view though is that the more teams the better…it creates a stronger environment of collaboration and competition. We want more though!

Can’t see the video? See it here!

As such, I want to encourage you all to consider joining up as a team. We recommend people form diverse teams of developers, designers, artists, scientists, and more to feed into and explore how we build software that can automate the teaching of literacy. Just think about the impact that this software could have on the world, and also how interesting a technical and interaction challenge this is.

To find out more, and to sign up, head over to learning.xprize.org and be sure to join our community forum to be a part of our community as it grows!

by jono at January 26, 2015 05:46 PM

January 24, 2015

Elizabeth Krumbach

Remembering Eric P. Scott (eps)

Last night I learned the worst kind of news, my friend and valuable member of the Linux community here in San Francisco, Eric P. Scott (eps) recently passed away.

In an excerpt from a post by Chaz Boston Baden, he cites the news from Ron Hipschman:

I hate to be the bearer of bad news, but It is my sad duty to inform you that Eric passed away sometime in the last week or so. After a period of not hearing from Eric by phone or by email, Karil Daniels (another friend) and I became concerned that something might be more serious than a lost phone or a trip to a convention, so I called his property manager and we met at Eric’s place Friday night. Unfortunately, the worst possible reason for his lack of communication was what we found. According to the medical examiner, he apparently died in his sleep peacefully (he was in bed). Eric had been battling a heart condition. We may learn more next week when they do an examination.

He was a good friend, the kind who was hugely supportive of any local events I had concocted for the Ubuntu California community, but as a friend he was also the thoughtful kind of man who would spontaneously give me thoughtful gifts. Sometimes they were related to an idea he had for promoting Ubuntu, like a new kind of candy we could use for our candy dishes at the Southern California Linux Expo, a toy penguin we could use at booths or a foldable origami-like street car he thought we could use as inspiration for something similar as a giveaway to promote the latest animal associated with an Ubuntu LTS release.

He also went beyond having ideas and we spent time together several times scouring local shops for giveaway booth candy, and once meeting at Costco to buy cookies and chips in bulk for an Ubuntu release party last spring, which he then helped me cart home on a bus! Sometimes after the monthly Ubuntu Hours, which he almost always attended, we’d go out to explore options for candy to include at booth events, with the amusing idea he also came up with: candy dishes that came together to form the Ubuntu logo.

In 2012 we filled the dishes with M&Ms:

The next year we became more germ conscious and he suggested we go with individually wrapped candies, searching the city for ones that would taste good and not too expensive. Plus, he found a California-shaped bowl which fit into our Ubuntu California astonishingly theme well!

He also helped with Partimus, often coming out to hardware triage and installfests we’d have at the schools.


At a Partimus-supported school, back row, middle

As a friend, he was also always welcome to share his knowledge with others. Upon learning that I don’t cook, he gave me advice on some quick and easy things I could do at home, which culminated in the gift of a plastic container built for cooking pasta in the microwave. Skeptical of all things microwave, it’s actually something I now use routinely when I’m eating alone, I even happened to use it last night before learning of his passing.

He was a rail fan and advocate for public transportation, so I could always count on him for the latest transit news, or just a pure geek out about trains in general, which often happened with other rail fans at our regular Bay Area Debian dinners. He had also racked up the miles on his favorite airline alliance, so there were plenty of air geek conversations around ticket prices, destinations and loyalty programs. And though I haven’t really connected with the local science fiction community here in San Francisco (so many hobbies, so little time!), we definitely shared a passion for scifi too.

This is a hard and shocking loss for me. I will deeply miss his friendship and support.

by pleia2 at January 24, 2015 08:10 PM

January 20, 2015

Elizabeth Krumbach

Stress, flu, Walt’s Trains and a scrap book

I’ve spent this month at home. Unfortunately, I’ve been pretty stressed out. Now that I’m finally home I have a ton to catch up on here, I’m getting back into the swing of things with the pure technical (not event, travel, talk) part of my day job and and have my book to work on. I know I haven’t backed off enough from projects I’m part of, even though I’ve made serious efforts to move away from a few leadership roles in 2014, so keeping up with everything remains challenging. Event-wise, I’ve managed to arrange my schedule so I only have 4 trips during this half of the year (down from 5, thanks to retracting a submission to one domestic conference), and 1-3 major local events that I’m either speaking at or hosting. It still feels like too much.

Perhaps adding to my stress was the complete loss of 5 days last week to the flu. I had some sniffles and cough on Friday morning, which quickly turned into a fever that sent me to bed as soon as I wrapped up work in the early evening. Saturday through most of Tuesday are a bit of a blur, I attempted to get some things done but honestly should have just stayed in bed and not tried to work on anything, because nothing I did was useful and actually made it more difficult to pick up where I left off come late Tuesday and into Wednesday. I always forget how truly miserable having the flu is, sleep is the only escape, even something as mind-numbing as TV isn’t easy as everything hurts. However, kitty snuggles are always wonderful.

Sickness aside, strict adherence to taking Saturdays off has helped my stress. I really look forward to my Saturdays when I can relax for a bit, read, watch TV, play video games, visit an exhibit at a museum or make progress in learning how to draw. I’m finally at the point where I no longer feel guilty for taking this time, and it’s pretty refreshing to simply ignore all email and social media for a day, even if I do have the impulse to check both. It turns out it’s not so bad to disconnect for a weekend day, and I come back somewhat refreshed on Sunday. It ultimately does make me more productive during the rest of the week too, and less likely to just check out in the middle of the week in a guiltful and poorly-timed evening of pizza, beer and television.

This Saturday MJ and I enjoyed All Aboard: A Celebration of Walt’s Trains exhibit at the Walt Disney Family Museum. It was a fantastic exhibit. I’m a total sucker for the entrepreneurial American story of Walt Disney and I love trains, so the mix of the two was really inspiring. This is particularly true as I find my own hobbies being as work-like and passion-driven as my actual work. Walt’s love of trains and creation of a train at his family home in order to have a hobby outside work led to trains at Disney parks around the world. So cool.

No photos are allowed in the exhibit, but I did take some time around the buildings to capture some signs and the beautiful day in the Presidio: https://www.flickr.com/photos/pleia2/sets/72157650347931082/

One evening over these past few weeks took time to put together a scrap book, which I’d been joking about for years (“ticket stub? I’ll keep it for my scrap book!”). Several months ago I dug through drawers and things to find all my “scrap book things” and put them into a bag, collecting everything from said ticket stubs to conference badges from the past 5 years. I finally swung by a craft store recently and picked up some rubber cement, good clear tape and an empty book made for the purpose. Armed with these tools, I spent about 3 hours gluing and taping things into the book one evening after work. The result is a mess, not at all beautiful, but one that I appreciate now that it exists.

I mentioned in my last “life” blog post that I was finishing a services migration from one of my old servers. That’s now done, I shut off my old VPS yesterday. It was pretty sad when I realized I’d been using that VPS for 7 years when the level plan I had offered a mere 360M of RAM (up to 2G now), I had gotten kind of attached! But that faded today when I did an upgrade on my new server and realized how much faster it is. On to bigger and better things! In other computer news, I’m really pushing hard on promoting the upcoming Ubuntu Global Jam here in the city and spent Wednesday evening of this week hosting a small Ubuntu Hour, thankful that it was the only event of the evening as I continued to need rest post-flu.

Today is a Monday, but a holiday in the US. I spent it catching up with work for Partimus in the morning, Ubuntu in the afternoon and this evening I’m currently avoiding doing more work around the house by writing this blog post. I’m happy to say that we did get some tricky light bulbs replaced and whipped out the wood glue in an attempt to give some repair love to the bathroom cabinet. Now off to do some laundry and cat-themed chores before spending a bit more time on my book.

by pleia2 at January 20, 2015 02:07 AM

January 19, 2015

Elizabeth Krumbach

San Francisco Ubuntu Global Jam at Gandi.net on Sunday February 8th

For years Gandi.net has been a strong supporter of Open Source communities and non-profits. From their early support of Debian to their current support of Ubuntu via discounts to Ubuntu Members they’ve been directly supportive of projects I’m passionate about. I was delighted when I heard they had opened an office in my own city of San Francisco, and they’ve generously offered to host the next Ubuntu Global Jam for the Ubuntu California team right here at their office in the city.

Gandi.net

+

Ubuntu

=

Jam for days
Jam!

What’s an Ubuntu Global Jam? From the FAQ on the wiki:

A world-wide online and face-to-face event to get people together to work on Ubuntu projects – we want to get as many people online working on things, having a great time doing so, and putting their brick in the wall for free software as possible. This is not only a great opportunity to really help Ubuntu, but to also get together with other Ubuntu fans to make a difference together, either via your LoCo team, your LUG, other free software group, or just getting people together in your house/apartment to work on Ubuntu projects and have a great time.

The event will take place on Sunday, February 8th from noon – 5PM at the Gandi offices on 2nd street, just south of Mission.

Community members will gather to do some Quality Assurance testing on Xubuntu ISOs and packages for the upcoming release, Vivid Vervet, using the trackers built for this purpose. We’re focusing on Xubuntu because that’s the project I volunteer with and I can help put us into contact with the developers as we test the ISOs and submit bugs. The ISO tracker and package tracker used for Xubuntu are used for all recognized flavors of Ubuntu, so what you learn from this event will transfer into testing for Ubuntu, Kubuntu, Ubuntu GNOME and all the rest.

No experience with Testing or Quality Assurance is required and Quality Assurance is not as boring as it sounds, honest :) Plus, one of the best things about doing testing on your hardware is that your bugs are found and submitted prior to release, increasing the chances significantly that any bugs that exist with your hardware are fixed prior to release!

The event will begin with a presentation that gives a tour of how manual testing is done on Ubuntu releases. From there we’ll be able to do Live Testing, Package Testing and Installation testing as we please, working together as we confirm bugs and when we get stuck. Installation Testing is the only one that requires you to make any changes to the laptop you bring along, so feel free to bring along one you can do Live and Package testing on if you’re not able to do installations on your hardware.

I’ll also have the following two laptops for folks to do testing on if they aren’t able to bring along a laptop:

I’ll also be bringing along DVDs and USB sticks with the latest daily builds for tests to be done and some notes about how to go about submitting bugs.

Please RSVP here (full address also available at this link):

http://loco.ubuntu.com/events/ubuntu-california/2984-ubuntu-california-san-francisco-qa-jam/

Or email me at lyz@ubuntu.com if you’re interested in attending and have trouble with or don’t wish to RSVP through the site. Also please feel free to contact me if you’re interested in helping out (it’s ok if you don’t know about QA, I need logistical and promotional help too!).

Food and drinks will be provided, the current menu is a platter of sandwiches and some pizzas, so please let me know if you have dietary restrictions so we can place orders accordingly. I’d hate to exclude folks because of our menu, so I’m happy to accommodate vegan, gluten free, whatever you need, I just need to know :)

Finally, giveaways of Ubuntu stickers and pens for everyone and a couple Ubuntu books (hopefully signed by the authors!) will also be available to a few select attendees.

Somewhere other than San Francisco and interested in hosting or attending an event? The Ubuntu Global Jam is an international event with teams focusing on a variety of topics, details at: https://wiki.ubuntu.com/UbuntuGlobalJam. Events currently planned for this Jam can be found via this link: http://loco.ubuntu.com/events/global/2967/

by pleia2 at January 19, 2015 11:00 PM

Jono Bacon

Bridging Marketing and Community

In the last five years we have seen tremendous growth in community management. Organizations large and small are striving to build strong, empowered communities that contribute to and support their work. These efforts are focused on a new form of engagement, one that builds engaged communities that are part of the fabric that achieves success.

This growth in community management has been disruptive. Engineering, governance, and other areas have been turned upside down with this new art and science. This disruption has been positive though, producing new cultures and relationships and a new feather to our collective bows in achieving our grander ambitions.

If there is one area where this disruption has made the strongest lightning bolt, it has been marketing and brand management.

Every year I run the Community Leadership Summit in Portland, and every year I hear the same feedback; the philosophical, strategic, and tactical differences between marketing and community managers. These concerns have also been shared with me in my work as a community strategy and management consultant.

The Community Leadership Summit

This worries me. When I see this feedback shared it tells a narrative of “us and them“, as if marketing and brand managers are people intent on standing in the way of successful communities.

This just isn’t true.

Marketing and brand managers are every bit as passionate and engaged about success as community managers. What we are seeing here is a set of strategic and tactical differences which can be bridged. To build unity though we need to first see and presume the good in people; we are all part of the same team, and we all want to do right by our organizations.

Philosophy

For most organizations, marketing operations are fairly crisply defined and controlled. You specify your brand and values and build multiple marketing campaigns to achieve the goals of brand awareness and engagement. This is usually pretty tightly controlled in terms of brand, values, mission, and campaigns, by the organization. This is designed to assure consistency across brand, voice, and messaging, and legal protection with your marks.

This kind of brand marketing is critical. We live in a world dominated by brands, and brand managers have to balance a delicate line between authentic engagement and feckless shlepping of their wares. There is an art and science to brand marketing and many tremendous leaders in this area such as Brendon Burchard, Aaliyah Shafiq, and Gary Briggs. These fine people and others have guided organizations through challenging times and an increasingly over-subscribed audience with shorter and shorter attention spans.

Community management takes a similar but different approach. Community managers seek to build open-ended engagement in which you create infrastructure, process, and governance, and then you invite a wider diversity of people and groups to join a central mission. With this work we see passionate and inspired communities that span the world, bringing a diverse range of skills and talents. Philosophically this is very much a “let a thousand roses bloom” approach to engagement.

The Spider and the Starfish

I believe the strategic and tactical difference between many marketing and community managers can be best explained with the inspiring and excellent work of Ori Brafman and Rod Beckstrom in their seminal book, The Starfish and the Spider. The book outlines the differences between the centralized methods of organization (the spider), and the decentralized method (the starfish).

I don’t like spiders, so this was uncomfortable to add to this post.

In many traditional organizations the structure is very much like a spider. While there are multiple legs, there is a central body that is in charge. The body provides strategy, execution, and guidance from a small group of people in charge, and the outer legs serve those requirements.

Brand management commonly uses the spider model: the parameters of the brand, structure, values, and execution are typically defined by a central hand-picked team of people. While the brand may be open to multiple possibilities and opportunities, the center of the spider has to approve or reject new ideas. In many cases the center of spider has oversight and approvals over everything externally facing.

There are two core challenges with the spider model: innovation and agility. You may have the very best folks in the middle of that spider but like any group of human beings, they will reach the natural limits of their own creativity and innovation. Likewise, that team will face a limit in agility; there is only so much the center of the spider can do, and this will impact the spider as a whole.

The other organizational management model is the starfish. Here we empower teams to do great work and provide guidelines to help them be successful. This doesn’t lack accountability or quality, but we achieve it by defining strong standards of quality and trusting the teams to execute within them. We then deal with suboptimal cases where appropriate. Many modern organizations work this way, such as YouTube, Wikipedia, and many start-ups, and this is the inherent model in the community management world.

Now, let’s be honest here. I am a community management guy. Much as I like to think I am an objective thinker and unbiased, everyone is biased in some way. You are probably expecting me to pronounce these spider-orientated marketing and brand organizations dead and to hail the new starfish king of community management.

Not at all.

As I said earlier, brand management is critical to our success. What we need to do is first understand we are all on the same team, and secondly bridge the agility of community management with the consistency of brand management.

Focus on the mission

In reality, we don’t want an entirely spider model or an entirely starfish model, we want a mixture of both; a spiderfish, if you will.

I am a strong believer in Covey’s philosophy of “begin with the end in mind“. We should sit down, dream a little, and then rigorously define our mission for our organization. With this mission in mind, every project, every initiative, every idea, should be assessed within the parameters of whether it furthers that mission. If it doesn’t, we should do something else.

Always think about where we want to get to.

When most organizations think with the end in mind they want their audience to feel a personal sense of connection to their work, and therefore their brand. The world of broadcast media is withering on the vine: We don’t just sit there and mindlessly devour content with a bag of Cheetos in hand. We want to engage, to interact, to be a part of that message and that content. If we are passionate about a brand, we want to play an active role in how we can make that brand successful. We want to transition from being a member of the audience to being a member of the team.

Most brand managers want this. All community managers want and should achieve this. Thus, brand and community managers are really singing from the same hymn sheet and connected to the same broader mission. Brand and community managers are simply people with different skill-sets putting different jigsaw pieces into the same puzzle.

So how do we strike that balance between brand and community? Well, I have some practical suggestions that may be useful:

1. Align strategy

Your marketing and community strategies need to be well-understood and aligned. Both teams should have regular meetings and a clear understanding of what both teams are doing. This serves two key functions. Firstly, it will mean that everyone has an understanding of what everyone is working on. Secondly, it will clearly demonstrate the importance and value of both teams, be able to identify positive and negative touch points, and bring balance to them.

Now, this is easier said than done. Strategy will change and adapt and it can be tough to keep everyone in the loop at once. As such, at a minimum focus on connecting the team leads together; they can then communicate this to their respective teams.

2. Your future won’t be 100% of what you expect it to be

There is a great rule of thumb in project management of “you will achieve your goals, but what it will be different to what you expect“. We should always remind our brand and community managers that part of bridging two different skill sets and philosophies means that our work will be a little different than we may expect.

Our goal here is the consistency of an awesome brand manager with the engagement of an awesome community manager. This may mean that a community manager’s work may be a little more tempered and conservative and a brand manager’s work may be a little more agile and freeform. This will feel weird and awkward at first, but sends us in the right direction to achieve our broader mission in our organization.

3. Have a flexible brand/trademark policy and communicate it clearly

One of the key challenges in balancing brand and community management is that communities typically want to use brands themselves in their work in a freeform way. This can include printing signs for events, using the brand on websites and social media, printing t-shirts and merchandise, creating presentation slides, and more. The brand is our shared identity, both for the organization and the community.

It is important that we clearly define the lines of how the brand can and cannot be used. We want to empower our community to freely utilize the brand (and associated trade dress, fonts, colors, and more) to do amazing work, but we want to avoid our brand being cheapened and diluted.

To do this we should create a rigorous brand and trademark policy that outlines these freedoms and restrictions and clearly communicate it to the community.

A good example of this is the Ubuntu Trademark Policy; it crisply states these restrictions and freedoms and has resulted in a large and capable community and fantastic brand awareness.

4. Focus quality where it really matters

As I mentioned earlier, we really want to take a “spiderfish” approach to our organizatons. This means that we centrally define some aspects of policy, but we focus those central pieces on the most valuable and important areas.

The trick is that we want to focus quality assurance on the right places. The way in which we assess the brand consistency of a keynote presentation that will be beamed around the world should be different to how we assess a small presentation given at a local community group. If we treat everything the same we will burn our teams out and limit agility and creativity.

Likewise, our assessment of quality should be around consistency as opposed to stylistic differences. We want to encourage different styles and voices: our community will present a multitude of different narratives and ideas. Our goal is to ensure that they feel consistent and connected to our central mission.

As such, focus your spider body on the most critical pieces. If you don’t, those teams will be overworked and stressed as opposed to creatively inspired and engaged.

5. Always focus on the mission

I know I have banged this drum a few times already in this article, but we have to focus on our mission every single day.

Covey teaches us that we should collaboratively define and share our missions and that these missions should guide our work every day, not just be shoved in a cupboard or stuck to a dusty wall, never to be seen again. We should assess every idea, every project, every motivation within the parameters of what we are here to do.

This is critical at a tactical level (“should project foo be something we invest in?”) but also at a strategic level (“how do we balance marketing and community management to further our mission?”).

Enforcing this is a key responsibility for senior executives. Is is senior leadership that really defines the culture and tenor of our organizations so it can trickle down, and reminding and inspiring everyone of the bigger picture is essential.

I hope you find some of this useful. My primary goal with this article was to help bridge the divide between what I consider to be two critical roles in successful organizations: marketing and community management. While the cultures may be a little different, both have much to learn from each other, and much to bring to the world. I look forward to hearing from you all about your experiences and perspectives on how we continue to work together to do interesting and important work.

by jono at January 19, 2015 08:37 PM

January 18, 2015

Akkana Peck

Another stick figure in peril

One of my favorite categories of funny sign: "Stick figures in peril". This one was on one of those automated gates, where you type in a code and it rolls aside, and on the way out it automatically senses your car.

[Moving gate can cause serious injury or death]

January 18, 2015 05:19 PM

January 17, 2015

kdub

Saleae Logic 8 Review

Over the break, I got to play a bit with the Saleae Logic 8 logic analyzer. Its the mid-range model from Saleae, and it works with Ubuntu. I wrote about the predecessor to the Logic 8 a while back, before Linux support was around. I finally got to do a bit of tinkering with the new device, under Ubuntu Vivid.

logic 8

Logic 8

The device itself came packaged only in the carrying case that is provided. Inside the zippered carrying case was the Logic 8 itself, 2 4×2 headers with 6-inch leads, 16 logic probes, a micro-usb cable, a postcard directing you to the support site, and a poster of Buzz Aldrin in the Apollo cockpit.
The Logic 8 is made out of machined anodized aluminum and is only about 2×2 inches. It’s sturdy-feeling, and the only ports are the micro-usb to connect to the computer, and the 16 logic probe pins (8x ground+signal). There’s a blue LED on the top.

IMG_1917

Package Contents

The test leads seem pretty good. I’m used to the J-hook type leads, and these have two pincers that come out. I’ve been able to get the leads into more places than I would have with a J-hook type logic probe.

Bonus Inspiration

Another really interesting feature is that this logic analyzer can do analog sampling. Each of the Logic 8 test leads can perform analog sampling. The device can sample faster if you’re only using one analog channel. One channel can sample at 10M samples/second, and running all 8 will sample at 2.5M samples/second. According to the literature, frequencies above the Nyquist frequency of the sample rate get filtered out before hitting the onboard ADC. If you’re anything like me, most of the my electronics tinkering doesnt require me to look at signals above this sampling rate, and I could see using the oscilloscope less and using the Logic 8 for some analog signals work too.

Underside of Logic 8

Underside of Logic 8

The Software:
The Logic 8 software is available (freeware, closed source) on the website and will simulate inputs if there’s no device connected, so you can get a pretty good feel for how the actual device will work. It was largely hassle-free, although I did have to unpack it in /opt because it wasn’t packaged. Overall, it was pretty intuitive to configure the sampling, set up triggers, and test my circuit. The look and feel of the software was much better than a lot of other electronics tools I’ve used.

Trying it out:
I was working on a pretty simple circuit that takes a sensor input and outputs to a single 7-segment. Its composed of a BCD-7segment decoder chip and an ATtiny13. (easy enough to program with the ubuntu packages ‘avrdude’ and ‘gcc-avr’).

circuit

Circuit Under Test (ATtiny13, a light sensor, and a BCD to 7 segment decoder)

Its not electrically isolated from the circuit, but I would expect that for the price point. Just make sure that you don’t have any ground loops between your computer and the circuit under test. I don’t typically build circuits that really need a earth-ground, so I don’t see that being much of an issue.

So, my first run, I connected it to the GPIO pins on the AVR and varied the voltage from 0-2.5V on the ADC pin.

ADC to 4bit digital

ADC to 4bit digital

Yay, my circuit (and avr program) was working.

I am pleased with the Logic 8, and am even more excited to have a hassle free way to measure logic and analog signals in Ubuntu!

by Kevin at January 17, 2015 08:53 PM

January 14, 2015

Jono Bacon

Your new Community Manager Hire: 5 Areas to Focus on

So, you have just hired that new community manager into your organization. Their remit is simple: build a community that wraps around your product/technology/service. You have an idea of what success looks like, but you are also not entirely sure exactly what this new hire will be doing at a tactical level.

Lots of people are in this position. Here are five things you should focus on to help ensure they are successful.

1. Think carefully about the reporting line

When a new community manager joins a company the question is where they report. In many cases they report into Marketing, in some cases (particularly for technology companies) they report to Engineering. In some cases they report to the COO.

Much of this depends on what the community manager is doing. If they are managing social and forums, marketing may be a good fit. If they are building a developer community, engineering may be a good.

If however they are building a full community with infrastructure, processes, governance, and more, they are going to be working cross-team in your organization. As such, having them report into a single team such as Marketing may not be a good idea: it may restrict their cross-functional capabilities and executive buy-in.

Also, and how do I say this delicately…there is often a philosophical difference between traditional marketing/brand managers and community managers. Think carefully about how open to community success your marketing manager is…if they are not very open, they may end up squashing the creativity of your new hire.

2. Build a strategic plan

A key part of success is setting expectations. With rare exceptions, right out the gate the difference in expectations between senior execs and a new community manager are likely to be pretty significant. We want to reduce that gap.

To do this you need to gather stakeholder requirements, define crisp goals for what the community should look like and map out an annual strategic plan that outlines what the community manager will achieve to meet those goals as well as crisp success criteria. Summarize this into a simple deck to review with the exec team and other key leaders in the organization.

The community manager should make a point of socializing the strategy with the majority of the organization: it will help to smooth the lines to success.

3. Provide mentoring

The is a huge variance in what community managers actually do. Some take care of social media, some respond on forums, some fly to conferences to speak, and some built entire environments with infrastructure, process, governance, and on-ramps to help the community be successful.

I believe the latter, is the true definition of a community manager. A community manager should have a vision for a community and be able to put all the infrastructure, process, and resources in place to achieve it.

This is tough. It requires balancing lots of different teams and resources, and your new hire may feel they are drowning. Find a good community manager who gets this kind of stuff and ask them for help. Encourage an environment and culture of learning: help them to help themselves to be successful.

4. Have an “essential travel only” policy

I see the same thing over and over again: a new community manager joins a company and the company spends thousands flying them to every conceivable conference to speak and hang out with attendees. This is usually with the rationale of “spreading the word”.

Here’s the deal. Every minute your community manager is on the road, at conferences, preparing talks, and mingling with people, they are not working on the wider community vision, they are working on the scope of that event. Travel 2is incredibly disruptive and conferences are very distracting, and at the beginning of a new community, you really want your community manager putting the foundations of your community in place which typically means them being sat at a computer and drinking plenty of coffee.

Now, don’t get me wrong, conferences and travel are critical for community success. My point is that you should pick conferences that match closely with the strategy you have defined. This keep your costs lower, your new hire more focused, and help get things up and running quicker.

5. Train the rest of your employees

The word “community” means radically different things to different people. For some a community is a customer-base, for some it is engineering, for some it is a support function, for others it may be social media.

When your new community manager joins, your other staff will have their own interpretation of what “community” means. You should help to align the community manger’s focus and goals with the rest of the organization.

In many companies, the formation of a community is a key strategic change. It is often a new direction that is breaking some ground. In these cases, this step is particularly important. We want to ensure the wider team knows the organizational significance of a community, but also to get them bought into the value and opportunity it brings.

I hope this helps. If anyone has any questions I can help with, feel free to get in touch.

by jono at January 14, 2015 04:24 PM

January 13, 2015

Jono Bacon

Discourse: Saving forums from themselves

Many of us are familiar with discussion forums: webpages filled with chronologically ordered messages, each with a little avatar and varying degrees of cruft surrounding the content.

Forums are a common choice for community leaders and prove to be popular, largely due to their simplicity. The largest forum in the world, Gaia Online, an Anime community, has 27 million users and over 2,200,000,000 posts. They are not alone: it is common for forums to have millions of posts and hundreds of thousands of users.

So, they are a handy tool in the armory of the community leader.

The thing is, I don’t particularly like them.

While they are simple to use, most forums I have seen look like 1998 vomited into your web browser. They are often ugly, slow to navigate, have suboptimal categorization, and reward users based on the number of posts as opposed to the quality of content. They are commonly targeted by spammers and as they grow in size they invariably grow in clutter and decrease in usefulness.

I have been involved with and run many forums and while some are better, most are just similar incarnations of the same dated norms of online communication.

So…yes…not a fan. :-)

Enter Discourse

Fortunately a new forum is on the block and it is really very good: Discourse.

Created by Jeff Atwood, co-founder of Stack Overflow and the Stack Exchange Network, Discourse takes a familiar but uprooted approach to forums. They have re-thought through everything that is normal in forums and improved online communication significantly.

If you want to see it in action, see the XPRIZE Community, Bad Voltage Community, and Community Leadership Forum forums that I have set up.

Discourse is neat for a few reasons.

Firstly, it is simple to use and read. It presents a simple list of discussions with suitable categories, as opposed to cluttered sub-forums that divide discussions. It provides a easy and effective way to highlight and pin topics and identify active discussions. Users can even hide certain categories they are not interested in.

Creating and replying to topics is a beautiful experience. The editor supports Markdown as well as GUI controls and includes a built-in preview where you can embed videos, images, tweets, quotes, code, and more. It supports multiple headings, formatting styles, and more. I find that posts really come to life with Discourse as opposed to the limited fragments of text shown on other forums.

Discourse is also clever in how it encourages good behavior. It has a range of trust levels that reward users for good and regular participation in the forum. This is gamified with badges which encourages users to progress, but more importantly from a community leadership perspective, it provides a simple at-a-glance view of who the rock stars in the forum are. This provides a list of people I can now encourage and engage to be leaders. Now, before you get too excited, this is based on forum usage, not content, but I find the higher trust level people are generally better contributors anyway.

Discourse also makes identity pleasant. Users can configure their profiles in a similar way to Twitter with multiple types of imagery and details about who they are. Likewise, referencing other users is simple by pressing @ and then their username. This makes replies easier to spot in the notifications indicator and therefore keeps the discussion flowing.

Administrating and running the site is also simple. User and content management is a breeze, configuring the look and feel of most aspects of the forum is simple, and Discourse supports multiple login providers.

What’s more, you can install Discourse easily with docker and there are many hosting providers. While Jeff Atwood’s company has their own commercial service I ended up using DiscourseHosting who are excellent and pretty cheap.

To top things off, the Discourse community are responsive, polite, and incredibly enthusiastic about their work. Everything is Open Source and everything works like clockwork. I have never, not once, seen a bug impact a stable release.

All in all Discourse makes online discussions in a browser just better. It is better than previous forums I have used in pretty much every conceivable way. If you are running a community, I strongly suggest you check Discourse out; there simply is no competition.

by jono at January 13, 2015 05:31 AM

January 12, 2015

Jono Bacon

Announcing the Community Leadership Summit 2015!

I am delighted to announce the Community Leadership Summit 2015, now in it’s seventh year! This year it takes place on the 18th and 19th July 2015, the weekend before OSCON at the Oregon Convention Center. Thanks again to O’Reilly for providing the venue.

For those of you who are unfamiliar with the CLS, it is an entirely free event designed to bring together community leaders and managers and the projects and organizations that are interested in growing and empowering a strong community. The event provides an unconference-style schedule in which attendees can discuss, debate and explore topics. This is augmented with a range of scheduled talks, panel discussions, networking opportunities and more.

The heart of CLS is an event driven by the attendees, for the attendees.

The event provides an opportunity to bring together the leading minds in the field with new community builders to discuss topics such as governance, creating collaborative environments, conflict resolution, transparency, open infrastructure, social networking, commercial investment in community, engineering vs. marketing approaches to community leadership and much more.

The previous events have been hugely successful and a great way to connect together different people from different community backgrounds to share best practice and make community management an art and science better understood and shared by us all.

I will be providing more details about the event closer to the time, but in the meantime be sure to register!

Mixing Things Up

For those who have been to CLS before, I want to ask your help.

This year I want to explore new ideas and methods of squeezing as much value out of CLS for everyone. As such, I am looking for your input on areas in which we can improve, refine, and optimize CLS.

I ask you head over to the Community Leadership Forum and share your feedback. Thanks!

by jono at January 12, 2015 06:27 PM

January 11, 2015

Grant Bowman

Next Billion Connected

Dell’s Next Billion video would inspire me more if Dell genuinely supported Linux to consumers instead of actively promoting Windows almost everywhere I see Dell.


by grantbow at January 11, 2015 10:24 PM

January 08, 2015

Jono Bacon

Bad Voltage and Ubuntu

I know many of my readers here are Ubuntu fans and I wanted to let you know of something neat.

For just over a year now I have been doing a podcast with Stuart Langridge, Bryan Lunduke, and Jeremy Garcia. It is a fun, loose, but informative show about Open Source and technology. It is called Bad Voltage.

Anyway, in the show that was released today, we did an interview with Michael Hall, a community manager over at Canonical (and who used to work for me when I was there).

It is a fun and interesting interview about Ubuntu and phones, release dates, and even sets a challenge to convince Lunduke about the value of scopes on the Bad Voltage Forum.

Go and listen to or download the show here and be sure to share your thoughts on the show in the community discussion.

The show also discusses the Soylent super-food, has predictions for 2015 (one of which involves Canonical), and more!

Finally, Bad Voltage will be doing our first live performance at SCALE in Los Angeles on Fri 20th Feb 2015. We hope to see you there!

by jono at January 08, 2015 11:18 PM

Akkana Peck

Accessing image metadata: storing tags inside the image file

A recent Slashdot discussion on image tagging and organization a while back got me thinking about putting image tags inside each image, in its metadata.

Currently, I use my MetaPho image tagger to update a file named Tags in the same directory as the images I'm tagging. Then I have a script called fotogr that searches for combinations of tags in these Tags files.

That works fine. But I have occasionally wondered if I should also be saving tags inside the images themselves, in case I ever want compatibility with other programs. I decided I should at least figure out how that would work, in case I want to add it to MetaPho.

I thought it would be simple -- add some sort of key in the images's EXIF tags. But no -- EXIF has no provision for tags or keywords. But JPEG (and some other formats) supports lots of tags besides EXIF. Was it one of the XMP tags?

Web searching only increased my confusion; it seems that there is no standard for this, but there have been lots of pseudo-standards over the years. It's not clear what tag most programs read, but my impression is that the most common is the "Keywords" IPTC tag.

Okay. So how would I read or change that from a Python program?

Lots of Python libraries can read EXIF tags, including Python's own PIL library -- I even wrote a few years ago about reading EXIF from PIL. But writing it is another story.

Nearly everybody points to pyexiv2, a fairly mature library that even has a well-written pyexiv2 tutorial. Great! The only problem with it is that the pyexiv2 front page has a big red Deprecation warning saying that it's being replaced by GExiv2. With a link that goes to a nonexistent page; and Debian doesn't seem to have a package for GExiv2, nor could I find a tutorial on it anywhere.

Sigh. I have to say that pyexiv2 sounds like a much better bet for now even if it is supposedly deprecated.

Following the tutorial, I was able to whip up a little proof of concept that can look for an IPTC Keywords tag in an existing image, print out its value, add new tags to it and write it back to the file.

import sys
import pyexiv2

if len(sys.argv) < 2:
    print "Usage:", sys.argv[0], "imagename.jpg [tag ...]"
    sys.exit(1)

metadata = pyexiv2.ImageMetadata(sys.argv[1])
metadata.read()

newkeywords = sys.argv[2:]

keyword_tag = 'Iptc.Application2.Keywords'
if keyword_tag in metadata.iptc_keys:
    tag = metadata[keyword_tag]
    oldkeywords = tag.value
    print "Existing keywords:", oldkeywords
    if not newkeywords:
        sys.exit(0)
    for newkey in newkeywords:
        oldkeywords.append(newkey)
    tag.value = oldkeywords
else:
    print "No IPTC keywords set yet"
    if not newkeywords:
        sys.exit(0)
    metadata[keyword_tag] = pyexiv2.IptcTag(keyword_tag, newkeywords)

tag = metadata[keyword_tag]
print "New keywords:", tag.value

metadata.write()

Does that mean I'm immediately adding it to MetaPho? No. To be honest, I'm not sure I care very much, since I don't have any other software that uses that IPTC field and no other MetaPho user has ever asked for it. But it's nice to know that if I ever have a reason to add it, I can.

January 08, 2015 05:28 PM

January 06, 2015

Grant Bowman

Scheduling Algorithms

One learns a lot about scheduling when working with many different schedules hoping for a harmonious and predictable result. My background in mathmatics and studies of project management scheduling algorithms has given me a good background. I apologize in advance for some of the specific vagueness.

Very inflexible schedule elements make everything feel much more difficult. Whether the inflexible elements are imposed in a seemingly arbitrary way or because of contingencies along a chain of events the consequences of a missed segment can feel disappointing. Sometimes the bad effects can cascade. Opportunity costs must be weighed against the measures taken to make things work as planned.

Sometimes the result is simply too unpredictable so the system must be treated as a black box. The costs of planning can actually exceed the time saved. Best efforts and adapting/pivoting is required. Results must be accepted for what they are, a best result given the amount of variability in the system. It is good enough so use energy where it will yield a better return on investment.

I consider myself a patient person, yet recent exercises have given me a new appreciation for the patience and planning required when dealing with these particular complex systems.

P. S. Thanks for reading. I have been focusing on writing in other places but I intend to write more frequently on this blog now.


by grantbow at January 06, 2015 02:58 AM

Elizabeth Krumbach

New year, Snowpiercer, Roads of Arabia and projects

I’ve been home for a week now, and strongly resisted the temptation to go complete hermit and stay home to work furiously on my personal projects as the holidays brought several Day Job days off. On New Year’s Eve MJ and I went over to ring in the new year with my friend Mark and his beautiful kitty. On Friday I met up with my friend mct to see Snowpiercer over at Castro Theater. I’m a huge fan of that Theater, but until now had only ever gone to see hybrid live+screen shows related to MST3K, first a Rifftrax show and then a Cinematic Titanic show. It was a nice theater to see a movie in, they make it very dark and then slowly bring up the lights at the end through the credits to welcome you back to the world. A gentle welcome was much needed for Snowpiercer, it was very good but very intense, after watching that I didn’t have it in me to stick around for the double feature (particularly not another one with a train!).

I have watched even less TV than usual lately, lacking time and patience for it (getting bored easily). But MJ and I did start watching Star Trek: Voyager. It turns out that it’s very Classic Trek feeling (meet new aliens every episode!) and I’m enjoying it a lot. I know people really love Deep Space Nine, and I did enjoy it too, but it was always a bit too dark and serious for my old fashioned Trek taste. Voyager is a nice journey back to the Trek style I love, plus, Captain Janeway is totally my hero.

This past Saturday MJ and I had a relaxing day, the highlight of which was the Roads of Arabia exhibit at the Asian Art Museum. It’s one of my favorite museums in the city, and I was really excited to see a full exhibit focused on the middle east, particularly with my trip to Oman on the horizon. It also inspired me for my trip, I’d been advised that it’s common to buy frankincense while in Oman, but I already have what seems like a lifetime supply, I’m now thinking I might try to find a pretty incense burner.


No photos allowed of the exhibit, with the exception of
this statue, where they encouraged selifes

Our wedding photos have finally gotten some attention. It’s been over a year and a half and the preview of photos has been limited to what our photographer shared on Facebook. Sorry everyone. I’ve mostly gone through them now and just need to take some time to put together a website for them. Maybe over this weekend.

My book has also seen progress, but sometimes I also like to write on paper. While going through my huge collection of pens-from-conferences I decided that I write notes enough to treat myself to a nicer pen than these freebies. Through my explorations of pens on the internet, I came across the Preppy Plaisir fountain pen. I’d never used a fountain pen before, so I figured I’d give it a shot. Now, I won’t forsake all other pens moving forward, but I do have to admit that I quite like this pen.


Naturally, I got the pink one

I did manage to catch up on some personal “work” things. Got a fair amount of Ubuntu project work done, including securing a venue and sponsorship for an upcoming Ubuntu Global Jam here in San Francisco and working out travel for a couple upcoming conferences. Also have almost completed the migration of websites and services from one of my old servers to a bigger, cheaper one, and satisfied the prerequisite of re-configuring my monitoring and backups of all my servers in preparation for the new one. Now I’m just waiting for some final name propagation and holding out in case I forgot something on the old server. At least I have backups.

Work on Partimus has been very quiet in recent months. There’s been a movement locally to deploy ChromeBooks in classrooms rather than traditional systems with full operating systems. These work well as many tools that teachers use, including some of the standardized testing tools, have moved to online tools. This is something we noticed back when we were still deploying larger labs of Ubuntu-based systems as we worked hard to tune the systems for optimal performance of Firefox with the latest Java and Flash. Our focus now has turned to education-focused community centers and groups who are seeking computers to do more application and programming focused tasked, I hope to have news about our newest projects in the works soon. I did have the opportunity last week to meet up with an accountant to go over our books, working pro-bono I was thankful for his time and ability to confirm we’re doing everything correctly. I’m not fired as Treasurer, hooray!

by pleia2 at January 06, 2015 02:48 AM

January 05, 2015

Elizabeth Krumbach

Ubuntu California in 2014

Inspired by the post by Riccardo Padovani about the awesome year that Ubuntu Italy had, I welcome you to a similar one for Ubuntu California for events I participated in.

The year kicked off with our annual support of the Southern California Linux Expo with SCaLE12x. The long weekend began with an Ubucon on Friday, and then a team booth on Saturday and Sunday in the expo hall. There were a lot of great presentations at Ubucon and a streamlined look to the Ubuntu booth with a great fleet of volunteers. I wrote about the Ubuntu-specific bits of SCaLE12x here. Unfortunately I have a scheduling conflict, but you can look for the team again at SCaLE this February with an Ubucon and Ubuntu booth in the main expo hall.


Ubuntu booth at SCale12x

In April, Ubuntu 14.04 LTS was released with much fanfare in San Francisco as we hosted a release party at a local company called AdRoll, which uses Ubuntu in their day to day operations. Attendees were treated with demos of a variety of flavors of Ubuntu, a couple Nexus 7s with Ubuntu on them, book giveaways, a short presentation about the features of 14.04 and a pile of pizza and cookies, courtesy of Ubuntu Community Donations Funding.


Ubuntu release party in San Francisco

More details and photos from that party here.

In May, carrying the Ubuntu California mantle, I did a pair of presentations about 14.04 for a couple of local groups, (basic slides here). The first was a bit of a drive down to Felton, California where I was greeted at the firehouse by the always welcoming FeltonLUG members. In addition to my presentation, I was able to bring along several laptops running Ubuntu, Xubuntu and Lubuntu and a Nexus 7 tablet running Ubuntu for attendees to check out.


Ubuntu at FeltonLUG

Back up in San Francisco, I presented at Bay Area Linux Users Group and once again had the opportunity to show off my now well-traveled bag of 14.04 laptops and tablet.


Ubuntu at BALUG

As the year continued, my travel schedule picked up and I mostly worked on hosting regular Ubuntu Hours in San Francisco.

Some featuring a Unicorn…

And an Ubuntu Hour in December finally featuring a Vervet!

December 31st marked my last day as a member of the Ubuntu California leadership trio. I took on this role back in 2010 and in that time have seen a lot of maturity come out of our team and events, from commitment of team members to host regular events to the refinement of our booths each year at the Southern California Linux Expo. I’m excited to see 2015 kick off with the election of an entirely new leadership trio, announced on January 1st, comprised of: Nathan Haines (nhaines), Melissa Draper (elky) and Brendan Perrine (ianorlin). Congratulations! I know you’ll all do a wonderful job. In spite of clearing out to make room for the new leadership team, I’ll still be active in the LoCo, with regular Ubuntu Hours in San Francisco and an Ubuntu Global Jam event coming up on February 8th, details here.

by pleia2 at January 05, 2015 02:08 AM

December 31, 2014

Elizabeth Krumbach

The Ubuntu Weekly Newsletter and other ways to contribute to Ubuntu

Today, the last day of 2014, I’ve taken some time to look back on some of my biggest accomplishments. There have been the big flashy things, lots of travel, lots of talks and the release of The Official Ubuntu Book, 8th Edition. What a great year!

Then there is the day to day stuff, one of which is the Ubuntu Weekly Newsletter.

Every week we work to collect news from around our community and the Internet to bring together a snapshot of that week in Ubuntu. I’ve used the Newsletter archive to glimpse into where we were 6 years ago, and many folks depend on the newsletter each week to get the latest dose of collected news.

In 2014 we released 49 issues. Each one of these issues is the result of a team of contributors who collect links for our newsletter (typically Paul White and myself) and then a weekend of writing summaries for many of these collected articles, where again Paul White has been an exceptional contributor, with several others pitching here and there for a few issues. We then do some editorial review. Release takes place on Monday, where we post to several community resources (forums, discourse, mailing lists, fridge) and across our social media outlets (Twitter, Facebook, Google+), this is usually done by myself or José Antonio Rey. In all, I’d estimate that creating a newsletter takes about 6-8 hours of people time each week. Not a small investment! And one that is shared largely on a week by week basis between the core of three contributors.

We need your help.

Plus, kicking off the new year by contributing to open source is a great way to start!

We specifically need folks to help write summaries over the weekend. All links and summaries are stored in a Google Doc, so you don’t need to learn any special documentation formatting or revision control software to participate. Plus, everyone who participates is encouraged to add their name to the credits.

Summary writers. Summary writers receive an email every Friday evening (or early Saturday) with a link to the collaborative news links document for the past week which lists all the articles that need 2-3 sentence summaries. These people are vitally important to the newsletter. The time commitment is limited and it is easy to get started with from the first weekend you volunteer. No need to be shy about your writing skills, we have style guidelines to help you on your way and all summaries are reviewed before publishing so it’s easy to improve as you go on.

Interested? Email editor.ubuntu.news@ubuntu.com and we’ll get you added to the list of folks who are emailed each week.

Finally, I grepped through our archives and want to thank the following people who’ve contributed this year:

  • Paul White
  • José Antonio Rey
  • Jim Connett
  • Emily Gonyer
  • Gim H
  • John Kim
  • Esther Schindler
  • Nathan Dyer
  • David Morfin
  • Tiago Carrondo
  • Diego Turcios
  • Penelope Stowe
  • Neil Oosthuizen
  • John Mahoney
  • Aaron Honeycutt
  • Mathias Hellsten
  • Stephen Michael Kellat
  • Sascha Manns
  • Walter Lapchynski

Thank you all!

Looking for some other way to contribute? I was fortunate in 2014 to speak at two Ubucons in the United States, at the Southern California Linux Expo and then at Fossetcon in Florida. At both these events I gave presentations on how to contribute to Ubuntu without any programming experience required, I dove into more thoroughly here in my blog:

Want more? Explore community.ubuntu.com for a variety of other opportunities to contribute to the Ubuntu community.

by pleia2 at December 31, 2014 07:41 PM

The adventures of 2014

I had a great year in 2013, highlighted by getting married to MJ and starting a new job that I continue to be really happy with. 2014 ended up being characterized by how much travel I’ve done, and a side trip down having the first surgery of my life.

Travel-wise I broke the 100k in-flight miles barrier, with a total of 101,170 in air miles. I traveled at least once a month and was able to add Australia to my continents list this year. The beaches in Perth were beautiful, and with January in the middle of their summer, it was certainly beach weather when I went!

Visiting family didn’t take a back seat this year, I spent a week up in Maine staying with my sister Annette, my nephew Xavier and visiting with my mother and her kitties. Plus, got a nice dose of snow along with it! Very enjoyable when I’m in a warm home and don’t have to drive.

I also was able to stick family visits onto a couple Florida trips and we went to the weddings of MJ’s cousin and sister in the fall. But much of my travel was for work, with a variety of conferences this year, all travel:


Really enjoyed walking the streets of friendly Zagreb, Croatia

First time out of an airport in Germany during my visit to Darmstadt!

My first time in Paris, need I say more?

Jamaica was beautiful and relaxing

But it wasn’t all traveling to conferences, I did HP booth duty at the Open Business Conference in May (wrap-up post) and presented at PuppetConf in September (wrap-up post), both here in San Francisco. I also did some personal conference geekery with my friend Danita Fries by attending Google I/O for the first time in June (wrap-up post).

I also gave a number of talks, sometimes double or tripling up during a conference. I learned that doing 3 talks at a conference is 1-2 talks too many.


Thanks to Vedran Papeš for this photo from DORS/CLUC in Croatia, source

Plus, I had my first book published over the summer! Working with Matthew Helmke and José Antonio Rey, the Official Ubuntu Book, 8th Edition was released in July.


The Official Ubuntu Book, 8th Edition, July 2014

2014 also made me one organ lighter as of July with the removal of my gallbladder after a few months of diagnostics and pain. It certainly complicated some of my travel, making me spend the shortest amount of time possible in both Croatia and Germany, both countries I wish I could have explored more during my trips.

So far for 2015 I believe I’ll have a slightly less busy year travel-wise, but my first two trips are international, the first to Brussels for my first FOSDEM and then in February off to Oman for FOSSC Oman. Looking for a great, and healthier year in 2015!

by pleia2 at December 31, 2014 04:24 PM

December 30, 2014

Elizabeth Krumbach

Tourist in St. Louis

This past long weekend I decided to take one final trip of the year. I admit, part of the reason for having any year end trip was to hit the 100k flight miles this year. This was purely for hitting that real number, it doesn’t help me get any kind of status since my miles are split between 2 alliances due to the USAirways split from Star Alliance during their American Airlines merger.

So I had a look at a map. Where have I never been, have friends I can crash with and is at least 1200 miles away? St. Louis!

I flew in on Christmas and met up with my friend Ryan who I’d be staying with. Options were limited for food, but we were able to snag some tickets at an AMC Dine-in theater so I could see the final Hobbit movie and get some food.

Friday Ryan had work, so I met up with my friend Eric and his wife Kristin for a day at the St. Louis Zoo. Routinely ranked among the top five in the country, I was pretty excited to go. The zoo is also free and the weather was exceptionally nice for the end of December in Missouri, with highs around 57 degrees. Perfect day for the zoo!


Eric and I with the giraffes

Some of the exhibits were closed for renovation (penguins!) but I really enjoyed the big cats and the primate house and herpetarium. They also did something really clever with their underwater exhibit: rather than having a shark tunnel that you can walk under, it’s a seal lion tunnel. The cool thing about sea lions is that they’re interactive, so zoo goers learned that if you throw a ball (or baseball cap) around in the tunnel, the sea lions will chase it. So cute and fun!

More photos from the zoo here: https://www.flickr.com/photos/pleia2/sets/72157649609657700/

Saturday and Sunday I hung out with Ryan. First stop on Saturday was the City Museum. The website doesn’t really do the insanity of this place justice, “big playground” doesn’t really do it either. We started off by going through the museum’s “caves” where you walk and climb through all kinds of man made caves, with dragons and other creatures carved into the walls. Once you get through those, you find yourself going up a series of metal stairways and landings with one goal: to get to the 10 story slide. I did it, and managed not to throw up afterwards (though it was a bit touch and go for a couple minutes!).


In the City Museum caves

The museum also features all kinds of eclectic collections, from massive carved stone pieces to doorknobs to every Lego train set ever made. For an additional fee, the second floor has the World Aquarium with a variety of animals, aquatic and not. I’d probably skip the aquarium next time, the exhibits were cramped and I wasn’t too impressed with the cage/tank sizes for most of the animals.

Finally, there’s the outdoor MonstroCity, described as: “A captivating collision of old and new, architectural castoffs and post-apocalyptic chaos, MonstroCity is at once interactive sculpture and playground. Comprised of wrought iron slinkies, fire trucks, stone turrets, airplane fuselages, slides of all sizes and shapes” – yep, that’s about right. Like the caves, there were all kinds of places to climb through, with adults having as much fun as the kids. The structures were quite wet and I was feeling very old at this point, so I kept my own explorations pretty conservative, taking walkways and stairways everywhere I went, including to both of the airplane fuselages. Even so, there were some scary moments as parts of the structure move slightly as you walk on them. I was a bit sore after my City Museum trip with all the climbing and head bumping (low ceilings!), but it really was a lot of fun.


MonstroCity

Also, pro-tip: I enjoyed taking photos throughout my visit, but holding on to a camera and phone while climbing everywhere was quite a challenge at times, even with my hoodie pockets. If you can do without having a photographic record of your visit, it may be more fun to leave the electronics in the car. More photos from City Museum here: https://www.flickr.com/photos/pleia2/sets/72157647687508774/

After City Museum, we headed over to Schafly Bottleworks to do a brewery tour. As a big beer fan, I was excited to see one of the several craft breweries that sit in the shadow of giant Anheuser-Busch that also calls St. Louis home. The tour was fun, and was followed by a tasting. They make some great ales, but I was particularly impressed with their Tasmanian IPA, which uses Australian Topaz and Tasmanian Galaxy hops for a nice, complex taste. We skipped lunch at the brewery to head over to Imo’s Pizza, with the super thin crust and Provel cheese, this St. Louis classic was a must. Yum!

Our evening was spent with at Three Sixty rooftop bar downtown, with a great view of the Arch, and then over to Baily’s Chocolate Bar for some fantastic dessert.

Sunday! This being a short trip, I packed in as much as possible, so Sunday began with 10:30AM tickets for the Gateway Arch. Getting there early was a good move, by the time we left around 11:15 the line for security into the facility was quite long, and with the weather taking a turn for the colder (around 32 degrees!) it was nice to not have to wait in such a long line in the cold. The trip up to the top of the arch began with a ride in their little super 1960s-style trams:

At the top, 630 ft (63 stories) up, there are small 7″x27″ windows where you can see the Mississippi river and the city of St. Louis. Going to the top was definitely a must, but we didn’t stay up too long because it was quite busy and the views were limited with such small windows.

Also a must, an arch photo:

And arch tourist in this, my St. Louis blog post:

We had lunch over in Ballpark Village at the Budweiser Brew House before heading off on our last adventure of the trip: The Anheuser-Busch Brewery Tour. Now, I’m not actually a fan of Budweiser. It’s rice-based beer and I don’t care for lagers in general, being more of an ale fan (and fan of hops!). But I was in St. Louis, I am a beer fan, and the idea of visiting the home of the biggest beer company in the world was compelling. We did the Day Fresh Brewery Tour. We got a couple samples throughout the tour, and I have to admit I was pleasantly surprised by the the Michelob AmberBock, while still quite mild for a Bock, it was smooth and didn’t have any unpleasant aftertaste. The tour gave us an opportunity to visit the Budweiser Clydesdales, who have a pretty amazing building to live in (beautiful, heated, wood paneling, nicer than most human houses!). From there was a tour of the Brew House and Clock Tower and finally the BEVO Packaging Plant where they have their canning and bottling lines which run 24/7. But my favorite thing on the whole tour? The hop vine chandeliers in the historical brew house. Our tour guide told us they were bought by the company from the 1904 World’s Fair.

More photos from the brewery tour here: https://www.flickr.com/photos/pleia2/sets/72157647696859563/

And with that, my trip wound down. We snagged some roast beef sandwiches to enjoy with a movie before I went to bed early to be up at 4AM for my 5:50AM flight back home via Denver. Huge thanks to Ryan for putting me up in his guest room the long weekend and driving me around town as we did our whirlwind tour of his city!

More generic St. Louis photos collected in this album: https://www.flickr.com/photos/pleia2/sets/72157647695764113/

by pleia2 at December 30, 2014 07:44 PM

December 24, 2014

Elizabeth Krumbach

I think I’ll go for a… oh bother

Last December I wrote about taking up running. I had some fantastic weeks, I was gaining stamina and finding actual value in my new found ability to run (late to the train? I can run!). I never really grew to like it, and as I got up to 25 minutes of solid (even if slow) running I really had to push myself, but things were going well.

Then, in April, I got sick. This kicked off my whole gallbladder ordeal. Almost 4 months of constant pain, changes in my diet to avoid triggers to increased pain. Running was out entirely, anything that bounced me around that much was not tolerable. The diet changes tended toward carbs, and away from meats and fats. The increased carbs and death of my exercise routine was a disaster for me weight-wise. Add on the busiest travel year of my life and all the stress and poor eating choices that come with travel, and I’ve managed to put on 30lbs this year, landing me at the heaviest I’ve ever been.

I don’t feel good about this.

By September I was recovered enough to start running again, but sneaking in discipline to exercise into my travel schedule proved tricky. They also don’t tell you how much harder it is to exercise when you’re heavy – all that extra weight to carry around! Particularly as I run, soreness in my feet has been my key issue, where previously I’d only had trouble with joint (knee) pain here and there. I picked up running again for a couple weeks in late November, but then the rain started in San Francisco. December has been unusually soggy. One of the reasons I picked running as my exercise of choice was because we actually have nice weather most of the time, so this was quite the disappointment.

But I haven’t given up! I did start Couch to 5k over, but so far it’s not nearly as hard as the first time around, so I didn’t lose all the ground I gained earlier in the year. Here’s to 2015 being a healthier year for me.

by pleia2 at December 24, 2014 02:05 AM

Simcoe’s December 2014 Checkup

Simcoe was diagnosed with Chronic Renal Failure (CRF) back in December of 2011, so it’s been a full three years since her diagnosis!

Still, she doesn’t enjoy the quarterly vet visits. We took her in on December 6th and she was determined to stay in her carrier and not look at me.


“I’m mad at you”

We’re keeping up with subcutaneous fluid injections every other day to keep her hydrated, and it has been keeping her pretty stable. This latest round of tests did show a slight decrease in her weight from 9.94lbs to 9.74lbs.

Weight

Her BUN level remained steady, and CRE rose a bit from 3.8 to 4.2.

BUN: 59 (normal range: 14-36)
CRE: 4.2 (normal range: .6-2.4)

Her calcium levels also came back a little high, so we scheduled some fasted blood work for this past weekend. We took the opportunity to also bring Caligula in for his annual exam.

Caligula is doing well, he just turned 11 years old and our only concern was some staining on his iris, which the vet took a look at and confirmed was just pigmentation changes that are common with aging. His blood work looks good, though also shows some slightly elevated calcium levels.


Simcoe was taken in the back with the carrier, Caligula got the leash

We still have one follow-up call with Simcoe’s vet to chat about the calcium levels, but the vet on duty who delivered the results didn’t seem concerned since they’ve been elevated for some time and are just slightly above normal.

The only other current struggle is supplies. Following some quality control issues with one of the manufacturers, the Lactated Ringer’s solution we give subcutaneously went through a period of severe shortage (article here). The market seems to be recovering, but we’re now navigating a world with different bag manufacturers and canceled out of stock orders from our pharmacy. Hoping 2015 will be a better year with regard to this shortage, it wasn’t only kitties who were impacted by this problem!

by pleia2 at December 24, 2014 01:21 AM

Jono Bacon

Happy Holidays

Just a quick note to wish all of you a happy, restful, and peaceful holidays, however and whoever you spend it with. Take care, folks, and I look forward to seeing you in 2015!

by jono at December 24, 2014 12:24 AM

December 22, 2014

Akkana Peck

Passwordless ssh with a key: the part most tutorials skip

I'm working on my Raspberry Pi crittercam again. I got a battery, so it can be a standalone box -- it was such a hassle to set it up with two power cords dangling from it at all times -- and set it up to run automatically at boot time.

But there was one aspect of the camera that wasn't automated: if close enough to the house to see the wi-fi router, I want it to mount a filesystem from our server and store its image files there. That makes it a lot easier to check on its progress, and also saves wear on the Pi's SD card.

Only one problem: I was using sshfs to mount the disk remotely, and ssh always prompts me for a password.

Now, there are a gazillion tutorials on how to set up an ssh key. Just do a web search for ssh key or passwordless ssh key. They vary a bit in their details, but they're all the same in the important aspects. They're all the same in one other detail: none of them work for me. I generate a new key (various types) with no pass phrase, I copy it to the server's authorized keys file (several different ways, two possible filenames), I try to ssh -- and I'm prompted for a password.

After much flailing I finally found out what was missing. In addition to those two steps, you need to modify your .ssh/config file to tell it which key to use. This is especially critical if you have multiple keys on the client machine, or if you've named the file anything but the default id_dsa or id_rsa.

So here are the real steps for making an ssh key. Assume the server, the machine to which you want to ssh, is named "myserver". But these steps are all run on the client machine, the one from which you want to run ssh.

ssh-keygen -t rsa -C "Comment"
When it prompts you for a filename, give it a full pathname, e.g. ~/.ssh/id_rsa_myserver. Type in a pass phrase, or hit return twice if you want to be able to ssh without a password.
ssh-copy-id -i .ssh/id_rsa_myserver user@myserver
You can omit the user@ if you're using the same username on both machines. You'll have to type in your password on myserver.

Then edit ~/.ssh/config, and add an entry like this:

Host myserver
  User my_username
  IdentityFile ~/.ssh/id_rsa_myserver
The User line is optional, and refers to your username on myserver if it's different from the one on the client. For instance, on the Raspberry Pi, everything has to run as root because most of the hardware and camera libraries can't work any other way. But I want it using my user ID on the server side, not root.

Eliminating strict host key checking

Of course, you can use this to go the other way too, and ssh to your Pi without needing to type a password every time. If you do that, and if you have several Pis, Beaglebones, plug computers or other little Linux gizmos which sometimes share the same IP address, you may run into the annoying whine ssh is prone to:

@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@
@    WARNING: REMOTE HOST IDENTIFICATION HAS CHANGED!     @
@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@
IT IS POSSIBLE THAT SOMEONE IS DOING SOMETHING NASTY!
The only way to get around this once it happens is by editing ~/.ssh/known_hosts, finding the line corresponding to the pi, and removing it (or just removing the whole file).

You're supposed to be able to turn off this check with StrictHostKeyChecking no, but it doesn't work. Fortunately, there's a trick I discovered several years ago and discussed in Three SSH tips. Here's how the Pi entry ends up looking in my desktop's ~/.ssh/config:

Host pipi
  HostName pi
  User pi
  StrictHostKeyChecking no
  UserKnownHostsFile /dev/null
  IdentityFile ~/.ssh/id_pi

December 22, 2014 11:25 PM

December 18, 2014

Akkana Peck

Firefox deprecates flash. How to get it back (on Debian).

Recently Firefox started refusing to run flash, including youtube videos (about the only flash I run). A bar would appear at the top of the page saying "This plug-in is vulnerable and should be upgraded". Apparently Adobe had another security bug. There's an "Update now" button in the Firefox bar, but it's a chimera: Firefox has never known how to install plug-ins for Linux (there are longstanding bugs filed on why it claims to be able to but can't), and it certainly doesn't know how to update a Debian package.

I use a Firefox downloaded from Mozilla.org, but flash from Debian's flashplugin-nonfree package. So I figured updating Debian -- apt-get update; apt-get dist-upgrade -- would fix it. Nope. I still got the same message.

A little googling found several pages recommending update-flashplugin-nonfree --install; I tried that but it didn't help either. It seemed to download a tarball, but as far as I could tell it never unpacked or installed the tarball it downloaded.

What finally did the trick was

apt-get install --reinstall flashplugin-nonfree
That downloaded a new tarball, AND unpacked and installed it. After restarting Firefox, I was able to view the video I'd been trying to watch.

December 18, 2014 10:21 PM

December 17, 2014

Jono Bacon

The Impact of One Person

I am 35 years old and people never cease to surprise me. My trip home from Los Angeles today was a good example of this.

It was a tortuous affair that should have been a quick hop from LA to Oakland, popping on BArt, and then getting home for a cup of tea and an episode of The Daily Show.

It didn’t work out like that.

My flight was delayed. Then we sat on the tarmac for an hour. Then the new AirBart train was delayed. Then I was delayed at the BArt station in Oakland for 30 minutes. Throughout this I was tired, it was raining, and my patience was wearing thin.

Through the duration of this chain of minor annoyances, I was reading about the horrifying school attack in Pakistan. As I read more, related articles were linked with other stories of violence, aggression, and rape, perpetuated by the dregs of our species.

As anyone who knows me will likely testify, I am a generally pretty positive guy who sees the good in people. I have baked my entire philosophy in life and focus in my career upon the core belief that people are good and the solutions to our problems and the doors to opportunity are created by good people.

On some days though, even the strongest sense of belief in people can be tested when reading about events such as this dreadful act of violence in Pakistan. My seemingly normal trip home from the office in LA just left me disappointed in people.

While stood at the BArt station I decided I had had enough and called an Uber. I just wanted to get home and see my family. This is when my mood changed entirely.

Gerald

A few minutes later, my Uber arrived, and I was picked up by an older gentleman called Gerald. He put my suitcase in the trunk of his car and off we went.

We started talking about the Pakistan shooting. We both shared a desperate sense of disbelief at all those innocent children slaughtered. We questioned how anyone with any sense of humanity and emotion could even think about doing that, let alone going through with it. With a somber air filling the car, Gerald switched gears and started talking about his family.

He told me about his two kids, both of which are in their mid-thirtees. He doted on their accomplishments in their careers, their sense of balance and integrity as people, and his three beautiful grand-children.

He proudly shared that he had shipped his grandkids’ Christmas presents off to them today (they are on the East Coast) so he didn’t miss the big day. He was excited about the joy he hoped the gifts would bring to them. His tone and sentiment was one of happiness and pride.

We exchanged stories about our families, our plans for Christmas, and how lucky we both felt to love and be loved.

While we were generations apart…our age, our experiences, and our differences didn’t matter. We were just proud husbands and fathers who were cherishing the moments in life that were so important to both of us.

We arrived at my home and I told Gerald that until I stepped in his car I was having a pretty shitty trip home and he completely changed that. We shook hands, shared Christmas best wishes, and parted ways.

Good People

What I was expecting to be a typical Uber ride home with me exchanging a few pleasantries and then doing email on my phone, instead really illuminated what is important in life.

We live in complex world. We live on a planet with a rich tapestry of people and perspectives.

Evil people do exist. I am not referring to a specific religious or spiritual definition of evil, but instead the extreme inverse of the good we see in others.

There are people who can hurt others, who can so violently shatter innocence and bring pain to hundreds, so brutally, and so unnecessarily. I can’t even imagine what the parents of those kids are going through right now.

It can be easy to focus on these tragedies and to think that our world is getting worse; to look at the full gamut of negative humanity, from the inconsequential, such as the miserable lady yelling at the staff at the airport, to the hateful, such as the violence directed at innocent children. It is easy to assume that our species is rotting from the inside out, to see poison in the well, and that the rot is spreading.

While it is easy to lose faith in people, I believe our wider humanity keeps us on the right path.

While there is evil in the world, there is an abundance of good. For every evil person screaming there is a choir of good people who drown them out. These good people create good things, they create beautiful things that help others to also create good things and be good people too.

Like many of you, I am fortunate to see many of these things every day. I see people helping the elderly in their local communities, many donating toys to orphaned kids over the holidays, others creating technology and educational resources that help people to create new content, art, music, businesses, and more. Every day millions devote hours to helping and inspiring others to create a brighter future.

What is most important about all of this is that every individual, every person, every one of you reading this, has the opportunity to have this impact. These opportunities may be small and localized, or they may be large and international, but we can all leave this planet a little better than when we arrived on it.

The simplest way of doing this is to share our humanity with others and to cherish the good in the face of evil. The louder our choir, the weaker theirs.

Gerald did exactly that tonight. He shared happiness and opportunity with a random guy he picked up in his car and I felt I should pass that spirit on to you folks too. Now it is your turn.

Thanks for reading.

by jono at December 17, 2014 07:35 AM

December 16, 2014

Elizabeth Krumbach

Recent time between travel

This year has pretty much been consumed by travel and events. I’ll dive into that more in a wrap-up post in a couple weeks, but for now I’ll just note that it’s been tiring and I’ve worked to value my time at home as much as possible.

It’s been uncharacteristically wet here in San Francisco since coming home from Jamaica. We’re fortunate to have the rain since we’re currently undergoing a pretty massive drought here in California, but I would have been happier if it didn’t come at once! There was some flooding in our basement garage at the beginning (fortunately a leak was found and fixed) and we had possibly the first power outage since I moved here almost five years ago. Internet has had outages too, which could be a bit tedious work-wise even with a back up connection. All because of a few inches of rain that we’d not think anything of back in Pennsylvania, let alone during the kinds of winter storms I grew up with in Maine.

On Thanksgiving I got ambitious about my time at home and decided to actually make a full dinner. We’d typically either gone out or picked up prepared food somewhere, so this was quite a change from the norm. I skipped the full turkey and went with cutlets I prepared in a pan, the rest of the menu included the usual suspects: gravy, stuffing, mashed potatoes, cranberry sauce, green beans and rolls. I had leftovers for days. I also made MJ suffer with me through a Mystery Science Theater 3000 Turkey Day marathon, hah!

I’ve spent a lot of time catching up with project work in the past few weeks. Following up on a number of my Xubuntu tasks and working through my Partimus backlog. Xubuntu-wise we’re working on a few contributor incentives, so I’m receiving a box of Xubuntu stickers in the mail soon, courtesy of UnixStickers.com, which I’ll be sending out to select QA contributors in the coming months. We’re also working on a couple of polls that can give us a better idea of who are user base is and how to serve them better. I also spent an afternoon in Alameda recently to meet with an organization that Partimus may partner with and met up with the Executive Director this past weekend for a board meeting where we identified some organizational work for the next quarter.

At home I’ve been organizing the condo and I’m happy to report that the boxes have gone, even working from home means I still have too much stuff around all the time. MJ took some time to set up our shiny new PlayStation 4 and several antennas so our TV now has channels and we can get AM and FM radio. I’ll finally be able to watch baseball at home! I also got holiday cards sent out and some Hanukkah lights put up, so it’s feeling quite comfortable here.

Having time at home has also meant I’ve been able to make time for friends who’ve come into town to visit lately. Laura Czajkowski, who I’ve worked with for years in the Ubuntu community, was recently in town and we met up for dinner. I also recently had dinner with my friend BJ, who I know from the Linux scene back in Philadelphia, though we’ve both moved since. Now I just need to make more time for my local friends.

The holiday season has afforded us some time to dress up and go out, like to a recent holiday party by MJ’s employer.

Plus I’ve had the typical things to keep me busy outside of work, an Ubuntu Hour and Debian Dinner last week and the Ubuntu Weekly Newsletter which will hit issue 400 early next year. Plus, I have work on my book, which I wish were going faster, but is coming along.

I have one more trip coming this year, off to St. Louis late next week. I’ll be spending few days visiting with friends and traveling around a city I’ve never been to! This trip will put me over 100k miles for the calendar year, which is a pretty big milestone for me, and one I’m not sure I’ll reach again. Plans are still firming up for how my travel schedule will look next year, but I do have a couple big international trips on the horizon that I’m excited about.

by pleia2 at December 16, 2014 05:24 AM

December 12, 2014

Eric Hammond

Exploring The AWS Lambda Runtime Environment

In the AWS Lambda Shell Hack article, I present a crude hack that lets me run shell commands in the AWS Lambda environment to explore what might be available to Lambda functions running there.

I’ve added a wrapper that lets me type commands on my laptop and see the output of the command run in the Lambda function. This is not production quality software, but you can take a look at it in the alestic/lambdash GitHub repo.

For the curious, here are some results. Please note that this is running on a preview and is in no way a guaranteed part of the environment of a Lambda function. Amazon could change any of it at any time, so don’t build production code using this information.

The version of Amazon Linux:

$ lambdash cat /etc/issue
Amazon Linux AMI release 2014.03
Kernel \r on an \m

The kernel version:

$ lambdash uname -a
Linux ip-10-0-168-157 3.14.19-17.43.amzn1.x86_64 #1 SMP Wed Sep 17 22:14:52 UTC 2014 x86_64 x86_64 x86_64 GNU/Linux

The working directory of the Lambda function:

$ lambdash pwd
/var/task

which contains the unzipped contents of the Lambda function I uploaded:

$ lambdash ls -l
total 12
-rw-rw-r-- 1 slicer 497 5195 Nov 18 05:52 lambdash.js
drwxrwxr-x 5 slicer 497 4096 Nov 18 05:52 node_modules

The user running the Lambda function:

$ lambdash id
uid=495(sbx_user1052) gid=494 groups=494

which is one of one hundred sbx_userNNNN users in /etc/passwd. “sbx_user” presumably stands for “sandbox user”.

The environment variables (in a shell subprocess). This appears to be how AWS Lambda is passing the AWS credentials to the Lambda function.

$ lambdash env
 AWS_SESSION_TOKEN=[ELIDED]
LAMBDA_TASK_ROOT=/var/task
LAMBDA_CONSOLE_SOCKET=14
PATH=/usr/local/bin:/usr/bin:/bin
PWD=/var/task
AWS_SECRET_ACCESS_KEY=[ELIDED]
NODE_PATH=/var/runtime:/var/task:/var/runtime/node_modules
AWS_ACCESS_KEY_ID=[ELIDED]
SHLVL=1
LAMBDA_CONTROL_SOCKET=11
_=/usr/bin/env

The versions of various pre-installed software:

$ lambdash perl -v
This is perl 5, version 16, subversion 3 (v5.16.3) built for x86_64-linux-thread-multi
[...]

$ lambdash python --version
Python 2.6.9

$ lambdash node -v
v0.10.32

Running processes:

$ lambdash ps axu
USER       PID %CPU %MEM    VSZ   RSS TTY      STAT START   TIME COMMAND
493          1  0.2  0.7 1035300 27080 ?       Ssl  14:26   0:00 node --max-old-space-size=0 --max-new-space-size=0 --max-executable-size=0 /var/runtime/node_modules/.bin/awslambda
493         13  0.0  0.0  13444  1084 ?        R    14:29   0:00 ps axu

The entire file system: 2.5 MB download

 $ lambdash ls -laiR /
 [click link above to download]

Kernel ring buffer: 34K download

$ lambdash dmesg
[click link above to download]

CPU info:

$ lambdash cat /proc/cpuinfo
processor   : 0
vendor_id   : GenuineIntel
cpu family  : 6
model       : 62
model name  : Intel(R) Xeon(R) CPU E5-2680 v2 @ 2.80GHz
stepping    : 4
microcode   : 0x416
cpu MHz     : 2800.110
cache size  : 25600 KB
physical id : 0
siblings    : 2
core id     : 0
cpu cores   : 1
apicid      : 0
initial apicid  : 0
fpu     : yes
fpu_exception   : yes
cpuid level : 13
wp      : yes
flags       : fpu vme de pse tsc msr pae mce cx8 apic sep mtrr pge mca cmov pat pse36 clflush mmx fxsr sse sse2 ht syscall nx rdtscp lm constant_tsc rep_good nopl xtopology eagerfpu pni pclmulqdq ssse3 cx16 pcid sse4_1 sse4_2 x2apic popcnt tsc_deadline_timer aes xsave avx f16c rdrand hypervisor lahf_lm xsaveopt fsgsbase smep erms
bogomips    : 5600.22
clflush size    : 64
cache_alignment : 64
address sizes   : 46 bits physical, 48 bits virtual
power management:

processor   : 1
vendor_id   : GenuineIntel
[...]

Installed nodejs modules:

$ dirs=$(lambdash 'echo $NODE_PATH' | tr ':' '\n' | sort)
$ echo $dirs
/var/runtime /var/runtime/node_modules /var/task

$ lambdash 'for dir in '$dirs'; do echo $dir; ls -1 $dir; echo; done'
/var/runtime
node_modules

/var/runtime/node_modules
aws-sdk
awslambda
dynamodb-doc
imagemagick

/var/task # Uploaded in Lambda function ZIP file
lambdash.js
node_modules

[Update 2013-12-03]

We’re probably not on a bare EC2 instance. The standard EC2 instance metadata service is not accessible through HTTP:

$ lambdash curl -sS http://169.254.169.254:8000/latest/meta-data/instance-type
curl: (7) Failed to connect to 169.254.169.254 port 8000: Connection refused

Browsing the AWS Lambda environment source code turns up some nice hints about where the product might be heading. I won’t paste the copyrighted code here, but you can download into an “awslambda” subdirectory with:

$ lambdash 'cd /var/runtime/node_modules;tar c awslambda' | tar xv

[Update 2013-12-11]

There’s a half gig of writable disk space available under /tmp (when run with 256MB of RAM. Does this scale up with memory?)

$ lambdash 'df -h 2>/dev/null'
Filesystem      Size  Used Avail Use% Mounted on
/dev/xvda1       30G  1.9G   28G   7% /
devtmpfs         30G  1.9G   28G   7% /dev
/dev/xvda1       30G  1.9G   28G   7% /
/dev/loop0      526M  832K  514M   1% /tmp

Anything else you’d like to see? Suggest commands in the comments on this article.

Original article: http://alestic.com/2014/11/aws-lambda-environment

by Eric Hammond at December 12, 2014 05:07 AM

December 10, 2014

Akkana Peck

Not exponential after all

We're saved! From the embarrassing slogan "Live exponentially", that is.

Last night the Los Alamos city council voted to bow to public opinion and reconsider the contract to spend $50,000 on a logo and brand strategy based around the slogan "Live Exponentially." Though nearly all the councilors (besides Pete Sheehey) said they still liked the slogan, and made it clear that the slogan isn't for residents but for people in distant states who might consider visiting as tourists, they now felt that basing a campaign around a theme nearly of the residents revile was not the best idea.

There were quite a few public comments (mine included); everyone was civil and sensible and stuck well under the recommended 3-minute time limit.

Instead, the plan is to go ahead with the contract, but ask the ad agency (Atlas Services) to choose two of the alternate straplines from the initial list of eight that North Star Research had originally provided.

Wait -- eight options? How come none of the previous press or the previous meeting mentioned that there were options? Even in the 364 page Agenda Packets PDF provided for this meeting, there was no hint of that report or of any alternate strap lines.

But when they displayed the list of eight on the board, it became a little clearer why they didn't want to make the report public: they were embarrassed to have paid for work of this quality. Check out the list:

  • Where Everything is Elevated
  • High Intelligence in the High Desert
  • Think Bigger. Live Brighter.
  • Great. Beyond.
  • Live Exponentially
  • Absolutely Brilliant
  • Get to a Higher Plane
  • Never Stop Questioning What's Possible

I mean, really. Great Beyond? Are we're all dead? High Intelligence in the High Desert? That'll certainly help with people who think this might be a bunch of snobbish intellectuals.

It was also revealed that at no point during the plan was there ever any sort of focus group study or other tests to see how anyone reacted to any of these slogans.

Anyway, after a complex series of motions and amendments and counter-motions and amendments and amendments to the amendments, they finally decided to ask Atlas to take the above list, minus "Live Exponentially"; add the slogan currently displayed on the rocks as you drive into town, "Where Discoveries are Made" (which came out of a community contest years ago and is very popular among residents); and ask Atlas to choose two from the list to make logos, plus one logo that has no slogan at all attached to it.

If we're lucky, Atlas will pick Discoveries as one of the slogans, or maybe even come up with something decent of their own.

The chicken ordinance discussion went well, too. They amended the ordinance to allow ten chickens (instead of six) and to try to allow people in duplexes and quads to keep chickens if there's enough space between the chickens and their neighbors. One commenter asked for the "non-commercial' clause to be struck because his kids sell eggs from a stand, like lemonade, which sounded like a very reasonable request (nobody's going to run a large commercial egg ranch with ten chickens); but it turned out there's a state law requiring permits and inspections to sell eggs.

So, folks can have chickens, and we won't have to live exponentially. I'm sure everyone's breathing a little more easily now.

December 10, 2014 11:27 PM

December 09, 2014

Eric Hammond

Persistence Of The AWS Lambda Environment Between Function Invocations

AWS Lambda functions are run inside of an Amazon Linux environment (presumably a container of some sort). Sequential calls to the same Lambda function could hit the same or different instantiations of the environment.

If you hit the same copy (I don’t want to say “instance”) of the Lambda function, then stuff you left in the environment from a previous run might still be available.

This could be useful (think caching) or hurtful (if your code incorrectly expects a fresh start every run).

Here’s an example using lambdash, a hack I wrote that sends shell commands to a Lambda function to be run in the AWS Lambda environment, with stdout/stderr being sent back through S3 and displayed locally.

$ lambdash 'echo a $(date) >> /tmp/run.log; cat /tmp/run.log'
a Tue Dec 9 13:54:50 PST 2014

$ lambdash 'echo b $(date) >> /tmp/run.log; cat /tmp/run.log'
a Tue Dec 9 13:54:50 PST 2014
b Tue Dec 9 13:55:00 PST 2014

$ lambdash 'echo c $(date) >> /tmp/run.log; cat /tmp/run.log'
a Tue Dec 9 13:54:50 PST 2014
b Tue Dec 9 13:55:00 PST 2014
c Tue Dec 9 13:55:20 PST 2014

As you can see in this example, the file in /tmp contains content from previous runs.

These tests are being run in AWS Lambda Preview, and should not be depended on for long term or production plans. Amazon could change how AWS Lambda works at any time for any reason, especially when the behaviors are not documented as part of the interface. For example, Amazon could decide to clear out writable file system areas like /tmp after each run.

If you want to have a dependable storage that can be shared among multiple copies of an AWS Lambda function, consider using standard AWS services like DynamoDB, RDS, ElastiCache, S3, etc.

Original article: http://alestic.com/2014/12/aws-lambda-persistence

by Eric Hammond at December 09, 2014 10:07 PM

December 08, 2014

Eric Hammond

AWS Lambda: Pay The Same Price For Faster Execution

multiply the speed of compute-intensive Lambda functions without (much) increase in cost

Given:

  • AWS Lambda duration charges are proportional to the requested memory.

  • The CPU power, network, and disk are proportional to the requested memory.

One could conclude that the charges are proportional to the CPU power available to the Lambda function. If the function completion time is inversely proportional to the CPU power allocated (not entirely true), then the cost remains roughly fixed as you dial up power to make it faster.

If your Lambda function is primarily CPU bound and takes at least several hundred ms to execute, then you may find that you can simply allocate more CPU by allocating more memory, and get the same functionality completed in a shorter time period for about the same cost.

For example, if you allocate 128 MB of memory and your Lambda function takes 10 seconds to run, then you might be able to allocate 640 MB and see it complete in about 2 seconds.

At current AWS Lambda pricing, both of these would cost about $0.02 per thousand invocations, but the second one completes five times faster.

Things that would cause the higher memory/CPU option to cost more in total include:

  • Time chunks are rounded up to the nearest 100 ms. If your Lambda function runs near or under that in less memory, then increasing the CPU allocated will make it return faster, but the rounding up will cause the resulting cost to be more expensive.

  • Doubling the CPU allocated to a Lambda function does not necessarily cut the run time in half. The code might be accessing external resources (e.g., calling S3 APIs) or interacting with disk. If you double the requested CPU, then those fixed time actions will end up costing twice as much.

If you have a slow Lambda function, and it seems that most of its time is probably spent in CPU activities, then it might be worth testing an increase in requested memory to see if you can get it to complete much faster without increasing the cost by much.

I’d love to hear what practical test results people find when comparing different memory/CPU allocation values for the same Lambda function.

Original article: http://alestic.com/2014/11/aws-lambda-speed

by Eric Hammond at December 08, 2014 05:54 PM

Elizabeth Krumbach

My father passed away 10 years ago

It’s December 7th, which marks 10 years since my father passed away. In the past decade I’ve had much to reflect on about his life.

When he passed away I was 23 and had bought a house in the suburbs of Philadelphia. I had just transitioned from doing web development contract work to working various temp jobs to pay the bills. It was one of those temp jobs that I went to the morning after I learned my father had passed, because I didn’t know what else to do, I learned quickly that people tend to take a few days off when they have such a loss and why. The distance from home made it challenging to work through the loss, as is seen in my blog post from the week it happened, I felt pretty rutterless.

My father had been an inspiration for me. He was always making things, had a wood workshop where he’d build dollhouses, model planes, and even a stable for my My Little Ponies. He was also a devout Tolkien fan, making The Hobbit a more familiar story for me growing up than Noah’s Ark. I first saw and fell in love with Star Wars because he was a big scifi fan. My passion for technology was sparked when his brother at IBM shipped us our first computer and he told me stories about talking to people from around the world on his HAM radios. He was also an artist, with his drawings of horses being among my favorites growing up. Quite the Renaissance man. Just this year, when my grandmother passed, I was honored received several of his favorite things that she had kept, including a painting that hung in our house growing up, a video of his time at college and photos that highlighted his love of travel.

He was also very hard on me. Every time I excelled, he pushed harder. Unfortunately it felt like “I could never do good enough” when in fact I now believe he pushed me for my own good, I could usually take it and I’m ultimately better for it. I know he was also supremely disappointed that I never went to college, something that was very important to him. This all took me some time to reconcile, but deep down I know my father loved my sisters and I very much, and regardless of what we accomplished I’m sure he’d be proud of all of us.

And he struggled with alcoholism. It’s something I’ve tended to gloss over in most public discussions about him because it’s so painful. It’s had a major impact on my life, I’m pretty much as text book example of “eldest child of an alcoholic” as you can get. It also tore apart my family and inevitably lead to my father’s death from cirrhosis of the liver. For a long time I was angry with him. Why couldn’t he give it up for his family? Not even to save his own life? I’ve since come to understand that alcoholism is a terrible, destructive thing and for many people it’s a lifelong battle that requires a tremendous amount of support from family and community. While I may have gotten genetic fun bag of dyslexia, migraines and seizures from my father, I’m routinely thankful I didn’t inherit the predisposition toward alcoholism.

And so, on this sad anniversary, I won’t be having an drink to his life. Instead I think I’ll honor his memory by spending the evening working on one of the many projects that his legacy inspired and brings me so much joy. I love you, Daddy.

by pleia2 at December 08, 2014 01:49 AM

Akkana Peck

My Letter to the Editor: Make Your Voice Heard On 'Live Exponentially'

More on the Los Alamos "Live Exponentially" slogan saga: There's been a flurry of letters, all opposed to the proposed slogan, in the Los Alamos Daily Post these last few weeks.

And now the issue is back on the council agenda; apparently they're willing to reconsider the October vote to spend another $50,000 on the slogan.

But considering that only two people showed up to that October meeting, I wrote a letter to the Post urging people to speak before the council: Letter to the Editor: Attend Tuesday's Council Meeting To Make Your Voice Heard On 'Live Exponentially'.

I'll be there. I've never actually spoken at a council meeting before, but hey, confidence in public speaking situations is what Toastmasters is all about, right?

(Even though it means I'll have to miss an interesting sounding talk on bats that conflicts with the council meeting. Darn it!)

A few followup details that I had no easy way to put into the Post letter:

The page with the links to Council meeting agendas and packets is here: Los Alamos County Calendar.

There, you can get the short Agenda for Tuesday's meeting, or the full 364 page Agenda Packets PDF.

[Breathtaking raised to the power of you] The branding section covers pages 93 - 287. But the graphics the council apparently found so compelling, which swayed several of them from initially not liking the slogan to deciding to spend a quarter million dollars on it, are in the final presentation from the marketing company, starting on page p. 221 of the PDF.

In particular, a series of images like this one, with the snappy slogan:

Breathtaking raised to the power of you
LIVE EXPONENTIALLY

That's right: the advertising graphics that were so compelling they swayed most of the council are even dumber than the slogan by itself. Love the superscript on the you that makes it into an exponent. Get it ... exponentially? Oh, now it all makes sense!

There's also a sadly funny "Written Concept" section just before the graphics (pages 242- in the PDF) where they bend over backward to work in scientific-sounding words, in bold each time.

But there you go. Hopefully some of those Post letter writers will come to the meeting and let the council know what they think.

The council will also be discussing the much debated proposed chicken ordinance; that discussion runs from page 57 to 92 of the PDF. It's a non-issue for Dave and me since we're in a rural zone that already allows chickens, but I hope they vote to allow them everywhere.

December 08, 2014 01:05 AM

December 03, 2014

Elizabeth Krumbach

December 2014 OpenStack Infrastructure User Manual Sprint

Back in April, the OpenStack Infrastructure project create the Infrastructure User Manual. This manual sought consolidate our existing documentation for Developers, Core Reviewers and Project Drivers, which was spread across wiki pages, project-specific documentation files and general institutional knowledge that was mostly just in our brains.

Books

In July, at our mid-cycle sprint, Anita Kuno drove a push to start getting this document populated. There was some success here, we had a couple of new contributors. Unfortunately, after the mid-cycle reviews only trickled in and vast segments of the manual remained empty.

At the summit, we had a session to plan out how to change this and announced an online sprint in the new #openstack-sprint channel (see here for scheduling: https://wiki.openstack.org/wiki/VirtualSprints). We hosted the sprint on Monday and Tuesday of this week.

Over these 2 days we collaborated on an etherpad so no one was duplicating work and we all did a lot of reviewing. Contributors worked to flesh out missing pieces of the guide and added a Project Creator’s section to the manual.

We’re now happy to report, that with the exception of the Third Party section of the manual (to be worked on collaboratively with the broader Third Party community at a later date), our manual is looking great!

The following are some stats about our sprint gleaned from Gerrit and Stackalytics:

Sprint start

  • Patches open for review: 10
  • Patches merged in total repo history: 13

Sprint end:

  • Patches open for review: 3, plus 2 WIP (source)
  • Patches merged during sprint: 30 (source)
  • Reviews: Over 200 (source)

We also have 16 patches for documentation in flight that were initiated or reviewed elsewhere in the openstack-infra project during this sprint, including the important reorganization of the git-review documentation (source)

Finally, thanks to sprint participants who joined me for this sprint, sorted chronologically by reviews: Andreas Jaeger, James E. Blair, Anita Kuno, Clark Boylan, Spencer Krum, Jeremy Stanley, Doug Hellmann, Khai Do, Antoine Musso, Stefano Maffulli, Thierry Carrez and Yolanda Robla

by pleia2 at December 03, 2014 04:30 PM