Planet Ubuntu California

May 23, 2017

Akkana Peck

Python help from the shell -- greppable and saveable

I'm working on a project involving PyQt5 (on which, more later). One of the problems is that there's not much online documentation, and it's hard to find out details like what signals (events) each widget offers.

Like most Python packages, there is inline help in the source, which means that in the Python console you can say something like

>>> from PyQt5.QtWebEngineWidgets import QWebEngineView
>>> help(QWebEngineView)
The problem is that it's ordered alphabetically; if you want a list of signals, you need to read through all the objects and methods the class offers to look for a few one-liners that include "unbound PYQT_SIGNAL".

If only there was a way to take help(CLASSNAME) and pipe it through grep!

A web search revealed that plenty of other people have wished for this, but I didn't see any solutions. But when I tried running python -c "help(list)" it worked fine -- help isn't dependent on the console.

That means that you should be able to do something like

python -c "from sys import exit; help(exit)"

Sure enough, that worked too.

From there it was only a matter of setting up a zsh function to save on complicated typing. I set up separate aliases for python2, python3 and whatever the default python is. You can get help on builtins (pythonhelp list) or on objects in modules (pythonhelp sys.exit). The zsh suffixes :r (remove extension) and :e (extension) came in handy for separating the module name, before the last dot, and the class name, after the dot.

#############################################################
# Python help functions. Get help on a Python class in a
# format that can be piped through grep, redirected to a file, etc.
# Usage: pythonhelp [module.]class [module.]class ...
pythonXhelp() {
    python=$1
    shift
    for f in $*; do
        if [[ $f =~ '.*\..*' ]]; then
            module=$f:r
            obj=$f:e
            s="from ${module} import ${obj}; help($obj)"
        else
            module=''
            obj=$f
            s="help($obj)"
        fi
        $python -c $s
    done
}
alias pythonhelp="pythonXhelp python"
alias python2help="pythonXhelp python2"
alias python3help="pythonXhelp python3"

So now I can type

python3help PyQt5.QtWebEngineWidgets.QWebEngineView | grep PYQT_SIGNAL
and get that list of signals I wanted.

May 23, 2017 08:12 PM

May 21, 2017

Elizabeth Krumbach

DevOps Days in Salt Lake City 2017

I was in Salt Lake City for OpenStack Days Mountain West back in December, it was the first time I’d ever been to SLC and I certainly didn’t expect to return so quickly. Still, back in early March one of the organizers for Salt Lake City DevOps Days reached out to me and asked if I’d be interested in giving a keynote for the event. After some brain-racking as to an appropriate topic, I happily agreed to join them to talk about “The Open Sourcing of Infrastructure” which is part history lesson, and part learn-from-history lesson.

But before I talk about that, let me say a few words about Salt Lake City. When I was there in December I didn’t have a great opportunity to really take in how beautiful it was there. I walked around Temple Square and admired the Christmas lights and buildings, acknowledged the mountains, but my heart was elsewhere as I worked through a difficult time. This time I was in a better place. As I rode past the city and into South Jordan, UT where the conference was being held, I really got to check out the scenery. The whole area is surrounded by mountains, which were snow-capped even in May. It’s really something to wake up to, and be reminded of every time you look out the window. Beautiful mountains, right there!

The conference itself was held at Noah’s Event Venue, a great space that easily accommodated the 400 attendees, with a large auditorium on the ground floor, and several rooms throughout the space for open spaces and workshops in the afternoon. The sponsor room could have been bigger, it was a bit overwhelming crowd-wise when I ventured in a couple times and the sponsors were squished in pretty close to each other. Everything else went well though, the lunch lines moved quickly, the outdoor-ish space where we ate gave us a lovely view of the mountains (and was even better when they brought in some heaters the evening of the second day!).

This is the second year of this conference, and last year they established a tradition of having a stuffed animal mascot. Last year it was a unicorn and this year it was a Yak (a la yak shaving). Obviously I had to get my picture taken with the both of them. They also sat up there on the lectern during my talk, hooray!

Talk-wise, there were a few that stood out for me. The first was the opening keynote for the event. They brought in Ross Clanton, formerly of Target, but now at Verizon. I had the opportunity to meet and chat with him and the closing keynote speaker, Gwen Dobson, at the speaker dinner prior to the event. As we figuratively compared prep-for-our-keynote notes prior to the conference, I was certainly eager to hear from both of them.

Ross began his talk by giving some DevOps methodology background, but the meat of what was interesting to me was the strategies used inside of Target and Verizon to really drive the DevOps model. Executive buy-in was essential, but from there you also need management to take training seriously, in several forms. You don’t teach an organization to adopt DevOps by reading a book and expecting and over-night transformation. Instead, you need varied methods of moving the organization forward and celebrating wins, he suggests:

  • Encourage collaborative learning environments where peers teach peers as much as instructors do, and it’s OK to fail and ask questions
  • Run internal DevOps days, bring in a couple outside speakers but also internal folks who have expertise and stories to share
  • Host six-week immersion engagements (“Learning Dojos”) where teams work on their actual backlog using DevOps strategies and have the freedom to learn and ask questions, while solving real problems, not examples created by instructors
  • Gamification of team progress, where teams get points for various DevOps skills and capabilities they’ve started incorporating into their work and are rewarded (Verizon has the DevOps Cup, like the Stanley Cup, awarded each year!)
  • Even if you aren’t winning a DevOps cup, make sure management knows how important it is that they acknowledge and celebrate any positive progress made toward the adoption of DevOps principles
  • Don’t fight people who resist change in your organization, instead do awesome things with your allies, make progress, and most of the nay-sayers will join you eventually

Later that morning we heard from Rob Richardson on “CI/CD on the Microsoft Stack.” This was interesting to me because in spite of my own aversion to proprietary software, I do understand that CI is important for the entire software industry and had been remiss in ever looking into what is available for developers on Windows doing .NET programming. His talk walked the audience through setting up a CI/CD pipeline using TeamCity for CI and hooked into Octopus Deploy for CD (note: both proprietary) that are available and have support for Microsoft-focused environments, and specifically .NET in the case of Octopus Deploy.

Now, I won’t say that this is immediately valuable to me in a practical sense, since I don’t use any of these tools and am uncomfortable building infrastructure tooling around proprietary solutions anyway, but I was appreciative for the broadening of horizons. I learned that there are easy CI/CD options for folks working in the Microsoft world, and adoption of them by people outside of my open source bubble will make the software world better for all of us.

That evening I met up with a couple colleagues from Mesosphere who were attending the conference! Sam Agnew works in sales and joined us from his home base of Denver to meet with folks at the conference and Tim Harper works in engineering on Marathon remotely from a city just south of SLC. After the evening social at the event venue, we all went out to enjoy a nice meal of Mexican food and some drinks. They’re both super friendly and easy to talk to, so it was fun to get to know them a bit. I also found great value in chatting with them about Mesosphere and DC/OS, they believe in the company and products as much as I do, but don’t have the Silicon Valley slant on their opinions and observations about where we’re going.

The second day I gave the opening keynote. During this talk I guided the audience through the past couple decades of infrastructure with an eye on the shift from proprietary to open source software. From there I focused on what we’re open sourcing on the operations side today, and things to consider as we once again become dependent on proprietary technologies, even if they are “in the cloud” this time.

I stopped short of flat out telling people not to use proprietary tooling, or to never consider building their applications into proprietary, hosted APIs and tool kits. A lot of companies successfully do this and a lot of the sponsors at the event make their money by providing hosted products that make sense for them. Instead I implored them to think about their choices carefully, and provided a list of things to think about, including the risk of vendor lock-in, price increases, security and reliability concerns and understanding if/how your (and your customer’s!) data will be used by the vendor. Looking back, these were the same things we asked ourselves a decade ago when we shifted to using Linux as the infrastructure platform of choice. Slides from my talk are up here (PDF).


Thanks to Sam Agnew for taking a picture during my talk!

The final talk that really stood out for me came from Rob Treat who spoke on “Production Testing through Monitoring.” There is a lot of focus in the DevOps world around CI with testing, but the truth is you’ll never find all bugs through testing. He shared a handful of funny (but serious!) examples where once in production, users did things that the developers never thought of that caused serious production issues. This wasn’t because they weren’t testing, but instead because our imagination will simply never come up with every potential use, or misuse, of the software we’re building. This is where monitoring and metrics become essential.

As someone with an operations background who really likes monitoring (I run Nagios at home), this seemed obvious to me, but he took it one step further to make it something worth noting: You don’t just monitor basic things like CPU heat, processes running and return codes (in fact, you might be tracking too much of this kind of stuff), you also track things that make sense for your particular business. This returned me to the talk by Jeffery Smith at the DevOps Days in Seattle where he stressed the importance of IT actually understanding the business.

Rob demonstrated by walking us through an example of using metrics as they tried to figure out why traffic and sales were lower than normal for a couple days. After looking through a bunch of technical reasons, they finally overlaid email bounce statistics over the data and learned that for a couple days, bounces were higher than normal. Since much of the company’s sales traffic is driven by these emails, that caused a clear problem on those days. Having the data to draw that conclusion was vital, but they wouldn’t have known to collect that data if they hadn’t been tuned into how the company drives sales and the fact that tracking something like email bounces would be valuable.

Huge thanks to the organizers of this event. They did a great job making us feel welcome and making sure we had everything we needed. As speakers we also got amazing Utah-themed gift baskets which they graciously offered to ship to us (couldn’t bring it on the plane due to liquids involved, and I didn’t check a bag). The attendees were great too, everyone I spoke to was very friendly, even after they found out what strong feelings I have about using open source and open standards, hah!

More photos from this event here: https://www.flickr.com/photos/pleia2/albums/72157681808549041

by pleia2 at May 21, 2017 07:05 PM

May 19, 2017

Elizabeth Krumbach

Outdoor Caligula, trains, MST3K and eateries

Back when I lived in a house in Schwenksville, Pennsylvania, I would often bring Caligula outside with me in the warmer months to work in the garden or just generally relax outside. He had a 50 foot lead that allowed him to explore, but not get close to the road or into the poison ivy-ridden woods. He enjoyed these visits to the outdoors, chasing chipmunks and laying in the grass in the sun. Simcoe was less interested in outdoor time, in spite of numerous attempts, she was always a bit too afraid and didn’t like wearing a harness.


Young Caligula, gardening in Pennsylvania

Fast forward to today. Caligula has been living in a high rise in downtown San Francisco for over seven years! We haven’t brought him out during all this time. I’d loosely mention taking Caligula out to a park here and there, but Simcoe didn’t like being left alone and she’d often react badly when we brought Caligula home from the vet (hissing, growling, for days!). And I figured she still wouldn’t be interested in coming along for the outdoor adventures. Now that we have just Caligula, it was time to revisit outdoor adventure plans. This past weekend we brought him to Golden Gate Park, where we found a quiet patch of grass not too close to anyone else and enjoyed some food (picked up from a Mexican food truck) as Caligula wandered around on a short leash.

We weren’t sure what to expect. I’d never brought him to a public park before, and I’m sure the car ride over wasn’t his favorite thing, but he loved it. My often lazy cat spent the hour and a half there wandering around our blanket, and then dragging me around so he could explore further.


Caligula in Golden Gate Park

Eventually we rounded off our day as the wind picked up and it got a bit cooler, but I’m really happy that he had such a nice time. I know I’ve been pretty down since losing Simcoe, and I think he’s really missed having his snuggle buddy. It was a good way to cheer all of us up.

I’ve mentioned that 2017 has been a tricky year for me, but I’ve started to feel better. Instead of spending so much non-work, non-traveling work watching TV, I’ve transitioned back into reading. My interest in other hobbies has picked up too, I’ve started moving away from so much computer work and decided to get more serious about my interest in model trains. When I was in Philadelphia last time I picked up a starter train set at a toy show, and I’ve now started to refresh my memory on some of the other basics. I subscribed to Model Railroader magazine, and am now somewhat overwhelmed with how much opportunity there is to learn and explore. I’m also struck by the fact that hobby-wise I’ve mostly focused on digital and outward-focused projects. This will be one of the first that gets me back to hardware, but it quickly occurred to me that it can be pulled into a bunch of the electronics projects I’ve idly wondered about over the years. Arduinos and sound-activated controls for a model railroad set? It’s totally going to be a thing!

Increasing the scale, we decided to go back to Philadelphia over the week of Memorial Day. As we were musing about travel, my interest in trains distracted me into talking about cross-country railroad trips and MJ seriously suggested we finally do it for this trip. After geeking out over routes for a couple hours, MJ secured tickets for us on the California Zephyr which we’ll take the entire length, from Emeryville to Chicago in one of the bedroom compartments. From there we’re taking a Capitol Limited to Washington DC in a Roomette and then on to the Northeast Regional to Phliadelphia in Business Class seats. How long does this trip take, you ask? We’re leaving from San Francisco’s temporary TransBay Terminal at 7:50AM on Friday the 26th and arriving in Philadelphia at 5:15PM on Monday the 29th. From there we’re taking the SEPTA regional rail from 30th street station in Philadelphia up Trevose, where the train drops us just over a mile from our townhouse. So it takes a long time and train is not cheaper. Traveling how we are, in the bedroom and roomette is actually considerably more than flying. For us, it’s all about the experience. I’ve not seen much of the center of the country, there are beautiful places I’m missing out on. Taking a train through over the course of a few days is a pretty exciting proposal, I’m really looking forward to it.

With all this train stuff, I realized over the past year how much more adventurous I’ve gotten with rail-based public transit. I’m slowly starting to default to it where it makes sense time-wise, and sad about missed opportunities to take it in the past.

I also recently finished reading Train by Tom Zoellner. He takes several journeys on train lines all around the world, and weaves a tale that blends his experience on these routes, conversations he has with fellow train passengers and a hefty dose of history about each line, and those which are naturally related to it in some way. It was a beautifully written book, and made me even more excited about our cross-country journey! I recently finished the audiobook for Ringworld. I read the book years ago, but never really got into the series. I decided this time around to buy the series as audiobooks and start making my way through them. I got an audiobook of If the Oceans Were Ink: An Unlikely Friendship and a Journey to the Heart of the Quran which has so far been incredibly engaging. Back to the pages, I’ve been reading Madeleine L’Engle’s The Arm of the Starfish and my second book by Brene Brown, I thought it was just me, but it isn’t.

But OK, I’m not just spending lots of wholesome time reading. The new season of Mystery Science Theater 3000 (MST3K) came out several weeks ago and I’ve been doing my best not to binge watch. I slowly made my way up to 1105, the episode that has my name in the credits because of the Kickstarter campaign. I then went through the next few pretty quickly, they’re just so good! And MST3K has been an important part of my life since I discovered it in the late 90s on the SciFi channel. I don’t remember how I found it, I must have just stumbled upon it in my general watching of the SciFi channel. It’s what made me join my first IRC server to chat with fellow fans. It was there that I met my ex-husband who introduced me to Linux, and dove into IRC client scripting and creating websites. Later I helped a pile of fellow fans run an MST3K fan site, which was tricky after the show stopped airing, but gave me my first experience scouring the internet for stories, which I later used in my work on the Ubuntu Weekly Newsletter.

I had my doubts about a reboot of the series, on the one hand we had many of the original cast and crew members participating, but on the other they suddenly had big names and cameos being announced as part of the project, and there was a real risk of the show getting more serious than I would enjoy. Thankfully, my fears were not realized. The show is just as silly and campy as it ever was. They didn’t let a budget or big names go to their head, it has the feel and jokes that I came to expect from MST3K.

At home things are chugging along. As I write this on an early Friday morning before work Caligula is in super snuggle mode and is curled up against me. He’s been like this since we lost Simcoe. We think he’s lonely, as my trip to SLC this week didn’t leave him the happiest (MJ was at work all day). There is a temptation to get him a new kitten friend, but every time I think about it I get sad and realize I’m not ready for it. Plus with all my travel lately I don’t really have the time to train a new kitten, who will have claws.

Speaking locally, this past month we’ve seen the closing of two Italian establishments in our area. A.G. Ferrari has closed all bay area locations. It’s a shame, that was my go-to spot for fresh Parmesan cheese and Italian bread. Umbria, my favorite Italian restaurant in the city, and conveniently on our block, has closed. We made our way down there on their final night, finding ourselves in the midst of other random diners, as well as family and friends wishing the owner a fond farewell. There were speeches, stories, hugs, and tears, which we were included us in. Thankfully this is not the end of the story for them! They’re moving up to Glen Ellen in Sonoma, with progress being tracked on their #WheresGiulio website. We’ll have to visit when they finally open, but I’ll really miss having such a great local place.

We’ve also been carving out bits of our weekend to actually catch up on boring adult things. Our dining area has always been a den of chaos, and I’ve finally started tackling that by picking up a new piece of Ikea furniture so we have a place to pack things into. The chaos still mostly exists, but it’s starting to be tamed and some things are now put away, hooray!

I think this weekend will be a stay in one. I have a ton to do here before I depart for two weeks. And a busy work week is on the horizon with attendance at DevXCon on Monday and a journey (ferry + car service) up to Napa on Wednesday to speak at a conference on Thursday. Then the rise-with-the-sun trek over to the TransBay terminal Friday morning to catch that train across the country. It’s all exciting stuff though, I wouldn’t trade next week for a boring one even if I could.

by pleia2 at May 19, 2017 03:47 PM

Jono Bacon

Google Home: An Insight Into a 4-Year-Old’s Mind

Recently I got a Google Home (thanks to Google for the kind gift). Last week I recorded a quick video of me interrogating it:

Well, tonight I nipped upstairs quickly to go and grab something and as I came back downstairs I heard Jack, our vivacious 4-year-old asking it questions, seemingly not expecting daddy to be listening.

This is when I discovered the wild curiosity in a 4-year-old’s mind. It included such size and weight questions as…

OK Google, how big is the biggest teddy bear?

…and…

OK Google, how much does my foot weigh?

…to curiosities about physics…

OK Google, can chickens fly faster than space rockets?

…to queries about his family…

OK Google, is daddy a mommy?

…and while asking this question he interrupted Google’s denial of an answer with…

OK Google, has daddy eaten a giant football?

Jack then switched gears a little bit, out of likely frustration that Google seemed to “not have an answer for that yet” to all of his questions, and figured the more confusing the question, the more likely that talking thing in the kitchen would work:

OK Google, does a guitar make a hggghghghgghgghggghghg sound?

Google was predictably stumped with the answer. So, in classic Jack fashion, the retort was:

OK Google, banana peel.

While this may seem random, it isn’t:

I would love to see the world through his eyes, it must be glorious.

The post Google Home: An Insight Into a 4-Year-Old’s Mind appeared first on Jono Bacon.

by Jono Bacon at May 19, 2017 03:27 AM

Akkana Peck

Pot Sherd

Wandering the yard chasing invasive weeds, Dave noticed an area that had been disturbed recently by some animal -- probably a deer, but there were no clear prints so we couldn't be sure.

But among the churned soil, he noticed something that looked different from the other rocks.

[Pot sherd we found in the yard] A pot sherd, with quite a nice pattern on it!

(I'm informed that fragments of ancient pots are properly called "sherds"; a "shard" is a fragment of anything other than a pot.)

Our sherd fairly large as such things go: the longest dimension is about two inches.

Of course, we wanted to know how old it was, and whether it was "real". We've done a lot of "archaeology" in our yard since we moved in, digging up artifacts ranging from bits of 1970s ceramic and plastic dinnerware to old tent pegs to hundreds of feet of old rotting irrigation tubing and black plastic sheeting. We even found a small fragment of obsidian that looked like it had been worked (and had clearly been brought here: we're on basalt, with the nearest obsidian source at least fifteen miles away). We've also eyed some of the rock rings and other formations in the yard with some suspicion, though there's no way to prove how long ago rocks were moved. But we never thought we'd find anything older than the 1970s when the house was built, or possibly the 1940s when White Rock was a construction camp for the young Los Alamos lab.

So we asked a friend who's an expert in such matters. She tells us it's a Santa Fe black-on-white, probably dated somewhere between 1200-1300 AD. Santa Fe black-on-white comes in many different designs, and is apparently the most common type of pottery found in the Los Alamos/Santa Fe area. We're not disappointed by that; we're excited to find that our pot sherd is "real", and that we could find something that old in the yard of a house that's been occupied since 1975.

It's not entirely a surprise that the area was used 700 years ago, or even earlier. We live in a community called La Senda, meaning "The Path". A longtime resident told us the name came from a traditional route that used to wind down Pajarito Canyon to the site of the current Red Dot trail, which descends to the Rio Grande passing many ancient petroglyphs along the way. So perhaps we live on a path that was commonly used when migrating between the farmland along the Rio and the cliff houses higher up in the canyons.

What fun! Of course we'll be keeping our eyes open for more sherds and other artifacts.

May 19, 2017 02:16 AM

May 15, 2017

Elizabeth Krumbach

DevOpsDays Seattle 2017

At the end of April I made my way up to Seattle for DevOpsDays Seattle. It occurred to me upon arrival that while I’ve spent the past several years very close to DevOps circles and methodologies, this was my very first DevOpsDays! The crew organizing the Seattle event was definitely a great introduction, in spite of the gender ratio that always plagues these events attendee-wise I felt safe and welcome at this event. They also had a diverse selection of speakers without sacrificing quality (Something I tell people all the time is totally doable! Here’s the proof!).

Bonus: My walk to the event both days gave me a great view of the Space Needle. So pretty.

The two day event had the format of a single track all morning, a talk just after lunch, and then Ignite-style talks (5 minutes, 20 auto-advancing slides). From there attendees had the option of one last talk in the main auditorium, or to join fellow attendees in a more interactive series of open spaces (unconference). Put together by the attendees, unconference topics were whatever people had proposed earlier in the day and wanted to have round table discussions about with their peers at the conference. The open spaces then continued through the end of the day.

I won’t give an overview of all the talks, but I do want to highlight a handful that stood out for me.

The first day we heard a talk from Suzie Prince titled “Continuous Integration: A Bittersweet Love Story”. I wasn’t sure what to expect from this talk, but I was eager to hear from her since CI is so near and dear to my heart. She began by discussing two of the most important things about CI: Collaborating on master/trunk (rather than your own branches) and committing code daily (or more!). Coming from the OpenStack world, this wasn’t news to me, yeah, this is CI how we did it! Great!

The big reveal for this talk was that’s not how everyone does it. In fact, based on some research she did last year, most people do CI wrong and suffer in ways they really shouldn’t if they were doing CI properly. The research asked a variety of questions about what people knew about CI and what the pain points are. It was quite astonishing for me to hear some of the results, it sounds like we’ve done a poor job as a community of explaining CI and making sure organizations are implementing it correctly. A blog post about their findings is up here: No One Agrees How to Define CI or CD.

Full video of the talk is available on YouTube, here. I recommend watching it if you’re interested in this topic, her presentation and slides do more justice to the topic than my summary!

My talk was that afternoon. It was my first time giving a Day 2 Ops talk, and I had spent a lot of time while preparing the talk to communicate the right message without being patronizing. Essentially, things get complicated when looking at cloud-native systems where you have an underlying platform (whether it be bare metal or a cloud provider), then whatever you’re running your application in (container?) and then your app itself. You need to be able to get metrics about what all the layers are doing, maintain some kind of monitoring system that understands the setup and can dynamically adjust as your system grows, have a way to access logs and troubleshoot problems down all the layers and have a system for maintaining everything. Plus, you want to give the appropriate access to everyone in your organization based on what they are working on, developers want access to their applications, operators of the physical cluster may need access to the infrastructure but need to know less about the applications.

I had some good talks with folks after this talk, several admitted their organizations accepted the turn-key offering of easily running apps and really got into trouble when things went sideways and they had to debug the actual issue down the stack. No one cares about metrics, logging and troubleshooting until something goes wrong, but more care should be put here in the planning stages, since it does take time and attention, and ultimately it’s all pretty important.

Slides from my talk are up here (PDF) and the video is on YouTube here. I’d like to give this talk again, based on feedback from folks who have seen it, I could use a more formal checklist of things to consider when building a cloud-native system. Plus, I’ll add some talk about integration with existing platforms, we all run complicated things with many moving pieces, no one wants yet-another-tech-specific-dashboard or non-standard tooling that only works when it’s assumed it’s working in isolation.

The second day opened with a talk from Jez Humble on “Continuous Delivery Sounds Great But It Won’t Work Here”. This was a really fun and inspiring talk (though I had heard some of the examples before). He began by going over the top reasons people claim they can’t do CD in their org:

  • We’re regulated
  • We’re not building website
  • Too much legacy
  • Our people are too stupid

His general premise was that these “excuses” for not doing CD in an organization are surmountable with the right culture, and walked the audience through examples that proved this. These included: checks for compliance that can be put into your CI pipeline, the fact that HP’s printer division wasn’t building websites either, but saw significant improvements once adopting CD methodologies, the idea that legacy applications should never hold the rest of the org back and new things should be built to meet new goals (like CD!) and a car production line example that showed how the same employees did higher quality work once their culture changed.

Super interesting stuff. Video of his talk is available here.

I also want to highlight a talk by Jeffery Smith on “How to Elevate Your Contributions as an Ops Engineer”. He very correctly pointed out that IT teams are often very insular and so focused on the tech of the infrastructure, that they don’t poke their heads out to really understand the business, or the specific value they’re providing. He walked through several examples of engineers in a company taking a broader view of the company and what it needed, and being able to make direct impact on the bottom line since they understood where things were going. Plus, this helps you too. He suggested that specific technologies come and go, they get automated or commoditized, and suddenly knowing how to configure something is not as valuable. You bring value by understanding the industry and helping people outside your specific sphere get their work done too, and proving that up the chain. He’s a great speaker so I recommend watching the talk for yourself! It’s up here

Then there were the Ignite-like talks! There were a bunch of great ones, but two really stood out for me, and since they’re only 5 minutes each and really fun, you should just go watch them:

Finally, huge thanks to the organizers of DevOpsDays Seattle. They were really friendly, and I got a kick out of my name being on the back of the conference t-shirts. Usually that’s where conferences put the sponsors! But sponsors get their names on plenty of things, this was a great way to make the speakers feel like rock stars :)

All the videos are up on a YouTube playlist here and more photos from DevOps Day Seattle 2017 that I took are here: https://www.flickr.com/photos/pleia2/albums/72157680011962503

I’m now about to get on a plane to attend and speak at my second DevOpsDays. This time I’m headed off to Salt Lake City! Here I’ll be speaking on “The Open Sourcing of Infrastructure” on Wednesday morning.

by pleia2 at May 15, 2017 07:04 PM

My magical smartpen

If you’ve ever seen me in a talk at a conference, you know I take notes, it gives me a record to blog from later and physically writing notes helps me with memory retention. I also carry around a paper notebook in my purse to jot down random stuff (cat’s weight at the vet, space measurements for when I go to Ikea, to do lists created when we’re having brunch and planning out our afternoon). The problem with this is that the contents of these notebooks aren’t captured anywhere digitally. I’m not going to transcribe this stuff after I use it, “but it would be handy to know the size of that space next to the counter, but darn it I left that notebook in my other purse!” or “I’d like to finish that blog post at work today, but I left my conference notebook at home.”

Enter the smartpen.

You write with this magical pen in a special paper notebook and suddenly you have paper notes AND they sync to an app on your phone. From there you can read and transcribe the notes, export them in various free formats, and auto-sync them with a handful of proprietary services.

A bunch of people have asked about my experience, so welcome to the rare blog post I’m writing about a product. I’m not being given an incentive by the companies I mention to write about it, and I probably wouldn’t write about it if I was.

My journey began when a colleague of mine clued me in to the existence of the Moleskine Smart Writing Set back in February while we were at Spark Summit East in Boston. From then on, I had a bit of a bug in my ear about it. I wandered over to the Moleskine shop nearby a few weeks later to try it out, and ended up semi-impulsively buying it there. I say semi-impulsively since I didn’t do as much research as I normally would have for such a thing, and in retrospect I could have gotten individual pieces (pen, notebook) for slightly less elsewhere. But it wasn’t much cheaper, and I did try the product out in their brick and mortar store, which was a valuable pre-buying experience and I want to see stores stick around, so I don’t mind spending my money there.

Regardless, I had it and they had a note return policy once I opened it, which of course I did as soon as I got home. It comes with the following things:

  • 176 page paper notebook with the special dots needed to work with the pen
  • The Moleskine-branded Neo smartpen N2
  • 1 pen tip ink refill
  • USB charging cable

I set up the Moleskine app, jotted down a few notes, and immediately realized I had made a mistake. You see, the pen is just a branded Neo smartpen and if you use the Neo smartpen app, you can use notebooks that aren’t made by Moleskine! Now, while I’d be happy to use just the lovely Moleskine paper pads (in spite of the tremendous price tag, they are nice), right now they only make them in the large size. Not awesome for my purse. Neo directly has lots of notebooks! Including the super cute N professional mini, which now lives in my purse. Oh, and the apps are nearly identical, Moleskine just branded theirs.


Moleskine Paper Tablet N°1 that came with the kit, Neo smartpen and Neo N professional mini

Now, the playing around was behind me and I had everything all set up, time to take this show on the road!

My first conference

I spoke at an Apache Flink conference in early April, and that was my first opportunity to use my shiny new smartpen. I charged it before hopping on the bus to the conference. I took a bunch of notes and it worked quite well.

The weight and size of the pen weren’t a problem for me, I didn’t really notice I wasn’t writing with a normal pen, though I admit I don’t have small hands. I was able to open up the app on my phone and watch writing happen, cool! Or just write a bunch and let is sync up later. The pen claims to store 1000 pages of writing, so syncing frequently doesn’t seem to be something that’s required unless you want to, but it does sync all the pending stuff for all notebooks when you do go to sync it.

I was pretty happy with this trial run, but it did immediately make me realize a few things about the pen that I wasn’t too keen on.

What I don’t like about it

The first three things I don’t like, but I think I can live with or work around:

  • The app isn’t great, it’s kind of confusing
  • All the auto-save options are proprietary (Evernote, Adobe Creative Cloud, Microsoft OneNote)
  • The notebooks are expensive, $30 for the large Moleskine, $14 for the little Neo notebook in my purse

In spite of the app being a bit of a mess, it is basically usable. I’m not sure I figured out how to properly get the backups going to Google Drive (I think I did…?). I’m somewhat worried about data transfer if I get a new phone and have to move content over from the app. The documentation isn’t great on the Neo smartpen website, so far I’ve noticed that it’s not always updated to reflect the latest version of the app. There are also a few little wizards that pop up to explain how to do things, they’re annoying until you realize you actually need them to use the app effectively, which is even more annoying.

In spite of not liking using a proprietary platform for auto-save I don’t have a practical problem with using them now and then, after all, I do use G Suite quite a bit. Practical concessions can be made.


All proprietary auto-save options :(

Plus, even if auto-save is going to a proprietary place, it’s not the only export option. You can export individual pages as PNG, PDF, SVG or TXT (it gets the OCR treatment) and then email them to yourself, upload them to Google Drive, or a few other places (depends on the apps you have installed).

The cost. Eh. I don’t go through these very often, so I can stomach the price of the notebooks once a year or so. Plus, they are really nice.

I could see any of the next three being a problem for me that causes me to stop using it:

  • I have to remember to CHARGE my pen (“what are you doing?” “charging my pen” “uh, ok, that’s a thing now”)
  • I have to remember to BRING my pen, and the special notebook
  • I can’t just use random cute notebooks, I have to buy expensive Neo smartpen notebooks

One of the reasons I attached myself to a paper and pen is because it’s simple and doesn’t require any technology. And I get free notebooks at conferences pretty frequently, it’s fun to use the various sizes and formats they come in, changing to a new notebook when I fill the last one up is fun. The complexity of now making sure I charge and have yet-another-device, and a specific notebook, is a challenge, particularly since I use the pen with both my conference and purse notebooks. If I leave the pen in the wrong bag? No notes for that day!

Finally, there are a few unknowns. What happens if my pen dries up in the middle of a conference? I can’t just grab another pen! I do have a spare tip, and you can order more, but I haven’t yet started carrying them with me. What happens if Neo smartpen goes away as a company? Or stops supporting my device? I can make backups, but it puts me in a tough spot for long-term support of my shiny new system. I also don’t know how well this all works if you have multiple pens, if I did decide to throw down another $150-170 for a second pen that only lives in my purse, can the app cope with two pens being linked? I don’t know! Can I switch which pen is going to which notebook? I don’t know! The inflexibility and confusing-ness of the app is quite a concern here, I’m somewhat worried that doing something unexpected will cause me to lose notes, or have a disjointed experience in the long run with notebooks being digitally split up.

General usage

That’s a lot to complain about, and I’m honestly not sure about this all long term, but the geek in me is in love. I love gadgets and it’s really cool to finally have a digital record of the copious notes I take at conferences. No more are they just stashed in a drawer, never to be seen again once I’ve completed a notebook!

It’s also so great to be able to leave my paper notebook in my conference backpack and not slog it back and forth to my desk or the office when I want to write a blog post that references them. I just load up the app in my phone to browse my notes, or have a peek via Evernote on my desktop. This also means that my conference notebook pretty much lives in my conference backpack, less risk of forgetting it. Also, if I lose it I’ll still have a digital archive.

I’ve now used it at Flink Forward, DevPulseCon and DevOpsDays Seattle. I can’t speak strongly to the battery life, since it’s been pretty reasonable so far and I didn’t charge it between the second two conferences, it lit up when I needed it to and still had 80% charge at the end. I do also usually carry a little battery with me for emergencies for my phone, noise-cancelling headphones and other random devices anyway.

The automatic transcription is pretty decent, I have tried to be a bit less sloppy with my writing, but it’s confused by industry terms. It’s good enough to correct after the fact though, so it gets most of the job done and I just need to pop in for edits. This will be very useful if I do decide I want to formally transcribe anything I write.

In all, the experiment has gone decently well and I’m looking forward to skipping off to Salt Lake City tomorrow for conference number four with my shiny new pen and notebooks!

by pleia2 at May 15, 2017 01:55 AM

May 09, 2017

Elizabeth Krumbach

Quince and Hamilton

I love musicals. As a youth I started off with Disney full-length animated features, buying and becoming obsessed with the sound tracks. I then graduated into the Rodgers and Hammerstein via the classic movies, South Pacific, The King and I and The Sound of Music… When Hamilton started picking up steam, I was right there to lend my ear to the original broadway cast recording. Over and over again. When the Hamilton Mixtape came out in December I was thrilled. So good.

Then MJ surprised me with tickets to see it in San Francisco. I was over the moon! We went with a couple we’re friends with on Saturday.

Prior to the show, we had reservations at Quince for dinner. It’s the newest San Francisco inductee into the Michelin three star club, but it had been on the list with fewer stars for a few years. Now, we had just been to a Michelin-starred restaurant the weekend before, but this is highly unusual. We might go to one per year, it’s an expensive meal and I like taking the weeks afterwards to enjoy the memory. I wasn’t going to say no to an amazing meal with some friends though ;)

In order to get to the show in time, we secured a 5PM reservation and let them know about our time constraint, shortening the typical 3 – 3 1/2 meal window to just 2 1/2 hours, which they were able to accommodate. We also learned that there was a nearby table with the same plans.

The meal was a multi-course set tasting menu, advertised as “Contemporary Californian and Italian” cuisine. The focus was on seasonal and local, with a handful of delicate pasta dishes and a couple featuring asparagus.

With a show ahead of me, I skipped the wine pairing and just had a single glass of Riesling to accompany my meal. It was a nice, sweet choice that went well with the dishes, all of which were as exceptional as expected. The caviar dish was probably my favorite, but my love for pasta made the whole meal quite enjoyable.

We made our exit just after 7:30 and got to the Orpheum Theatre just in time to get to our seats for the 8PM show. We had great seats, nearly centered on the stage aisle seats in the front row of the Mezzanine.

I may have teared up when the show opened. And several other times throughout the show. It was everything I was hoping it would be! Satisfied left me Helpless. I really enjoyed the actor who portrayed Aaron Burr, he kind of stole the show for me.

Since it was my first time seeing the production played out (not just listening to the soundtrack) I was also to catch a bunch of things, like how hilarious King George is, and the very opinionated portrayal of Thomas Jefferson which landed him in “bad guy” territory.

We had a great night, I’m so glad we went.

It is possible to get tickets for showings at the Orpheum now, with a handful of available seats here and there. They also are still running the next-day lottery for the chance to win a pair of $10 tickets.

There are some more photos from the evening here: https://www.flickr.com/photos/pleia2/albums/72157680336767594

by pleia2 at May 09, 2017 04:11 PM

May 08, 2017

Elizabeth Krumbach

DevPulseCon 2017

Back on April 20th I had the pleasure of attending and speaking at my first DevPulseCon, put on by CodeChix. I’ve worked with CodeChix before, back in 2013 I did an OpenStack tutorial in Palo Alto. Then in 2014 I went with them on the road to help with the PiDoorbell workshop at PyCon in Montreal. These experiences were all very fulfilling. CodeChix founder Rupa Dachere has a great vision for all the events she works on and always manages to bring a great team together to execute them.

This conference took place over two days, the first made up of talks and panels, where I was participating, and a training day on the second. I was invited to give a tech talk on “Using DC/OS for Continuous Delivery” and to join an afternoon panel on “Getting Your Next Job – Groundwork You Need To Do Before You Start Interviewing.”

DevPulseCon 2017 was held in the upstairs event space at the Computer History Museum in Mountain View. Rupa did the event introduction, explaining that the event was made up of female engineers from various companies around the bay area. I go to women in tech-targeted events infrequently enough that I find myself really enjoying the environment. Walking into a whole room of highly skilled women who I can geek out with about infrastructure and tooling is quite the departure from what I’m used to at tech events.

The first talk of the morning was by Mansi Narula, Senior Data Architect at eBay, who spoke about NoSQL Database Platforms. She gave a high level overview of Mongo, Cassandra, Couchbase and Hbase and the basic rules around how they are all used at eBay. It was interesting to learn that internally they have a database selection tool that helps developers select which database platform works best for whatever they’re working on based on criteria they present, like speed, reliability and purpose of the data store.

My talk was up next. I began with a basic introduction to DC/OS and what it brings to the Continuous Delivery equation by simplifying a lot of the underlying infrastructure. Jenkins has an Apache Mesos plugin, but in spite with my own background using Jenkins in past roles, preparing for this talk this was my first time really getting a close look at that particular plugin. The demo I did used a Python script to bring up a simple pipeline of changes being made to a repository, uploaded, tested, and deployed on a web server. I customized it some for the event, having it publish a “Hello world” type post specifically for DevPulseCon attendees. I concluded the talk by talking about some of the DC/OS 1.9 features I felt were particularly applicable to folks interested in running an infrastructure platform, including strides made with metrics and logging. I uploaded the slides here (PDF) and they include links to some other resources and the demo I showed.


Thanks to Nithya Ruff for the photos of my presentation (source)

The final tech talk was given by Gloria W., titled “IoT: Yes You Can!” where she broadly outlined the space of DYI internet of things and then dove into some details about how you might get started. She started by talking about the constant struggle of anyone developing in the IoT space around making sure devices are provided with power and some way to communicate. From there she spoke about some of the specific tooling available today, trending toward recommending open source solutions where ever possible. She talked about using Arduinos with sensors, and I was interested to learn about the MATRIX Voice, “an open-source VOICE RECOGNITION platform consisting of a 3.14-inches in diameter dev board, with a radial array of 7 MEMS microphones connected to a Xilinx Spartan6 FPGA & 64 Mbit SDRAM with 18 RGBW LED’s & 64 GPIO pins.” How cool! Kit-wise, she advised attendees to try to steer clear of proprietary development kits since they try to push you onto their platform, and instead select ones that lean toward using open source and open standards. The talk concluded with a raffle where she gave away some of the devices she had brought along.

The afternoon was spent with a series of panels:

  • Getting Your Next Job – Groundwork You Need To Do Before You Start Interviewing
  • Company culture that works for YOU (not just the men in your team) – AKA “work/life balance”
  • Promotions, Visibility, toxic environments and how to deal with them

I can’t share details about these sessions since they did a really novel thing with these: Asked everyone to put down their social media devices and not share what was shared in these panels outside the conference. It allowed panelists and audience members alike to be really honest about their experiences, solutions and advice without risking that they’d be quoted somewhere. Huge thanks to the event for providing a safe space for these kinds of discussions, it was helpful and I think we sometimes suffer from not having enough of this in our industry.

The day concluded with a small after party in the lobby sponsored by Facebook. I am often shy at social events like this, but being a speaker helps, people came up to me to chat about CI/CD and the work we’re doing on DC/OS. I also met an attendee who I chatted about OpenStack with for a while. It was also nice to connect with some of the folks who I already knew at the event, like Nithya who I frequently fail to connect with at events and at home – both homes! She spends time in Philadelphia with her new role and yet our trips back east seem to rarely overlap. I was also amused that when I went to get a beer from the bar and declined a glass they said “the men want glasses and women want the bottle, it’s usually the opposite!” Oh yes, I was in the right place at this event.

by pleia2 at May 08, 2017 09:19 PM

Apache Mesos, and big, streaming data events

Over the past several months I’ve been getting into the swing of things with my new role as a Developer Advocate at Mesosphere. This began by attending Spark Summit East back in February, and really got going when I spoke with my colleague Ravi Yadav at Flink Forward in San Francisco early last month.

These very specific technology conferences are somewhat new for me. It’s true that I’ve been going to Ubuntu and OpenStack conferences for nearly a decade, but these projects are huge, with dozens of different projects inside them and various teams, companies and volunteers with varying motivations. It’s a whole different thing feel when you have a small concentration of folks working on a very specific technology directly and together. It’s also a great learning environment, since your attention is not split across a massive community and you can focus on learning how other people are doing things like deployments, scaling and whatever else is specific to that technology.

I wrote about the specific Flink Forward talk Ravi and I gave in the post on the DC/OS blog, but even more generally it was great to meet community members operating in that space and talk shop about the technologies that surround our work. Professional photos from the event are here and I have my own album of pictures I took here. And in case you’re curious, a video of our talk is now online here and slides can be found here.


Ravi shows off a demo between my bits of speaking at Flink Forward, cc-by-sa 2.0 Silke Briel (source)

I’ve also been starting to help run some of the meetups that we’re hosting here at the office. Back in March I attended and MCed my first Apache Mesos meetup, Running Production Containers and Big Data Services Gets Even Better. The meeting was great for me since I’m still getting up to speed with all our projects, and it covered some features in the new releases. First up was Gilbert Song talking about “Mesos 1.2 Updates and Universal Container Runtime” and then a DC/OS 1.9 features talk by Sebastien Pahl. The event concluded with a presentation about Instana, a multi-layer monitoring platform geared toward container-based architectures where your environment is, by design, constantly changing (it is a paid product, but a 14 day trial is offered). A video from the event is up on YouTube.

The opportunity also arose to host a Women in Big Data meetup here at the office where Amita Ekbote and Susan Huynh introduced Apache Mesos and DC/OS and gave a live demonstration of the IoT Pipeline. Suzanne Scala posted a write up of the event, including the slide decks and other links on the Women in Big Data blog, here: Big Data on DC/OS. I attend a lot of tech conferences and events, and they tend to be male-dominated, so I really enjoy these events where I can meet other women doing cool technical stuff. Plus, big data in particular is a space where people are doing some really interesting work.

I’m looking forward to helping out with more local meetups in the coming months here at the office, but also to be speaking at some of my own, I’m aiming for some east coast events in early June that I’m pretty excited about.

by pleia2 at May 08, 2017 06:45 PM

Eric Hammond

Rewriting TimerCheck.io In Python 3.6 On AWS Lambda With Chalice

If you are using and depending on the TimerCheck.io service, please be aware that the entire code base will be swapped out and replaced with new code before the end of May, 2017.

Ideally, consumers of the TimerCheck.io API will notice no changes, but if you are concerned, you can test out the new implementation using this temporary endpoint: https://new.timercheck.io/

For example:

https://new.timercheck.io/YOURTIMERNAME/60

and

https://new.timercheck.io/YOURTIMERNAME

This new endpoint uses the same timer database, so all timers can be queried and set using either endpoint.

At some point before the end of May, the new code will be activated by the standard https://timercheck.io endpoint.

Rationale

When the TimerCheck.io service was built two years ago, the only language supported by AWS Lambda was NodeJS 0.10. The API Gateway service was console only, and quite painful to set up.

It is two years later, and Amazon is retiring NodeJS 0.10. AWS Lambda functions written with this language version will stop working at the end of May (a 1 month extension from the original April deadline).

Though AWS Lambda now supports NodeJS 6.10, I decided to completely rewrite the code for TimerCheck.io in Python 3.6, for which support was just announced.

I’ve also been wanting to try out chalice for a long time now. Since TimerCheck.io uses API Gateway and AWS Lambda, this was the perfect opportunity, especially since chalice now also supports Python 3.6.

Though I ran into a few issues trying to get chalice stages and environment variables to work, the project went far easier than the initial implementation, and I am happy with the result.

Results

The chalice software makes creating APIs with Python a pleasure.

These four lines of code are an example of how easy it is to define an API with chalice. This is how the timer set API is defined.

from chalice import Chalice
app = Chalice(app_name='timercheck')

@app.route('/{timer}/{count}')
def set_timer(timer, count):
    [...]

The biggest benefit is that chalice takes care of all of the API Gateway hassles.

After a chalice deploy, all I had to do to make this production worthy was:

  • Create an ACM certificate

  • Point an API Gateway custom domain at the chalice-created API Gateway stage, using the certificate.

  • Add the host record to DNS in Route 53 for the resulting API Gateway CloudFront distribution.

The entire new source for the TimerCheck.io service is available in the timercheck repository on GitHub.

Original article and comments: https://alestic.com/2017/05/timercheck-aws-chalice/

May 08, 2017 10:00 AM

May 07, 2017

Elizabeth Krumbach

The underground and monorail in Seattle

I was in Seattle just over a week ago for a couple days for DevOps Days Seattle 2017. I’ll write about that later, but the early evenings while I was there and my journey back to the airport allowed me some time to explore the city a bit more than I have in the couple times I’ve visited previously.

The first full day I was there I forewent the event social in favor of the Seattle Underground Tour. MJ went on this tour a year ago with some friends when he was spending a lot of time working up in Seattle. He knew I’d enjoy it, a perfect mix of tourist-y and interesting history.

I learned that the area around Pioneer Square in Seattle was risen by a story or two over 100 years ago, following a fire that gave them an opportunity to rebuild. As a result there’s a whole underground that has the former street level buildings, now deteriorating, that were used while the work was being done. The tour guide shared history, puns and anecdotes as we spent an hour traipsing through various sections of the underground, seeing old entrances to hotels and banks, along with early toilets and plumbing systems, and even an old Klondike gold rush era (1897-1899) bank vault.

At one point you’re particularly reminded that you’re walking under the current sidewalk, as we walked through a section that had natural light, coming from fogged glass insets in the sidewalk that we’d just walked on above ground a few minutes before.

Almost everyone going on the tour was from out of town, and the guide made jokes about locals going who were dragged along by visiting friends. But you know me, I do all the tourist things, even in my own town. After the underground tour I took a bit of a walk to the nearby piers, which appear to be there totally for tourists and have a highway spanning nearby above them. I suddenly realized how similar it must be to how the Embarcadero in San Francisco was before they tore down the freeway. Thank goodness for that. The drizzle that had been coming down then switched to a steady pour, so I caught a car back to my hotel, and had dinner at the nearby Crow restaurant that had counter seating, making it slightly less awkward than usual to eat out alone.

More photos from the underground tour, and wandering around Seattle here: https://www.flickr.com/photos/pleia2/albums/72157680012009863

Since I was staying up near the Space Needle, I had some time to get to the airport at the end of the last day of the conference and Seattle was enjoying a beautiful sunny afternoon, I decided to take the leisurely public transit way. That meant starting with the monorail! I made a point to take the monorail the first time I was in Seattle, during the chilly winter of 2013 when I started at HP. It’s an incredibly short ride, but fun because of all the 1960’s future-view paraphernalia at the stations. From there I caught the Link to the airport, and concluded my Seattle adventure this time. I wish I’d taken time to visit with some folks while I was in town, but I’ve been pretty tired with everything going on, I was glad I at least took time to do the tourist things for a few hours. Next time I hope to be more social!

Upon checking in at the airport learned I could take a slightly earlier flight home than I was expecting. Getting able to meet MJ for dinner and being tucked in at home before midnight was a pleasant surprise and lovely conclusion to this quick trip.

by pleia2 at May 07, 2017 11:13 PM

4th Wedding Anniversary at Coi

On April 28th MJ and I celebrated our 4th wedding anniversary. April and May tend to be busy travel months for me, and though I do try to be in town for our anniversary. I wasn’t last year and it caused us to do a ridiculous amount of postponing when it came to celebrating it.

This year I was home, and MJ snagged us reservations at the amazing Coi Restaurant in North Beach!

Their focus is seafood and like many of these super fancy restaurants in San Francisco there’s also a focus on seasonal selections from local farms in the dishes.

The menu itself is a set nine course tasting menu with a couple optional drink pairings. They offer a tea pairing, as well as two options for wine pairings. The tea pairing was tempting since it was unusual, but I decided to go with the more expensive of the two wine pairings, and MJ had a glass of the 1998 Krut “Brut” Champagne from the same menu and tastes of mine throughout the meal.

I’m quite the seafood fan, so I was really happy with the theme and the selections were really nicely executed. The wines were amazing, especially the 2007 Château Pape Clément Blanc. So good.

The menus they gave us at the end of the meal had been customized to show the wines we enjoyed throughout the meal, as well as the fact that we were excluding pork from the selections, which was a really nice touch.

More photos from our dinner here: https://www.flickr.com/photos/pleia2/albums/72157680300381264

With four years under our belt you’d think I’d have the wedding photos online somewhere by now, but I don’t. Hah! I have some more work to do on the site I’m putting together, I’ll try to bump the priority on it since there are so many great photos I’d like to share.

by pleia2 at May 07, 2017 10:17 PM

May 06, 2017

Elizabeth Krumbach

Local movies, sights and Sharks

It’s been two months since I’ve done a proper general life update post. This is in part because I’ve been struggling with everything going on this year and primarily focusing on work and blog posts over there about events. I would like to catch up over here though, since a healthy part of me staying on top of my mood is writing about the exciting things we’re doing to keep moving forward.

So, first exciting thing, we got our washing machine fixed! Maybe not so exciting, but I did have to walk to the laundromat once and send out wash with a service that had a hefty turnaround time while it was broken, so I was pretty happy when I could finally do laundry at home again. I forgot how much I take that for granted. I’ve also tried to do some spring cleaning around here. Our condo has an open floor plan, but there’s a “dining room” area that has forever been a bit of a box land. I’m trying to solve that so MJ has more space to work on projects. I made a considerable amount of progress when I began this work, also cleaning out some of what we had in the hallway closet while I was at it so I could make room for some of what I was moving out of the condo proper. It’s a bit stalled at the moment, but at least I now have a better idea of what is over there so I can pick up where we left off when we decide to make time over some weekend.

I spent some time over the past couple months going out with a friend of mine. First it was over to see the film Love and Taxes at The Roxie in the Mission. I wasn’t sure what to expect, and I’m not much of a comedy fan but it was nicely done, if a bit too real. The Roxie is also one of those fun old theaters with big neon lights outside, we saw the film in the smaller theater, I’ll have to get into the big one some time. We also recently saw The Sense of an Ending, which I was less thrilled with. As a British film it’s slower than I tend to like but I found myself wondering if it ever managed to go anywhere interesting. Still, we had a nice evening grabbing some drinks and cheeses nearby afterwards. Back in mainstream movie land, I mentioned in my post about the last trip to Philly last month that I saw Beauty and the Beast for a second time. The first time was when I got an opening night ticket on my own, and immediately had a couple awesome friends pile on and invite themselves along so I wouldn’t have to go alone. Sab and Mark, you rock.

Too much TV has been happening lately too. I’m not proud of it, but I did have a pretty extensive scifi and fantasy TV queue from times where I’ve been more productive, so I’m trying not to be too hard on myself, and already my amount of reading is once again starting to overtake the amount of time I spend watching TV. But in the midst of my TV binge time, the new season of Mystery Science Theater 3000 came out. MST3K is kind of a big deal for me. It’s the show that brought me to my first IRC server, it was a major point of bonding for my first husband and I, and I met a lot of friends through an MST3K IRC channel and fan site I used to help run (hello #deep13 buddies!). When the kickstarter was launched last year for a revival I signed up quickly and expensively. My name is in the credits of the new episode 5 (1105) and the kickstarter rewards have been trickling in. Looking forward to the coffee table book! I’m trying to bide my time on the episodes though, restricting myself to one or so per week so I don’t have a major the dry spell once they’re gone and I’m forced to wait for a new season, which we’re all hoping will happen.

With all this laziness and TV watching, I’m not running as much as I’d like, but I have found myself walking more. Even if it’s just to go sit by Ferry Building with a hot chocolate, or a random wander down to Pier 39 to watch street cars drive down the Embarcadero and then visit the sea lions. I live in a beautiful place and taking time to enjoy that while listening to some music or an audio book is incredibly relaxing.

A recent walk around the zoo was also therapeutic, but I specifically went a few weekends ago when they had cards up to share memories and love for the last polar bear who lived at the San Francisco Zoo and recently passed away, Uulu. Losing her was quite sad, but the visit to the zoo was a nice one otherwise. I was able to see the family of guanacos closer than usual, a bounding wolverine and one of the red pandas running all around their enclosure. Plus it was a beautifully sunny day, of which we’ve had plenty of as we migrate out of the soggy winter we had.

More photos from my visit to the zoo here: https://www.flickr.com/photos/pleia2/albums/72157680629307391

And it wasn’t just walking. MJ and I made our way to Golden Gate Park recently as we were preparing for Passover by eating a whole bunch of bagels while we still could (haha!). We stopped by House of Bagels for some sandwiches and then made our way over to the park.

Work-wise I’ve been keeping super busy with events and continuing to learn more about our product, but I decided to start a new experiment so I can limit my workaholic tenancies: I now have 2 cellphones. My Nexus 6 on Project Fi is now used for work and international travel, and I have my mother-in-law’s old (but quite new…) cellphone on Verizon that I’m using personally. During the weekdays I often carry them both, but I can choose to leave my work phone behind when I want to physically separate myself from the temptation to check work email and notifications. It’s definitely a cumbersome arrangement, and making sure I charge (and shut off for takeoff!) two phones has taken some getting used to, but I think I’m developing a healthier relationship with my day job this way. Evenings and weekends I can really spend on other work and projects without getting too distracted by work work.

Speaking of work, I recently went on my first bay cruise that took me under the Golden Gate Bridge! Mesosphere celebrated their 4th anniversary recently and to celebrate we did an unconference during the day, and then went on a boat. It was cold on deck, but that’s how we got to really experience going under the bridge before the boat turned around. There were also dolphins and sea lions in the bay that we got to see. Chatting more casually with some of my colleagues was also obviously nice, though I do tend to be on the quieter side during social events like this.

Some more photos from the boat and our cruise around the bay here: https://www.flickr.com/photos/pleia2/albums/72157680726304672

After work on April 18th MJ and I met up at the Marines’ Memorial Theatre, which he hadn’t been to before (I’d been to a Long Now seminar there). We were there for an interview with Krista Tippett on her new book, Becoming Wise. I’ve been a long time listener to her On Being radio show, going back to it was called “Speaking of Faith”. As a fan of the incredible interviews she’s done of scientists, religious scholars and key thinkers of today, it was a real honor to see her be interviewed live and then meet her and get a copy of the book signed! I finished reading the book last night, and have already sent another copy off to an aunt who I thought would particularly appreciate it. The evening was concluded by a tourist-esque visit to Sears Fine Food right near the theater.


Photo by Anna Bryan, album on Facebook

A couple weeks ago I also found myself loosely following the NHL playoffs. With the San Jose Sharks making it all the way to the Stanley Cup last year, I knew there was a chance there’d be some playoff games this year. We drove down to San Jose on April 22nd to have dinner and enjoy the game. Unfortunately it ended up being the last game of the season, with our Sharks losing to the Edmonton Oilers 1-3 and bringing the series to a close with the Oilers on top.

Still, it was a lot of fun to go down and enjoy a final game of the season with the Sharks. More photos from the game here: https://www.flickr.com/photos/pleia2/albums/72157679697721774

I still have much to write about our 4th wedding anniversary, and my recent trip up to Seattle, but there’s only so much catching up I can do in a single post! Plus, I’m on my way for dinner and to see Hamilton (so excited!). Next time.

by pleia2 at May 06, 2017 11:44 PM

The life of Simcoe

Rounding out over ten years of a wonderful life, and over five years of treatment for Chronic Renal Failure (CRF), Simcoe passed away on April 9th.

I picked up tiny baby Simcoe on a snowy night in January of 2007. The litter of kittens she came from had two girls, and I picked the one I saw first. Her whiskers were a bit mangled due to some rough play with a sibling but she was the one who was least skittish in the crew if naturally skittish Siamese. And she was so small! Her introduction to Caligula was tense at first but within just a couple days he grew accustomed to her.

Simcoe was named after a hop. I was really into local breweries and homebrew in the mid 2000s, even having grown nugget hops in our back yard in Pennsylvania. The hop was distinctive and I enjoyed it, and the name was great. Though it did cause a lot of people to mix up her gender over the years since the name was not obviously gendered.

As a kitten, she was a bit of a terror. Climbing so many places where she shouldn’t be, taking her claws to a pair of speakers and some furniture. She also loved attacking toilet paper, which I eventually trained her out of. She never quite settled down like Caligula did into full adult cat mode, but she did calm out of her kitten phase eventually after she learned some rules and would at least abide by them while we were watching. She also stopped the furious little kitten destruction-of-things and as an adult was quite kind and careful.

Simcoe was with me through a divorce and total up-ending of my life as I left the Philadelphia area where I spent much of my 20s and moved to San Francisco. When I moved to San Francisco in 2010 there were months of unpacking as I struggled to juggle work and my new life out west. During this time she developed a habit of hopping on top of one of the larger boxes when MJ came home so he could pet her. I was thrilled with how quickly they bonded. She also took to city life quickly, and the beautiful weather we often enjoy here in northern California.


“Helping” me pack for the move to California

She was a bouncy kitty whose boundless affection touched everyone who spent much time with her. She was playful and funny, making for endless photos of her escapades I could share. There were mornings when I’d wake up with cat toys in the bed, a clear indication that it was time to play with her whenever I decided to get myself out of bed. She also spent a lot of time snuggling Caligula, following him around to figure out where he was sleeping so she could join him, and had a funny habit of trying to nurse from him. I tried to capture much of this this post, which I wrote the day after she passed away.

On November 22nd of 2011 our Simcoe turned five years old. She’d always been a small cat, but in the weeks following her birthday we realized she had lost a considerable amount of weight. We brought her in to the vet on December 10th and learned that she had dropped from about ten pounds to just over six. I wrote about that first week of learning she was sick here, but suffice to say we learned that she had CRF, an incurable condition that we could treat, but was ultimately fatal. It was devastating. I cried a lot during the 72 hours they worked to get her stable at the vet through constant fluids and a watchful eye.

After a few days, she came home to us. The next several weeks were spent learning how to care for her. Thankfully MJ had experience giving cats subcutaneous (SubQ) fluids, as it seemed so daunting when the vet explained that we had to put a needle under the skin of her neck to administer fluids. We even brought her up to a veterinary hospital over an hour north of San Francisco to visit a feline kidney transplant doctor. We also completed various preliminary tests to see if she was a good candidate for transplant. Ultimately after discussion with all the vets involved, we all decided to hold off on the transplant because we wanted to try management of her condition once her values evened out at the end of December, showing that she was in the early stages of the disease.

While doing all these tests related to the disease and transplant eligibility, we also learned that she had an infected tooth that was causing trouble. In mid-January we took the calculated risk of having her put under to get the tooth taken care of. The infection was considerably worse than they thought, so the procedure took longer and there was a dip in blood pressure they had to control, but over all she was ok. By February we were getting into the swing of things care-wise. In April we had to make our first trip where we’d leave our newly diagnosed kitty in the care of a pet sitter who came to the house daily, but we found one and it worked out fine. We continued to use the same pet sitter over the next five years. Her weight bounced back, gaining steadily throughout 2012 and bringing her back up to a healthy 9.5 by the end of the year where she remained while she was doing well.

Plus, she was an awesome cat! And we spent a lot of time together since I work from home.

Check-ups every three months over the next few years became the norm. I wrote about them each time, finding the tracking and writing to be therapeutic and always hoped that sharing our journey would be helpful to others. The full listing, for reading in detail about her progression:

We decided in the course of this to discontinue our plans for a renal transplant. The vet up north had retired and his practice ceased doing the procedure. We also learned that there hadn’t been any major improvements in the procedure in the years that had elapsed since her diagnosis, it was still expensive and risky, with a high level of care required after the transplant, which I thought we might struggle with.

She was responding incredibly well to treatment, in addition to a healthy weight, her BUN and Creatinine levels stayed reasonable for her condition. We adjusted some additional supplements, changed up her food from time to time as it made sense to tend to her treatment. Her and Caligula did end up swapping colds over the years, but after seeing the vet a few times for it they said it was just the way some pairs of cats are with these things, and aside from sniffles and sneezing, it didn’t seem to make much of an impact on their general well-being.


Box of fluids, IV sets and needles

In 2014 she became immortalized in a software project I work on. With the Xubuntu 14.04 release the login greeter garnered the ability to have a personalized picture next to your login name. The team flipped through some options for the default picture in the installer, and decided upon the striking image of beautiful Simcoe.

This wasn’t the full extent of her internet fame though. I shared pictures of her all the time on social media, so everyone who knew me, knew my cat. It did make me a bit of a cat lady, but that’s totally fair, it’s tricky to pull my identity away from my beloved critters.

Alas, CRF is a disease that progresses, and late 2015 is when things started shifting. First her weight began to drop. Then she started breaking out with sores around her eyes and nose, which were first treated, probably unsuccessfully, with antibiotics. Then, after a large sore on the base of the underside of her tail developed along with the other sores we took her to a dermatologist. We learned that she had allergies which were causing the breakouts. The doctor didn’t believe it was related to the CRF directly, but did say that her weakened immune system could be making it so that the sores resulting from the breakouts failed to heal quickly, risking infection. A small dose of daily anti-allergy medication cleared it up nicely and there were no further incidents. Her health was declining though.

Through 2016 her BUN and Creatinine levels continued to rise and her weight drop. Her damaged kidneys were incredibly small and it surprised the vet that they were functioning at all. We increased her SubQ fluids to 100 ml daily. She was put on a couple more medications to manage her calcium levels and other things that had started getting out of whack with the progression of the disease. In general, she was still acting fine though, in spite of the stage four renal failure diagnosis she ended up with last year. Everyone was surprised at how well she was handling it.

She did develop severe constipation though, which caused her visible distress and made it so she’d sometimes find a more comfortable place than her litter box to do her number two business, often our bed. In addition to covering the bed more aggressively, this led us to various attempts to give her more fiber, with varying success throughout the year. Mixing fiber-for-humans in with wet food, giving her some mixed with water directly. At the end of the year we decided the positives outweighed the negatives and switched her to a daily medication which helped ease the constipation, but while safe, wasn’t quite optimal for a CRF cat, though it did start to relieve the constipation.

By the end of 2016 her weight had dropped below eight pounds and our concern was growing. She then rapidly dropped below seven pounds over the first couple months of 2017, and in March her energy took an unexpected dip. She wasn’t as playful, slept more, and when she was awake she would often rest in a somewhat hunched position. Her meowing got more frequent, especially at night, and came to my lap to snuggle much more frequently than normal. Her appetite had decreased a lot as well.

With her steady decline we looked into transplants again, somewhat out of desperation and desire not to lose her. I went as far as having a call with UPenn’s feline renal transplant department and talking to our local vet about preparatory and follow up care here. But she was really not doing well, the trip across the country for the transplant, in spite of having a place there where she could begin recovery, would have been challenging. Life post-surgery would have also been difficult for all of us, two anti-rejection medications per day, need to stay away from infections since her immune system would be compromised. We made the heartbreaking decision not to move forward with the transplant.


Skinny Simcoe toward the end on a heated blanket with Caligula

The week of April 3rd showed further behavior changes. Her breathing was sometimes labored and scratchy as it seems like she was going through another bout of respiratory issues. She was climbing to unusual places, and being excessively heat-seeking. I’d find her on top of the toaster oven, sitting on our computer equipment. She had also almost completely stopped eating. We could get her to eat a little cold cut turkey (her favorite!) but even that she lost interest in quickly, and vomited much of it up.

On April 7th Simcoe had a urine accident in our bed, which had never happened before. She was clearly distressed by the situation and paired with her general lethargy and change in behavior over the preceding weeks made us decide it was time to let her go. We made an appointment with an in-house vet Sunday morning. Early Saturday morning I found her on top of our computer rack in my office, when I picked her up she was soaked with urine, seemingly had an accident again and yet didn’t move from where she was. She barely fought me when I cleaned her up in the bathtub.

We spent the rest of the day with her. Playing with her favorite toys (string! seagull!) and cat tents, she even took a few bites of cold cut turkey. Saturday night MJ and I took shifts to stay up all night with her. He stayed up until the early morning, waking me up around 5AM so I could spend some final hours with her. She spent much of those final hours sleeping on me, or near me on a heated blanket with Caligula. In the late morning before the vet showed up she perked up and played a bit. I immediately thought we were making a mistake in having her put to sleep, but all the other evidence of her decline outweighed that final playtime.

The vet arrived on time and walked us through the procedure. He was super compassionate and friendly, and remarked at how beautiful she looked. Indeed, many cats are unable to groom themselves effectively when they get to where she was, but she’d always been a super groomer, so she was beautiful even at the end. MJ held her as we sat together on the couch as he administered the shots and we felt her slip away there as we petted her. We brought Caligula over so he could say goodbye, though he didn’t seem to understand. The vet took her away in a little basket and she just looked like she was sleeping. He took care of the cremation details and explained that her ashes would be returned to us in a couple weeks.

It’s now been almost four weeks since her passing, and this is still incredibly painful to write. Losing her has been one of those really hard losses. We have her ashes back now, in a beautiful lotus urn. We haven’t yet had the discussion of where to spread them, but I’ve kept it in my thoughts should inspiration arise.

These past few weeks I’ve realized how much we’d adjusted our lives to handle her care. She had a history of vomiting, so we’d been diligent about covering the bed with a waterproof cover as soon as we got up in the morning, and were careful not to leave out papers she could potentially get sick on. Laptops had to remain closed so she wouldn’t sit on the keyboards. There was also the daily care. Every night we made time to give her medication and SubQ fluids. And all the vet visits, at least quarterly. The pet sitter who had to be familiar with administration of SubQ and medications.

But we’d do it all again without a second thought. It was worth it to have her part of our lives for so many wonderful years. It seems like a daunting amount of work, but it really just becomes routine and not too scary.

I learned soon after her diagnosis that many vets don’t work with owners to recommend treatment and instead recommend euthanasia upon diagnosis. I would never judge those vets or owners who choose that path, it is some work and expense, but I want people to know there’s another way. Simcoe was diagnosed when she was just five years old, and she had another good five years in her beyond her diagnosis. There are great websites, communities and vets who can help CRF cats who are otherwise in good health, especially if they’re on the younger side.

Throughout the disease we kept personal records of her weight and important levels, graphing and sharing them with each blog post:

I’ve put a copy of the spreadsheet with exact levels up here, the BUN and CRE tabs include what the normal values are from the lab (these vary between labs).

We’re super thankful for the staff of both All Pets Hospital, particularly Dr. Barr and Dr. Gillespie who showed so much love and care for her early in her condition. Then as her condition worsened the welcoming staff at VCA San Francisco Veterinary Specialists and Dr. Maretzki who walked us through the end stages with us, changing up her medication regularly and helping us determine the next steps throughout. Our pet-sitter Elaine was also wonderful through all our travel as she went to great lengths to make sure Simcoe got all her medications, and who was also able to take away and donate foods that Simcoe wouldn’t eat throughout her pickier phases.

And much gratitude to friends who understand how painful this has been for me.

The following are some resources we consulted and used throughout her condition:

Tanya’s Comprehensive Guide to Feline Chronic Renal Failure Disease

This website is a treasure trove of information that we consulted on a regular basis throughout her CRF life. They talk about treatment for various symptoms that can crop up, preferred medications for CRF cats, shared stories and gave a lot of information about what you could expect in various situations. The site also included a whole chart of over the counter foods that contained tolerable nutrient balance for a CRF cat in case your cat won’t accept the prescribed renal-specific diet or you were seeking to supplement it. I would often print out sections of these charts to bring to the pet store, but the site author just published a hard copy of the US foods, which I would have bought if it had been available when I was using it. Food-wise we got lucky that Simcoe mostly ate the K/D prescription diet from Hills, but for a while we were adding in some OTC wet food to further increase the amount of liquids she was getting.

Feline Chronic Renal Failure Information Center

This is the first CRF website I found, and while Tanya’s is far more comprehensive, this one was a little less overwhelming as I started out learning about things.

Mailing lists/Yahoo! groups: Chronic Renal Failure Cats and Feline Assisted Feeding

I used these for reference and posted a few times, always receiving kind and thoughtful replies. However I could never could keep up with the volume of emails. Part of that was it was so sad to read about the struggles people were posting about, and I feared where eventually inevitable for us, and the crushing pain when people would post about the death of their cat. So I mostly let the emails pile up and used these when I needed to.

Finally, more photos of Simcoe can be found in this album and her website remains up at simcoec.at.

We love you and miss you, Simcoe. Caligula sends his snuggly love too.

by pleia2 at May 06, 2017 08:48 PM

May 05, 2017

Akkana Peck

Moon Talk at PEEC tonight

Late notice, but Dave and I are giving a talk on the moon tonight at PEEC. It's called Moonlight Sonata, and starts at 7pm. Admission: $6/adult, $4/child (we both prefer giving free talks, but PEEC likes to charge for their Friday planetarium shows, and it all goes to support PEEC, a good cause).

We'll bring a small telescope in case anyone wants to do any actual lunar observing outside afterward, though usually planetarium audiences don't seem very interested in that.

If you're local but can't make it this time, don't worry; the moon isn't a one-time event, so I'm sure we'll give the moon show again at some point.

May 05, 2017 09:26 PM

Eric Hammond

AWS Community Day San Francisco, June 15, 2017

register today for this free conference with content organized and presented by AWS community experts!

The AWS user community in the Bay Area and (US) West Coast is getting together in San Francisco for a full day of technical content, food, drink, and mixing with other users in the AWS community.

AWS Community Day San Francisco

At AWS Community Day San Francisco, the content is selected and planned by leaders in the AWS community, and all of the speakers are AWS experts from the West Coast community of AWS users, including some AWS User Group leaders and AWS Community Heroes.

Amazon is generously footing the bill for the event space, food, multi-media management, event site hosting, registration, and all the little details that make a big event like this go smoothly, but the content is organized and presented by AWS community leaders and experts, instead of Amazon employees.

“When we learn from the community we learn from others like us. We have a shared perspective as users of AWS. AWS Community Day will offer us a unique opportunity that we cannot get at AWS re:Invent or AWS Summit.”
John Varghese, Leader, Bay Area AWS User Group

This free event is being held at the Marriott Marquis San Francisco on Thursday, June 15, 2017.

Breakfast, lunch, snacks, and happy hour are provided to keep the community fueled for learning and networking. Thanks, Amazon!

“I’m excited to spend a day learning and sharing with the rest of the AWS community. The sessions will be a great mix of practical tips about the services we use constantly, and showcases of new technologies that we aren’t so familiar with. It’s great to hear about best practices from Amazon, but it’ll be even better to learn from our peers’ real world results.”
Ryan Park, Engineering Manager, Slack

There are eight sessions planned in two tracks, with plenty of valuable information for all types of AWS users, and lots of opportunities to meet others in the community and chat about your AWS experiences. The current agenda includes:

  • Security for Complex Networks on AWS (Teri Radichel)
  • And You Thought You Knew EC2 (Ben Whaley)
  • And the CFO Wept: AWS Cost Control (Corey Quinn)
  • Learning AWS the Hard Way (Valentino Volonghi)
  • Lessons Learned After One Year with AWS Lambda (Matt Billock)
  • Open Source: What Works and What Doesn’t (Mike Barrett)
  • Amazon Athena Deep Dive (Kevin Epstein)

Sessions will cover some of the most popular and cutting-edge AWS technologies being used today including: Lambda, Kinesis, DynamoDB, Athena, QuickSight, Systems Manager, Application Load Balancers, and more.

“I’m excited to see how the community is leveraging the tools that Amazon provides, especially the creative uses Amazon didn’t expect!”
Jeremy Edberg, Founder, MinOps Inc

At this conference, there will be no vendor talks, no partner talks, no sponsored talks. The presenters are all AWS users and AWS experts with real life experience in the AWS topics they are presenting.

Who doesn’t want to hear the important lessons learned by an AWS expert who deals with 4 trillion data points per day?

“It’s one thing to learn about AWS from Amazon employees at AWS re:Invent, the (quite comprehensive) docs, or the ocean of online material. It’s something else entirely to learn from folks who have used AWS to solve real world problems for coming up on a decade.”
Ben Whaley, Founder, WhaleTech

You can learn more about, and register for, AWS Community Day San Francisco at the official web site, generously designed and hosted by Amazon:

AWS Community Day San Francisco

Check out the agenda, read the FAQs, then Register Today and join your fellow West Coast AWS community members on June 15 for a great AWS Community Day!

Original article and comments: https://alestic.com/2017/05/aws-community-day-san-francisco/

May 05, 2017 06:30 PM

May 01, 2017

Jono Bacon

Open Community Conference CFP Closes This Week

This coming weekend is the Community Leadership Summit in Austin (I hope to see you all there!), but there is another event I am running which you need to know about.

The Open Community Conference is one of the four main events that is part of the Open Source Summit in Los Angeles from 11th – 13th September 2017. A little while ago the Linux Foundation and I started exploring running a new conference as part of their Open Source Summit, and this has culminated in the Open Community Conference.

The goals of the event are simple: to provide presentations, panels, and networking that share community management and leadership best practice in a practical and applicable way. My main goal is to provide a set of speakers who have built great communities, either internally or externally, and have them share their insights.

This is different to the Community Leadership Summit: CLS is a set of discussions designed for community managers. The Open Community Conference is a traditional conference with presentations, panels, and more in which methods, approaches, case-studies, and more is shared, and designed for a broader audience.

Call For Papers

If you have built community in the open source world, or have built internal communities using open source methodologies, I would love to have you submit a paper by clicking here. The call for papers closes THIS WEEK on 6th May 2017, so get them in!

The post Open Community Conference CFP Closes This Week appeared first on Jono Bacon.

by Jono Bacon at May 01, 2017 10:11 PM

April 28, 2017

Jono Bacon

Anonymous Open Source Projects

Today Solomon asked an interesting question on Twitter:

He made it clear he is not advocating for this view, just a thought experiment. I had, well, a few thoughts on this.

I tend to think of open source projects in three broad buckets.

Firstly, we have the overall workflow in which the community works together to build things. This is your code review processes, issue management, translations workflow, event strategy, governance, and other pieces.

Secondly, there are the individual contributions. This is how we assess what we want to build, what quality looks like, how we build modularity, and other elements.

Thirdly, there is identity which covers the identity of the project and the individuals who contribute to it. Solomon taps into this third component.

Identity

While the first two components, workflow and contributions are clearly important in defining what you want to work on and how you build it, identity is more subtle.

I think identity plays a few different roles at the individual level.

Firstly, it helps to build reputation. Open source communities are at a core level meritocracies: contributions are assessed on their value, and the overall value of the contributor is based on their merits. Now, yes, I know some of you will balk at whether open source communities are actually meritocracies. The thing is, too many people treat “meritocracy” as a framework or model: it isn’t. It is more of a philosophy…a star that we move towards.

It is impossible to build a meritocracy without some form of identity attached to the contribution. We need to have a mapping between each contribution and the same identity that delivered it: this helps that individual build their reputation as they deliver more and more contributions. This also helps them flow from being a new contributor, to a regular, and then to a leader.

This leads to my second point. Identity is also critical for accountability. Now, when I say accountability we tend to think of someone being responsible for their actions. Sure, this is the case, but accountability also plays an important and healthy role in people receiving peer feedback on their work.

According to Google Images search, “accountability” requires lots of fist bumps.

Open source communities are kinda weird places to be. It is easy to forget that (a) joining a community, (b) making a contribution, (c) asking for help, (d) having your contribution critically reviewed, and (e) solving problems, all happens out in the open, for all to see. This can be remarkably weird and stressful for people new to or unfamiliar with open source, and on a bed of the cornucopia of human insecurities about looking stupid, embarrassing yourself etc. While I have never been to one (honest), I imagine this is what it must be like going to a nudist colony: everything out on display, both good and bad.

All of this rolls up to identity playing an important role for building the fabric of a community, for effective peer review, and the overall growth of individual participants (and thus the network effect of the community).

Real Names vs. Handles

If we therefore presume identity is important, do we require that identity to be a real name (e.g. “Jono Bacon”) or a handle (e.g. “MetalDude666”)? – not my actual handle, btw.

We have all said this at some point.

In terms of the areas I presented above such as building reputation, accountability, and peer review, this can all be accomplished if people use handles, under the prerequisite that there is some way of knowing that “MetalDude666” is the same person each time. Many gaming communities have players who build remarkable reputations and accountability and no one knows who they really are, just their handles.

Where things get trickier is assuring the same quality of community experience for those who use real names and those who use handles in the same community. On core infrastructure (such as code hosting, communication channels, websites, etc) this can typically be assured. It can get trickier with areas such as real-world events. For example, if the community has an in-person event, the folks with the handles may not feel comfortable attending so as to preserve their anonymity. Given how key these kinds of events can be to building relationships, it can therefore result in a social/collaborative delta between those with real names and those with handles.

So, in answer to Solomon’s question, I do think identity is critical, but it could be all handles if required. What is key is to either (a) require only handles/real names (which is tough), or (b) provide very careful community strategy and execution to reduce the delta of experience between those with real names and handles (tough, but easier).

So, what do you think, folks? Do you agree with me, or am I speaking nonsense? Can you share great examples of anonymous of open source communities? Are there elements I missed out in my assessment here? Share them in the comments below!

The post Anonymous Open Source Projects appeared first on Jono Bacon.

by Jono Bacon at April 28, 2017 06:55 PM

April 25, 2017

Akkana Peck

Typing Greek letters

I'm taking a MOOC that includes equations involving Greek letters like epsilon. I'm taking notes online, in Emacs, using the iimage mode tricks for taking MOOC class notes in emacs that I worked out a few years back.

Iimage mode works fine for taking screenshots of the blackboard in the videos, but sometimes I'd prefer to just put the equations inline in my file. At first I was typing out things like E = epsilon * sigma * T^4 but that's silly, and of course the professor isn't spelling out the Greek letters like that when he writes the equations on the blackboard. There's got to be a way to type Greek letters on this US keyboard.

I know how to type things like accented characters using the "Multi key" or "Compose key". In /etc/default/keyboard I have XKBOPTIONS="ctrl:nocaps,compose:menu,terminate:ctrl_alt_bksp" which, among other things, sets the compose key to be my "Menu" key, which I never used otherwise. And there's a file, /usr/share/X11/locale/en_US.UTF-8/Compose, that includes all the built-in compose key sequences. I have a shell function in my .zshrc,

composekey() {
  grep -i $1 /usr/share/X11/locale/en_US.UTF-8/Compose
}
so I can type something like composekey epsilon and find out how to type specific codes. But that didn't work so well for Greek letters. It turns out this is how you type them:
<dead_greek> <A>            : "Α"   U0391    # GREEK CAPITAL LETTER ALPHA
<dead_greek> <a>            : "α"   U03B1    # GREEK SMALL LETTER ALPHA
<dead_greek> <B>            : "Β"   U0392    # GREEK CAPITAL LETTER BETA
<dead_greek> <b>            : "β"   U03B2    # GREEK SMALL LETTER BETA
<dead_greek> <D>            : "Δ"   U0394    # GREEK CAPITAL LETTER DELTA
<dead_greek> <d>            : "δ"   U03B4    # GREEK SMALL LETTER DELTA
<dead_greek> <E>            : "Ε"   U0395    # GREEK CAPITAL LETTER EPSILON
<dead_greek> <e>            : "ε"   U03B5    # GREEK SMALL LETTER EPSILON
... and so forth. And this <dead_greek> key isn't actually defined in most US/English keyboard layouts: you can check whether it's defined for you with: xmodmap -pke | grep dead_greek

Of course you can use xmodmap to define a key to be <dead_greek>. I stared at my keyboard for a bit, and decided that, considering how seldom I actually need to type Greek characters, I didn't see the point of losing a key for that purpose (though if you want to, here's a thread on how to map <dead_greek> with xmodmap).

I decided it would make much more sense to map it to the compose key with a prefix, like 'g', that I don't need otherwise. I can do that in ~/.XCompose like this:

<Multi_key> <g> <A>            : "Α"   U0391    # GREEK CAPITAL LETTER ALPHA
<Multi_key> <g> <a>            : "α"   U03B1    # GREEK SMALL LETTER ALPHA
<Multi_key> <g> <B>            : "Β"   U0392    # GREEK CAPITAL LETTER BETA
<Multi_key> <g> <b>            : "β"   U03B2    # GREEK SMALL LETTER BETA
<Multi_key> <g> <D>            : "Δ"   U0394    # GREEK CAPITAL LETTER DELTA
<Multi_key> <g> <d>            : "δ"   U03B4    # GREEK SMALL LETTER DELTA
<Multi_key> <g> <E>            : "Ε"   U0395    # GREEK CAPITAL LETTER EPSILON
<Multi_key> <g> <e>            : "ε"   U03B5    # GREEK SMALL LETTER EPSILON
... and so forth.

And now I can type [MENU] g e and a lovely ε appears, at least in any app that supports Greek fonts, which is most of them nowadays.

April 25, 2017 06:57 PM

April 21, 2017

Akkana Peck

Comb Ridge and Cedar Mesa Trip

[House on Fire ruin, Mule Canyon UT] Last week, my hiking group had its annual trip, which this year was Bluff, Utah, near Comb Ridge and Cedar Mesa, an area particular known for its Anasazi ruins and petroglyphs.

(I'm aware that "Anasazi" is considered a politically incorrect term these days, though it still seems to be in common use in Utah; it isn't in New Mexico. My view is that I can understand why Pueblo people dislike hearing their ancestors referred to by a term that means something like "ancient enemies" in Navajo; but if they want everyone to switch from using a mellifluous and easy to pronounce word like "Anasazi", they ought to come up with a better, and shorter, replacement than "Ancestral Puebloans." I mean, really.)

The photo at right is probably the most photogenic of the ruins I saw. It's in Mule Canyon, on Cedar Mesa, and it's called "House on Fire" because of the colors in the rock when the light is right.

The light was not right when we encountered it, in late morning around 10 am; but fortunately, we were doing an out-and-back hike. Someone in our group had said that the best light came when sunlight reflected off the red rock below the ruin up onto the rock above it, an effect I've seen in other places, most notably Bryce Canyon, where the hoodoos look positively radiant when seen backlit, because that's when the most reflected light adds to the reds and oranges in the rock.

Sure enough, when we got back to House on Fire at 1:30 pm, the light was much better. It wasn't completely obvious to the eye, but comparing the photos afterward, the difference is impressive: Changing light on House on Fire Ruin.

[Brain main? petroglyph at Sand Island] The weather was almost perfect for our trip, except for one overly hot afternoon on Wednesday. And the hikes were fairly perfect, too -- fantastic ruins you can see up close, huge petroglyph panels with hundreds of different creatures and patterns (and some that could only have been science fiction, like brain-man at left), sweeping views of canyons and slickrock, and the geology of Comb Ridge and the Monument Upwarp.

And in case you read my last article, on translucent windows, and are wondering how those generated waypoints worked: they were terrific, and in some cases made the difference between finding a ruin and wandering lost on the slickrock. I wish I'd had that years ago.

Most of what I have to say about the trip are already in the comments to the photos, so I'll just link to the photo page:

Photos: Bluff trip, 2017.

April 21, 2017 01:28 AM

April 20, 2017

kdub

5 years of Test Driven Development, Visualized

Here’s a 15 minute video covering nearly 5 years of active Mir development in C++.

We (the original Mir team), started the project by reading “Growing Object-Oriented Software, Guided by Tests” by Steve Freeman and Nat Price (book site).  We followed the philosophy of test-driven development
closely after that, really throughout the whole project.

This is a video generated by the program ‘gource’. What you see is every file or directory is a node. Little avatars, one for every contributor zoom around the files and zap them when a change is made. If you watch closely, some contributors will fixate on a few files for a bit (bug hunting, maybe?) and sometimes the avatars zap a lot of files during close succession, during feature additions and refactoring.

A few things to point out about TDD in the visualization. At the end, the 3 biggest branches are headers, source code and tests. The size of those components remained roughly the same throughout the whole project, which shows that we were adding a test, and then fleshing out the source code (and supporting headers from there). When an avatar zaps a lot of files at once, there’s always a corresponding zapping of the tests tree and the source tree. Write the test first! Very cool to see how things come together with TDD on a large, sustained, and high quality code base.

 

by kdub at April 20, 2017 05:03 PM

April 11, 2017

Elizabeth Krumbach

After the last snow of the season

Just over a week ago I returned to San Francisco from a visit back east. I imagine trips back to Philadelphia will grow less noteworthy as we grow more accustom to having property there as well. Visits may cease having specific reasons and instead just be a part of our lives. This wasn’t one of those trips though, I had a specific agenda going in to spend the week going through and organizing my mother-in-law’s belongings.

When we laid her to rest back in February things were a bit of a whirlwind. Very little time was spent going through her belongings as we quickly packed up her apartment and dropped all the boxes in our den at the townhouse before getting back to funeral arrangements, other immediate end of life tasks, and time spent with family during our visit. When I arrived in Philadelphia for this trip, I had my work cut out for me.

The last snow of the season was also in recent memory for the area. Though much of the naturally fallen snow around had melted, piles remained all over my neighborhood. As the week progressed, a series of rain storms coming through the area and warmer weather meant that by the time my trip came to a close much of the snow was gone. I was fortunate weather-wise with many of my plans though, even the ones that took me outside largely landed on the dry parts of my visit.

I spent Monday-Friday working nominally 9-5 (earlier or later depending on the day, meetings scheduled). It was a great test for my ability to interact with the team remotely during a normal work week. Fortunately the team is used to being distributed and I have been working from home often even when I’m in town, so it’s not been a huge culture shift for any of us. It was also good to get comfortable working in that space, having breakfast and lunches at the townhouse and starting to develop a normal life routine out there instead of feeling like I’m on a trip.


DC/OS Office Hours at work taken upstairs!

Since I was working, it took me the whole trip to get through her 20 or so boxes (excluding clothes), but it wasn’t just about time. I knew this work would be difficult. The loss was still somewhat fresh, and though MJ was just a call or text away, it was still hard going through all of her things on my own. There’s also no denying the personal impact that seeing someone’s life packed into 20 boxes. How many boxes would I end up with? What would my family surviving me do with it all? What is sentimental to me but would be confusing or unimportant to almost anyone else? What makes me happy today but will be a burden-of-stuff to those who come after me? All of this lead to a great amount of care and respect as I went through to catalog and repack all of her things, and decide what few items here and there could be donated, which ended up being almost exclusively clothes and linens we had no use for.

While I was there, regular updates also came in from MJ about Simcoe’s rapidly declining health. Not all communication was sad though. I was getting pictures and updates about what was going on in San Francisco, and was able to loop MJ in whenever I had questions or comments about things. I ended up having to bring a few piles of paperwork home with me, but staying in touch was really nice.

To balance the difficulty presented by all this, I also spent time with friends and family. The Sunday following my arrival I took advantage of the nearby train station for the second time since moving into the townhouse to head to downtown Philly. When I lived in the area previously I’d never lived near a rail line, and my use of public transit was rare. As a result, proximity to a regional rail line was not an intended goal of where we ended up buying, but it’s quickly turning into something I value considerably. Living in the bay area has turned me into quite the rail and public transit fan. In the past six months the amount of time I’ve spent on Philly public transit has rivaled what I experienced while living in the area. City life here in San Francisco has also reduced my apprehension about driving in cities, but I’m still not super keen on dealing with traffic or parking once I get to near my destination downtown, and I actively enjoy train rides. The line I take down to the city runs hourly and takes 40 minutes to drop me off at Market East station, nearly matching what I’d get driving when traffic and parking are considered.

The end of my train ride brought me to a lunch with my friend Tom at Bareburger. They had a surprisingly option-filled menu for a burger place, and I think I’d go back for their milkshakes alone and I don’t think I’d be able to resist adding duck bacon to my burger. It was a pleasure to catch up, Tom was one of those friends who I met through LiveJournal well over a decade ago, and in spite of living in proximity to each other for years, our in person time spent was quite limited.

This trip also afforded me the opportunity to have dinner with my friend Jace. We hadn’t seen each other in probably eight years, but he lives not too far from the townhouse and we’ve kept in touch online. He’s also the designer who came up with the last two iterations of the main page of princessleia.com and we’ve both published books in the past year, leading to piles of options for discussion. Given his proximity to our new place I hope we can make more time to hang out in the coming years, it was nice to reconnect.

Some move-in work progressed on the townhouse as well as we customize it to our liking. My brother-in-law came over to do some wall excavation off or our garage to see if a closet could be put in under the stairs. Success! The void we speculated existing does indeed exist and we’ll be working with him on a quote to do the formal build out work in the coming months. After the wall work, I joined him, his mother and my father-in-law for a wonderful dinner at the nearby Uzbekistan Restaurant on Bustleton Ave in Philadelphia. It was my first time there, but after enjoying their culinary delights, it won’t be my last.

On Wednesday I met up with my friends Crissi and Nita to see Beauty and the Beast for what was the second time for all three of us. We went to one of the new theaters that serves dinner along with the movie due to time constraints with Nita’s pre-surgical eating schedule, and then met afterwards for dessert elsewhere so we could catch up and actually talk.

And pre-surgical? Nita was having a procedure done the following morning. In spite of her living thirty minutes from my place, fate would have it that the center she went to for the procedure was just a couple miles from the townhouse. On Thursday I headed over right after work to spend a few hours with her and several folks who dropped in to visit her throughout the evening. When she was discharged the following afternoon her and her sister came over to my place to spend the night so she wouldn’t have the thirty minute drive home so soon after surgery. I really enjoyed the company, making the first proper meal not by myself for dinner with our new pots and pans (spaghetti!), and a spread of omelettes the following morning. We also engaged in a Pirates of the Caribbean marathon, making our way through three of the movies, since I’d somehow neglected to see any of them.

Time constraints got in the way of plans to visit some of my friends in New Jersey on Saturday, which I’m disappointed about but couldn’t be avoided. My final day in town was Sunday and spent with yet another friend, making our way down to Delaware for a vintage toy show, and then taking some time before my flight for a walk in a local park where we could enjoy the weather and talk. It was the most beautiful day of my trip, and though it wasn’t particularly warm, the temperatures in the 50s made for a San Francisco-like feel that I have come to really enjoy taking walks in.

I haven’t had the easiest time over the past few months, and this visit definitely continued in the vein of complicated emotions. Rebuilding the in-person relationships that had largely shifted to being online since I moved away has brought some peace to what has been a difficult time. I’m incredibly grateful for the wonderful people I have in my life, and am reminded that of those finally organized boxes that my mother-in-law left behind, 20% of what I went through were photo-based. Photos of friends, family, various moments in her life that she held on to through the years. In my often work-focused life, it was a good reminder of what is important in life besides what we accomplish professionally. I need the people I love and care for to really make me feel whole.

And upon returning home, MJ met me at the airport with roses. I love and am loved, so much to still be grateful for.

by pleia2 at April 11, 2017 02:07 AM

April 10, 2017

Elizabeth Krumbach

Simcoe loved

Yesterday we had to let go of our precious Simcoe. She was almost ten and a half years old, and has spent the past five and a half years undergoing treatment for her Chronic Renal Failure (CRF).

I’ll be doing a final medical post that has details about her care over the years and how her levels looked as the disease progressed, but these very painful past twenty-four hours have reminded me of so many of the things our little kitty loved and made her the sweet, loving, fun critter she was. So this post is just a simple one.

Simcoe loved her seagull on a stick, I had to covertly buy an identical one when her old one broke

Simcoe loved being a country cat, hunting bugs and watching chipmunks at the house in Pennsylvania

Simcoe loved being a city cat, staring down at cars and people from the high rise window sill in San Francisco

Simcoe loved little cat tents and houses

Simcoe loved sitting on our laps

Simcoe loved cold cut turkey

Simcoe loved bringing toys on to the bed so we would play with her

Simcoe loved snuggling Caligula

Simcoe loved being inside boxes

Simcoe loved sitting in the sun

Simcoe loved pop corn

Simcoe loved sitting on books and magazines I was trying to read

Simcoe loved string

Simcoe loved having a perfectly groomed coat of fur

Simcoe loved sitting on suitcases

Simcoe loved her Millennium Falcon on a stick

Simcoe loved climbing on top of boxes

Simcoe loved paper bags

Simcoe loved talking

Simcoe loved sleeping on our warm laptops if we left them open

Simcoe loved living, which made this all so much harder

Simcoe loved us, and we loved her so very much

by pleia2 at April 10, 2017 07:25 PM

April 06, 2017

Akkana Peck

Clicking through a translucent window: using X11 input shapes

It happened again: someone sent me a JPEG file with an image of a topo map, with a hiking trail and interesting stopping points drawn on it. Better than nothing. But what I really want on a hike is GPX waypoints that I can load into OsmAnd, so I can see whether I'm still on the trail and how to get to each point from where I am now.

My PyTopo program lets you view the coordinates of any point, so you can make a waypoint from that. But for adding lots of waypoints, that's too much work, so I added an "Add Waypoint" context menu item -- that was easy, took maybe twenty minutes. PyTopo already had the ability to save its existing tracks and waypoints as a GPX file, so no problem there.

[transparent image viewer overlayed on top of topo map] But how do you locate the waypoints you want? You can do it the hard way: show the JPEG in one window, PyTopo in the other, and do the "let's see the road bends left then right, and the point is off to the northwest just above the right bend and about two and a half times as far away as the distance through both road bends". Ugh. It takes forever and it's terribly inaccurate.

More than once, I've wished for a way to put up a translucent image overlay that would let me click through it. So I could see the image, line it up with the map in PyTopo (resizing as needed), then click exactly where I wanted waypoints.

I needed two features beyond what normal image viewers offer: translucency, and the ability to pass mouse clicks through to the window underneath.

A translucent image viewer, in Python

The first part, translucency, turned out to be trivial. In a class inheriting from my Python ImageViewerWindow, I just needed to add this line to the constructor:

    self.set_opacity(.5)

Plus one more step. The window was translucent now, but it didn't look translucent, because I'm running a simple window manager (Openbox) that doesn't have a compositor built in. Turns out you can run a compositor on top of Openbox. There are lots of compositors; the first one I found, which worked fine, was xcompmgr -c -t-6 -l-6 -o.1

The -c specifies client-side compositing. -t and -l specify top and left offsets for window shadows (negative so they go on the bottom right). -o.1 sets the opacity of window shadows. In the long run, -o0 is probably best (no shadows at all) since the shadow interferes a bit with seeing the window under the translucent one. But having a subtle .1 shadow was useful while I was debugging.

That's all I needed: voilà, translucent windows. Now on to the (much) harder part.

A click-through window, in C

X11 has something called the SHAPE extension, which I experimented with once before to make a silly program called moonroot. It's also used for the familiar "xeyes" program. It's used to make windows that aren't square, by passing a shape mask telling X what shape you want your window to be. In theory, I knew I could do something like make a mask where every other pixel was transparent, which would simulate a translucent image, and I'd at least be able to pass clicks through on half the pixels.

But fortunately, first I asked the estimable Openbox guru Mikael Magnusson, who tipped me off that the SHAPE extension also allows for an "input shape" that does exactly what I wanted: lets you catch events on only part of the window and pass them through on the rest, regardless of which parts of the window are visible.

Knowing that was great. Making it work was another matter. Input shapes turn out to be something hardly anyone uses, and there's very little documentation.

In both C and Python, I struggled with drawing onto a pixmap and using it to set the input shape. Finally I realized that there's a call to set the input shape from an X region. It's much easier to build a region out of rectangles than to draw onto a pixmap.

I got a C demo working first. The essence of it was this:

    if (!XShapeQueryExtension(dpy, &shape_event_base, &shape_error_base)) {
        printf("No SHAPE extension\n");
        return;
    }

    /* Make a shaped window, a rectangle smaller than the total
     * size of the window. The rest will be transparent.
     */
    region = CreateRegion(outerBound, outerBound,
                          XWinSize-outerBound*2, YWinSize-outerBound*2);
    XShapeCombineRegion(dpy, win, ShapeBounding, 0, 0, region, ShapeSet);
    XDestroyRegion(region);

    /* Make a frame region.
     * So in the outer frame, we get input, but inside it, it passes through.
     */
    region = CreateFrameRegion(innerBound);
    XShapeCombineRegion(dpy, win, ShapeInput, 0, 0, region, ShapeSet);
    XDestroyRegion(region);

CreateRegion sets up rectangle boundaries, then creates a region from those boundaries:

Region CreateRegion(int x, int y, int w, int h) {
    Region region = XCreateRegion();
    XRectangle rectangle;
    rectangle.x = x;
    rectangle.y = y;
    rectangle.width = w;
    rectangle.height = h;
    XUnionRectWithRegion(&rectangle, region, region);

    return region;
}

CreateFrameRegion() is similar but a little longer. Rather than post it all here, I've created a GIST: transregion.c, demonstrating X11 shaped input.

Next problem: once I had shaped input working, I could no longer move or resize the window, because the window manager passed events through the window's titlebar and decorations as well as through the rest of the window. That's why you'll see that CreateFrameRegion call in the gist: -- I had a theory that if I omitted the outer part of the window from the input shape, and handled input normally around the outside, maybe that would extend to the window manager decorations. But the problem turned out to be a minor Openbox bug, which Mikael quickly tracked down (in openbox/frame.c, in the XShapeCombineRectangles call on line 321, change ShapeBounding to kind). Openbox developers are the greatest!

Input Shapes in Python

Okay, now I had a proof of concept: X input shapes definitely can work, at least in C. How about in Python?

There's a set of python-xlib bindings, and they even supports the SHAPE extension, but they have no documentation and didn't seem to include input shapes. I filed a GitHub issue and traded a few notes with the maintainer of the project. It turned out the newest version of python-xlib had been completely rewritten, and supposedly does support input shapes. But the API is completely different from the C API, and after wasting about half a day tweaking the demo program trying to reverse engineer it, I gave up.

Fortunately, it turns out there's a much easier way. Python-gtk has shape support, even including input shapes. And if you use regions instead of pixmaps, it's this simple:

    if self.is_composited():
        region = gtk.gdk.region_rectangle(gtk.gdk.Rectangle(0, 0, 1, 1))
        self.window.input_shape_combine_region(region, 0, 0)

My transimageviewer.py came out nice and simple, inheriting from imageviewer.py and adding only translucency and the input shape.

If you want to define an input shape based on pixmaps instead of regions, it's a bit harder and you need to use the Cairo drawing API. I never got as far as working code, but I believe it should go something like this:

    # Warning: untested code!
    bitmap = gtk.gdk.Pixmap(None, self.width, self.height, 1)
    cr = bitmap.cairo_create()
    # Draw a white circle in a black rect:
    cr.rectangle(0, 0, self.width, self.height)
    cr.set_operator(cairo.OPERATOR_CLEAR)
    cr.fill();

    # draw white filled circle
    cr.arc(self.width / 2, self.height / 2, self.width / 4,
           0, 2 * math.pi);
    cr.set_operator(cairo.OPERATOR_OVER);
    cr.fill();

    self.window.input_shape_combine_mask(bitmap, 0, 0)

The translucent image viewer worked just as I'd hoped. I was able to take a JPG of a trailmap, overlay it on top of a PyTopo window, scale the JPG using the normal Openbox window manager handles, then right-click on top of trail markers to set waypoints. When I was done, a "Save as GPX" in PyTopo and I had a file ready to take with me on my phone.

April 06, 2017 11:08 PM

April 05, 2017

Jono Bacon

Canonical Refocus

I wrote this on G+, but it seemed appropriate to share it here too:

So, today Canonical decided to refocus their business and move away from convergence and devices. This means that the Ubuntu desktop will move back to GNOME.

I have seen various responses to this news. Some sad that it is the end of an era, and a non-zero amount of “we told you so” smugness.

While Unity didn’t pan out, and there were many good steps and missteps along the way, I am proud that Canonical tried to innovate. Innovation is tough and fraught with risk. The Linux desktop has always been a tough nut to crack, and one filled with an army of voices, but I am proud Canonical gave it a shot even if it didn’t succeed it’s ultimate goals. That spirit of experimentation is at the epicenter of open source, and I hope everyone involved here takes a good look at how they contributed to and exacerbated this innovation. I know I have looked inwards at this.

Much as some critics may deny, everyone I know who worked on Unity and Mir, across engineering, product, community, design, translations, QA, and beyond did so with big hearts and open minds. I just hope we see that talent and passion continue to thrive and we continue to see Ubuntu as a powerful driver for the Linux desktop. I am excited to see how this work manifests in GNOME, which has been doing some awesome work in recent years.

And, Mark, Jane, I know this will have been a tough decision to come to, and this will be a tough day for the different teams affected. Hang in there: Ubuntu has had such a profound impact on open source and while the future path may be a little different, I am certain it will be fruitful.

The post Canonical Refocus appeared first on Jono Bacon.

by Jono Bacon at April 05, 2017 10:12 PM

March 31, 2017

Akkana Peck

Show mounted filesystems

Used to be that you could see your mounted filesystems by typing mount or df. But with modern Linux kernels, all sorts are implemented as virtual filesystems -- proc, /run, /sys/kernel/security, /dev/shm, /run/lock, /sys/fs/cgroup -- I have no idea what most of these things are except that they make it much more difficult to answer questions like "Where did that ebook reader mount, and did I already unmount it so it's safe to unplug it?" Neither mount nor df has a simple option to get rid of all the extraneous virtual filesystems and only show real filesystems.

http://unix.stackexchange.com/questions/177014/showing-only-interesting-mount-p oints-filtering-non-interesting-types had some suggestions that got me started:

mount -t ext3,ext4,cifs,nfs,nfs4,zfs
mount | grep -E --color=never  '^(/|[[:alnum:]\.-]*:/)'
Another answer there says it's better to use findmnt --df, but that still shows all the tmpfs entries (findmnt --df | grep -v tmpfs might do the job).

And real mounts are always mounted on a filesystem path starting with /, so you can do mount | grep '^/'.

But it also turns out that mount will accept a blacklist of types as well as a whitelist: -t notype1,notype2... I prefer the idea of excluding a blacklist of filesystem types versus restricting it to a whitelist; that way if I mount something unusual like curlftpfs that I forgot to add to the whitelist, or I mount a USB stick with a filesystem type I don't use very often (ntfs?), I'll see it.

On my system, this was the list of types I had to disable (sheesh!):

mount -t nosysfs,nodevtmpfs,nocgroup,nomqueue,notmpfs,noproc,nopstore,nohugetlbfs,nodebugfs,nodevpts,noautofs,nosecurityfs,nofusectl

df is easier: like findmnt, it excludes most of those filesystem types to begin with, so there are only a few you need to exclude:

df -hTx tmpfs -x devtmpfs -x rootfs

Obviously I don't want to have to type either of those commands every time I want to check my mount list. SoI put this in my .zshrc. If you call mount or df with no args, it applies the filters, otherwise it passes your arguments through. Of course, you could make a similar alias for findmnt.

# Mount and df are no longer useful to show mounted filesystems,
# since they show so much irrelevant crap now.
# Here are ways to clean them up:
mount() {
    if [[ $# -ne 0 ]]; then
        /bin/mount $*
        return
    fi

    # Else called with no arguments: we want to list mounted filesystems.
    /bin/mount -t nosysfs,nodevtmpfs,nocgroup,nomqueue,notmpfs,noproc,nopstore,nohugetlbfs,nodebugfs,nodevpts,noautofs,nosecurityfs,nofusectl
}

df() {
    if [[ $# -ne 0 ]]; then
        /bin/df $*
        return
    fi

    # Else called with no arguments: we want to list mounted filesystems.
    /bin/df -hTx tmpfs -x devtmpfs -x rootfs
}

Update: Chris X Edwards suggests lsblk or lsblk -o 'NAME,MOUNTPOINT'. it wouldn't have solved my problem because it only shows /dev devices, not virtual filesystems like sshfs, but it's still a command worth knowing about.

March 31, 2017 06:25 PM

March 26, 2017

Nathan Haines

Winners of the Ubuntu 17.04 Free Culture Showcase

Spring is here and the release of Ubuntu 17.04 is just around the corner. I've been using it for two weeks and I can't say I'm disappointed! But one new feature that never disappoints me is appearance of the community wallpapers that were selected from the Ubuntu Free Culture Showcase!

Every cycle, talented artists around the world create media and release it under licenses that encourage sharing and adaptation. For Ubuntu 17.04, 96 images were submitted to the Ubuntu 17.04 Free Culture Showcase photo pool on Flickr, where all eligible submissions can be found.

But now the results are in, and the top choices, voted on by certain members of the Ubuntu community, and I'm proud to announce the winning images that will be included in Ubuntu 17.04:

A big congratulations to the winners, and thanks to everyone who submitted a wallpaper. You can find these wallpapers (along with dozens of other stunning wallpapers) today at the links above, or in your desktop wallpaper list after you upgrade or install Ubuntu 17.04 on April 13th.

March 26, 2017 08:35 AM

March 25, 2017

Akkana Peck

Reading keypresses in Python

As part of preparation for Everyone Does IT, I was working on a silly hack to my Python script that plays notes and chords: I wanted to use the computer keyboard like a music keyboard, and play different notes when I press different keys. Obviously, in a case like that I don't want line buffering -- I want the program to play notes as soon as I press a key, not wait until I hit Enter and then play the whole line at once. In Unix that's called "cbreak mode".

There are a few ways to do this in Python. The most straightforward way is to use the curses library, which is designed for console based user interfaces and games. But importing curses is overkill just to do key reading.

Years ago, I found a guide on the official Python Library and Extension FAQ: Python: How do I get a single keypress at a time?. I'd even used it once, for a one-off Raspberry Pi project that I didn't end up using much. I hadn't done much testing of it at the time, but trying it now, I found a big problem: it doesn't block.

Blocking is whether the read() waits for input or returns immediately. If I read a character with c = sys.stdin.read(1) but there's been no character typed yet, a non-blocking read will throw an IOError exception, while a blocking read will wait, not returning until the user types a character.

In the code on that Python FAQ page, blocking looks like it should be optional. This line:

fcntl.fcntl(fd, fcntl.F_SETFL, oldflags | os.O_NONBLOCK)
is the part that requests non-blocking reads. Skipping that should let me read characters one at a time, block until each character is typed. But in practice, it doesn't work. If I omit the O_NONBLOCK flag, reads never return, not even if I hit Enter; if I set O_NONBLOCK, the read immediately raises an IOError. So I have to call read() over and over, spinning the CPU at 100% while I wait for the user to type something.

The way this is supposed to work is documented in the termios man page. Part of what tcgetattr returns is something called the cc structure, which includes two members called Vmin and Vtime. man termios is very clear on how they're supposed to work: for blocking, single character reads, you set Vmin to 1 (that's the number of characters you want it to batch up before returning), and Vtime to 0 (return immediately after getting that one character). But setting them in Python with tcsetattr doesn't make any difference.

(Python also has a module called tty that's supposed to simplify this stuff, and you should be able to call tty.setcbreak(fd). But that didn't work any better than termios: I suspect it just calls termios under the hood.)

But after a few hours of fiddling and googling, I realized that even if Python's termios can't block, there are other ways of blocking on input. The select system call lets you wait on any file descriptor until has input. So I should be able to set stdin to be non-blocking, then do my own blocking by waiting for it with select.

And that worked. Here's a minimal example:

import sys, os
import termios, fcntl
import select

fd = sys.stdin.fileno()
newattr = termios.tcgetattr(fd)
newattr[3] = newattr[3] & ~termios.ICANON
newattr[3] = newattr[3] & ~termios.ECHO
termios.tcsetattr(fd, termios.TCSANOW, newattr)

oldterm = termios.tcgetattr(fd)
oldflags = fcntl.fcntl(fd, fcntl.F_GETFL)
fcntl.fcntl(fd, fcntl.F_SETFL, oldflags | os.O_NONBLOCK)

print "Type some stuff"
while True:
    inp, outp, err = select.select([sys.stdin], [], [])
    c = sys.stdin.read()
    if c == 'q':
        break
    print "-", c

# Reset the terminal:
termios.tcsetattr(fd, termios.TCSAFLUSH, oldterm)
fcntl.fcntl(fd, fcntl.F_SETFL, oldflags)

A less minimal example: keyreader.py, a class to read characters, with blocking and echo optional. It also cleans up after itself on exit, though most of the time that seems to happen automatically when I exit the Python script.

March 25, 2017 06:42 PM

March 24, 2017

Jono Bacon

My Move to ProsperWorks CRM and Feature Requests

As some of you will know, I am a consultant that helps companies build internal and external communities, collaborative workflow, and teams. Like any consultant, I have different leads that I need to manage, I convert those leads into opportunities, and then I need to follow up and convert them into clients.

Managing my time is one of the most critical elements of what I do. I want to maximize my time to be as valuable as possible, so optimizing this process is key. Thus, the choice of CRM has been an important one. I started with Insightly, but it lacked a key requirement: integration.

I hate duplicating effort. I spend the majority of my day living in email, so when a conversation kicks off as a lead or opportunity, I want to magically move that from my email to the CRM. I want to be able to see and associate conversations from my email in the CRM. I want to be able to see calendar events in my CRM. Most importantly, I don’t want to be duplicating content from one place to another. Sure, it might not take much time, but the reality is that I am just going to end up not doing it.

Evaluations

So, I evaluated a few different platforms, with a strong bias to SalesforceIQ. The main attraction there was the tight integration with my email. The problem with SalesforceIQ is that it is expensive, it has limited integration beyond email, and it gets significantly more expensive when you want more control over your pipeline and reporting. SalesforceIQ has the notion of “lists” where each is kind of like a filtered spreadsheet view. For the basic plan you get one list, but beyond that you have to go up a plan in which I get more lists, but it also gets much more expensive.

As I courted different solutions I stumbled across ProsperWorks. I had never heard of it, but there were a number of features that I was attracted to.

ProsperWorks

Firstly, ProsperWorks really focuses on tight integration with Google services. Now, a big chunk of my business is using Google services. Prosperworks integrates with Gmail, but also Google Calendar, Google Docs, and other services.

They ship a Gmail plugin which makes it simple to click on a contact and add them to ProsperWorks. You can then create an opportunity from that contact with a single click. Again, this is from my email: this immediately offers an advantage to me.

ProsperWorks CRM

Yep, that’s not my Inbox. It is an image yanked off the Internet.

When viewing each opportunity, ProsperWorks will then show associated Google Calendar events and I can easily attach Google Docs documents or other documents there too. The opportunity is presented as a timeline with email conversations listed there, but then integrated note-taking for meetings, and other elements. It makes it easy to summarize the scope of the deal, add the value, and add all related material. Also, adding additional parties to the deal is simple because ProsperWorks knows about your contacts as it sucks it up from your Gmail.

While the contact management piece is less important to me, it is also nice that it brings in related accounts for each contact automatically such as Twitter, LinkedIn, pictures, and more. Again, this all reduces the time I need to spend screwing around in a CRM.

Managing opportunities across the pipeline is simple too. I can define my own stages and then it basically operates like Trello and you just drag cards from one stage to another. I love this. No more selecting drop down boxes and having to save contacts. I like how ProsperWorks just gets out of my way and lets me focus on action.

…also not my pipeline view. Thanks again Google Images!

I also love that I can order these stages based on “inactivity”. Because ProsperWorks integrates email into each opportunity, it knows how many inactive days there has been since I engaged with an opportunity. This means I can (a) sort my opportunities based on inactivity so I can keep on top of them easily, and (b) I can set reminders to add tasks when I need to follow up.

ProsperWorks CRM

The focus on inactivity is hugely helpful when managing lots of concurrent opportunities.

As I was evaluating ProsperWorks, there was one additional element that really clinched it for me: the design.

ProsperWorks looks and feels like a Google application. It uses material design, and it is sleek and modern. It doesn’t just look good, but it is smartly designed in terms of user interaction. It is abundantly clear that whoever does the interaction and UX design at ProsperWorks is doing an awesome job, and I hope someone there cuts this paragraph out and shows it to them. If they do, you kick ass!

Of course, ProsperWorks does a load of other stuff that is helpful for teams, but I am primarily assessing this from a sole consultant’s perspective. In the end, I pulled the trigger and subscribed, and I am delighted that I did. It provides a great service, is more cost efficient than the alternatives, provides an integrated solution, and the company looks like they are doing neat things.

Feature Requests

While I dig ProsperWorks, there are some things I would love to encourage the company to focus on. So, ProsperWorks folks, if you are reading this, I would love to see you focus on the following. If some of these already exist, let me know and I will update this post. Consider me a resource here: happy to talk to you about these ideas if it helps.

Wider Google Calendar integration

Currently the gcal integration is pretty neat. One limitation though is that it depends on a gmail.com domain. As such, calendar events where someone invites my jonobacon.com email doesn’t automatically get added to the opportunity (and dashboard). It would be great to be able to associate another email address with an account (e.g. a gmail.com and jonobacon.com address) so when calendar events have either or both of those addresses they are sucked into opportunities. It would also be nice to select which calendars are viewed: I use different calendars for different things (e.g. one calendar for booked work, one for prospect meetings, one for personal etc). Feature Request Link

It would also be great to have ProsperWorks be able to ease scheduling calendar meetings in available slots. I want to be able to talk to a client about scheduling a call, click a button in the opportunity, and ProsperWorks will tell me four different options for call times, I can select which ones I am interested in, and then offer these times to the client, where they can pick one. ProsperWorks knows my calendar, this should be doable, and would be hugely helpful. Feature Request Link

Improve the project management capabilities

I have a dream. I want my CRM to also offer simple project management capabilities. ProsperWorks does have a ‘projects’ view, but I am unclear on the point of it.

What I would love to see is simple project tracking which integrates (a) the ability to set milestones with deadlines and key deliverables, and (b) Objective Key Results. This would be huge: I could agree on a set of work complete with deliverables as part of an opportunity, and then with a single click be able to turn this into a project where the milestones would be added and I could assign tasks, track notes, and even display a burndown chart to see how on track I am within a project. Feature Request Link

This doesn’t need to be a huge project management system, just a simple way of adding milestones, their child tasks, tracking deliverables, and managing work that leads up to those deliverables. Even if ProsperWorks just adds simple Evernote functionality where I can attach a bunch of notes to a client, this would be hugely helpful.

Optimize or Integrate Task Tracking

Tracking tasks is an important part of my work. The gold standard for task tracking is Wunderlist. It makes it simple to add tasks (not all tasks need deadlines), and I can access them from anywhere.

I would love to ProsperWorks to either offer that simplicity of task tracking (hit a key, whack in a title for a task, and optionally add a deadline instead of picking an arbitrary deadline that it nags me about later), or integrate with Wunderlist directly. Feature Request Link

Dashboard Configurability

I want my CRM dashboard to be something I look at every day. I want it to tell me what calendar events I have today, which opportunities I need to follow up with, what tasks I need to complete, and how my overall pipeline is doing. ProspectWorks does some of this, but doesn’t allow me to configure this view. For example, I can’t get rid of the ‘Invite Team Members’ box, which is entirely irrelevant to me as an individual consultant. Feature Request Link

So, all in all, nice work, ProsperWorks! I love what you are doing, and I love how you are innovating in this space. Consider me a resource: I want to see you succeed!

UPDATE: Updated with feature request links.

The post My Move to ProsperWorks CRM and Feature Requests appeared first on Jono Bacon.

by Jono Bacon at March 24, 2017 05:13 PM

March 23, 2017

Jono Bacon

Community Leadership Summit 2017: 6th – 7th May in Austin

The Community Leadership Summit is taking place on the 6th – 7th May 2017 in Austin, USA.

The event brings together community managers and leaders, projects, and initiatives to share and learn how we build strong, engaging, and productive communities. The event takes place the weekend before OSCON in the same venue, the Austin Contention Center. It is entirely FREE to attend and welcomes everyone, whether you are a community veteran or just starting out your journey!

The event is broken into three key components.

Firstly, we have an awesome set of keynotes this year:

Secondly, the bulk of the event is an unconference where the attendees volunteer session ideas and run them. Each session is a discussion where the topic is discussed, debated, and we reach final conclusions. This results in a hugely diverse range of sessions covering topics such as event management, outreach, social media, governance, collaboration, diversity, building contributor programs, and more. These discussions are incredible for exploring and learning new ideas, meeting interesting people, building a network, and developing friendships.

Finally, we have social events on both evenings where you can meet and network with other attendees. Food and drinks are provided by data.world and Mattermost. Thanks to both for their awesome support!

Join Us

The Community Leadership Summit is entirely FREE to attend. If you would like to join, we would appreciate if you could register (this helps us with expected numbers). I look forward to seeing you there in Austin on the 6th – 7th May 2017!

The post Community Leadership Summit 2017: 6th – 7th May in Austin appeared first on Jono Bacon.

by Jono Bacon at March 23, 2017 04:40 PM

March 22, 2017

Elizabeth Krumbach

Your own Zesty Zapus

As we quickly approach the release of Ubuntu 17.04, Zesty Zapus, coming up on April 13th, you may be thinking of how you can mark this release.

Well, thanks to Tom Macfarlane of the Canonical Design Team you have one more goodie in your toolkit, the SVG of the official Zapus! It’s now been added to the Animal SVGs section of the Official Artwork page on the Ubuntu wiki.

Zesty Zapus

Download the SVG version for printing or using in any other release-related activities from the wiki page or directly here.

Over here, I’m also all ready with the little “zapus” I picked up on Amazon.

Zesty Zapus toy

by pleia2 at March 22, 2017 04:01 AM

March 21, 2017

Elizabeth Krumbach

SCaLE 15x

At the beginning of March I had the pleasure of heading down to Pasadena, California for SCaLE 15x. Just like last year, MJ also came down for work so it was fun syncing up with him here and there between going off to our respective meetings and meals.

I arrived the evening on March 1st and went out with my co-organizer of the Open Source Infrastructure Day to pick up some supplies for the event. I hope to write up a toolkit for running one of these days based on our experiences and what we needed to buy, but that will have to wait for another day.

March 2nd is when things began properly and we got busy! I spent most of my day running the Open Source Infrastructure day, which I wrote about here on opensource.com: How to grow healthy open source project infrastructures.

I also spent an hour over at the UbuCon Summit giving a talk on Xubuntu which I already blogged about here. Throughout the day I also handled the Twitter accounts for both @OpenSourceInfra and @ubuntu_us_ca. This was deceptively exhausting, by Thursday night I was ready to crash but we had a dinner to go to! Speakers, organizers and other key folks who were part of our Open Source Infrastructure day were treated to a meal by IBM.


Photo thanks to SpamapS (source)

Keynotes for the conference on Saturday and Sunday were both thoughtful, future-thinking talks about the importance of open source software, culture and methodologies in our world today. On Saturday we heard from Astrophysicist Christine Corbett Moran, who among her varied accomplishments has done research in Antarctica and led security-focused development of the now wildly popular Signal app for iOS. She spoke on the relationships between our development of software and the communities we’re building in the open. There is much to learn and appreciate in this process, but also more that we can learn from other communities. Slides from her talk, amusingly constructed as a series of tweets (some real, most not) are available as a pdf on the talk page.


Christine Corbett Moran on “Open Source Software as Activism”

In Karen Sandler’s keynote she looked at much of what is going on in the United States today and seriously questioned her devotion to free software when it seems like there are so many other important causes out there to fight for. She came back to free software though, since it’s such an important part of every aspect of our lives. As technologists, it’s important for us to continue our commitment to open source and support organizations fighting for it, a video of her talk is already available on YouTube at SCaLE 15x Keynote: Karen Sandler – In the Scheme of Things, How Important is Software Freedom?

A few other talks really stood out for me, Amanda Folson spoke on “10 Things I Hate About Your API” where she drew from her vast experience with hosted APIs to give advice to organizations who are working to build their customer and developer communities around a product. She scrutinized things like sign-up time and availability and complexity of code examples. She covered tooling problems, documentation, reliability and consistency, along with making sure the API is actually written for the user, listening to feedback from users to maintain and improve it, and abiding by best practices. Best of all, she also offered helpful advice and solutions for all these problems! The great slides from her talk are available on the talk page.


Amanda Folson

I also really appreciated VM Brasseur’s talk, “Passing the Baton: Succession planning for FOSS leadership”. I’ll admit right up front that I’m not great at succession planning. I tend to take on a lot in projects and then lack the time to actually train other people because I’m so overwhelmed. I’m not alone in this, succession planning is a relatively new topic in open source projects and only a handful have taken a serious look at it from a high, project-wide level. Key points she made centered around making sure skills for important roles are documented and passed around and suggested term limits for key roles. She walked attendees through a process of identifying critical roles and responsibilities in their community, refactoring roles that are really too large for individual contributors, and procedures and processes for knowledge transfer. I think one of the most important things about this talk was less about the “bus factor” worry of losing major contributors unexpectedly, but how documenting and making sure roles are documented makes your project more welcoming to new, and more divers contributors. Work is well-scoped, so it’s easy for someone new to come in and help on a small part, and the project has support built in for that.


VM Brasseur

For my part, I gave a talk on “Listening to the Needs of Your Global Open Source Community” where I had a small audience (last talk of the day, against a popular speaker) but an engaged one that had great questions. It’s nice sometimes nice to have a smaller crowd that allows you to talk to almost all of them after the talk, I even arranged a follow-up lunch meeting with a woman I met who is doing some work similar to what I did for the i18n team in the OpenStack community. Slides from my talk are here (7.4M PDF).

I heard a talk from AB Periasamy of Minio, the open source alternative to AWS S3 that we’re using at Mesosphere for some of our DC/OS demos that need object storage. My friend and open source colleague Nathan Handler also gave a very work-applicable talk on PaaSTA, the framework built by Yelp to support their Apache Mesos-driven infrastructure. I cover both of these talks in more depth in a blog post coming out soon on the dcos.io blog. Edit: The post on the DC/OS blog is now up: Reflecting on SCaLE 15x.

SCaLE 15x remains one of my favorite conferences. Lots of great talks, key people from various segments of open source communities I participate in and great pacing so that you can find time to socialize and learn. Huge thanks to Ilan Rabinovitch who I worked with a fair amount during this event to make sure the Open Source Infrastructure day came together, and to the fleet of volunteers who make this happen every year.

More photos from SCaLE 15x here: https://www.flickr.com/photos/pleia2/albums/72157681016586816

by pleia2 at March 21, 2017 07:53 PM

March 20, 2017

Akkana Peck

Everyone Does IT (and some Raspberry Pi gotchas)

I've been quiet for a while, partly because I've been busy preparing for a booth at the upcoming Everyone Does IT event at PEEC, organized by LANL.

In addition to booths from quite a few LANL and community groups, they'll show the movie "CODE: Debugging the Gender Gap" in the planetarium, I checked out the movie last week (our library has it) and it's a good overview of the problem of diversity, and especially the problems women face in in programming jobs.

I'll be at the Los Alamos Makers/Coder Dojo booth, where we'll be showing an assortment of Raspberry Pi and Arduino based projects. We've asked the Coder Dojo kids to come by and show off some of their projects. I'll have my RPi crittercam there (such as it is) as well as another Pi running motioneyeos, for comparison. (Motioneyeos turned out to be remarkably difficult to install and configure, and doesn't seem to do any better than my lightweight scripts at detecting motion without false positives. But it does offer streaming video, which might be nice for a booth.) I'll also be demonstrating cellular automata and the Game of Life (especially since the CODE movie uses Life as a background in quite a few scenes), music playing in Python, a couple of Arduino-driven NeoPixel LED light strings, and possibly an arm-waving penguin I built a few years ago for GetSET, if I can get it working again: the servos aren't behaving reliably, but I'm not sure yet whether it's a problem with the servos and their wiring or a power supply problem.

The music playing script turned up an interesting Raspberry Pi problem. The Pi has a headphone output, and initially when I plugged a powered speaker into it, the program worked fine. But then later, it didn't. After much debugging, it turned out that the difference was that I'd made myself a user so I could have my normal shell environment. I'd added my user to the audio group and all the other groups the default "pi" user is in, but the Pi's pulseaudio is set up to allow audio only from users root and pi, and it ignores groups. Nobody seems to have found a way around that, but sudo apt-get purge pulseaudio solved the problem nicely.

I also hit a minor snag attempting to upgrade some of my older Raspbian installs: lightdm can't upgrade itself (Errors were encountered while processing: lightdm). Lots of people on the web have hit this, and nobody has found a way around it; the only solution seems to be to abandon the old installation and download a new Raspbian image.

But I think I have all my Raspbian cards installed and working now; pulseaudio is gone, music plays, the Arduino light shows run. Now to play around with servo power supplies and see if I can get my penguin's arms waving again when someone steps in front of him. Should be fun, and I can't wait to see the demos the other booths will have.

If you're in northern New Mexico, come by Everyone Does IT this Tuesday night! It's 5:30-7:30 at PEEC, the Los Alamos Nature Center, and everybody's welcome.

March 20, 2017 06:29 PM

March 19, 2017

Elizabeth Krumbach

Simcoe’s January and March 2017 Checkups

Simcoe has had a few checkups since I last wrote in October. First was a regular checkup in mid-December, where I brought her in on my own and had to start thinking about how we’re going to keep her weight up. The next step will be a feeding tube, and we really don’t want to go down that path with a cat who has never even been able to tolerate a collar. Getting her to take fiber was getting to be stressful for all of us, so the doctor prescribed Lactulose to be taken daily to handle constipation. Medication for a kitty facing renal failure is always dicey option, but the constipation was clearly painful for her and causing her to vomit more. We started getting going with that slowly. We skipped the blood work with this visit since we were aiming to get it done again in January.

On January 7th she was not doing well and was brought in for an emergency visit to make sure she didn’t pass into crisis with her renal failure. Blood work was done then and we had to get more serious about making sure she stays regular and keeps eating. Still, her weight started falling more dramatically at this point, with her dropping below 8 lbs for the first time since she was diagnosed in 2011, landing her at a worrying 7.58. Her BUN level had gone from 100 in October to 141, CRE from 7.0 to 7.9.

At the end of January she went in for her regular checkup. We skipped the blood work since it had just been done a couple weeks before. We got a new, more concentrated formulation of Mirtazapine to stimulate her appetite since MJ had discovered that putting the liquid dosage into a capsule that she could swallow without tasting any of it was the only possible way we could get her to take it. The Calcitriol she was taking daily was also reformulated. We had to leave town unexpectedly for a week in early February, which she wasn’t at all happy with, but since then I’ve been home with her most of the time so she’s seems to have perked up a bit and after dipping in weight she seems to be doing tolerably well.

When we brought her into the vet on March 11th her weight came in at a low 6.83 lbs. The lowest weight she’d ever had was 6.09 when she was first diagnosed and not being treated at all, so she wasn’t down to her all time low. Still, dropping below 7 pounds is still troubling, especially since it has happened so rapidly.

The exam went well though, the vet continues to be surprised at how well she’s doing outwardly in spite of her weight and blood work. Apparently some cats just handle the condition better than others. Simcoe is a lucky, tough kitty.


Evidence of the blood draw!

I spoke with the vet this morning now that blood work has come back. Her phosphorous and calcium levels are not at all where we want them to be. Her CRE is up from 7.9 to 10.5, BUN went from 141 to 157. Sadly, these are pretty awful levels, her daily 100 ml Subcutaneous fluids are really what is keeping her going at this point.

With this in mind, as of today we’ve suspended use of the Calcitriol, switched the Atopica she’s taking for allergies to be every other day. We’re only continuing with the Mirtazapine, Lactulose and Subcutaneous fluids. I’m hoping that the reduction in medications she’s taking each day will stress her body and mind less, leading to a happier kitty even as her kidneys continue in their decline. I hope she’s not in a lot of pain day to day, she does still vomit a couple times a week, and I know her constipation isn’t fully addressed by the medication, she still is quite thirsty all the time. We can’t increase her fluids dosage since there’s only so much she can absorb in a day, and it would put stress on her heart (she has a slight heart murmur). Keeping her weight up remains incredibly important, with the vet pretty much writing off dietary restrictions and saying she can eat as much of whatever she likes (turkey prepared for humans? Oh yes!).

Still, mostly day to day we’re having a fun cat life over here. We sent our laundry out while the washer was broken recently and clothes came back bundled in strings that Simcoe had a whole evening of fun over. I picked up a laser pointer recently that she played with for a bit, before figuring it out, she just stares at my hand now when I use it, but at least Caligula still enjoys it! And in the evenings when I carve out some time to read or watch TV, it’s pretty common for her to camp out on my lap.

by pleia2 at March 19, 2017 10:09 PM

March 13, 2017

Eric Hammond

Incompatible: Static S3 Website With CloudFront Forwarding All Headers

a small lesson learned in setting up a static web site with S3 and CloudFront

I created a static web site hosted in an S3 bucket named www.example.com (not the real name) and enabled accessing it as a website. I wanted delivery to be fast to everybody around the world, so I created a CloudFront distribution in front of the S3 bucket.

I wanted S3 to automatically add “index.html” to URLs ending in a slash (CloudFront can’t do this), so I configured the CloudFront distribution to access the S3 bucket as a web site using www.example.com.s3-website-us-east-1.amazonaws.com as the origin server.

Before sending all of the www.example.com traffic to the new setup, I wanted to test it, so I added test.example.com to the list of CNAMEs in the CloudFront distribution.

After setting up Route53 so that DNS lookups for test.example.com would resolve to the new CloudFront endpoint, I loaded it in my browser and got the following error:

404 Not Found

Code: NoSuchBucket
Message: The specified bucket does not exist
BucketName: test.example.com
RequestId: [short string]
HostId: [long string]

Why would AWS be trying to find an S3 bucket named test.example.com? That was pointing at the CloudFront distribution endpoint, and CloudFront was configured to get the content from www.example.com.s3-website-us-east-1.amazonaws.com

After debugging, I found out that the problem was that I had configured the CloudFront distribution to forward “all” HTTP headers. I thought that this would be a sneaky way to turn off caching in CloudFront so that I could keep updating the content in S3 and not have to wait to see the latest changes.

However, this also means that CloudFront was forwarding the HTTP Host header from my browser to the S3 website handler. When S3 saw that I was requesting the host of test.example.com it looked for a bucket of the same name and didn’t find it, resulting in the above error.

When I turned off forwarding all HTTP headers in CloudFront, it then started sending through the correct header:

Host: www.example.com.s3-website-us-east-1.amazonaws.com

which S3 correctly interpreted as accessing the correct S3 bucket www.example.com in the website mode (adding index.html after trailing slashes).

It makes sense for CloudFront to support forwarding the Host header from the browser, especially when your origin server is a dynamic web site that can act on the original hostname. You can set up a wildcard *.example.com DNS entry pointing at your CloudFront distribution, and have the back end server return different results depending on what host the browser requested.

However, passing the Host header doesn’t work so well for an origin server S3 bucket in website mode. Lesson learned and lesson passed on.

Original article and comments: https://alestic.com/2017/03/cloudfront-s3-host-header/

March 13, 2017 09:30 PM

March 10, 2017

Akkana Peck

At last! A roadrunner!

We live in what seems like wonderful roadrunner territory. For the three years we've lived here, we've hoped to see a roadrunner, and have seen them a few times at neighbors' places, but never in our own yard.

Until this morning. Dave happened to be looking out the window at just the right time, and spotted it in the garden. I grabbed the camera, and we watched it as it came out from behind a bush and went into stalk mode.

[Roadrunner stalking]

And it caught something!

[close-up, Roadrunner with fence lizard] We could see something large in its bill as it triumphantly perched on the edge of the garden wall, before hopping off and making a beeline for a nearby juniper thicket.

It wasn't until I uploaded the photo that I discovered what it had caught: a fence lizard. Our lizards only started to come out of hibernation about a week ago, so the roadrunner picked the perfect time to show up.

I hope our roadrunner decides this is a good place to hang around.

March 10, 2017 09:33 PM

Elizabeth Krumbach

Ubuntu at SCaLE15x

On Thursday, March 2nd I spent most of the day running an Open Source Infrastructure Day, but across the way my Ubuntu friends were kicking off the first day of the second annual UbuCon Summit at SCaLE. The first day included a keynote from by Carl Richell of System76 where they made product announcements, including of their new Galago Pro laptop and their Starling Pro ARM server. The talk following came from Nextcloud, with a day continuing with talks from Aaron Atchison and Karl Fezer talking about the Mycroft AI, José Antonio Rey on Getting to know Juju: From zero to deployed in minutes and Amber Graner sharing the wisdom that You don’t need permission to contribute to your own destiny.

I ducked out of the Open Infrastructure Day in the mid-afternoon to give my talk, 10 Years of Xubuntu. This is a talk I’d been thinking about for some time, and I begin by walking folks through the history of the Xubuntu project. From there I spoke about where it sits in the Ubuntu community as a recognized flavor, and then on to how specific strategies that the team has employed with regard to motivating the completely volunteer-driven team.

When it came to social media accounts, we didn’t create them all ourselves, instead relying upon existing accounts on Facebook, G+ and LinkedIn that we promoted to being official ones, keeping the original volunteers in place, just giving access to a core Xubuntu team member in case they couldn’t continue running it. It worked out for all of us, we had solid contributors passionate about their specific platforms and excited to be made official, and as long as they kept them running we didn’t need to expend core team resources to keep them running. We’ve also worked to collect user stories in order to motivate current contributors, since it means a lot to see their work being used by others. I’ve also placed a great deal of value on the Xubuntu Strategy Document, which has set the guiding principles of the project and allowed us to steer the ship through difficult decisions in the project. Slides from the talk are available here: 10_years_of_Xubuntu_UbuCon_Summit.pdf (1.9M).

Thursday evening I met with my open source infrastructure friends for dinner, but afterwards swung by Porto Alegre to catch some folks for evening drinks and snacks. I had a really nice chat with Nathan Haines, who co-organized the UbuCon Summit.

On Friday I was able to attend the first keynote! Michael Hall gave a talk titled Sponsored by Canonical where he dove deep into Ubuntu history to highlight Canonical’s role in the support of the project from the early focus on desktop Linux, to the move into devices and the cloud. His talk was followed by one from Sergio Schvezov on Snaps. The afternoon was spent as an unconference, with the Ubuntu booth starting up in the expo hall on 2PM.

The weekend was all about the Ubuntu booth. Several volunteers staffed it Friday through Sunday.

They spent the event showing off the Ubuntu Phone, Mycroft AI, and several laptops.

It was also great to once again meet up with one of my co-authors for the 9th edition of The Official Ubuntu Book, José Antonio Rey. Our publisher sent a pile of books to give out at the event, some of which we gave out during our talks, and a couple more at the booth.

by pleia2 at March 10, 2017 05:39 AM

Work, wine, open source and… survival

So far 2017 has proven to be quite the challenge, but let’s hold off on all that until the end.

As I’ve mentioned in a couple of recent posts, I start new job in January, joining Mesosphere to move up the stack to work on containers and focus on application deployments. It’s the first time I’ve worked for a San Francisco startup and so far I’ve been having a lot of fun working with really smart people who are doing interesting work that’s on the cutting edge of what companies are doing today. Aside from travel for work, I’ve spent most of my time these first couple months in the office getting to know everyone. Now, we all know that offices aren’t my thing, but I have enjoyed the catered breakfasts and lunches, dog-friendly environment and ability to meet with colleagues in person as I get started.

I’ve now started going in mostly just for meetings, with productivity much higher when I can work from home like I have for the past decade. My team is working on outreach and defining open source strategies, helping with slide decks, guides and software demos. All stuff I’m finding great value in. As I start digging deeper into the tech I’m finding myself once again excited about work I’m doing and building things that people are using.

Switching gears into the open source work I still do for fun, I’ve started to increase my participation with Xubuntu again, just recently wrapping up the #LoveXubuntu Competition. At SCaLE15x last week I gave a Xubuntu presentation, which I’ll write about in a later post. Though I’ve stepped away from the Ubuntu Weekly Newsletter just recently, I did follow through with ordering and shipping stickers off to winners of our issue 500 competition.

I’ve also put a nice chunk of my free time into promoting Open Source Infrastructure work. In addition to a website that now has a huge list of infras thanks to various contributors submitting merge proposals via GitLab, I worked with a colleague from IBM to run a whole open source infra event at SCaLE15x. Though we went into it with a lot of uncertainty, we came out the other end having had a really successful event and excitement from a pile of new people.

It hasn’t been all work though. In spite of a mounting to do list, sometimes you just need to slow down.

At the beginning of February MJ and I spent a Saturday over at the California Historical Society to see their Vintage: Wine, Beer, and Spirits Labels from the Kemble Collections on Western Printing and Publishing exhibit. It’s just around the corner from us, so allowed for a lovely hour of taking a break after a Saturday brunch to peruse various labels spanning wine, beer and spirits from a designer and printer in California during the first half of the 20th century. The collection was of mass-production labels, there nothing artisanal about them and no artists signing their names, but it did capture a place in time and I’m a sucker for early 20th century design. It was a fascinating collection, beautifully curated like their exhibits always are, and I’m glad we made time to see it.

More photos from the exhibit are up here: https://www.flickr.com/photos/pleia2/albums/72157676346542394

At the end of February we noted our need to pick up our quarterly wine club subscription at Rutherford Hill. In what was probably our shortest trip up to Napa, we enjoyed a noontime brunch at Harvest Table in St. Helena. We picked up some Charbay hop-flavored whiskey, stopped by the Heitz Cellar tasting room where we picked up a bottle of my favorite Zinfandel and then made our way to Rutherford Hill to satisfy the real goal of our trip. Upon arrival we were pleased to learn that a members’ wine-tasting event was being held in the caves, where they had a whole array of wines to sample along with snacks and cheeses. Our wine adventures ended with this stop and we made a relatively early trek south, in the direction of home.

A few more photos from our winery jaunt are here: https://www.flickr.com/photos/pleia2/albums/72157677743529104

Challenge-wise, here we go. Starting a new job means a lot of time spent learning, while I also have had to to hit the ground running. We worked our way through a death in the family last month. I’ve been away from home a lot, and generally we’ve been doing a lot of running around to complete all the adult things related to life. Our refrigerator was replaced in December and in January I broke one of the shelves, resulting in a spectacular display of tomato sauce all over the floor. Weeks later our washing machine started acting up and overflowed (thankfully no damage done in our condo), we have our third repair visit booked and hopefully it’ll be properly fixed on Monday.

I spent the better part of January recovering from a severe bout of bronchitis that had lasted three months, surviving antibiotics, steroids and two types of inhalers. MJ is continuing to nurse a broken bone in his foot, transitioning from air cast to shoe-based aids, but there’s still pain and uncertainty around whether it’ll heal properly without surgery. Simcoe is not doing well, she is well into the final stages of renal failure. We’re doing the best we can to keep her weight up and make sure she’s happy, but I fear the end is rapidly approaching and I’m not sure how I’ll cope with it. I also lurked in the valley of depression for a while in February.

We’re also living in a very uncertain political climate here in the United States. I’ve been seeing people I care about being placed in vulnerable situations. I’m finding myself deeply worried every time browse the news or social media for too long. I never thought that in 2017 I’d be reading from a cousin who was evacuated from a Jewish center due to a bomb threat, or have to check to make sure the cemetery in Philadelphia that was desecrated wasn’t one that my relatives were in. A country I’ve loved and been proud of for my whole life, through so many progressive changes in recent years, has been transformed into something I don’t recognize. I have friends and colleagues overseas cancelling trips and moves here because they’re afraid of being turned away or otherwise made to feel unwelcome. I’m thankful for my fellow citizens who are standing up against it and organizations like the ACLU who have vowed to keep fighting, I just can’t muster the strength for it right now.

Right now we have a lot going on, and though we’re both stressed out and tired, we aren’t actively handling any crisis at the moment. I feel like I finally have a tiny bit of breathing room. These next two weekends will be spent catching up on tasks and paperwork. I’m planning on going back to Philadelphia for a week at the end of the month to start sorting through my mother-in-law’s belongings and hopefully wrap up sorting of things that belonged to MJ’s grandparents. I know a fair amount of heartache awaits me in these tasks, but we’ll be in a much better place to move forward once I’ve finished. Plus, though I’ll be working each day, I will be making time to visit with friends while I’m there and that always lifts my spirits.

by pleia2 at March 10, 2017 02:35 AM

March 05, 2017

Akkana Peck

The Curious Incident of the Junco in the Night-Time

Dave called from an upstairs bedroom. "You'll probably want to see this."

He had gone up after dinner to get something, turned the light on, and been surprised by an agitated junco, chirping and fluttering on the sill outside the window. It evidently was tring to fly through the window and into the room. Occasionally it would flutter backward to the balcony rail, but no further.

There's a piñon tree whose branches extend to within a few feet of the balcony, but the junco ignored the tree and seemed bent on getting inside the room.

As we watched, hoping the bird would calm down, instead it became increasingly more desperate and stressed. I remembered how, a few months earlier, I opened the door to a deck at night and surprised a large bird, maybe a dove, that had been roosting there under the eaves. The bird startled and flew off in a panic toward the nearest tree. I had wondered what happened to it -- whether it had managed to find a perch in the thick of a tree in the dark of night. (Unlike San Jose, White Rock gets very dark at night.)

And that thought solved the problem of our agitated junco. "Turn the porch light on", I suggested. Dave flipped a switch, and the porch light over the deck illuminated not only the deck where the junco was, but the nearest branches of the nearby piñon.

Sure enough, now that it could see the branches of the tree, the junco immediately turned around and flew to a safe perch. We turned the porch light back off, and we heard no more from our nocturnal junco.

March 05, 2017 06:27 PM

February 27, 2017

Nathan Haines

UbuCon Summit Comes to Pasadena this Week!

UbuCon SCALE 14x group photo

Once again, UbuCon Summit will be hosted by the Southern California Linux Expo in Pasadena, California on March 2nd and 3rd. UbuCon Summit is two days that celebrate Ubuntu and the community, and this year has some excitement in store.

Thursday's keynote will feature Carl Richell, the CEO and founder of System 76, a premium source of Ubuntu desktop and laptop computers. Entitled "Acrylic, Aluminum, Thumb Screws, and Heavy Machinery at System 76," he will share how System 76 is reinventing what it means to be a computer manufacturer, and talk about how they are changing the relationship between users and their devices. Don't miss this fascinating peek behind the scenes of a computer manufacturer that focuses on Ubuntu, and keep your ears peeled because they are announcing new products during the keynote!

We also have community member Amber Graner who will share her inspiring advice on how to forge a path to success with her talk "You Don't Need Permission to Contribute to Your Own Destiny," and Elizabeth Joseph who will talk about her 10 years in the Xubuntu community.

Thursday will wrap up with our traditional open Ubuntu Q&A panel where you can ask us your burning questions about Ubuntu, and Friday will see a talk from Michael Hall, "Sponsored by Canonical" where he describes the relationship between Canonical and Ubuntu and how it's changed, and Sergio Schvezov will describe Ubuntu's next-generation packaging format in "From Source to Snaps." After a short break for lunch and the expo floor, we'll be back for four unconference sessions, where attendees will come together to discuss the Ubuntu topics that matter most to them.

Ubuntu will be at booth 605 during the Southern California Linux Expo's exhibition floor hours from Friday through Sunday. You'll be able to see the latest version of Ubuntu, see how it works with touchscreens, laptops, phones, and embedded devices, and get questions answered by both community and Canonical volunteers at the booth.

Come for UbuCon, stay for SCALE! This is a weekend not to be missed!

SCALE 15x, the 15th Annual Southern California Linux Expo, is the largest community-run Linux/FOSS showcase event in North America. It will be held from March 2-5 at the Pasadena Convention Center in Pasadena, California. For more information on the expo, visit https://www.socallinuxexpo.org

February 27, 2017 01:24 AM

February 26, 2017

Elizabeth Krumbach

My mother-in-law

On Monday, February 6th MJ’s mother passed away.

She had been ill over the holidays and we had the opportunity to visit with her in the hospital a couple times while we were in Philadelphia in December. Still, with her move to a physical rehabilitation center in recent weeks I thought she was on the mend. Learning of her passing was quite the shock, and it hasn’t been easy. No arrangements had been made for her passing, so for the few hours following her death we notified family members and scrambled to select a cemetery and funeral home. Given the distance and our situations at work (I was about to leave for a conference and MJ had commitments as well) we decided to meet in Philadelphia at the end of the week and take things from there.

MJ and I met at the townhouse in Philadelphia on Saturday and began the week of work we needed to do to put her to rest. Selecting a plot in the cemetery, organizing her funeral, selecting a headstone. A lot of this was new for both of us. While we both have experienced loss in our families, most of these arrangements had already been made for the passing of our other family members. Thankfully everyone we worked with was kind and compassionate, and even when we weren’t sure of specifics, they had answers to fill in the blanks. We also spent time that week moving out her apartment and started the process of handling her estate. Her brother flew into town and stayed in the guest room of our town house, which we were suddenly grateful we had made time to set up on a previous trip.

We held her funeral on February 15th and she was laid to rest surrounded by a gathering of close family and friends. We had clear, beautiful weather as we gathered graveside to say goodbye. Her obituary can be found here.

There’s still a lot to do to finish handling her affairs and it’s been hard for me, but I’m incredibly thankful for friends, family and colleagues who have been so understanding as we’ve worked through this. We’re very grateful for the time we were able to spend with her. When she was well, we enjoyed countless dinners together and of course she joined us to celebrate at our wedding back in 2013. Even recently over the holidays in spite of her condition it was nice to have some time together. She will be missed.

by pleia2 at February 26, 2017 05:49 PM

February 25, 2017

Elizabeth Krumbach

Moving on from the Ubuntu Weekly Newsletter

Somewhere around 2010 I started getting involved with the Ubuntu News Team. My early involvement was limited to the Ubuntu Fridge team, where I would help post announcements from various teams, including the Ubuntu Community Council that I was part of. With Amber Graner at the helm of the Ubuntu Weekly Newsletter (UWN) I focused my energy elsewhere since I knew how much work the UWN editor position was at the time.

Ubuntu Weekly Newsletter

At the end of 2010 Amber stepped down from the team to pursue other interests, and with no one to fill her giant shoes the team entered a five month period of no newsletters. Finally in June, after being contacted numerous times about the fate of the newsletter, I worked with Nathan Handler to revive it so we could release issue 220. Our first job was to do an analysis of the newsletter as a whole. What was valuable about the newsletter and what could we do away with to save time? What could we automate? We decided to make some changes to reduce the amount of manual work put into it.

To this end, we ceased to include monthly reports inline and started linking to rather than sharing inline the upcoming meeting and event details in the newsletter itself. There was also a considerable amount of automation done thanks to Nathan’s work on scripts. No more would we be generating any of the release formats by hand, they’d all be generated with a single command, ready to be cut and pasted. Release time every week went from over two hours to about 20 minutes in the hands of an experienced editor. Our next editor would have considerably less work than those who came before them. From then on I’d say I’ve been pretty heavily involved.

500

The 500th issue lands on February 27th, this is an exceptional milestone for the team and the Ubuntu community. It is deserving of celebration, and we’ve worked behind the scenes to arrange a contest and a simple way for folks to say “thanks” to the team. We’ve also reached out to a handful of major players in the community to tell us what they get from the newsletter.

With the landing of this issue, I will have been involved with over 280 issues over 8 years. Almost every week in that time (I did skip a couple weeks for my honeymoon!) I’ve worked to collect Ubuntu news from around the community and internet, prepare it for our team of summary writers, move content to the wiki for our editors, and spend time on Monday doing the release. Over these years I’ve worked with several great contributors to keep the team going, rewarding contributors with all the thanks I could muster and even a run of UWN stickers specifically made for contributors. I’ve met and worked with some great people during this time, and I’m incredibly proud of what we’ve accomplished over these years and the quality we’ve been able to maintain with article selection and timely releases.

But all good things must come to an end. Several months ago as I was working on finding the next step in my career with a new position, I realized how much my life and the world of open source had changed since I first started working on the newsletter. Today there are considerable demands on my time, and while I hung on to the newsletter, I realized that I was letting other exciting projects and volunteer opportunities pass me by. At the end of October I sent a private email to several of the key contributors letting them know I’d conclude my participation with issue 500. That didn’t quite happen, but I am looking to actively wind down my participation starting with this issue and hope that others in the community can pick up where I’m leaving off.

UWN stickers

I’ll still be around the community, largely focusing my efforts on Xubuntu directly. Folks can reach out to me as they need help moving forward, but the awesome UWN team will need more contributors. Contributors collect news, write summaries and do editing, you can learn more about joining here. If you have questions about contributing, you can join #ubuntu-news on freenode and say hello or drop an email to our team mailing list (public archives).

by pleia2 at February 25, 2017 02:57 AM

February 24, 2017

Akkana Peck

Coder Dojo: Kids Teaching Themselves Programming

We have a terrific new program going on at Los Alamos Makers: a weekly Coder Dojo for kids, 6-7 on Tuesday nights.

Coder Dojo is a worldwide movement, and our local dojo is based on their ideas. Kids work on programming projects to earn colored USB wristbelts, with the requirements for belts getting progressively harder. Volunteer mentors are on hand to help, but we're not lecturing or teaching, just coaching.

Despite not much advertising, word has gotten around and we typically have 5-7 kids on Dojo nights, enough that all the makerspace's Raspberry Pi workstations are filled and we sometimes have to scrounge for more machines for the kids who don't bring their own laptops.

A fun moment early on came when we had a mentor meeting, and Neil, our head organizer (who deserves most of the credit for making this program work so well), looked around and said "One thing that might be good at some point is to get more men involved." Sure enough -- he was the only man in the room! For whatever reason, most of the programmers who have gotten involved have been women. A refreshing change from the usual programming group. (Come to think of it, the PEEC web development team is three women. A girl could get a skewed idea of gender demographics, living here.) The kids who come to program are about 40% girls.

I wondered at the beginning how it would work, with no lectures or formal programs. Would the kids just sit passively, waiting to be spoon fed? How would they get concepts like loops and conditionals and functions without someone actively teaching them?

It wasn't a problem. A few kids have some prior programming practice, and they help the others. Kids as young as 9 with no previous programming experience walk it, sit down at a Raspberry Pi station, and after five minutes of being shown how to bring up a Python console and use Python's turtle graphics module to draw a line and turn a corner, they're happily typing away, experimenting and making Python draw great colorful shapes.

Python-turtle turns out to be a wonderful way for beginners to learn. It's easy to get started, it makes pretty pictures, and yet, since it's Python, it's not just training wheels: kids are using a real programming language from the start, and they can search the web and find lots of helpful examples when they're trying to figure out how to do something new (just like professional programmers do. :-)

Initially we set easy requirements for the first (white) belt: attend for three weeks, learn the names of other Dojo members. We didn't require any actual programming until the second (yellow) belt, which required writing a program with two of three elements: a conditional, a loop, a function.

That plan went out the window at the end of the first evening, when two kids had already fulfilled the yellow belt requirements ... even though they were still two weeks away from the attendance requirement for the white belt. One of them had never programmed before. We've since scrapped the attendance belt, and now the white belt has the conditional/loop/function requirement that used to be the yellow belt.

The program has been going for a bit over three months now. We've awarded lots of white belts and a handful of yellows (three new ones just this week). Although most of the kids are working in Python, there are also several playing music or running LED strips using Arduino/C++, writing games and web pages in Javascript, writing adventure games Scratch, or just working through Khan Academy lectures.

When someone is ready for a belt, they present their program to everyone in the room and people ask questions about it: what does that line do? Which part of the program does that? How did you figure out that part? Then the mentors review the code over the next week, and they get the belt the following week.

For all but the first belt, helping newer members is a requirement, though I suspect even without that they'd be helping each other. Sit a first-timer next to someone who's typing away at a Python program and watch the magic happen. Sometimes it feels almost superfluous being a mentor. We chat with the kids and each other, work on our own projects, shoulder-surf, and wait for someone to ask for help with harder problems.

Overall, a terrific program, and our only problems now are getting funding for more belts and more workstations as the word spreads and our Dojo nights get more crowded. I've had several adults ask me if there was a comparable program for adults. Maybe some day (I hope).

February 24, 2017 08:46 PM

February 20, 2017

Elizabeth Krumbach

Adventures in Tasmania

Last month I attended my third Linux.conf.au, this time in Hobart, Tasmania, I wrote about the conference here and here. In an effort to be somewhat recovered from jet lag for the conference and take advantage of the trip to see the sights, I flew in a couple days early.

I arrived in Hobart after a trio of flights on Friday afternoon. It was incredibly windy, so much so that they warned people when deplaning onto the tarmac (no jet ways at the little Hobart airport) to hold tightly on to their belongings. But speaking of the weather for a moment, January is the middle of summer in the southern hemisphere. I prepare for brutal heat when I visit Australia at this time. But Hobart? They were enjoying beautiful, San Francisco-esque weather. Sunny and comfortably in the 60s every day. The sun was still brutal though, thinner ozone that far south means that I burned after being in the sun for a couple days, even after applying strong sunblock.


Beautiful view from my hotel room

On Saturday I didn’t make any solid plans, just in case there was a problem with my flights or I was too tired to go out. I lucked out though, and took the advice of many who suggested I visit Mona – Museum of Old and New Art. In spite of being tired, the good reviews of the museum, plus learning that you could take a ferry directly there and a nearby brewery featured their beers at the eateries around the museum encouraged me to go.

I walked to the ferry terminal from the hotel, which was just over a mile with some surprising hills along the way as I took the scenic route along the bay and through some older neighborhoods. I also walked past Salamanca Market that is set up every Saturday. I passed on the wallaby burritos and made my way to the ferry terminal. There it was quick and easy to buy my ferry and museum tickets.

Ferry rides are one of my favorite things, and the views on this one made the journey to the museum a lot of fun.

The ferry drops you off at a dock specifically for the museum. Since it was nearly noon and I was in need of nourishment, I walked up past the museum and explored the areas around the wine bar. They had little bars set up that opened at noon and allowed you to get a bottle of wine or some beers and enjoy the beautiful weather on chairs and bean bags placed around a large grassy area. On my own for this adventure, I skipped drinking on the grass and went up to enjoy lunch at the wonderful restaurant on site, The Source. I had a couple beers and discovered Tasmanian oysters. Wow. These wouldn’t be the last ones on my trip.

After lunch it was incredibly tempting to spend the entire afternoon snacking on cheese and wine, but I had museum tickets! So it was down to the museum to spend a couple hours exploring.

I’m not the biggest fan of modern art, so a museum mixing old and new art was an interesting choice for me. As I began to walk through the exhibits, I realized that it would have been great to have MJ there with me. He does enjoy newer art, so the museum would have had a little bit for each of us. There were a few modern exhibits that I did enjoy though, including Artifact which I took a video of: “Artifact” at the Museum of Old and New Art, Hobart (warning: strobe lights!).

Outside the museum I also walked around past a vineyard on site, as well as some beautiful outdoor structures. I took a bunch more photos before the ferry took me back to downtown Hobart. More photos from Mona here: https://www.flickr.com/photos/pleia2/albums/72157679331777806

It was late afternoon when I returned to the Salamanca area of Hobart and though the Market was closing down, I was able to take some time to visit a few shops. I picked up a small pen case for my fountain pens made of Tasmanian Huon Pine and a native Sassafras. That evening I spent some time in my room relaxing and getting some writing done before dinner with a couple open source colleagues who had just gotten into town. I turned in early that night to catch up on some sleep I missed during flights.

And then it was Sunday! As fun as the museum adventure was, my number one goal with this trip was actually to pet a new Australian critter. Last year I attended the conference in Geelong, not too far from Melbourne, and did a similar tourist trip. On that trip I got to feed kangaroos, pet a koala and see hundreds of fairy penguins return to their nests from the ocean at dusk. Topping that day wasn’t actually possible, but I wanted to do my best in Tasmania. I booked a private tour with a guide for the Sunday to take me up to the Bonorong Wildlife Sanctuary.

My tour guide was a very friendly women who owns a local tour company with her husband. She was super friendly and accommodating, plus she was similar in age to me, making for a comfortable journey. The tour included a few stops, but started with Bonorong. We had about an hour there to visit the standing exhibits before the pet-wombats tour begain. All the enclosures were populated by rescued wildlife that were either being rehabilitated or were too permanently injured for release. I had my first glimpse at Tasmanian devils running around (I’d seen some in Melbourne, but they were all sleeping!). I also got to see a tawny frogmouth, which is a bird that looks a bit like an owl, and the three-legged Randall the echidna, a spiky member of the species that is one of the few egg-laying mammals. I also took some time to commune with kangaroos and wallabies, picking up a handful of food to feed my new, bouncy friends.


Feeding a kangaroo, tiny wombat drinking from a bottle, pair of wombats, Tasmanian devil

And then there were the baby wombats. I saw my first wombat at the Perth Zoo four years ago and was surprised at how big they are. Growing to be a meter in length in Tasmania, wombats are hefty creatures and I got to pet one! At 11:30 they did a keeper talk and then allowed folks gathered to give one of the babies (about 9 months old) a quick pat. In a country of animals that have fur that’s more wiry and wool-like than you might expect (on kangaroos, koalas), the baby wombats are surprisingly soft.


Wombat petting mission accomplished.

The keeper talks continued with opportunities to pet a koala and visit some Tasmanian devils, but having already done these things I hit the gift shop for some post cards and then went to the nearby Richmond Village.

More photos from Bonorong Wildlife Sanctuary, Tasmania here: https://www.flickr.com/photos/pleia2/albums/72157679331734466

I enjoyed a meat pie lunch in the cute downtown of Richmond before continuing our journey to visit the oldest continuously operating Catholic church in all of Australia (not just Tasmania!), St John’s. It was built in 1836. Just a tad bit older, we also got to visit the oldest bridge, built in 1823. The bridge is surrounded by a beautiful park, making for a popular picnic and play area on days like the beautiful one we had while there. On the way back, we stopped at the Wicked Cheese Co. where I got to sample a variety of cheeses and pick up some Whiskey Cheddar to enjoy later in the week. A final stop at Rosny Hill rounded out the tour. It gave some really spectacular views of the bay and across to Hobart, I could see my hotel from there!

Sunday evening I met up with a gaggle of OpenStack friends for some Indian food back in the main shopping district of Hobart.

That wrapped up my real touristy part of my trip, as the week continued with the conference. However there were some treats still to be enjoyed! I had a whole bunch of Tasmanian cider throughout the week and as I had promised myself, more oysters! The thing about the oysters in Tasmania is that they’re creamy and they’re big. A mouthful of delicious.

I loved Tasmania, I hope I can make it back some day. More photos from my trip here: https://www.flickr.com/photos/pleia2/albums/72157677692771201

by pleia2 at February 20, 2017 06:47 PM

February 18, 2017

Akkana Peck

Highlight and remove extraneous whitespace in emacs

I recently got annoyed with all the trailing whitespace I saw in files edited by Windows and Mac users, and in code snippets pasted from sites like StackOverflow. I already had my emacs set up to indent with only spaces:

(setq-default indent-tabs-mode nil)
(setq tabify nil)
and I knew about M-x delete-trailing-whitespace ... but after seeing someone else who had an editor set up to show trailing spaces, and tabs that ought to be spaces, I wanted that too.

To show trailing spaces is easy, but it took me some digging to find a way to control the color emacs used:

;; Highlight trailing whitespace.
(setq-default show-trailing-whitespace t)
(set-face-background 'trailing-whitespace "yellow")

I also wanted to show tabs, since code indented with a mixture of tabs and spaces, especially if it's Python, can cause problems. That was a little harder, but I eventually found it on the EmacsWiki: Show whitespace:

;; Also show tabs.
(defface extra-whitespace-face
  '((t (:background "pale green")))
  "Color for tabs and such.")

(defvar bad-whitespace
  '(("\t" . 'extra-whitespace-face)))

While I was figuring this out, I got some useful advice related to emacs faces on the #emacs IRC channel: if you want to know why something is displayed in a particular color, put the cursor on it and type C-u C-x = (the command what-cursor-position with a prefix argument), which displays lots of information about whatever's under the cursor, including its current face.

Once I had my colors set up, I found that a surprising number of files I'd edited with vim had trailing whitespace. I would have expected vim to be better behaved than that! But it turns out that to eliminate trailing whitespace, you have to program it yourself. For instance, here are some recipes to Remove unwanted spaces automatically with vim.

February 18, 2017 11:41 PM

February 17, 2017

Elizabeth Krumbach

Spark Summit East 2017

“Do you want to go to Boston in February?”

So began my journey to Boston to attend the recent Spark Summit East 2017, joining my colleagues Kim, Jörg and Kapil to participate in the conference and meet attendees at our Mesosphere booth. I’ve only been to a handful of single-technology events over the years, so it was an interesting experience for me.


Selfie with Jörg!

The conference began with a keynote by Matei Zaharia which covered some of the major successes in the Apache Spark world in 2016, from the release of version 2.0, with structured streaming to the growth in community-driven meetups. As the keynotes continued, two trends came into clear focus:

  1. Increased use of Apache Spark with streaming data
  2. Strong desire to do data processing for artificial intelligence (AI) and machine learning

It was really fascinating to hear about all the AI and machine learning work being done from companies like Salesforce developing customized products to genetic data analysis by way of the Hail project that will ultimately improve and save lives. Work is even being done by Intel to improve hardware and open source tooling around deep learning (see their BigDL project on GitHub).

In perhaps my favorite keynote of the conference, we heard from Mike Gualtieri of Forrester where he presented the new “age of the customer” with a look toward very personalized, AI-driven learning about customer behavior, intent and more. He went on the use the term “pragmatic AI” to describe what we’re aiming for with an intelligence that’s good enough to succeed at what it’s put to. However, his main push for this talk was how much opportunity there is in this space. Companies and individuals skilled with processing massive amounts of data processing, AI and deep and machine learning can make a real impact in a variety of industries. Video and slides from this keynote are available here.


Mike Gualtieri on types of AI we’re looking at today

I was also impressed by how strong the open source assumption was at this conference. All of these universities, corporations, hardware manufacturers and more are working together to build platforms to do all of this work data processing work and they’re open sourcing them.

While at the event, Jörg gave a talk on Powering Predictive Mapping at Scale with Spark, Kafka, and Elastic Search (slides and videos at that link). In this he used DC/OS to give a demo based on NYC cab data.

At the booth the interest in open source was also strong. I’m working on DC/OS in my new role, and the fact that folks could hit the ground running with our open source version, and get help on mailing lists and Slack was in sync with their expectations. We were able to show off demos on our laptops and in spite of only having just over a month at the company under my belt, I was able to answer most of the questions that came my way and learned a lot from my colleagues.


The the Mesosphere booth included DC/OS hand warmers!

We had a bit of non-conference fun at the conference as well, Kapil took us out Wednesday night to the L.A. Burdick chocolate shop to get some hot chocolate… on ice. So good. Thursday the city was hit with a major snow storm, dumping 10 inches on us throughout the day as we spent our time inside the conference venue. Flights were cancelled after noon that day, but thankfully I had no trouble getting out on my Friday flight after lunch with my friend Deb who lives nearby.

More photos from the event here: https://www.flickr.com/photos/pleia2/albums/72157680153926395

by pleia2 at February 17, 2017 10:29 PM

February 15, 2017

Elizabeth Krumbach

Highlights from LCA 2017 in Hobart

Earlier this month I attended my first event while working as a DC/OS Developer Advocate over at Mesosphere. My talk on Listening to the needs of your global open source community was accepted before I joined the company, but this kind of listening is precisely what I need to be doing in this new role, so it fit nicely.

Work also gave me some goodies to bring along! So I was able to hand some out as I chatted with people about my new role, and left piles of stickers and squishy darts on the swag table throughout the week.

The topic of the conference this year was the future of open source. It led to an interesting series of keynotes, ranging from the hopeful and world-changing words from Pia Waugh about how technologists could really make a difference in her talk, Choose Your Own Adventure, Please!, to the Keeping Linux Great talk by Robert M. “r0ml” Lefkowitz that ended up imploring the audience to examine their values around the traditional open source model.

Pia’s keynote was a lot of fun, walking us through human history to demonstrate that our values, tools and assumptions are entirely of our own making, and able to be changed (indeed, they have!). She asked us to continually challenge our assumptions about the world around us and what we could change. She encouraged thinking beyond our own spaces, like how 3D printers could solve scarcity problems in developing nations or what faster travel would do to transform the world. As a room of open source enthusiasts who make small changes to change the world all the day, being the creators and innovators of the world, there’s always more we can do and strive for, curing the illness rather than scratching the itch for systematic change. I really loved the positive message of this talk, I think a lot of attendees walked out feeling empowered and hopeful. Plus, she had a brilliant human change.log, that demonstrated how we as humans have made some significant changes in our assumptions through the millennia.


Pia Waugh’s human change.log

The keynote by Dan Callahan on Wednesday morning on Designing for Failure explored the failure of Mozilla’s Persona project, and key things he learned from it. He walked through some key lessons:

  1. Free licenses are not enough, your code can’t be tied to proprietary infrastructure
  2. Bits rot more quickly online, an out of date desktop application is usually at much lower risk, and endangers fewer people, than a service running on the web
  3. Complexity limits agency, people need to be able to have the resources, system and time to try out and run your software

He went on to give tips about what to do to prolong project life, including making sure you have metrics and are measuring the right things for your project, explicitly defining your scope so the team doesn’t get spread too thin or try to solve the wrong problems, and ruthlessly opposing complexity, since that makes it harder to maintain and for others to get involved.

Finally, he had some excellent points for how to assist the survival of your users when a project does finally fail:

  1. If you know your project is dead (funding pulled, etc), say so, don’t draw things out
  2. Make sure your users can recover without your involvement (have a way to extract data, give them an escape path infrastructure-wise)
  3. Use standard data formats to minimize the migration harm when organizations have to move on

It was really great hearing lessons from this, I know how painful it is to see a project you’ve put a lot of work into die, the ability to not only move on in a healthy way but bring those lessons to a whole community during a keynote like this was commendable.

Thursday’s keynote by Nadia Eghbal was an interesting one that I haven’t seen a lot of public discussion around, Consider the Maintainer. In it she talked about the work that goes into being a maintainer of a project, which she defined as someone who is doing the work of keeping a project going: looking at the contributions coming in, actively responding to bug reports and handling any other interactions. This is a discussion that came up from time to time on some projects I’ve recently worked on where we were striving to prevent scope creep. How can we manage the needs of our maintainers who are sticking around, with the desire for new contributors to add features that benefit them? It’s a very important question that I was thrilled to see her talk about. To help address this, she proposed a twist on the The Four Essential Freedoms of Software as defined by the FSF, The Four Freedoms of Open Source Producers. They were:

  • The freedom to decide who participates in your community
  • The freedom to say no to contributions or requests
  • The freedom to define the priorities and policies of the project
  • The freedom to step down or move on from a project, temporarily or permanently

The speaker dinner was beautiful and delicious, taking us up to Frogmore Creek Winery. There was a radio telescope in the background and the sunset over the vineyard was breathtaking. Plus, great company.

Other talks I went to trended toward fun and community-focused. On Monday there was a WOOTConf, the entire playlist from the event is here. I caught a nice handful of talks, starting with Human-driven development where aurynn shaw spoke about some of the toxic behaviors in our technical spaces, primarily about how everyone is expected to know everything and that asking questions is not always acceptable. She implored us to work to make asking questions easier and more accepted, and working toward asking your team questions about what they need.

I learned about a couple websites in a talk by Kate Andrews on Seeing the big picture – using open source images, TinEye Reverse Image Search to help finding the source of an image to give credit, and sites like Unsplash where you can find freely licensed photos, in addition to various creative commons searches. Brenda Wallace’s Let’s put wifi in everything was a lot of fun, as she walked through various pieces of inexpensive hardware and open source tooling to build sensors to automate all kinds of little things around the house. I also enjoyed the talk by Kris Howard, Knit One, Compute One where very strong comparisons were made between computer programming and knitting patterns, and a talk by Grace Nolan on Condensed History of Lock Picking.

For my part, I gave a talk on Listening to the Needs of Your Global Open Source Community. This is similar to the talk I gave at FOSSCON back in August, where I walked through experiences I had in Ubuntu and OpenStack projects, along with in person LUGs and meetups. I had some great questions at the end, and I was excited to learn VM Brasseur was tweeting throughout and created a storify about it! The slides from the talk are available as a PDF here.


Thanks to VM Brasseur for the photo during my talk, source

The day concluded with Rikki Endsley’s Mamas Don’t Let Your Babies Grow Up to Be Rock Star Developers, which I really loved. She talked about the tendency to put “rock star” in job descriptions for developers, but when going through the traits of rock stars these weren’t actually what you want on your team. The call was for more Willie Nelson developers, and we were treated to a quick biography of Willie Nelson. In it she explained how he helped others, was always learning new skills, made himself available to his fans, and would innovate and lead. I also enjoyed that he actively worked to collaborate with a diverse mix of people and groups.

As the conference continued, I learned about the the great work that Whare Hauora from Brenda Wallace and Amber Craig, and heard from Josh Simmons about building communities outside of major metropolitan areas where he advocated for multidisciplinary meetups. Allison Randal spoke about the ways that open source accelerates innovation and Karen Sandler dove into what happens to our software when we die in a presentation punctuated by pictures of baby Tasmanian Devils to cheer us up. I also heard Chris Lamb gave us the status of the Reproducible Builds projects and then from Hamish Coleman on the work he’s done replacing ThinkPad keyboards and backwards engineering the tooling.

The final day wound down with a talk by VM (Vicky) Brasseur on working inside a company to support open source projects, where she talked about types of communities, the importance of having a solid open source plans and quickly covered some of the most common pitfalls within companies.

This conference remains one of my favorite open source conferences in the world, and I’m very glad I was able to attend again. It’s great meeting up with all my Australian and New Zealand open source colleagues, along with some of the usual suspects who attend many of the same conferences I do. Huge thanks for the organizers for making it such a great conference.

All the videos from the conference were uploaded very quickly to YouTube and are available here: https://www.youtube.com/user/linuxconfau2017/videos

More photos from the conference at https://www.flickr.com/photos/pleia2/sets/72157679331149816/

by pleia2 at February 15, 2017 01:09 AM

February 13, 2017

Akkana Peck

Emacs: Initializing code files with a template

Part of being a programmer is having an urge to automate repetitive tasks.

Every new HTML file I create should include some boilerplate HTML, like <html><head></head></body></body></html>. Every new Python file I create should start with #!/usr/bin/env python, and most of them should end with an if __name__ == "__main__": clause. I get tired of typing all that, especially the dunderscores and slash-greater-thans.

Long ago, I wrote an emacs function called newhtml to insert the boilerplate code:

(defun newhtml ()
  "Insert a template for an empty HTML page"
  (interactive)
  (insert "<!DOCTYPE HTML PUBLIC \"-//W3C//DTD HTML 4.01 Transitional//EN\">\n"
          "<html>\n"
          "<head>\n"
          "<title></title>\n"
          "</head>\n\n"
          "<body>\n\n"
          "<h1></h1>\n\n"
          "<p>\n\n"
          "</body>\n"
          "</html>\n")
  (forward-line -11)
  (forward-char 7)
  )

The motion commands at the end move the cursor back to point in between the <title> and </title>, so I'm ready to type the page title. (I should probably have it prompt me, so it can insert the same string in title and h1, which is almost always what I want.)

That has worked for quite a while. But when I decided it was time to write the same function for python:

(defun newpython ()
  "Insert a template for an empty Python script"
  (interactive)
  (insert "#!/usr/bin/env python\n"
          "\n"
          "\n"
          "\n"
          "if __name__ == '__main__':\n"
          "\n"
          )
  (forward-line -4)
  )
... I realized that I wanted to be even more lazy than that. Emacs knows what sort of file it's editing -- it switches to html-mode or python-mode as appropriate. Why not have it insert the template automatically?

My first thought was to have emacs run the function upon loading a file. There's a function with-eval-after-load which supposedly can act based on file suffix, so something like (with-eval-after-load ".py" (newpython)) is documented to work. But I found that it was never called, and couldn't find an example that actually worked.

But then I realized that I have mode hooks for all the programming modes anyway, to set up things like indentation preferences. Inserting some text at the end of the mode hook seems perfectly simple:

(add-hook 'python-mode-hook
          (lambda ()
            (electric-indent-local-mode -1)
            (font-lock-add-keywords nil bad-whitespace)
            (if (= (buffer-size) 0)
                (newpython))
            (message "python hook")
            ))

The (= (buffer-size) 0) test ensures this only happens if I open a new file. Obviously I don't want to be auto-inserting code inside existing programs!

HTML mode was a little more complicated. I edit some files, like blog posts, that use HTML formatting, and hence need html-mode, but they aren't standalone HTML files that need the usual HTML template inserted. For blog posts, I use a different file extension, so I can use the elisp string-suffix-p to test for that:

  ;; s-suffix? is like Python endswith
  (if (and (= (buffer-size) 0)
           (string-suffix-p ".html" (buffer-file-name)))
      (newhtml) )

I may eventually find other files that don't need the template; if I need to, it's easy to add other tests, like the directory where the new file will live.

A nice timesaver: open a new file and have a template automatically inserted.

February 13, 2017 04:52 PM

February 09, 2017

Jono Bacon

HackerOne Professional, Free for Open Source Projects

For some time now I have been working with HackerOne to help them shape and grow their hacker community. It has been a pleasure working with the team: they are doing great work, have fantastic leadership (including my friend, Mårten Mickos), are seeing consistent growth, and recently closed a $40 million round of funding. It is all systems go.

For those of you unfamiliar with HackerOne, they provide a powerful vulnerability coordination platform and a global community of hackers. Put simply, a company or project (such as Starbucks, Uber, GitHub, the US Army, etc) invite hackers to hack their products/services to find security issues, and HackerOne provides a platform for the submission, coordination, dupe detection, and triage of these issues, and other related functionality.

You can think of HackerOne in two pieces: a powerful platform for managing security vulnerabilities and a global community of hackers who use the platform to make the Internet safer and in many cases, make money. This effectively crowd-sources security using the same “with enough eyeballs are shallow” principle in open source: with enough eyeballs all security issues are shallow too.

HackerOne and Open Source

HackerOne unsurprisingly are big fans of open source. The CEO, Mårten Mickos, has led a number of successful open source companies including MySQL and Eucalyptus. The platform itself is built on top of chunks of open source, and HackerOne is a key participant in the Internet Bug Bounty program that helps to ensure core pieces of technology that power the Internet are kept secure.

One of the goals I have had in my work with HackerOne is to build an even closer bridge between HackerOne and the open source community. I am delighted to share the next iteration of this.

HackerOne for Open Source Projects

While not formally announced yet (this is coming soon), I am pleased to share the availability of HackerOne Community Edition.

Put simply, HackerOne is providing their HackerOne Professional service for free to open source projects.

This provides features such as a security page, vulnerability submission/coordination, duplicate detection, hacker reputation, a comprehensive API, analytics, CVEs, and more.

This not only provides a great platform for open source projects to gather vulnerability report and manage them, but also opens your project up to thousands of security researchers who can help identify security issues and make your code more secure.

Which projects are eligible?

To be eligible for this free service projects need to meet the following criteria:

  1. Open Source projects – projects in scope must only be Open Source projects that are covered by an OSI license.
  2. Be ready – projects must be active and at least 3 months old (age is defined by shipped releases/code contributions).
  3. Create a policy – you add a SECURITY.md in your project root that provides details for how to submit vulnerabilities (example).
  4. Advertise your program – display a link to your HackerOne profile from either the primary or secondary navigation on your project’s website.
  5. Be active – you maintain an initial response to new reports of less than a week.

If you meet these criteria and would like to apply, just see the HackerOne Community Edition page and click the button to apply.

Of course, let me know if you have any questions!

The post HackerOne Professional, Free for Open Source Projects appeared first on Jono Bacon.

by Jono Bacon at February 09, 2017 10:20 PM

February 06, 2017

Elizabeth Krumbach

Rogue One and Carrie Fisher

Back in December I wasn’t home in San Francisco very much. Most of my month was spent back east at our townhouse in Philadelphia and I spent a few days in Salt Lake City for a conference, but the one week I was in town was the week that Rogue One: A Star Wars Story came out! I was traveling to Europe when tickets went on sale, but fortunately for me our local theater transformed to swap most of it’s screens over to show the film opening night. I was able to snag tickets once I realized they were on sale.

And that’s how I continued my tradition of seeing all the new films (1-3, 7) opening night! MJ and I popped over to the Metreon, just a short walk from home, to see it. For this showing I didn’t do IMAX or 3D or anything fancy, just a modern AMC theater and a late night showing.

The movie was great. They did a really nice job of looping the story in with the past films and preserving the feel of Star Wars for me, which was absent in the prequels that George Lucas made. Clunky technology, the good guys achieving victories in the face of incredible odds and yet, quite a bit of heartbreak. Naturally, I saw it a second time later in the month while staying in Philadelphia for the holidays. It was great the second time too!

My hope is that the quality of the films will remain high while in the hands of Disney, and I’m really looking forward to The Last Jedi coming out at the end of this year.

Alas, the year wasn’t all good for a Star Wars fan like me. Back in August we lost Kenny Baker, the man behind my beloved R2-D2. Then on December 23rd we learned that Carrie Fisher had a heart attack on a flight from London. On December 27th she passed away.

Now, I am typically not one to write about the death of a celebrity in her blog. It’s pretty rare that I’m upset about the death of a celebrity at all. But this was Carrie Fisher. She was not on my radar for passing (only 60!) and she is the actress who played one of my all-time favorite characters, in case it wasn’t obvious from the domain name this blog is on.

The character of Princess Leia impacted my life in many ways, and at age 17 caused me to choose PrincessLeia2 (PrincessLeia was taken), and later pleia2, as my online handle. She was a princess of a mysterious world that was destroyed. She was a strong character who didn’t let people get in her way as she covertly assisted, then openly joined the rebel alliance because of what she believed in. She was also a character who also showed considerable kindness and compassion. In the Star Wars universe, and in the 1980s when I was a kid, she was often a shining beacon of what I aspired to. Her reprise of the character, returning as General Leia Organa, in Episode VII brought me to tears. I have a figure of her on my desk.


Halloween 2005, Leia costume!

A character she played aside, she also was a champion of de-stigmatizing mental illness. I have suffered from depression for over 20 years and have worked to treat my condition with over a dozen doctors, from primary care to neurologists and psychiatrists. Still, I haven’t found an effective medication-driven treatment that won’t conflict with my other neurological atypical conditions (migraines and seizures). Her outspokenness on the topic of both mental illness and the difficulty in treating it even when you have access to resources was transformational for me. I had a guilt lifted from me about not being “better” in spite of my access to treatment, and was generally more inclined to tackle the topic of mental illness in public.

Her passing was hard for me.

I was contacted by BBC Radio 5 Live on the day she passed away and interviewed by Chris Warburton for their show that would air the following morning. They reached out to me as a known fan, asking me about what her role as Leia Organa meant to me growing up, her critical view of the celebrity world and then on to her work in the space of mental illness. It meant a lot that they reached out to me, but I was also pained by what it brought up, it turns out that the day of her passing was the one day in my life I didn’t feeling like talking about her work and legacy.

It’s easier today as I reflect upon her impact. I’m appreciative of the character she brought to life for me. Appreciative of the woman she became and shared in so many memorable, funny and self-deprecating books, which line my shelves. Thank you, Carrie Fisher, for being such an inspiration and an advocate.

by pleia2 at February 06, 2017 08:17 AM

Akkana Peck

Rosy Finches

Los Alamos is having an influx of rare rosy-finches (which apparently are supposed to be hyphenated: they're rosy-finches, not finches that are rosy).

[Rosy-finches] They're normally birds of the snowy high altitudes, like the top of Sandia Crest, and quite unusual in Los Alamos. They're even rarer in White Rock, and although I've been keeping my eyes open I haven't seen any here at home; but a few days ago I was lucky enough to be invited to the home of a birder in town who's been seeing great flocks of rosy-finches at his feeders.

There are four types, of which three have ever been seen locally, and we saw all three. Most of the flock was brown-capped rosy-finches, with two each black rosy-finches and gray-capped rosy-finches. The upper bird at right, I believe, is one of the blacks, but it might be a grey-capped. They're a bit hard to tell apart. In any case, pretty birds, sparrow sized with nice head markings and a hint of pink under the wing, and it was fun to get to see them.

[Roadrunner] The local roadrunner also made a brief appearance, and we marveled at the combination of high-altitude snowbirds and a desert bird here at the same place and time. White Rock seems like much better roadrunner territory, and indeed they're sometimes seen here (though not, so far, at my house), but they're just as common up in the forests of Los Alamos. Our host said he only sees them in winter; in spring, just as they start singing, they leave and go somewhere else. How odd!

Speaking of birds and spring, we have a juniper titmouse determinedly singing his ray-gun song, a few house sparrows are singing sporadically, and we're starting to see cranes flying north. They started a few days ago, and I counted several hundred of them today, enjoying the sunny and relatively warm weather as they made their way north. Ironically, just two weeks ago I saw a group of about sixty cranes flying south -- very late migrants, who must have arrived at the Bosque del Apache just in time to see the first northbound migrants leave. "Hey, what's up, we just got here, where ya all going?"

A few more photos: Rosy-finches (and a few other nice birds).

We also have a mule deer buck frequenting our yard, sometimes hanging out in the garden just outside the house to drink from the heated birdbath while everything else is frozen. (We haven't seen him in a few days, with the warmer weather and most of the ice melted.) We know it's the same buck coming back: he's easy to recognize because he's missing a couple of tines on one antler.

The buck is a welcome guest now, but in a month or so when the trees start leafing out I may regret that as I try to find ways of keeping him from stripping all the foliage off my baby apple tree, like some deer did last spring. I'm told it helps to put smelly soap shavings, like Irish Spring, in a bag and hang it from the branches, and deer will avoid the smell. I will try the soap trick but will probably combine it with other measures, like a temporary fence.

February 06, 2017 02:39 AM

January 28, 2017

Nathan Haines

We're looking for Ubuntu 17.04 wallpapers right now!

We're looking for Ubuntu 17.04 wallpapers right now!

Ubuntu is a testament to the power of sharing, and we use the default selection of desktop wallpapers in each release as a way to celebrate the larger Free Culture movement. Talented artists across the globe create media and release it under licenses that don't simply allow, but cheerfully encourage sharing and adaptation. This cycle's Free Culture Showcase for Ubuntu 17.04 is now underway!

We're halfway to the next LTS, and we're looking for beautiful wallpaper images that will literally set the backdrop for new users as they use Ubuntu 17.04 every day. Whether on the desktop, phone, or tablet, your photo or can be the first thing Ubuntu users see whenever they are greeted by the ubiquitous Ubuntu welcome screen or access their desktop.

Submissions will be handled via Flickr at the Ubuntu 17.04 Free Culture Showcase - Wallpapers group, and the submission window begins now and ends on March 5th.

More information about the Free Culture Showcase is available on the Ubuntu wiki at https://wiki.ubuntu.com/UbuntuFreeCultureShowcase.

I'm looking forward to seeing the 10 photos and 2 illustrations that will ship on all graphical Ubuntu 17.04-based systems and devices on April 13th!

January 28, 2017 08:08 AM

January 27, 2017

Akkana Peck

Making aliases for broken fonts

A web page I maintain (originally designed by someone else) specifies Times font. On all my Linux systems, Times displays impossibly tiny, at least two sizes smaller than any other font that's ostensibly the same size. So the page is hard to read. I'm forever tempted to get rid of that font specifier, but I have to assume that other people in the organization like the professional look of Times, and that this pathologic smallness of Times and Times New Roman is just a Linux font quirk.

In that case, a better solution is to alias it, so that pages that use Times will choose some larger, more readable font on my system. How to do that was in this excellent, clear post: How To Set Default Fonts and Font Aliases on Linux .

It turned out Times came from the gsfonts package, while Times New Roman came from msttcorefonts:

$ fc-match Times
n021003l.pfb: "Nimbus Roman No9 L" "Regular"
$ dpkg -S n021003l.pfb
gsfonts: /usr/share/fonts/type1/gsfonts/n021003l.pfb
$ fc-match "Times New Roman"
Times_New_Roman.ttf: "Times New Roman" "Normal"
$ dpkg -S Times_New_Roman.ttf
dpkg-query: no path found matching pattern *Times_New_Roman.ttf*
$ locate Times_New_Roman.ttf
/usr/share/fonts/truetype/msttcorefonts/Times_New_Roman.ttf
(dpkg -S doesn't find the file because msttcorefonts is a package that downloads a bunch of common fonts from Microsoft. Debian can't distribute the font files directly due to licensing restrictions.)

Removing gsfonts fonts isn't an option; aside from some documents and web pages possibly not working right (if they specify Times or Times New Roman and don't provide a fallback), removing gsfonts takes gnumeric and abiword with it, and I do occasionally use gnumeric. And I like having the msttcorefonts installed (hey, gotta have Comic Sans! :-) ). So aliasing the font is a better bet.

Following Chuan Ji's page, linked above, I edited ~/.config/fontconfig/fonts.conf (I already had one, specifying fonts for the fantasy and cursive web families), and added these stanzas:

    <match>
        <test name="family"><string>Times New Roman</string></test>
        <edit name="family" mode="assign" binding="strong">
            <string>DejaVu Serif</string>
        </edit>
    </match>
    <match>
        <test name="family"><string>Times</string></test>
        <edit name="family" mode="assign" binding="strong">
            <string>DejaVu Serif</string>
        </edit>
    </match>

The page says to log out and back in, but I found that restarting firefox was enough. Now I could load up a page that specified Times or Times New Roman and the text is easily readable.

January 27, 2017 09:47 PM

January 26, 2017

Elizabeth Krumbach

CLSx at LCA 2017

Last week I was in Hobart, Tasmania for LCA 2017. I’ll write broader blog post about the whole event soon, but I wanted to take some time to write this focused post about the CLSx (Community Leadership Summit X) event organized by VM Brasseur. I’d been to the original CLS event at OSCON a couple times, first in 2013 and again in 2015. This was the first time I was attending a satellite event, but with VM Brasseur at the helm and a glance at the community leadership talent in the room I knew we’d have a productive event.


VM Brasseur introduces CLSx

The event began with an introduction to the format and the schedule. As an unconference, CLS events topics are brainstormed by and the schedule organized by the attendees. It started with people in the room sharing topics they’d be interested in, and then we worked through the list to combine topics and reduce it down to just 9 topics:

  • Non-violent communication for diffusing charged situations
  • Practical strategies for fundraising
  • Rewarding community members
  • Reworking old communities
  • Increasing diversity: multi-factor
  • Recruiting a core
  • Community metrics
  • Community cohesion: retention
  • How to Participate When You Work for a Corporate Vendor

Or, if you’d rather, the whiteboard of topics!

The afternoon was split into four sessions, three of which were used to discuss the topics, with three topics being covered simultaneously by separate groups in each session slot. The final session of the day was reserved for the wrap-up of the event where participants shared summaries of each topic that was discussed.

The first session I participated in was the one I proposed, on Rewarding Community Members. The first question I asked the group was whether we should reward community members at all, just to make sure we were all starting with the same ideas. This quickly transitioned into what counts as a reward, were we talking physical gifts like stickers and t-shirts? Or recognition in the community? Some communities “reward” community members by giving them free or discounted entrance to conferences related to the project, or discounts on services with partners.

Simple recognition of work was a big topic for this session. We spent some time talking about how we welcome community members. Does your community have a mechanism for welcoming, even if it’s automated? Or is there a more personal touch to reaching out? We also covered whether projects have a path to go from new contributor to trusted committer, or the “internal circle” of a project, noting that if that path doesn’t exist, it could be discouraging to new contributors. Gamification was touched upon as a possible way to recognize contributors in a more automated fashion, but it was clear that you want to reward certain positive behaviors and not focus so strictly on statistics that can be cheated without bringing any actual value to the project or community.

What I found most valuable in this session was learning some of the really successful tips for rewards. It was interesting how far the personal touch goes when sending physical rewards to contributors, like including a personalized note along with stickers. It was also clear that metrics are not the full story, in every community the leaders, evangelists and advocates need to be very involved so they can identify contributors in a more qualitative way in order to recognize or reward them, maybe someone is particularly helpful and friendly, or are making contributions in ways that are not easily tracked by solid metrics. The one warning here was making sure you avoid personal bias, make sure you aren’t being more critical of contributions from minorities in your community or are ignoring folks who don’t boast about their contributions, this happens a lot.

Full notes from Rewarding Contributors, thanks go to Deirdré Straughan for taking notes during the session.

The next session brought me to a gathering to discuss Community Building, Cohesion and Retention. I’ve worked in very large open source communities for over a decade now, and as I embark on my new role at Mesosphere where the DC/OS community is largely driven by paid contributors from a single company today, I’m very much interested in making sure we work to attract more outside contributors.

One of the big topics of this session was the fragmentation of resources across platforms (mailing lists, Facebook, IRC, Slack, etc) and how we have very little control over this. Pulling from my own experience, we saw this in the Xubuntu user community where people would create unofficial channels on various resources, and so as an outreach team we had to seek these users out and begin engaging with them “where they lived” on these platforms. One of the things I learned from my work here, was that we could reduce our own burden by making some of these “unofficial” resources into official resources, thus having an official presence but leaving the folks who were passionate about the platform and community there in control, though we did ask for admin credentials for one person on the Xubuntu team to help with the bus factor.

Some other tips to building cohesion were making sure introductions were done during meetings and in person gatherings so that newcomers felt welcome, or offering a specific newcomer track so that no one felt like they were the only new person in the room, which can be very isolating. Similarly, making sure there were communication channels available before in-person events could be helpful to getting people comfortable with a community before meeting. One of the interesting proposals was also making sure there was a more official, announce-focused channel for communication so that people who were loosely interested could subscribe to that and not be burdened with an overly chatty communication channel if they’re only interested in important news from the community.

Full notes from Community building, cohesion and retention, with thanks to Josh Simmons for taking notes during this session.


Thanks to VM Brasseur for this photo of our building, cohesion and retention session (source)

The last session of the day I attended was around Community Metrics and held particular interest for me as the team I’m on at Mesosphere starts drilling down into community statistics for our young community. One of the early comments in this session is that our teams need to be aware that metrics can help drive value for your team within a company and in the project. You should make sure you’re collecting metrics and that you’re measuring the right things. It’s easy for those of us who are more technically inclined to “geek out” over numbers and statistics, which can lead to gathering too much data and drawing conclusions that may not necessarily be accurate.

There was value found in surveys of community members by some attendees, which was interesting for me to learn. I haven’t had great luck with surveys but it was suggested that making sure people know why they should spend their time replying and sharing information and how it will be used to improve things makes them more inclined to participate. It was also suggested to have staggered surveys targeted at specific contributors. Perhaps have one survey to newcomers, and another targeted at people who have succeeded in becoming a core contributor about the process challenges they’ve faced. Surveys also help gather some of the more qualitative data that is essential for proper tracking the health of a community. It’s not just numbers.

Specifically drilling down into value to the community, the following beyond surveys were found to be helpful:

  • Less focus on individuals and specific metrics in a silo, instead looking at trends and aggregations
  • Visitor count to the web pages on your site and specific blog posts
  • Metrics about community diversity in terms of number of organizations contributing, geographic distribution and human metrics (gender, race, age, etc) since all these types of diversity have proven to be indicators of project and team success.
  • Recruitment numbers linked to contributions, whether it’s how many people your company hires from the community or that companies in general do if the project has many companies involved (recruitment is expensive, you can bring real value here)

The consensus in the group was that it was difficult to correlate metrics like retweets, GitHub stars and other social media metrics to sales, so even though there may be value with regard to branding and excitement about your community, they may not help much to justify the existence of your team within a company. We didn’t talk much about metrics gathering tools, but I was OK with this, since it was nice to get a more general view into what we should be collecting rather than how.

Full notes from Community Metrics, which we can thank Andy Wingo for.

The event concluded with the note-taker from each group giving a five minute summary of what we talked about in each group. This was the only recorded portion of the event, you can watch it on YouTube here: Community Leadership Summit Summary.

Discussion notes from all the sessions can be found here: https://linux.conf.au/wiki/conference/miniconfs/clsx_at_lca/#wiki-toc-group-discussion-notes.

I really got a lot out of this event, and I hope others gained from my experience and perspectives as well. Huge thanks to the organizers and everyone who participated.

by pleia2 at January 26, 2017 02:58 AM

January 24, 2017

Jono Bacon

Endless Code and Mission Hardware Demo

Recently, I have had the pleasure of working with a fantastic company called Endless who are building a range of computers and a Linux-based operating system called Endless OS.

My work with them has primarily been involved in the community and product development of an initiative in which they are integrating functionality into the operating system that teaches you how to code. This provides a powerful platform where you can learn to code and easily hack on applications in the platform.

If this sounds interesting to you, I created a short video demo where I show off their Mission hardware as well as run through a demo of Endless Code in action. You can see it below:

I would love to hear what you think and how Endless Code can be improved in the comments below.

The post Endless Code and Mission Hardware Demo appeared first on Jono Bacon.

by Jono Bacon at January 24, 2017 12:35 PM

January 23, 2017

Akkana Peck

Testing a GitHub Pull Request

Several times recently I've come across someone with a useful fix to a program on GitHub, for which they'd filed a GitHub pull request.

The problem is that GitHub doesn't give you any link on the pull request to let you download the code in that pull request. You can get a list of the checkins inside it, or a list of the changed files so you can view the differences graphically. But if you want the code on your own computer, so you can test it, or use your own editors and diff tools to inspect it, it's not obvious how. That this is a problem is easily seen with a web search for something like download github pull request -- there are huge numbers of people asking how, and most of the answers are vague unclear.

That's a shame, because it turns out it's easy to pull a pull request. You can fetch it directly with git into a new branch as long as you have the pull request ID. That's the ID shown on the GitHub pull request page:

[GitHub pull request screenshot]

Once you have the pull request ID, choose a new name for your branch, then fetch it:

git fetch origin pull/PULL-REQUEST_ID/head:NEW-BRANCH-NAME
git checkout NEW-BRANCH-NAME

Then you can view diffs with something like git difftool NEW-BRANCH-NAME..master

Easy! GitHub should give a hint of that on its pull request pages.

Fetching a Pull Request diff to apply it to another tree

But shortly after I learned how to apply a pull request, I had a related but different problem in another project. There was a pull request for an older repository, but the part it applied to had since been split off into a separate project. (It was an old pull request that had fallen through the cracks, and as a new developer on the project, I wanted to see if I could help test it in the new repository.)

You can't pull a pull request that's for a whole different repository. But what you can do is go to the pull request's page on GitHub. There are 3 tabs: Conversation, Commits, and Files changed. Click on Files changed to see the diffs visually.

That works if the changes are small and only affect a few files (which fortunately was the case this time). It's not so great if there are a lot of changes or a lot of files affected. I couldn't find any "Raw" or "download" button that would give me a diff I could actually apply. You can select all and then paste the diffs into a local file, but you have to do that separately for each file affected. It might be, if you have a lot of files, that the best solution is to check out the original repo, apply the pull request, generate a diff locally with git diff, then apply that diff to the new repo. Rather circuitous. But with any luck that situation won't arise very often.

Update: thanks very much to Houz for the solution! (In the comments, below.) Just append .diff or .patch to the pull request URL, e.g. https://github.com/OWNER/REPO/pull/REQUEST-ID.diff which you can view in a browser or fetch with wget or curl.

January 23, 2017 09:34 PM

January 19, 2017

Akkana Peck

Plotting Shapes with Python Basemap wwithout Shapefiles

In my article on Plotting election (and other county-level) data with Python Basemap, I used ESRI shapefiles for both states and counties.

But one of the election data files I found, OpenDataSoft's USA 2016 Presidential Election by county had embedded county shapes, available either as CSV or as GeoJSON. (I used the CSV version, but inside the CSV the geo data are encoded as JSON so you'll need JSON decoding either way. But that's no problem.)

Just about all the documentation I found on coloring shapes in Basemap assumed that the shapes were defined as ESRI shapefiles. How do you draw shapes if you have latitude/longitude data in a more open format?

As it turns out, it's quite easy, but it took a fair amount of poking around inside Basemap to figure out how it worked.

In the loop over counties in the US in the previous article, the end goal was to create a matplotlib Polygon and use that to add a Basemap patch. But matplotlib's Polygon wants map coordinates, not latitude/longitude.

If m is your basemap (i.e. you created the map with m = Basemap( ... ), you can translate coordinates like this:

    (mapx, mapy) = m(longitude, latitude)

So once you have a region as a list of (longitude, latitude) coordinate pairs, you can create a colored, shaped patch like this:

    for coord_pair in region:
        coord_pair[0], coord_pair[1] = m(coord_pair[0], coord_pair[1])
    poly = Polygon(region, facecolor=color, edgecolor=color)
    ax.add_patch(poly)

Working with the OpenDataSoft data file was actually a little harder than that, because the list of coordinates was JSON-encoded inside the CSV file, so I had to decode it with json.loads(county["Geo Shape"]). Once decoded, it had some counties as a Polygonlist of lists (allowing for discontiguous outlines), and others as a MultiPolygonlist of list of lists (I'm not sure why, since the Polygon format already allows for discontiguous boundaries)

[Blue-red-purple 2016 election map]

And a few counties were missing, so there were blanks on the map, which show up as white patches in this screenshot. The counties missing data either have inconsistent formatting in their coordinate lists, or they have only one coordinate pair, and they include Washington, Virginia; Roane, Tennessee; Schley, Georgia; Terrell, Georgia; Marshall, Alabama; Williamsburg, Virginia; and Pike Georgia; plus Oglala Lakota (which is clearly meant to be Oglala, South Dakota), and all of Alaska.

One thing about crunching data files from the internet is that there are always a few special cases you have to code around. And I could have gotten those coordinates from the census shapefiles; but as long as I needed the census shapefile anyway, why use the CSV shapes at all? In this particular case, it makes more sense to use the shapefiles from the Census.

Still, I'm glad to have learned how to use arbitrary coordinates as shapes, freeing me from the proprietary and annoying ESRI shapefile format.

The code: Blue-red map using CSV with embedded county shapes

January 19, 2017 04:36 PM

Nathan Haines

UbuCon Summit at SCALE 15x Call for Papers

UbuCons are a remarkable achievement from the Ubuntu community: a network of conferences across the globe, organized by volunteers passionate about Open Source and about collaborating, contributing, and socializing around Ubuntu. UbuCon Summit at SCALE 15x is the next in the impressive series of conferences.

UbuCon Summit at SCALE 15x takes place in Pasadena, California on March 2nd and 3rd during the first two days of SCALE 15x. Ubuntu will also have a booth at SCALE's expo floor from March 3rd through 5th.

We are putting together the conference schedule and are announcing a call for papers. While we have some amazing speakers and an always-vibrant unconference schedule planned, it is the community, as always, who make UbuCon what it is—just as the community sets Ubuntu apart.

Interested speakers who have Ubuntu-related topics can submit their talk to the SCALE call for papers site. UbuCon Summit has a wide range of both developers and enthusiasts, so any interesting topic is welcome, no matter how casual or technical. The SCALE CFP form is available here:

http://www.socallinuxexpo.org/scale/15x/cfp

Over the next few weeks we’ll be sharing more details about the Summit, revamping the global UbuCon site and updating the SCALE schedule with all relevant information.

http://www.ubucon.org/

About SCaLE:

SCALE 15x, the 15th Annual Southern California Linux Expo, is the largest community-run Linux/FOSS showcase event in North America. It will be held from March 2-5 at the Pasadena Convention Center in Pasadena, California. For more information on the expo, visit https://www.socallinuxexpo.org

January 19, 2017 10:12 AM

January 14, 2017

Akkana Peck

Plotting election (and other county-level) data with Python Basemap

After my arduous search for open 2016 election data by county, as a first test I wanted one of those red-blue-purple charts of how Democratic or Republican each county's vote was.

I used the Basemap package for plotting. It used to be part of matplotlib, but it's been split off into its own toolkit, grouped under mpl_toolkits: on Debian, it's available as python-mpltoolkits.basemap, or you can find Basemap on GitHub.

It's easiest to start with the fillstates.py example that shows how to draw a US map with different states colored differently. You'll need the three shapefiles (because of ESRI's silly shapefile format): st99_d00.dbf, st99_d00.shp and st99_d00.shx, available in the same examples directory.

Of course, to plot counties, you need county shapefiles as well. The US Census has county shapefiles at several different resolutions (I used the 500k version). Then you can plot state and counties outlines like this:

from mpl_toolkits.basemap import Basemap
import matplotlib.pyplot as plt

def draw_us_map():
    # Set the lower left and upper right limits of the bounding box:
    lllon = -119
    urlon = -64
    lllat = 22.0
    urlat = 50.5
    # and calculate a centerpoint, needed for the projection:
    centerlon = float(lllon + urlon) / 2.0
    centerlat = float(lllat + urlat) / 2.0

    m = Basemap(resolution='i',  # crude, low, intermediate, high, full
                llcrnrlon = lllon, urcrnrlon = urlon,
                lon_0 = centerlon,
                llcrnrlat = lllat, urcrnrlat = urlat,
                lat_0 = centerlat,
                projection='tmerc')

    # Read state boundaries.
    shp_info = m.readshapefile('st99_d00', 'states',
                               drawbounds=True, color='lightgrey')

    # Read county boundaries
    shp_info = m.readshapefile('cb_2015_us_county_500k',
                               'counties',
                               drawbounds=True)

if __name__ == "__main__":
    draw_us_map()
    plt.title('US Counties')
    # Get rid of some of the extraneous whitespace matplotlib loves to use.
    plt.tight_layout(pad=0, w_pad=0, h_pad=0)
    plt.show()
[Simple map of US county borders]

Accessing the state and county data after reading shapefiles

Great. Now that we've plotted all the states and counties, how do we get a list of them, so that when I read out "Santa Clara, CA" from the data I'm trying to plot, I know which map object to color?

After calling readshapefile('st99_d00', 'states'), m has two new members, both lists: m.states and m.states_info.

m.states_info[] is a list of dicts mirroring what was in the shapefile. For the Census state list, the useful keys are NAME, AREA, and PERIMETER. There's also STATE, which is an integer (not restricted to 1 through 50) but I'll get to that.

If you want the shape for, say, California, iterate through m.states_info[] looking for the one where m.states_info[i]["NAME"] == "California". Note i; the shape coordinates will be in m.states[i]n (in basemap map coordinates, not latitude/longitude).

Correlating states and counties in Census shapefiles

County data is similar, with county names in m.counties_info[i]["NAME"]. Remember that STATE integer? Each county has a STATEFP, m.counties_info[i]["STATEFP"] that matches some state's m.states_info[i]["STATE"].

But doing that search every time would be slow. So right after calling readshapefile for the states, I make a table of states. Empirically, STATE in the state list goes up to 72. Why 72? Shrug.

    MAXSTATEFP = 73
    states = [None] * MAXSTATEFP
    for state in m.states_info:
        statefp = int(state["STATE"])
        # Many states have multiple entries in m.states (because of islands).
        # Only add it once.
        if not states[statefp]:
            states[statefp] = state["NAME"]

That'll make it easy to look up a county's state name quickly when we're looping through all the counties.

Calculating colors for each county

Time to figure out the colors from the Deleetdk election results CSV file. Reading lines from the CSV file into a dictionary is superficially easy enough:

    fp = open("tidy_data.csv")
    reader = csv.DictReader(fp)

    # Make a dictionary of all "county, state" and their colors.
    county_colors = {}
    for county in reader:
        # What color is this county?
        pop = float(county["votes"])
        blue = float(county["results.clintonh"])/pop
        red = float(county["Total.Population"])/pop
        county_colors["%s, %s" % (county["name"], county["State"])] \
            = (red, 0, blue)

But in practice, that wasn't good enough, because the county names in the Deleetdk names didn't always match the official Census county names.

Fuzzy matches

For instance, the CSV file had no results for Alaska or Puerto Rico, so I had to skip those. Non-ASCII characters were a problem: "Doña Ana" county in the census data was "Dona Ana" in the CSV. I had to strip off " County", " Borough" and similar terms: "St Louis" in the census data was "St. Louis County" in the CSV. Some names were capitalized differently, like PLYMOUTH vs. Plymouth, or Lac Qui Parle vs. Lac qui Parle. And some names were just different, like "Jeff Davis" vs. "Jefferson Davis".

To get around that I used SequenceMatcher to look for fuzzy matches when I couldn't find an exact match:

def fuzzy_find(s, slist):
    '''Try to find a fuzzy match for s in slist.
    '''
    best_ratio = -1
    best_match = None

    ls = s.lower()
    for ss in slist:
        r = SequenceMatcher(None, ls, ss.lower()).ratio()
        if r > best_ratio:
            best_ratio = r
            best_match = ss
    if best_ratio > .75:
        return best_match
    return None

Correlate the county names from the two datasets

It's finally time to loop through the counties in the map to color and plot them.

Remember STATE vs. STATEFP? It turns out there are a few counties in the census county shapefile with a STATEFP that doesn't match any STATE in the state shapefile. Mostly they're in the Virgin Islands and I don't have election data for them anyway, so I skipped them for now. I also skipped Puerto Rico and Alaska (no results in the election data) and counties that had no corresponding state: I'll omit that code here, but you can see it in the final script, linked at the end.

    for i, county in enumerate(m.counties_info):
        countyname = county["NAME"]
        try:
            statename = states[int(county["STATEFP"])]
        except IndexError:
            print countyname, "has out-of-index statefp of", county["STATEFP"]
            continue

        countystate = "%s, %s" % (countyname, statename)
        try:
            ccolor = county_colors[countystate]
        except KeyError:
            # No exact match; try for a fuzzy match
            fuzzyname = fuzzy_find(countystate, county_colors.keys())
            if fuzzyname:
                ccolor = county_colors[fuzzyname]
                county_colors[countystate] = ccolor
            else:
                print "No match for", countystate
                continue

        countyseg = m.counties[i]
        poly = Polygon(countyseg, facecolor=ccolor)  # edgecolor="white"
        ax.add_patch(poly)

Moving Hawaii

Finally, although the CSV didn't have results for Alaska, it did have Hawaii. To display it, you can move it when creating the patches:

    countyseg = m.counties[i]
    if statename == 'Hawaii':
        countyseg = list(map(lambda (x,y): (x + 5750000, y-1400000), countyseg))
    poly = Polygon(countyseg, facecolor=countycolor)
    ax.add_patch(poly)
The offsets are in map coordinates and are empirical; I fiddled with them until Hawaii showed up at a reasonable place. [Blue-red-purple 2016 election map]

Well, that was a much longer article than I intended. Turns out it takes a fair amount of code to correlate several datasets and turn them into a map. But a lot of the work will be applicable to other datasets.

Full script on GitHub: Blue-red map using Census county shapefile

January 14, 2017 10:10 PM