Planet Ubuntu California

May 01, 2016

Elizabeth Krumbach

OpenStack Summit Days 1-2

This past week I attended my sixth OpenStack Summit. This one took us to Austin, Texas. I was last in Austin in 2014 when I quickly stopped by to give a talk at the Texas LinuxFest, but I wasn’t able to stay long during that trip. This trip gave me a chance (well, several) to finally have some local BBQ!

I arrived Sunday afternoon and took the opportunity to meet up with Chris Aedo and Paul Belanger, who I’d be on the stage with on Monday morning. We were able to do our first meetup together and do a final once through of our slides to make sure they had all the updates we wanted and we were clear on where the transitions were. Gathering at the convention center also allowed to pick up our badges before the mad rush that would come the opening of the conference itself on Monday morning.

With Austin being the Live Music Capital of the World, we were greeted in the morning by live music from the band Soul Track Mind. I really enjoyed the vibe it brought to the morning, and we had a show to watch as we settled in and waited for the keynotes.

Jonathan Bryce and Lauren Sell of the OpenStack Foundation opened the conference and gave us a tour of numbers. The first OpenStack summit was held in Austin just under six years ago with 75 people and they were proud to announce that this summit had over 7,500. It’s been quite the ride that I’m proud to have been part of since the beginning of 2013. In Jonathan’s keynote we were able to get a glimpse into the real users of OpenStack, with highlights including the fact that 65% of respondents to the recent OpenStack User Survey are using OpenStack in production and that half of the Fortune 100 companies are using OpenStack in some capacity. It was also interesting to learn how important the standard APIs for interacting with clouds was for companies, a fact that I always hoped would shine through as this open source cloud was being adopted. The video from his keynote is here: Embracing Datacenter Diversity.

As the keynotes continued the ones that really stood out for me were by AT&T (video: AT&T’s Cloud Journey with OpenStack) and Volkswagen Group (Driving the Future of IT Infrastructure at Volkswagen Group.

The AT&T keynote was interesting from a technical perspective. It’s clear that the rise of mobile devices and the internet of things has put pressure on telecoms to grow much more quickly than they have in the past to handle this new mobile infrastructure. Their keynote shared that they expected this to grow an additional ten times by 2020. To meet this need, the networking aspects of technologies like OpenStack are important to their strategy as they move away from “black box” hardware from networking vendors and to more software-driven infrastructure that could grow more quickly to fit their needs. We learned that they’re currently using 10 OpenStack projects in their infrastructure, with plans to add 3 more in the near future, and learned about their in house AT&T Integrated Cloud (AIC) tooling for managing OpenStack. When the morning concluded, all their work was rewarded with a Super User award, they wrote about here.

The Volkswagen Group keynote was a lot of fun. As the world of electric and automated cars quickly approaches they have recognized the need to innovate more quickly and use technology to get there. They still seem to be in the early days of OpenStack deployments, but have committed a portion of one of their new data centers to just OpenStack. Ultimately they see a hybrid cloud future, leveraging both public and private hosting.

The keynote sessions concluded with the announcement of the 2017 OpenStack Summit locations: Boston and Sydney!

Directly after the keynote I had to meet Paul and Chris for our talk on OpenStack Infrastructure for Beginners (video, slides). We had a packed room. I lead off the presentation by covering an overview of our work and by giving a high level tour of the OpenStack project infrastructure. Chris picked up by speaking to how things worked from a developer perspective, tying that back into how and why we set things up the way we did. Paul rounded out the presentation by diving into more of the specifics around Zuul and Jenkins, including how our testing jobs are defined and run. I think the talk went well, we certainly had a lot of fun as we went into lunch chatting with folks about specific components that they were looking either to get involved with or replicate in their own continuous integration systems.


Chris Aedo presenting, photo by Donnie Ham (source)

After a delicious lunch at Cooper’s BBQ, I went over to a talk on “OpenStack Stable: What It Actually Means to Maintain Stable Branches” by Matt Riedemann, Matthew Treinish and Ihar Hrachyshka in the Upstream Development track of the conference. This was a new track for this summit, and it was great to see how well-attended the sessions ended up being. The goal of this talk was to inform members of the community what exactly is involved in management of stable releases, which has a lot more moving pieces than most people tend to expect. Video from the session up here. It was then over to “From Upstream Documentation To Downstream Product Knowledge Base” by Stefano Maffulli and Caleb Boylan of DreamHost. They’ve been taking OpenStack documentation and adjusting it for easier and more targeted for consumption by their customers. They talked about their toolchain that gets it from raw source from the OpenStack upstream into the proprietary knowledge base at DreamHost. It’ll be interesting to see how this scales long term through releases and documentations changes, video here.

My day concluded by participating in a series of Lightning Talks. My talk was first, during which I spent 5 minutes giving a tour of status.openstack.org. I was inspired to give this talk after realizing that even though the links are right there, most people are completely unaware of what things like Reviewday (“Reviews” link) are. It also gave me the opportunity to take a closer, current look at OpenStack Health prior to my presentation, I had intended to go to “OpenStack-Health Dashboard and Dealing with Data from the Gate” (video) but it conflicted with the talk we were giving in the morning. The lightning talks continued with talks by Paul Belanger on Grafyaml, James E. Blair on Gertty and Andreas Jaeger on the steps for adding a project to OpenStack. The lightning talks from there drifted away from Infrastructure and into more general upstream development. Video of all the lightning talks here.

Day two of the summit began with live music again! It was nice to see that it wasn’t a single day event. This time Mark Collier of the OpenStack Foundation kicked things off by talking about the explosion of growth in infrastructure needed to support the growing Internet of Things. Of particular interest was learning about how operators are particularly seeking seamless integration of virtual machines, containers and bare metal, and how OpenStack meets that need today as a sort of integration engine, video here.

The highlights of the morning for me included a presentation from tcp cloud in the Czech Republic. They’re developing a Smart City in the small Czech city of Písek. He did an overview of the devices they were using and presented a diagram demonstrating how all the data they collect from around the city gets piped into an OpenStack cloud that they run. He concluded his presentation by revealing that they’d turned the summit itself into a mini city by placing devices around the venue to track temperature and CO2 levels throughout the rooms, very cool. Video of the presentation here.


tcp cloud presentation

I also enjoyed seeing Dean Troyer on stage to talk about improving user experience (UX) with OpenStackClient (OSC). As someone who has put a lot of work into converting documented commands in my book in an effort to use OSC rather than the individual project clients I certainly appreciate his dedication to this project. The video from the talk is here. It was also great to hear from OVH, an ISP and cloud hosting provider who currently donates OpenStack instances to our infrastructure team for running CI testing.

Tuesday also marked the beginning of the Design Summit. This is when I split off from the user conference and then spend the rest of my time in development space. This time the Design Summit was held across the street from the convention center in the Hilton where I was staying. This area of the summit takes us away from presentation-style sessions and into discussions and work sessions. This first day focused on cross-project sessions.

This was the lightest day of the week for me, having a much stronger commitment to the infrastructure sessions happening later in the week. Still, went to several sessions, starting off with a session led by Doug Hellmann to talk about how to improve the situation around global requirements. The session actually seemed to be an attempt to define the issues around requirements and get more contributors to help with requirements project review and to chat about improvements to tests. We’d really like to see requirements changes have a lower chance of breaking things, so trying to find folks to sign up to do this test writing work is really important.

I had lunch with my book writing co-conspirator Matt Fischer to chat about some of the final touches we’re working on before it’s all turned in. Ended up with a meaty lunch again at Moonshine Grill just across the street from the convention center, after which I went into a “Stable Branch End of Life Policy” session led by Thierry Carrez and Matt Riedemann. The stable situation is a tough one. Many operators want stable releases with longer lifespans, but the commitment from companies to put engineers on it is extremely limited. This session explored the resources required to continue supporting releases for longer (infra, QA, etc) and there were musings around extending the support period for projects meeting certain requirements for up to 24 months (from 18). Ultimately by the end of the summit it does seem that 18 months continues to be the release lifespan of them all.

I then went over to the Textile building across from the conference center where my employer, HPE, had set up their headquarters. I had a great on-camera chat with Stephen Spector about how open source has evolved from hobbyist to corporate since I became involved in 2001. I then followed some of the marketing folks outside to shoot some snippits for video later.

The day of sessions continued with a “Brainstorm format for design summit split event” session that talked a lot about dates. As a starting point, Thierry Carrez wrote a couple blog posts about the proposal to split the design summit from the user summit:

With these insightful blog posts in mind, the discussion moved forward on the assumption that the events would be split and how to handle that timing-wise. When in the cycle would each event happen for maximum benefit for our entire community? In the first blog post he had a graphic that had a proposed timeline, which the discussions mostly stuck to, but dove deeper into discussing what is going on during each release cycle week and what the best time would be for developers to gather together to start planning the next release. While there was good discussion on the topic, it was clear that there continues to be apprehension around travel for some contributors. There are fears that they would struggle to attend multiple events funding-wise, especially when questions arose around whether mid-cycle events would still be needed. Change is tough, but I’m on board with the plan to split out these events. Even as I write this blog post, I notice the themes and feel for the different parts of our current summit are very different.

My session day concluded with a session about cross-projects specifications for work lead by Shamail Tahir and Carol Barrett from the Product Working Group. I didn’t know much about OpenStack user stories, so this session was informative for seeing how those should be used in specs. In general, planning work in a collaborative way, especially across different projects that have diverse communities is tricky. Having some standards in place for these specs so teams are on the same page and have the same expectations for format seems like a good idea.

Tuesday evening meant it was time for the StackCity Community Party. Instead of individual companies throwing big, expensive parties, a street was rented out and companies were able to sponsor the bars and eateries in order to throw their branded events in them. Given my dietary restrictions this week, I wasn’t able to partake in much of the food being offered, so I only spent about an hour there before joining a similarly restricted diet friend over at Iron Works BBQ. But not before I picked up a dinosaur with a succulent in it from Canonical.

I called it an early night after dinner, and I’m glad I did. Wednesday through Friday were some busy days! But those days are for another post.

More photos from the summit here: https://www.flickr.com/photos/pleia2/albums/72157667572682751

by pleia2 at May 01, 2016 04:28 PM

April 29, 2016

Akkana Peck

Vermillion Cliffs trip, and other distractions

[Red Toadstool, in the Paria Rimrocks] [Cobra Arch, in the Vermillion Cliffs] I haven't posted in a while. Partly I was busy preparing for, enjoying, then recovering from, a hiking trip to the Vermillion Cliffs, on the Colorado River near the Arizona/Utah border. We had no internet access there (no wi-fi at the hotel, and no data on the cellphone). But we had some great hikes, and I saw my first California Condors (they have a site where they release captive-bred birds). Photos (from the hikes, not the condors, which were too far away): Vermillion Cliffs trip.

I've also been having fun welding more critters, including a roadrunner, a puppy and a rattlesnake. I'm learning how to weld small items, like nail legs on spark plug dragonflies and scorpions, which tend to melt at the MIG welder's lowest setting.

[ Welded puppy \ [ Welded Roadrunner ] [ Welded rattlesnake ]

New Mexico's weather is being charmingly erratic (which is fairly usual): we went for a hike exploring some unmapped cavate ruins, shivering in the cold wind and occasionally getting lightly snowed upon. Then the next day was a gloriously sunny hike out Deer Trap Mesa with clear long-distance views of the mountains and mesas in all directions. Today we had graupel -- someone recently introduced me to that term for what Dave and I have been calling "snail" or "how" since it's a combination of snow and hail, soft balls of hail like tiny snowballs. They turned the back yard white for ten or fifteen minutes, but then the sun came out for a bit and melted all the little snowballs.

But since it looks like much of today will be cloudy, it's a perfect day to use up that leftover pork roast and fill the house with good smells by making a batch of slow-cooker green chile posole.

April 29, 2016 06:28 PM

April 23, 2016

Elizabeth Krumbach

FOSSASIA 2016

A few weeks ago I had the pleasure of flying to Singapore to participate in FOSSASIA 2016, which is billed as Asia’s Premier Open Technology Event. I was able to spend a little time prior to the event doing some touristing but Friday morning came quickly and I met up with a colleague to make our way to the conference. We took the Singapore MRT (Mass Rapid Transit, rails!) from the station near our hotel to the Science Centre Singapore where the conference was being held. I was really pleased with how fast, frequent, clean and easy to navigate the MRT is during rush hour. Though the trains did tend to fill up, we had very easy rides to and from the venue each day.

This was my second open source conference in a science museum, and I really like the association. As conference attendees we were free to visit the museum (photos here). It was quite an honor to be welcomed to the center by Lim Tit Meng, the museum’s Chief Executive, during the keynotes on Friday morning. That morning I also had the pleasure of meeting FOSSASIA founder Hong Phuc, who I had been exchanging emails with leading up to the event, it was very clear that she’s continued to be very hands on with the organization of the conference since its founding.

The theme of the conference this year centered around the Internet of Things, so the Friday morning keynotes drew from a diverse group of people and organizations. I was particularly impressed that they didn’t just call upon open source developers to give presentations. Keynotes came from folks working on hardware, design and fascinating programs that used IoT devices.

Highlights of the morning included a talk by Bunnie Huang who made electronic, lighted badges for Burning Man that changed their light patterns based on how they “mated” with other badges to change their blinkome (think genome). Talks continued with a really fun one from Bernard Leong of the Singapore Post who explained how they’ve been experimenting with drones for small package delivery, particularly to remote areas, using Pulau Ubin as an example in the demonstration run.

I was then really delighted to hear about UNESCO’s YouthMobile program from Davide Storti and ITO Misako. YouthMobile is encouraging children to shift from being mere users of mobile devices to actually developing applications for them. I find this project to be particularly important as I know I wouldn’t be the technologist I am today without being able to fiddle with my early computers. We need to grow that next generation of tinkerers, but increasingly kids tend to only have access to mobile rather than the big old desktops that I grew up on. I believe projects aimed at inspiring the tinkerer in children on these new devices will grow in importance as we move into the future It was also nice to hear that the project hasn’t just been creating all their own curriculum to accomplish their goals, they’ve been partnering with existing initiatives and programs. Kudos to them for doing it right.


Davide Storti and ITO Misako on YouthMobile

Cat Allman continued keynotes as she talked about the work Google has started to do in the Maker and Science space. Their work includes Google Summer of Code accepting more science-focused programs, support of Maker events and “road trips” with students to science museums. The final keynote came from Jan Nikolai Nelles who spoke on the The Other Nefertiti, where a team visited a German museum and created a not-strictly-authorized 3D rendering of a famous Nefertiti bust. It was a valuable thing unto itself, and interesting for raising awareness about how museums share data about artifacts, or don’t, as the case may be.

The conference continued as I went to a talk titled “Why are we (Still) wasting food? How technology can help” which sounds interesting, but the presenter didn’t seem to understand his audience or what the conference was about. The talk was pretty much a sales talk about the success of their product in saving food in restaurant and other industry kitchens. A noble effort, and it was fun to brainstorm how some of the components he talked about could be used in other open source projects. I visited their website during the talk and was perplexed to be unable to find a link to their source code. During the Q&A I specifically asked whether the software was actually open source. The presenter struggled to answer my question, he claimed that it was, but that he is not a developer so he wasn’t sure which parts or where I could find it. He gave me his business card so I could send him an email about it after the conference. My email follow-up received this response:

“We are not using any open source code. Everything is developed in house.”

How disappointing! I’m not sure how their talk ended up at a Free and Open Source Software conference, though their selection of a non-technical presenter who couldn’t answer a simple question that strikes at the core of what the conference is about does hint at their obliviousness. I certainly didn’t appreciate being tricked into attending a sales talk about a suite of proprietary software. Thankfully, the conference improved after this.

I attended a talk by U-Zyn Chua about how he reverse engineered an API in a taxi app for his Singapore Taxi data project. His talk was fascinating for two reasons. First he walked us through the work that had to be done to use an undocumented API. Second, the data about taxis that he collected was fascinating, high traffic areas, times of days when taxis were busy. Plus, between this talk and the Singapore Post talk I learned a lot about the geography and population centers of Singapore.

Official Group Photograph - FOSSASIA 2016
Official Group Photograph – FOSSASIA 2016 by Michael Cannon

The conference continued the next day and I made sure I made time to attend Sayan Chowdhury’s “Dive deep into Fedora Infra” talk. Fedora was an early project on my open source infra list and it’s always exciting to chat with their engineers and swap running infra in the open stories. Sayan’s talk gave an overview of several of the key services that they’ve developed and deployed, including projects like Fedora Infrastructure Message Bus (fedmsg) which was also deployed by the open source infra team for the Debian project. Unfortunately I had to quickly depart from that talk in order to make it over to my own just after.

I gave a talk on “Code Review for DevOps” which I had a lot of fun modifying for the 20 minute slot and for a devops rather than systems administration audience. I put a firmer emphasis on the development of tooling in our team and was able to tighten up the presentation a lot to deliver a whirlwind tour of how we do almost everything through a code review system and with testing. Slides from the presentation are here (PDF).


Photo of my presentation by Dong (Vincent) Ma source)

I mentioned that my talk was 20 minutes long, and that makes this a good time to pause and reflect on that format. Almost all the talks at this conference were 20 minute slots, which is about half the length I’m accustom to. I really like this length. If a talk is not interesting, at least it’s short. If it is interesting, 20 minutes does actually give enough time for a good presentation. The schedule also allowed for 10 minutes between sessions so that people could get to their next room. In reality, all this timing this could have used a bit more policing. Q&As and even talks themselves by speakers used to longer slots frequently overflowed beyond their 20 minute window and frequently made it difficult to complete seeing one talk and getting out to the next. For a volunteer-run event, they did do a good job overall of sticking to at least the schedule of when talks started in each room, so if I planned accordingly I rarely missed the beginning of a talk in an alternate track because the schedule had drifted.

Saturday afternoon I spent some time going to lightning talks, including one about “Continuous Integration and Continuous Deployment (CI/CD) for Open Source and Free Software Development” by my colleague Dong Ma. With only 5 minutes, he was quickly able to contrast some of the features of the FOSSology open source CI/CD workflow with that of the model the OpenStack community has developed.


Dong Ma on open source CI/CD

I was then off to Sundeep Anand’s presentation, “Using Python Client to talk with Zanata Server.” Last autumn we launched translate.openstack.org running on Zanata and have been using the Java client along with a series of scripts to handle manipulation of the translations in the OpenStack project. It was interesting to learn about his strides with the Python client, which is making its way up to feature parity with the Java one. Since OpenStack itself is written in Python, switching to this Python client may make sense for us at some point, as it would make it easier for developers on our project to contribute to it. During his talk he also gave a demonstration of Zanata itself as he walked through the use of the client.

These talks were all very practical for me and applicable to my work, but that doesn’t mean I didn’t go off and have fun too. Later that afternoon I attended a talk on “A trip to Pluto with OpenSpace” where the team developing OpenSpace took public images of the Pluto flyby and gave us a demonstration of how their software worked to provide such a fascinating, animated demonstration. I also got to learn about the New Palmyra project where people are getting together to create 3D models of famous monuments in Syria that have been or are at risk of being destroyed by ongoing military conflict in the region. I also enjoyed learning about the passion that everyone on that team is bringing to the project, and I have a lot of respect for and interest in their goals of preserving history.

On Sunday the first talk I attended was by François Cartegnie on the newest features of the popular, cross-platform VLC software project. As a user of multiple platforms (Linux and Android) it was nice to hear that with the 3.0 release they’re aiming to standardize on that release number, as the differing version numbers across platforms have been confusing. He also spent a great deal of time explaining the challenges they continually overcome to be the best player on the market, including not just by supporting encoding standards, but by also supporting when those encoding standards are poorly or improperly implemented. This can’t be an easy task. I was also interested to learn that the uPnP support has also been revamped and should be working better these days.

My colleague and tourism buddy for the week Matthew Treinish spoke next, on “QA in the Open.” Drawing from his experience as the QA project lead for OpenStack for several cycles, he talked about the plugin-driven model that OpenStack QA has adopted. This model has helped individual projects take ownership of their testing requirements and has helped scale the very small core QA team, which now spans over a thousand repositories and dozens of projects that make up OpenStack.


Matthew Treinish on QA in the Open

Sunday afternoon had a talk that was one of the conference highlights for me: “Reproducible Builds – fulfilling the original promise of free software” by Chris Lamb. I had an interest in the topic before joining the session, but it was one of those talks where I was really pulled in and became even more interested in the topic. The idea on the surface seems pretty simple, you want to be able to exactly replicate builds over time and space. But there are a number of challenges to this when it comes to actually doing it, which he outlined:

  • Timestamps
  • Timezones and locales
  • Different versions of libraries
  • Non-deterministic file ordering
  • Users, groups, umask, environment variables
  • Random behavior (eg. hash ordering)
  • Build paths

Chris Lamb on Reproducible Builds

As soon as he enumerated these things it was obvious that they all would be problems, and still surprising that it would be so difficult. From this talk I learned about the reproducible-builds.org project which seeks to document and discuss these issues and find solutions for all of them. Additionally, Chris himself is a participant in Debian and he was able to share statistics about how most Debian packages are now being created in a way that adheres to the reproducible model. Very cool stuff, I hope to learn more about it.

My afternoon continued by attending a talk about btrfs by Anand Jain. His focus was basics and then on to upcoming features in development. The talk may have convinced me to start using it in a basic way on one of my systems soon, as the support for the core components is actually quite stable these days. I then went to an Asciidoc talk, where presenter George Goh compiled his presentation from Asciidoc just before he began presenting, nicely done! He stressed the importance of documentation and making it easy to keep updated, with automated updates of references to things like figures that live inline in the text. He also explored the use of template systems in Asciidoc to easily export portions of your document to different projects and organizations while preserving the appropriate branding for each.

In what seemed much too soon, the conference conclusion came on Sunday evening. There were thanks and words from several of the organizers. Words from the audience and various attendees were also spoke, my favorite of which came from young (middle school by my US-rendering) students visiting from Saudi Arabia. Several had feared that the conference would be boring and too technical for the level they were at, but they expressed excitement about how much fun they had and how many presenters had succeeded in presenting topics that they could understand. It was thrilling to hear this from these students, I want the future architects of our future to start young, be exposed to free and open source software and to be excited by the possibilities.

More of my photos from the event here: https://www.flickr.com/photos/pleia2/albums/72157666299641355

Thanks to all the organizers and volunteers for putting this conference together. I had a wonderful time and hope to participate again in the future!

by pleia2 at April 23, 2016 05:50 PM

April 21, 2016

Jono Bacon

Dan Ariely on Building More Human Technology, Data, Artificial Intelligence, and More

Behavioral economics is an exciting skeleton on which to build human systems such as technology and communities.

One of the leading minds in behavioral economics is Dan Ariely, New York Times best-selling author of Predictably Irrational, The Upside Of Irrationality, and frequent TED speaker.

I recently interviewed Dan for my Forbes column to explore how behavioral economics is playing a role in technology, data, artificial intelligence, and preventing online abuse. Predictably, his insight was irrationally interesting. OK, that was a stretch.

Read the piece here

by Jono Bacon at April 21, 2016 08:59 PM

Nathan Haines

Ubuntu 16.04 LTS FAQ

This work is licensed under a Creative Commons Attribution-ShareAlike 4.0 International License.

Ubuntu 16.04 LTS is here! Let's take a look at some of the most exciting features and common questions around this new operating system.

Ubuntu 16.04 LTS

  1. When does Ubuntu 16.04 LTS come out?

    • Ubuntu 16.04 LTS will reach general release on April 21st, 2016.
  2. I meant at what time will the release happen?

    • Ubuntu is actively being developed until the actual release happens, minus a small delay to help the mirrors propogate first. The release will be announced on the ubuntu-announce mailing list. (This page will not exist until the release.)
  3. What does "16.04 LTS" mean?

    • Ubuntu is released on a regular schedule every six months. The first release was in October 2004, and was named Ubuntu 4.10. For Ubuntu, the major version number is the year of release and the minor version number is the month of release. Ubuntu 16.04 is released on 2016-04-21, so the version number is 16.04.
    • Ubuntu releases are supported for 9 months, but many computing activities require stability. Every two years, an Ubuntu release is developed with long term support in mind. These releases, designated with "LTS" after the version number, are supported for 5 years on the server and desktop.
  4. What does "Xenial Xerus" mean?

    • Every version of Ubuntu has an alliterative development codename. After Ubuntu 6.06 LTS was released, the decision was made to choose new codenames in alphabetical order. Ubuntu 16.04 LTS is codenamed the Xenial Xerus release, or xenial for short.
    • "Xenial" is an adjective that means "friendly to others, especially foreigners, guests, or strangers." With lxd being perfect for "guest" containers, Snappy Ubuntu Core being perfect for IoT developers, snap packages being perfect for third-party software developers, and Ubuntu on Windows perfect for Windows developers who use Ubuntu in the cloud (or Ubuntu developers who are forced to use Windows at work!), xenial is a perfect description of Ubuntu 16.04!
    • "Xerus" is the genus name of the African ground squirrel. They collaborate and are not aggressive to other mammals, so they fit the description of xenial. It also makes for an adorable mascot!
  5. How long will Ubuntu 16.04 LTS be supported?

    • Ubuntu 16.04 LTS will be supported on desktops, servers, and in the cloud for 5 years, until April 2021. After this time, 16.04 LTS will enter end-of-life and no more security updates will be released.

Getting Ubuntu 16.04 LTS

  1. Where can I download Ubuntu 16.04 LTS?

    • Once released, Ubuntu 16.04 LTS will be available for release at http://www.ubuntu.com/download/. This URL will help you select the right architecture and will automatically link you to a mirror for the download. Please don't constantly refresh the direct download site!
  2. What if I find ubuntu-16.04-desktop-amd64.iso on an Ubuntu server before the official release is announced?

    • Then you've found a final release candidate that is being used to seed the mirrors before releases. Downloading or linking to it will interfere with the mirrors and delay the release.
  3. What if I post a link to it anyway?

    • If you do it on /r/Ubuntu, your post or comment will be removed and you will be banned for a day. The release team works hard enough as it is!
  4. What if I want to help others get Ubuntu 16.04 LTS faster?

    • Thank you for your help! Consider using BitTorrent (Ubuntu comes with Transmission) and seeding the final release.
  5. What if I'm already running Ubuntu 14.04.4 LTS or Ubuntu 15.10?

    • Then you can simply upgrade to Ubuntu 16.04 using Software Updater

Upgrading to Ubuntu 16.04 LTS

  1. Is upgrading to a new version of Ubuntu easy?

    • Yes, the upgrade process is supported and automated. However, you should always back up your files and data before upgrading Ubuntu. Actually, you should always keep recent backups even when you not upgrading Ubuntu.
    • Ubuntu checks for software updates once a day, and Software Updater will inform you once a new version of Ubuntu is available. The upgrade will download a large amount of data--anywhere from 0.5 - 1.5 GB of data depending on the packages you have installed, and the upgrade process can take some time. Don't do any serious work on your computer during the upgrade process. Light web browsing or a simple game such as Aisleriot, Mahjongg, or Mines is safe.
  2. Should I upgrade to Ubuntu 16.04 LTS right away or wait?

    • It should be safe to upgrade immediately, and as long as you back up your home folder and have install media for your current version of Ubuntu in case you want to reinstall, there's very little risk involved.
  3. Is it better to wait until later?

    • Probably not, but there are other benefits. Ubuntu 16.04 will receive newer release images with bug fixes about 3 months after its initial release. In addition, downloading updates can be much faster after release week. (Be sure to set up your Ubuntu mirror in Software & Updates!) Ubuntu 14.04 LTS is supported until April 2019 and Ubuntu 15.10 is supported until July 2016, so you have nothing to lose by waiting a couple weeks.
  4. I'm running Ubuntu 15.10. How do I upgrade to Ubuntu 16.04 LTS?

    • After Ubuntu 16.04 LTS is released, Software Updater will inform you that a new version of Ubuntu is available. Make sure that all available updates for Ubuntu 15.10 have been installed first, then click the "Upgrade..." button.
  5. I'm running Ubuntu 14.04.4 LTS. How do I upgrade to Ubuntu 16.04 LTS?

    • After Ubuntu 16.04.1 LTS is released in July 2016, Software Updater will inform you that a new version of Ubuntu is available. Make sure that all available updates for Ubuntu 14.04 LTS have been installed first, then click the "Upgrade..." button.
  6. I'm running Ubuntu 12.04 LTS. How do I upgrade to Ubuntu 16.04 LTS?

    • You can't upgrade directly to Ubuntu 16.04 LTS, so you have two options:
      • Use Update Manager to upgrade to Ubuntu 14.04 LTS, then reboot and use Software Updater to upgrade again to Ubuntu 16.04 LTS.
      • Back up your computer and install Ubuntu 16.04 LTS from scratch.
  7. What is Ubuntu 16.04.1 and why can't I update Ubuntu 14.04 LTS immediately?

    • A new version of Ubuntu is released every six months, but LTS releases are used for years. So Ubuntu offers "point releases" of LTS versions. Starting 3 months after the release and then every 6 months thereafter, new install images are created that include the latest updates to all of the default software. This allows new installations to run the latest software immediately and decreases the time it takes to download updates after a new install.
    • Because LTS users depend on stability, Ubuntu 14.04 LTS will not automatically offer an update to Ubuntu 16.04 LTS until the first point release. After three months, any show-stopper bugs should be solved and the upgrade process will have been tested by many others and improved if necessary.
  8. What if I want to upgrade right now?

    • Upgrading from Ubuntu 14.04 LTS to Ubuntu 16.04 LTS should be safe and easy. If you have a recent backup of your files and data, simply open Terminal and type update-manager -d. This will tell Ubuntu to upgrade to the next release early.
  9. What if I already ran update-manager -d and upgraded to a beta or pre-release version of Ubuntu 16.04 LTS?

    • If you run Software Updater after the release of Ubuntu 16.04 LTS, your version of xenial will be the same as the released version of Ubuntu.
  10. What if I don't believe that?

    • When xenial is being developed, it is constantly being improved. Milestones such as Alpha 1, Beta 2, and so on are simply points in time where developers can check progress. If you install Ubuntu from a Beta 2 image (for example), the moment you apply updates, you are no longer running Beta 2. Updates to xenial continue until release, when the Ubuntu archive is locked, images are spun, and the xenial archive is finalized and released as Ubuntu 16.04 LTS. After the release of Ubuntu 16.04 LTS, all further updates come from the xenial-updates and xenial-security repositories and the xenial repository remains unchanged. Updating from the Ubuntu repositories during and after the xenial development and release brings you along through theses moments in time.
      • TRIVIA: As implied above, this means that Ubuntu 16.04 LTS doesn't exist until the Release Team names the final product. Until then, the release is simply Xenial Xerus or xenial for short.

Coming next:

Details on new features!

  • How do snap packages and deb packages work together?
  • DAE Unity 8?
  • Y U NO AMD fglrx drivers?
  • And other questions you ask in the [/r/Ubuntu comments](https://redd.it/4frg4a)!

April 21, 2016 10:17 AM

April 16, 2016

Elizabeth Krumbach

Color an Ubuntu Xenial Xerus

Last cycle I reached out to artist and creator of Full Circle Magazine Ronnie Tucker to see if he’d create a coloring page of a werewolf for some upcoming events. He came through and we had a lot of fun with it (blog post here).

With the LTS release coming up, I reached out to him again.

He quickly turned my request around, and now we have a xerus to color!

Xerus coloring page
Click the image or here to download the full size version for printing.

Huge thanks to Ronnie for coming through with this, it’s shared with a CC-SA license, so I encourage people to print and share them at their release events and beyond!

While we’re on the topic of the our African ground squirrel friend, thanks to Tom Macfarlane of the Canonical Design Team I was able to update the Animal SVGs section of the Official Artwork page on the Ubuntu wiki. For those of you who haven’t seen the mascot image, it’s a real treat.

Xerus official mascot

It’s a great accompaniment to your release party. Download the SVG version for printing from the wiki page or directly here.

by pleia2 at April 16, 2016 05:03 PM

April 14, 2016

Jono Bacon

Mycroft and Building a Future of Open Artificial Intelligence

Last year a new project hit Kickstarter called Mycroft that promises to build an artificial intelligence assistant. The campaign set out to raise $99,000 and raised just shy of $128,000.

Now, artificial intelligence assistants are nothing particularly new. There are talking phones and tablets such as Apple’s Siri and Google Now, and of course the talking trash can, the Amazon Echo. Mycroft is different though and I have been pretty supportive of the project, so much so that I serve as an advisor to the team. Let me tell you why.

Here is a recent build in action, demoed by Ryan Sipes, Mycroft CTO and all round nice chap:

Mycroft is interesting both for the product it is designed to be and the way the team are building it.

For the former, artificial intelligence assistants are going to be a prevalent part of our future. Where these devices will be judged though is in the sheer scope of the functions, information, and data they can interact with. They won’t be judged by what they can do, but instead what they can’t do.

This is where the latter piece, how Mycroft is being built, really interests me.

Firstly, Mycroft is open source in not just the software, but also the hardware and service it connects to. You can buy a Mycroft, open it up, and peek into every facet of what it is, how it works, and how information is shared and communicated. Now, for most consumers this might not be very interesting, but from a product development perspective it offers some distinctive benefits:

  • A community can be formed that can play a role in the future development and success of the product. This means that developers, data scientists, advocates, and more can play a part in Mycroft.
  • Capabilities can be crowdsourced to radically expand what Mycroft can do. In much the same way OpenStreetmap has been able to map the world, developers can scratch their own itch and create capabilities to extend Mycroft.
  • The technology can be integrated far beyond the white box sitting on your kitchen counter and into Operating Systems, devices, connected home units, and beyond.
  • The hardware can be iterated by people building support for Mycroft on additional boards. This could potentially lower costs for future units with the integration work reduced.
  • Improved security for users with a wider developer community wrapped around the project.
  • A partner ecosystem can be developed where companies can use and invest in the core Mycroft open source projects to reduce their costs and expand the technology.

There is though a far wider set of implications with Mycroft too. Much has been been written about the concerns from people such as Elon Musk and Stephen Hawking about the risks of artificial intelligence, primarily if it is owned by a single company, or a small set of companies.

While I don’t think skynet is taking over anytime soon, these concerns are valid and this raises the importance that artificial intelligence is something that is open, not proprietary. I think Mycroft can play a credible role in building a core set of services around AI that are part of an open commons that companies can invest in. Think of this as the OpenStack of AI, if you will.

Hacking on Mycroft

So, it would be remiss if I didn’t share a few details of how the curious among you can get involved. Mycroft currently has three core projects:

  • The Adapt Intent Parser converts natural language into machine readable data structures.
  • Mimic takes in text and reads it out loud to create a high quality voice.
  • OpenSTT is aimed at creating an open source speech-to-text model that can be used by individuals and company to allow for high accuracy, low-latency conversion of speech into text.

You can also find the various projects here on GitHub and find a thriving user and developer community here.

Mycroft are also participating in the IBM Watson AI XPRIZE where the goal is to create an artificial intelligence platform that interacts with people so naturally that when people speak to it they’ll be unable to tell of they’re talking to a machine or to a person. You can find out more about how Mycroft is participating here.

I know the team are very interested in attracting developers, docs writers, translators, advocates, and more to play a role across these different parts of the project. If this all sounds very exciting to you, be sure to get started by posting to the forum.

by Jono Bacon at April 14, 2016 05:01 AM

April 13, 2016

Jono Bacon

Going Large on Medium

I just wanted to share a quick note to let you know that I will be sharing future posts both on jonobacon.org and on my Medium site.

I would love to hear what kind of content you would find interesting for me to share. Feel free to share in the comments!

Thanks!

by Jono Bacon at April 13, 2016 07:19 PM

April 12, 2016

Jono Bacon

Upcoming Speaking at Interop and Abstractions

I just wanted to share a couple of upcoming speaking engagements going on:

  • Interop in Las Vegas – 5th May 2016 – I will be participating in the keynote panel at Interop this year. The panel is called How Open-Source Changes the IT Equation and I am looking forward to participating with Colin McNamara, Greg Ferro, and Sean Roberts.
  • Abstractions in Pittsburgh – 18-20 Aug 2016 – I will be delivering one of the headlining talks at Abstractions. This looks like an exciting new conference and my first time in Pittsburgh. Looking forward to getting out there!

Some more speaking gigs are in the works. More details soon.

by Jono Bacon at April 12, 2016 03:30 PM

April 10, 2016

Jono Bacon

Community Leadership Summit 2016

On 14th – 15th May 2016 in Austin, Texas the Community Leadership Summit 2016 will be taking place. For the 8th year now, community leaders and managers from a range of different industries, professions, and backgrounds will meet together to share ideas and best practice. See our incredible registered attendee list that is shaping up for this year’s event.

This year we also have many incredible keynotes that will cover topics such as building developer communities, tackling imposter syndrome, gamification, governance, and more. Of course CLS will incorporate the popular unconference format where the audience determine the sessions in the schedule.

We are also delighted to host the FLOSS Community Metrics event as part of CLS this year too!

The event is entirely free and everyone is welcome! CLS takes place the weekend before OSCON in the same venue in Austin. Be sure to go and register to join us and we hope to see you in Austin in May!

Many thanks to O’Reilly, Autodesk, and the Linux Foundation for their sponsorship of the event!

by Jono Bacon at April 10, 2016 09:35 PM

April 05, 2016

Akkana Peck

Modifying a git repo so you can pull without a password

There's been a discussion in the GIMP community about setting up git repos to host contributed assets like scripts, plug-ins and brushes, to replace the long-stagnant GIMP Plug-in Repository. One of the suggestions involves having lots of tiny git repos rather than one that holds all the assets.

That got me to thinking about one annoyance I always have when setting up a new git repository on github: the repository is initially configured with an ssh URL, so I can push to it; but that means I can't pull from the repo without typing my ssh password (more accurately, the password to my ssh key).

Fortunately, there's a way to fix that: a git configuration can have one url for pulling source, and a different pushurl for pushing changes.

These are defined in the file .git/config inside each repository. So edit that file and take a look at the [remote "origin"] section.

For instance, in the GIMP source repositories, hosted on git.gnome.org, instead of the default of url = ssh://git.gnome.org/git/gimp I can set

pushurl = ssh://git.gnome.org/git/gimp
url = git://git.gnome.org/gimp
(disclaimer: I'm not sure this is still correct; my gnome git access stopped working -- I think it was during the Heartbleed security fire drill, or one of those -- and never got fixed.)

For GitHub the syntax is a little different. When I initially set up a repository, the url comes out something like url = git@github.com:username/reponame.git (sometimes the git@ part isn't included), and the password-free pull URL is something you can get from github's website. So you'll end up with something like this:

pushurl = git@github.com:username/reponame.git
url = https://github.com/username/reponame.git

Automating it

That's helpful, and I've made that change on all of my repos. But I just forked another repo on github, and as I went to edit .git/config I remembered what a pain this had been to do en masse on all my repos; and how it would be a much bigger pain to do it on a gazillion tiny GIMP asset repos if they end up going with that model and I ever want to help with the development. It's just the thing that should be scriptable.

However, the rules for what constitutes a valid git passwordless pull URL, and what constitutes a valid ssh writable URL, seem to encompass a lot of territory. So the quickie Python script I whipped up to modify .git/config doesn't claim to handle everything; it only handles the URLs I've encountered personally on Gnome and GitHub. Still, that should be useful if I ever have to add multiple repos at once. The script: repo-pullpush (yes, I know it's a terrible name) on GitHub.

April 05, 2016 06:28 PM

March 29, 2016

Elizabeth Krumbach

Tourist in Singapore

Time flies, and my recent trip to Singapore to speak at FOSSASIA snuck up on me. I wasn’t able to make time to do research into local attractions and so I found myself there the day before the conference I was there to attend with only one thing on my agenda, the Night Safari. MJ told me about it years ago when he visited Singapore and how he thought I’d enjoy it, given my love for animals and zoos.

I flew Singapore Air, frequently ranked the best airline in the world, and for good reason. Even in coach, the service is top notch and the food is edible, sometimes even good. My itinerary took me through Seoul on the way out, which felt the long way of doing things but my layover was short and I had a contiguous flight number, so passengers were mostly just shuffled through security and loaded onto the next plane. I seem to have cashed in all my travel karma this trip and ended up with an entire center row to myself, which meant I could lie down and get some sleep during the flights even though I was in coach. Heavenly! I arrived in Singapore at the bright and early time of 2AM and caught a taxi to my hotel. Thankfully I was able to get some sleep there too so I was ready for my jet lag adjustment day on Wednesday.

In the morning I met up with a colleague who was also in town for the conference. With neither of us having plans, I dragged him along with me as we bought tickets for the Night Safari that evening, including transport from a tour company that included priority boarding inside the park once we arrived. And then on to a touristy hop on/hop off bus to give us an overview of the city.


On the tourist bus!

The first thing I’ll say about Singapore: It’s hot and humid. I’m not built for this kind of weather. As much as I enjoyed my adventures, it was a struggle each day to keep up with my “I went to school in Georgia, this is fine!” colleague and to stay hydrated.

Then there’s their love for greenery. As a city-state there is a prevalence of what they refer to as the “concrete jungle” but they also seem keen on striking a balance. Many buildings have green gardens, and even full trees, on various balconies and roofs of their tall buildings. Even throughout areas of the city you could find larger green spaces than I’m accustom to seeing, bigger trees that they’ve clearly made an effort to make sure could still thrive. It was nice to see in a city.

The tourist bus took us through the heart of downtown where we were staying, then down to Chinatown, where the where we saw the Sri Mariamman Temple (which is actually a Hindu temple). The financial and districts were next, and then we decided to leave the bus for a time as we got to the Gardens by the Bay. This was a huge complex. There were several outdoor gardens with various themes, which surround the main area that has a couple indoor complexes as well as the outdoor tree-like structures that loom large, I got some great pictures of them.

We decided to go into the Cloud Forest, seeing as we were in town to speak about our work on cloud software. I was worried it would be even hotter inside, but it was amusing to discover that it was actually cooler, quite the welcome break for me. The massive dome structure enclosed what I would compare to the rain forest dome inside the California Academy of Sciences building in San Francisco, but much bigger and with a strong focus on flora rather than fauna. You enter the building at ground level and take the elevator to the top to walk down several stories through exhibits showing plant life of all sorts. It made for some nice views of the whole complex, and outside too.

After the dome, it was back out in the heat. We walked through some of the outdoor gardens before hopping on the tourist bus again. We took it through the Indian neighborhood where we saw the Sri Veeramakaliamman Temple and Arab section which included getting to see the beautiful Masjid Sultan (mosque), near where we had dinner later in the week at an Indian place that advertised being Halal.

By the time the bus got back to the stop near our hotel it was time for me to take a break before the Night Safari. We were being picked up for the safari at 6PM, which took us on a van to meet our bus that took us up to the part of the island where the Night Safari was. The tour guide gave an interesting take on history and the social benefits of living in Singapore on our journey up. It did make me reflect upon the fact that while there was traffic, the congestion was nothing like I’d expect for a city of Singapore’s size. I hadn’t yet experienced the public transit, but as I’d learn later in my trip it was quite good for the southern parts of the island.

The Night Safari! First impression: Tourist trap. But it got better. Once you make your way past the crowds, shops and food places, and beyond the goofy welcome show that has various animals doing tricks things get better. The adventure begins on a tram through the park. With the tour we didn’t have to wait in line, which when combined with the bus ride there, made it worth the extra fee for paying for the tour. The tram takes you through various habitats from around the world where nocturnal animals dwell. Big cats, various types of deer, wolves and hippos were among the star attractions. I was delighted to finally get to see some tahrs, which the last Ubuntu LTS release were named after.

After the tram tour I was feeling pretty tired, heat and jet lag hitting me hard. But I decided to go on a couple of the walking trails anyway. It was worth it. The walking trails are by far the best part of the park! More animals and getting to take the time as much time as you want to see the various animals. Exhaustion started hitting me when we completed half the trails, but I got to see fishing cats, otters, bats, a sleeping pangolin (another Ubuntu LTS animal!) and my favorite of the night, the binturong, otherwise known as a bear cat. I didn’t take any pictures of the animals, because night safari. By the end of our walking I was pretty tired and just wanted to get back to my bed, we forewent the tour bus back to the hotel and just got a taxi.

Thursday evening the first conference events kicked off with a hot pot dinner, but prior to that we had more time for touristing. During our city tour the day before I saw the Mint – Museum of Toys. Casting away thoughts of Toy Story 2’s plot line of being sold to a Japanese toy museum, I was delighted to visit an actual toy museum. Sadly, their floor on Space and Sci-Fi toys was closed, but the rest of the museum mostly made up for it. The open parts of the museum had 5 floors of toy displays spanning about one hundred years. Most of the toys were cartoon-related, with Popeye, super heroes, various popular Anime and Disney characters all making a respectable showing. Some of the toys packed into displays had surprisingly high appraisals attached to them, and there were notes here and there about their rarity. I had a lot of fun!

After toys, we decided to find lunch. It turns out that a number of places aren’t open for lunch, so we wandered around for a bit until around noon when we found ourselves in the Raffles Hotel courtyard in front of a menu that looked lovely for lunch. It was outdoors, so no escaping the heat, but the shade made things a bit more tolerable. It didn’t take long for us to eye the list of Slings on the cocktail menu and learn via a Google search that we were sitting where Singapore Slings were invented. How cool! Hydration took a back seat, I had to have a Singapore Sling were they were invented.

After lunch we continued our walk to make our way to the newly opened National Gallery. I had actually read about this one incidentally before arriving in Singapore, as it just opened in November and the opening was briefly covered in a travel magazine I read. This new gallery is housed in the historical former Supreme Court and City Hall buildings, and they didn’t do anything to hide this. Particularly in the Supreme Court building, it was very obvious that it was a courthouse, with much of what look like original benches throughout and rooms that still looked like court rooms with big wooden chairs and (jury?) boxes. In all, they were amazing buildings. The contents within made it that much better, these were some of the most impressive galleries I’ve ever had the pleasure of walking through. Art spanned centuries and styles of southern Asian talent, as well as art from colonials. I do admit enjoying the older, more realistic art rather than the modern and abstract, but there was something for everyone there. I’ll definitely go again the next time I’m in Singapore.

The National Gallery visit concluded my tourist adventures. That evening we met up with fellow FOSSASIA speakers at a hot pot restaurant not far from our hotel. It was my first time having hot pot, collecting raw meats, vegetables and fish from a buffet and dumping it in various boiling pots with seasonings was an experience I’m glad I had, but the weather got me there too. Sitting over a boiling pot in the evening heat and humidity certainly took its toll on me. Later in the week I had the opposite culinary adventure when I ended up at Swensen’s, an ice cream chain that started in San Francisco. I’d never been to the one in San Francisco, but apparently they’ve been a big hit in south Asia. It was fascinating to be in a San Francisco-themed restaurant and order a Golden Gate Bridge sundae while sitting halfway around the world from my city by the bay. Maybe I should visit the one in San Francisco now.

More photos from my tour around Singapore here: https://www.flickr.com/photos/pleia2/albums/72157666098884052

Two days isn’t nearly enough in Singapore. Even though I don’t shop (and shopping is BIG there!) I only got a small taste of what the city had to offer.

Next stop was on to the conference at the Singapore Science Centre, which was quite the inspired venue selection for an open source conference, especially one that attracted a number of younger attendees, but that’s a story for another day.

by pleia2 at March 29, 2016 02:31 AM

March 27, 2016

Elizabeth Krumbach

Wine and dine in Napa Valley

In 2008 when I was visiting MJ in my first trip to San Francisco we had plans to go up to Napa Valley. Given the distance and crowds, the driver MJ hired for the day made an alternate suggestion: “How about Sonoma Valley instead?” That day was the beginning of us being Sonoma Valley fans. Tastings weren’t over-crowded, the wine was excellent, at the time traffic was tolerable even coming back to the city. We visited a winery with a wine cave, where we’d get engaged three years later. Last year we joined a wine club, sealing our fate to visit regularly.

We never did make it to Napa, until a couple weeks ago.

For MJ’s birthday last year I promised him a meal at the most coveted restaurant in California, The French Laundry. I worked with a concierge to complete the herculean effort to get reservations, and then rescheduled a couple of times to work around our shifting travel schedules. Finally they were firmed up for Sunday, March 13th. The timing worked out, with all our travel lately we hadn’t seen much of each other, so it was a nice excuse to get out of town and spend the weekend together. We drove up Friday night and checked into the Harvest Inn, catching a late dinner at the lovely restaurant there, Harvest Table.


Dinner at Harvest Table

Saturday morning we began our wine trail. We didn’t have a lot of time to plan this trip, so we depended upon the recommendations of my recent house guest, George Mulak (and remotely, his wife Vicki), who supplied us with a list of their favorites. Their recommendations were spot on. Our first stop was Heitz Cellar which was conveniently almost across the street from where we were staying. They have a relatively small tasting area, and sadly when we arrived the skies had opened up to give us piles of rain, so there was no enjoying the grounds. They did have a couple things I really liked though. The first was a bit surprise, I don’t typically care for Zinfandels, but we bought a bottle of theirs, it was very good. Two bottles of their port also came home with us. Next on our list was one of several Rutherford <Noun> wineries, and we ended up at the wrong one, in what was a lovely mistake. We found ourselves at Rutherford Hill. a famous winery known for their Merlots, and I love Merlot. They also had wine caves and did tours! On the rainy day that it ended up being, a wine cave tour was a fantastic shelter from the weather. Our bartender and tour guide was super friendly and inviting and there’s a reason they are world-renowned: their wines are wonderful. We even joined their club.


Drinking wine in the Rutherford Hills wine caves

For lunch we went to Rutherford Grill, which we quickly noticed looked a lot like one of our Silicon Valley favorites, Los Altos Grill, and San Francisco haunt Hillstone. Turns out they’re all related. The familiarity was a welcome surprise, and an enjoyable lunch.

Wine adventures continued in the same parking lot as the grill when we made our way across to Beaulieu Vineyard (BV). I think planning ahead would have served us better here, we just did the basic tasting which was pretty run of the mill. A day with better weather and a planned historic wine tour would have been a better experience, maybe next time. From there we made our final stop of the day back near our hotel at Franciscan Estate Winery. We had a lovely time chatting with the Philadelphia-native pouring our wines and did a couple flights covering their range of types and qualities. A fine way to round out our afternoon. We picked up some snacks and water (time to hydrate!) at the lovely Dean & DeLuca shop (purveyors of fine food) and went back to the hotel to spend some time relaxing before dinner.


Final tasting of the day at Franciscan

In preparation for our exciting French Laundry reservation the following day, we booked late (9:45PM) dinner reservations at a related restaurant, Bouchon. Another French restaurant by Thomas Keller, the meal was delicious and the atmosphere was both fancy and casual, a lovely mix of how at home a really nice Napa Valley restaurant can make you feel. Highly recommended, and quite a bit easier to get reservations at than The French Laundry, though I still did need to plan a couple weeks ahead.


Appetizers at Bouchon

Sunday morning concluded our stay at the Harvest Inn. In spite of the rainy weekend, I did get to enjoy walking through their grounds a bit and appreciated the spacious room we had and the real wood fireplace. The location was great too, giving us a nice home base for the loop of wineries we visited. We’d stay here again. Check out was quick and then we were dressed up and on our way to the gem of our Napa adventure: Tasting menu lunch at The French Laundry!

In case I haven’t drilled this home enough, The French Laundry was named the Best Restaurant in the World multiple times. Even when it’s not at the top, looking at pretty much any top 10 lists for the past decade will see it listed as well. Going here was a really big, once in a lifetime, kind of deal.

The rainy weekend continued as we were seated downstairs and settled in with a glass of champagne to start our meal. A half bottle of red wine later joined us mid-meal. What struck me first about the meal there was the environment. French restaurants I’ve been to are either very modern or very stuffy, neither of which I’m a huge fan of. The French Laundry was a lovely mix of the two, much like Bouchon of the previous night, it seemed to reflect its home in Napa Valley. The restaurant was truly laundry themed in a very classy way, with a clothes pin as their logo and the lamps on the walls tastefully boasting clothes laundry symbols. The staff was professional, charming and witty. The food was spectacular, quickly making it into one of the top three meals I’ve ever had. The meal took about three hours, with small plates coming at a nice pace to keep us satisfied but also relaxed so we could enjoy the time there. I was definitely full at the end, especially after the stream of beautiful and delicious desserts that filled our table at the end. At the conclusion of the meal we were given a copy of the menu and gifted the wooden clothes pins that were at our table upon arrival. In all, it was an exceptional experience.


Meal at The French Laundry

With some time on our hands following our long lunch at The French Laundry we decided to add one more winery to our itinerary before driving home, Hagafen Cellars. Their wines are Kosher, even for Passover, which makes them great for us during that no-bread time and a star at the White House during major Jewish and Israeli-focused events. Best of all, their wines are wonderful. Having not grown up Jewish, I was not aware of the disappointment found with the standard Manischewitz wine until a couple years ago, so it was refreshing to learn we have other options during Passover! We were pretty close to joining their wine club, but in the end preferred making our own selections, and with a trunk full of wine we figured we’d had enough for now.


Final stop, Hagafen Cellars

With that, our fairy tale weekend together in Napa Valley came to a close. MJ flew out to Seattle that night for work. My trip to Singapore had me leaving the next morning.

More photos from our weekend in Napa Valley here: https://www.flickr.com/photos/pleia2/albums/72157665313725990

by pleia2 at March 27, 2016 06:26 PM

March 26, 2016

Akkana Peck

Debian: Holding packages you build from source, and rebuilding them easily

Recently I wrote about building the Debian hexchat package to correct a key binding bug.

I built my own version of the hexchat packages, then installed the ones I needed:

dpkg -i hexchat_2.10.2-1_i386.deb hexchat-common_2.10.2-1_all.deb hexchat-python_2.10.2-1_i386.deb hexchat-perl_2.10.2-1_i386.deb

That's fine, but of course, a few days later Debian had an update to the hexchat package that wiped out my changes.

The solution to that is to hold the packages so they won't be overwritten on the next apt-get upgrade:

aptitude hold hexchat hexchat-common hexchat-perl hexchat-python

If you forget which packages you've held, you can find out with aptitude:

aptitude search '~ahold'

Simplifying the rebuilding process

But now I wanted an easier way to build the package. I didn't want to have to search for my old blog post and paste the lines one by one every time there was an update -- then I'd get lazy and never update the package, and I'd never get security fixes.

I solved that with a zsh function:

newhexchat() {
    # Can't set errreturn yet, because that will cause mv and rm
    # (even with -f) to exit if there's nothing to remove.
    cd ~/outsrc/hexchat
    echo "Removing what was in old previously"
    rm -rf old
    echo "Moving everything here to old/"
    mkdir old
    mv *.* old/

    # Make sure this exits on errors from here on!
    setopt localoptions errreturn

    echo "Getting source ..."
    apt-get source hexchat
    cd hexchat-2*
    echo "Patching ..."
    patch -p0 < ~/outsrc/hexchat-2.10.2.patch
    echo "Building ..."
    debuild -b -uc -us
    echo
    echo 'Installing' ../hexchat{,-python,-perl}_2*.deb
    sudo dpkg -i ../hexchat{,-python,-perl}_2*.deb
}

Now I can type newhexchat and pull a new version of the source, build it, and install the new packages.

How do you know if you need to rebuild?

One more thing. How can I find out when there's a new version of hexchat, so I know I need to build new source in case there's a security fix?

One way is the Debian Package Tracking System. You can subscribe to a package and get emails when a new version is released. There's supposed to be a package tracker web interface, e.g. package tracker: hexchat with a form you can fill out to subscribe to updates -- but for some packages, including hexchat, there's no form. Clicking on the link for the new package tracker goes to a similar page that also doesn't have a form.

So I guess the only option is to subscribe by email. Send mail to pts@qa.debian.org containing this line:

subscribe hexchat [your-email-address]
You'll get a reply asking for confirmation.

This may turn out to generate too much mail: I've only just subscribed, so I don't know yet. There are supposedly keywords you can use to limit the subscription, such as upload-binary and upload-source, but the instructions aren't at all clear on how to include them in your subscription mail -- you say keyword, or keyword your-email, so where do you put the actual keywords you want to accept? They offer no examples.

Use apt to check whether your version is current

If you can't get the email interface to work or suspect it'll be too much email, you can use apt to check whether the current version in the repository is higher than the one you're running:

apt-cache policy hexchat

You might want to automate that, to make it easy to check on every package you've held to see if there's a new version. Here's a little shell function to do that:

# Check on status of all held packages:
check_holds() {
    for pkg in $( aptitude search '~ahold' | awk '{print $2}' ); do
        policy=$(apt-cache policy $pkg)
        installed=$(echo $policy | grep Installed: | awk '{print $2}' )
        candidate=$(echo $policy | grep Candidate: | awk '{print $2}' )
        if [[ "$installed" == "$candidate" ]]; then
            echo $pkg : nothing new
        else
            echo $pkg : new version $candidate available
        fi
    done
}

March 26, 2016 05:11 PM

Elizabeth Krumbach

Six years in San Francisco

February 2016 marked six years of me living here in San Francisco. It’s hard to believe that much time has passed, but at the same time I feel so at home in my latest adopted city. I sometimes find myself struggling to remember what it was like to live in the suburbs, drive every day and not be able to just walk to the dentist, or take in the beautiful sights along the Embarcadero as I go for a run. I’ve grown accustom to the weather and seasons (or lack thereof), and barely think twice when making plans. Of course the weather will be beautiful!

I love you, California, I adore spending my time on The Dock of the Bay.

Our travel schedules this year have been a bit crazy though. I just returned from my second overseas conference of the year on Monday and MJ has been spending almost half his time time traveling with work. We’ve tried to plan things so that we’re not out of town at the same time, but haven’t always been effective. Plus, being out of town the same time is great for the cats and our need for a pet sitter, but it’s less great for getting time together. We ended up celebrating Valentine’s Day a day early, on February 13th, in order to work around these schedules and MJ’s plan to leave for a trip on Sunday.

It was a fabulous Valentine’s Day dinner though. We went to Jardinière over in Hayes Valley and both ordered the tasting menu, and I went with the wine pairing since I didn’t have a flight to catch the next day. Everything was exceptional, from the sea urchin to the beautifully prepared, marbled steak that melts in your mouth. I hope we can make it back at some point.

With MJ out of town I’ve had to fight the temptation to slip into workaholic mode. I definitely have a lot of work to do, especially as my for-real-this-time book deadline approaches. But I’ve grown appreciative of the need to take a break, and how it untangles the mind to be fresh again the next day and more effective at solving problems. On Presidents’ Day I treated myself to an afternoon at the zoo.

More photos from the zoo here: https://www.flickr.com/photos/pleia2/albums/72157662402671763

I’ve also gotten to make time to spend with friends here and there, recently making it out to the cinema with a friend to see the Oscar Nominated Animation Shorts. I grew to appreciate these shorts years ago after learning my beloved Wallace & Gromit films had been nominated and won in the past, but it had been some time since I’d gone to a theater to enjoy them.

While MJ has been in town, I’ve reflected on my six years here in the city and realized there were still things I’ve wanted to do in the area but haven’t had the opportunity to, so I’ve been slowly checking them off my list. Even small changes to accommodate new things have been worth it. One afternoon we took a slight detour from going to the Beach Chalet and instead went downstairs to the Park Chalet where we had never been before.


High Tide Hefeweizen at the Park Chalet

While on the topic of food, we also finally made it over to Zachary’s Chicago Pizza over in Oakland, near the Rockridge BART station. I’m definitely a New York pizza girl, but I hear so many good things about Zachary’s every time I moan about the state of California pizza. We went around 2:30 in the afternoon on a Saturday afternoon and were seated immediately. Eating there is a bit of an event, you order and wait a half hour for your giant wall of deep dish pizza to cook, I had the Spinach & Mushroom. The toppings and cheese are buried inside the pizza, with the sauce covering the top. It was really good, even if I could barely finish two pieces (leftovers!).

After Zachary’s I had planned to take BART up to downtown Berkeley to hit up a comic book store, since the one I used to go to here in San Francisco has closed due to increasing rent. I was delighted to learn that there was a comic book store within walking distance of where we already were. That’s how I was introduced to The Escapist in Berkeley, just over the Oakland/Berkeley border. I picked up most of the backlog of comics I was looking for, and then hit up Dark Carnival next door, a great Sci-Fi and Fantasy book store that I’d been to in the past. I’ll be returning to both stores in the near future.

And now it’s time to take an aforementioned break. Saturday off, here I come!

by pleia2 at March 26, 2016 02:32 AM

March 25, 2016

Jono Bacon

Suggestions for Donating a Speaker fee

In August I am speaking at Abstractions and the conference organizers very kindly offered to provide a speaker fee.

Thing is, I have a job and so I don’t need the fee as much as some other folks in the world. As such, I would like to donate the speaker fee to an open source / free software / social good organization and would love suggestions in the comments.

I probably won’t donate to the Free Software Foundations, EFF, or Software Freedom Conservancy as I have already financially contributed to them this year.

Let me know your suggestions in the comments!

by Jono Bacon at March 25, 2016 04:30 PM

March 17, 2016

Akkana Peck

Changing X brightness and gamma with xrandr

I switched a few weeks ago from unstable ("Sid") to testing ("Stretch") in the hope that my system, particularly X, would break less often. The very next day, I updated and discovered I couldn't use my system at night any more, because the program I use to reduce the screen brightness by tweaking X gamma no longer worked. Neither did other related programs, such as xgamma and xcalib.

The Dell monitor I use doesn't have reasonable hardware brightness controls: strangely, the brightness button works when the monitor is connected over VGA, but if I want to use the sharper HDMI connection, brightness adjustment no longer works. So I depend on software brightness adjustment in order to use my computer at night when the room is dim.

Fortunately, it turns out there's a workaround. xrandr has options for both brightness and gamma:

xrandr --output HDMI1 --brightness .5
xrandr --output HDMI1 --gamma .5:.5:.5

I've always put xbrightness on a key, so I can use a function key to adjust brightness interactively up and down according to conditions. So a command that sets brightness to .5 or .8 isn't what I need; I need to get the current brightness and set it a little brighter or a little dimmer. xrandr doesn't offer that, so I needed to script it.

You can get the current brightness with

xrandr --verbose | grep -i brightness

But I was hoping there would be a more straightforward way to get brightness from a program. I looked into Python bindings for xrandr; there are some, but with no documentation and no examples. After an hour of fiddling around, I concluded that I could waste the rest of the day poring through the source code and trying things hoping something would work; or I could spend fifteen minutes using subprocess.call() to wrap the command-line xrandr.

So subprocesses it was. It made for a nice short script, much simpler than the old xbrightness C program that used <X11/extensions/xf86vmode.h> and XF86VidModeGetGammaRampSize(): xbright on github.

March 17, 2016 05:01 PM

March 09, 2016

Akkana Peck

Juniper allergy season

It's spring, and that means it's the windy season in New Mexico -- and juniper allergy season.

When we were house-hunting here, talking to our realtor about things like local weather, she mentioned that spring tended to be windy and a lot of people got allergic. I shrugged it off -- oh, sure, people get allergic in spring in California too. Little did I know.

A month or two after we moved, I experienced the worst allergies of my life. (Just to be clear, by allergies I mean hay fever, sneezing, itchy eyes ... not anaphylaxis or anything life threatening, just misery and a morbid fear of ever opening a window no matter how nice the temperature outside might be.)

[Female (left) and male junipers in spring]
I was out checking the mail one morning, sneezing nonstop, when a couple of locals passed by on their morning walk. I introduced myself and we chatted a bit. They noticed my sneezing. "It's the junipers," they explained. "See how a lot of them are orange now? Those are the males, and that's the pollen."

I had read that juniper plants were either male or female, unlike most plants which have both male and female parts on every plant. I had never thought of junipers as something that could cause allergies -- they're a common ornamental plant in California, and also commonly encountered on trails throughout the southwest -- nor had I noticed the recent color change of half the junipers in our neighborhood.

But once it's pointed out, the color difference is striking. These two trees, growing right next to each other, are the same color most of the year, and it's hard to tell which is male and which is female. But in spring, suddenly one turns orange while the other remains its usual bright green. (The other season when it's easy to tell the difference is late fall, when the female will be covered with berries.)

Close up, the difference is even more striking. The male is dense with tiny orange pollen-laden cones.

[Female juniper closeup] [male juniper closeup showing pollen cones]

A few weeks after learning the source of my allergies, I happened to be looking out the window on a typically windy spring day when I saw an alarming sight -- it looked like the yard was on fire! There were dense clouds of smoke billowing up out of the trees. I grabbed binoculars and discovered that what looked like fire smoke was actually clouds of pollen blowing from a few junipers. Since then I've gotten used to seeing juniper "smoke" blowing through the canyons on windy spring days. Touching a juniper that's ready to go will produce similar clouds.

The good news is that there are treatments for juniper allergies. Flonase helps a lot, and a lot of people have told me that allergy shots are effective. My first spring here was a bit miserable, but I'm doing much better now, and can appreciate the fascinating biology of junipers and the amazing spectacle of the smoking junipers (not to mention the nice spring temperatures) without having to hide inside with the windows shut.

March 09, 2016 03:20 AM

March 08, 2016

kdub

Mir and Vulkan Demo

This week the Mir team got a Vulkan demo working on Mir! (youtube link to demo)

I’ve been working on replumbing mir’s internals a bit to give more fine grained control over buffers, and my tech lead Cemil has been working on hooking that API into the Vulkan/Mir WSI.

The tl;dr on Vulkan is its a recently finalized hardware accelerated graphics API from Khronos (who also proved the OpenGL APIs). It doesn’t surplant OpenGL, but can give better performance (esp in multithreaded environments) and better debug in exchange for more explicit control of the GPU.

Some links:
Khronos Vulkan page

Wikipedia Vulkan entry

short video from Intel at SIGGRAPH with a quick explanation

longer video from NVIDIA at SIGGRAPH on Vulkan

 

If you’re wondering when this will appear in a repository near you, probably right after the Ubuntu Y series opens up (we’re in a feature freeze for xenial/16.04 LTS at the moment).

by Kevin at March 08, 2016 07:31 PM

March 06, 2016

Elizabeth Krumbach

Xubuntu 16.04 ISO testing tips

As we get closer to the 16.04 LTS release, it’s becoming increasingly important for people to be testing the daily ISOs to catch any problems. This past week, we had the landing of GNOME Software to replace the Ubuntu Software Center and this will definitely need folks looking at it and reporting bugs (current ones tracked here: https://bugs.launchpad.net/ubuntu/+source/gnome-software)

In light of this, I thought I’d quickly share a few of my own tips and stumbling points. My focus is typically on Xubuntu testing, but things I talk about are applicable to Ubuntu too.


ISO testing on a rainy day

1. Downloading the ISO

Downloading an ISO every day, or even once a week can be tedious. Helpfully, the team provides the images via zsync which will only download the differences in the ISO between days, saving you a lot of time and bandwidth. Always use this option when you’re downloading ISOs, you can even use it the first time you download one, as it will notice that none exists.

The zsync URL is right alongside all the others when you choose “Link to the download information” in the ISO tracker:

You then use a terminal to cd into the directory where you want the ISO to be (or where it already is) and copy the zsync line into the terminal and hit enter. It will begin by examining the current ISO and then give you a progress bar for what it needs to download.

2. Putting the image on a USB stick

I have struggled with this for several releases. At first I was using UNetbootin (unetbootin), then usb-creator (usb-creator-gtk). Then I’d switch off between the two per release when one or the other wasn’t behaving properly. What a mess! How can we expect people to test if they can’t even get the ISO on a USB stick with simple instructions?

The other day flocculant, the Xubuntu QA Lead, clued me into using GNOME Disks to put an ISO on a USB stick for testing. You pop in the USB stick, launch gnome-disks (you’ll need to install the gnome-disk-utility package in Xubuntu), select your USB stick in the list on the left and choose the “Restore Disk Image…” option in the top right to select the ISO image you want to use:

I thought about doing a quick screencast of it, but Paul W. Frields over at Fedora Magazine beat me to it by more than a year: How to make a Live USB stick using GNOME Disks

This has worked beautifully with both the Xubuntu and Ubuntu ISOs.

3. Reporting bugs

The ISO tracker, where you report testing results, is easy enough to log into, but a fair number of people quit the testing process when it gets to actually reporting bugs. How do I report bugs? What package do I report them against? What if I do it wrong?

I’ve been doing ISO testing for several years, and have even run multiple events with a focus on ISO testing, and STILL struggle with this.

How did I get over it?

First, I know it’s a really long page, but this will get you familiar with the basics of reporting a bug using the ubuntu-bug tool: Ubuntu ReportingBugs

Often times being familiar with the basic tooling isn’t enough. It’s pretty common to run into a bug that’s manifesting in the desktop environment rather than in a specific application. A wallpaper is gone, a theme looks wrong, you’re struggling to log in. Where do those get submitted? And Is this bad enough for me to classify it as “Critical” in the ISO Tracker? This is when I ask. For Xubuntu I ask in #xubuntu-devel and for Ubuntu I ask in #ubuntu-quality. Note: people don’t hover over their keyboards on IRC, explain what you’re doing, ask your question and be patient.

This isn’t just for bugs, we want to see more people testing and it’s great when new testers come into our IRC channels to share their experiences and where they’re getting stuck. You’re part of our community :)


Simcoe thinks USB sticks are cat toys

Resources

I hope you’ll join us.

by pleia2 at March 06, 2016 05:54 PM

March 04, 2016

Akkana Peck

Recipe: Easy beef (or whatever) jerky

You don't need a special smoker or dehydrator to make great beef jerky.

Winter is the time to make beef jerky -- hopefully enough to last all summer, because in summer we try to avoid using the oven, cooking everything outside so as not to heat up the house. In winter, having the oven on for five hours is a good thing.

It took some tuning to get the flavor and the amount of saltiness right, but I'm happy with my recipe now.

Beef jerky

Ingredients

  • thinly sliced beef or pork: about a pound or two
  • 1-1/2 cups water
  • 1/4 cup soy sauce
  • 3/4 tbsp salt
  • Any additional seasonings you desire: pepper, chile powder, sage, ginger, sugar, etc.

Directions

Heat water slightly (30-40 sec in microwave) to help dissolve salt. Mix all ingredients except beef.

Cut meat into small pieces, trimming fat as much as possible.

Marinate in warm salt solution for 15 min, stirring occasionally. (For pork, you might want a shorter marinating time. I haven't tried other meats.)

Set the oven on its lowest temperature (170F here).

Lay out beef on a rack, with pieces not touching or overlapping.
Nobody seems to sell actual cooking racks, but you can buy "cooling racks" for cooling cookies, which seem to work fine for jerky. They're small so you probably need two racks for a pound of beef.

Ideally, put the rack on one oven shelf with a layer of foil on the rack below to catch the drips.
You want as much air space as possible under the meat. You can put the rack on a cookie sheet, but it'll take longer to cook and you'll have to turn the meat halfway through. Don't lay the beef directly on cookie sheet or foil unless you absolutely can't find a rack.

Cook until sufficiently dry and getting hard, about 4 to 4-1/2 hours at 170F depending on how dry you like your jerky. Drier jerky will keep longer unrefrigerated, but it's not as tasty. I cook mine a little less and store it in the fridge when I'm not actually carrying it hiking or traveling.

If you're using a cookie sheet, turn the pieces once at around 2-3 hours when the tops start to look dry and dark.

Tip: if you're using a rack without a cookie sheet, a fork wedged between the bars of the rack makes it easy to remove a rack from the oven.

March 04, 2016 07:24 PM

March 01, 2016

Elizabeth Krumbach

OpenStack infra-cloud sprint

Last week at the HPE offices in Fort Collins, members of the OpenStack Infrastructure team focused on getting an infra-cloud into production met from Monday through Thursday.

The infra-cloud is an important project for our team, so important that it has a Mission!

The infra-cloud’s mission is to turn donated raw hardware resources into expanded capacity for the OpenStack infrastructure nodepool.

This means that in addition companies who Contribute Cloud Test Resources in the form of OpenStack instances, we’ll be running our own OpenStack-driven cloud that will provide additional instances to our pool of servers we run tests on. We’re using the OpenStack Puppet Modules (since the rest of our infra uses Puppet) and bifrost, which is a series of Ansible playbooks that use Ironic to automate the task of deploying a base image onto a set of known hardware.

Our target for infra-cloud was a few racks of HPE hardware provided to the team by HPE that resides in a couple HPE data centers. When the idea for a sprint came together, we thought it might be nice to have the sprint itself hosted at an HPE site where we could meet some of the folks who handle servers. That’s how we ended up in Fort Collins, at an HPE office that had hosted several mid-cycle and sprint events for OpenStack in the past.

Our event kicked off with an overview by Colleen Murphy of work that’s been done to date. The infra-cloud team that Colleen is part of has been architecting and deploying the infra-cloud over the past several months with an eye toward formalizing the process and landing it in our git repositories. Part of the aim of this sprint was to get everyone on the broader OpenStack Infrastructure team up to speed with how everything works so that the infra cores could intelligently review and provide feedback on the patches being deployed. Colleen’s slides (available here) also gave us an overview of the baremetal workflow with bifrost, the characteristics of the controller and compute nodes, networking (and differences found between the US East and US West regions) and her strategy for deploying locally for a development environment (GitHub repo here). She also spent time getting us up to speed with the HPE iLO management interfaces that we’ll have to use if we’re having trouble with provisioning.

This introduction took up our morning. After lunch it was time to talk about our plan for the rest of our time together. We discussed the version of OpenStack we wanted to focus on and broadly how and if we planned to do upgrades, along with goals of this project. Of great importance was also that we built something that could be redeployed if we changed something, we don’t want this infrastructure to bit rot and cause a major hassle if we need to rebuild the cloud for some reason. We then went through the architecture section of the infra-cloud documentation to confirm that the assumptions there continued to be accurate, and made notes accordingly on our etherpad when they were not.

Our discussion then shifted into broad goals for our week, out came the whiteboard! It was decided that we’d focus on getting all the patches landed to support US West so that by the end of the sprint we’d have at least one working cloud. It was during this discussion that we learned how valuable hosting our sprint at an HPE facility was. An attendee at our sprint, Phil Jensen, works in the Fort Collins data center and updated us on the plans for moving systems out of US West. The timeline that he was aware of was considerably closer than we’d been planning on. A call was scheduled for Thursday to sort out those details, and we’re thankful we did since it turns out we had to effectively be ready to shut down the systems by the end of our sprint.

Goals continued for various sub-tasks in what coalesced in the main goal of the sprint: Get a region added to Nodepool so I could run a test on it.

Tuesday morning we began tackling our tasks, and at 11:30 Phil came by to give us a tour of the local data center there in Fort Collins. Now, if we’re honest, there was no technical reason for this tour. All the systems engineers on our team have been in data centers before, most of us have even worked in them. But there’s a reason we got into this: we like computers. Even if we mostly interact with clouds these days, a tour through a data center is always a lot of fun for us. Plus it got us out of the conference room for a half hour, so it was a nice pause in our day. Huge thanks to Phil for showing us around.

The data center also had one of the server types we’re using in infra-cloud, the HP SL390. While we didn’t get to see the exact servers we’re using, it was fun to get to see the size and form factor of the servers in person.


Spencer Krum checks out a rack of HP SL390s

Tuesday was spent heads down, landing patches. People moved around the room as we huddled in groups, and there was some collaborative debugging on the projector as we learned more about the deployment, learned a whole lot more about OpenStack itself and worked through some unfortunate issues with Puppet and Ansible.


Not so much glamour, sprints are mostly spent working on our laptops

Wednesday was the big day for us. The morning was spent landing more patches and in the afternoon we added our cloud to the list of clouds in Nodepool. We then eagerly hovered over the Jenkins dashboard and waited for a job to need a trusty node to run a test…

Slave ubuntu-trusty-infracloud-west-8281553 Building swift-coverage-bindep #2

The test ran! And completed successfully! Colleen grabbed a couple screenshots.


We watch on Clark Boylan’s laptop as the test runs

Alas, it was not all roses. Our cloud struggled to obey the deletion command and the test itself ran considerably slower than we would have expected. We spent some quality time looking at disk configurations and settings together to see if we could track down the issue and do some tuning. We still have more work to do here to get everything running well on this hardware once it has moved to the new facility.

Thursday we spent some time getting US East patches to land before the data center moves. We also had a call mid-day to firm up the timing of the move. Our timing for the sprint ended up working out well for the move schedule, we were able to complete a considerable amount of work at the sprint before the machines had to be shut down. The call was also valuable in getting to chat with some of the key parties involved and learn what we needed to hand off to them with regard to our requirements for the new home the servers will have, in an HPE POD (cool!) in Houston. This allowed us to kick off a Network Requirements for Infracloud Relocation Deployment thread and Cody A.W. Somerville captured notes from the rest of the conversation here.

The day concluded with a chat about how the sprint went. The feedback was pretty positive, we all got a lot of work done, Spencer summarized our feedback on list here.

Personally, I liked that the HPE campus in Fort Collins has wild rabbits. Also, it snowed a little and I like snow.

I could have done without the geese.

It was also enjoyable to visit downtown Fort Collins in the evenings and meet up with some of the OpenStack locals. Plus, at Coopersmith’s I got a beer with a hop pillow on top. I love hops.

More photos from the week here: https://www.flickr.com/photos/pleia2/sets/72157662730010623/

David F. Flanders also Tweeted some photos: https://twitter.com/dfflanders/status/702603441508487169

by pleia2 at March 01, 2016 02:01 AM

February 27, 2016

Akkana Peck

Learning to Weld

I'm learning to weld metal junk into art!

I've wanted to learn to weld since I was a teen-ager at an LAAS star party, lusting after somebody's beautiful homebuilt 10" telescope on a compact metal fork mount. But building something like that was utterly out of reach for a high school kid. (This was before John Dobson showed the world how to build excellent alt-azimuth mounts out of wood and cheap materials ... or at least before Dobsonians made it to my corner of LA.)

Later the welding bug cropped up again as I worked on modified suspension designs for my X1/9 autocross car, or fiddled with bicycles, or built telescopes. But it still seemed out of reach, too expensive and I had no idea how to get started, so I always found some other way of doing what I needed.

But recently I had the good fortune to hook up with Los Alamos's two excellent metal sculptors, David Trujillo and Richard Swenson. Mr. Trujillo was kind enough to offer to mentor me and let me use his equipment to learn to make sculptures like his. (Richard has also given me some pointers.)

[My first metal art piece] MIG welding is both easier and harder than I expected. David Trujillo showed me the basics and got me going welding a little face out of a gear and chain on my very first day. What a fun start!

In a lot of ways, MIG welding is actually easier than soldering. For one thing, you don't need three or four hands to hold everything together while also holding the iron and the solder. On the other hand, the craft of getting a good weld is something that's going to require a lot more practice.

Setting up a home workshop

I knew I wanted my own welder, so I could work at home on my own schedule without needing to pester my long-suffering mentors. I bought a MIG welder and a bottle of gas (and, of course, safety equipment like a helmet, leather apron and gloves), plus a small welding table. But then I found that was only the beginning.

[Metal art: Spoon cobra] Before you can weld a piece of steel you have to clean it. Rust, dirt, paint, oil and anti-rust coatings all get in the way of making a good weld. David and Richard use a sandblasting cabinet, but that requires a big air compressor, making it as big an investment as the welder itself.

At first I thought I could make do with a wire brush wheel on a drill. But it turned out to be remarkably difficult to hold the drill firmly enough while brushing a piece of steel -- that works for small areas but not for cleaning a large piece or for removing a thick coating of rust or paint.

A bench grinder worked much better, with a wire brush wheel on one side for easy cleaning jobs and a regular grinding stone on the other side for grinding off thick coats of paint or rust. The first bench grinder I bought at Harbor Freight had a crazy amount of vibration that made it unusable, and their wire brush wheel didn't center properly and added to the wobble problem. I returned both, and bought a Ryobi from Home Depot and a better wire brush wheel from the local Metzger's Hardware. The Ryobi has a lot of vibration too, but not so much that I can't use it, and it does a great job of getting rust and paint off.

[Metal art: grease-gun goony bird] Then I had to find a place to put the equipment. I tried a couple of different spots before finally settling on the garage. Pro tip: welding on a south-facing patio doesn't work: sunlight glints off the metal and makes the auto-darkening helmet flash frenetically, and any breeze from the south disrupts everything. And it's hard to get motivated to out outside and weld when it's snowing. The garage is working well, though it's a little cramped and I have to move the Miata out whenever I want to weld if I don't want to risk my baby's nice paint job to welding fumes. I can live with that for now.

All told, it was over a month after I bought the welder before I could make any progress on welding. But I'm having fun now. Finding good junk to use as raw materials is turning out to be challenging, but with the junk I've collected so far I've made some pieces I'm pretty happy with, I'm learning, and my welds are getting better all the time.

Earlier this week I made a goony bird out of a grease gun. Yesterday I picked up some chairs, a lawnmower and an old exercise bike from a friend, and just came in from disassembling them. I think I see some roadrunner, cow, and triceratops parts in there.

Photos of everything I've made so far: Metal art.

February 27, 2016 09:02 PM

February 25, 2016

Akkana Peck

Migrating from xchat: a couple of hexchat fixes

I decided recently to clean up my Debian "Sid" system, using apt-get autoclean, apt-get purge `deborphan`, aptitude purge ~c, and aptitude purge ~o. It gained me almost two gigabytes of space. On the other hand, it deleted several packages I had long depended on. One of them was xchat.

I installed hexchat, the fully open replacement for xchat. Mostly, it's the same program ... but a few things didn't work right.

Script fixes

The two xchat scripts I use weren't loading. Turns out hexchat wants to find its scripts in .config/hexchat/addons, so I moved them there. But xchat-inputcount.pl still didn't work; it was looking for a widget called "xchat-inputbox". That was fairly easy to patch: I added a line to print the name of each widget it saw, determined the name had changed in the obvious way, and changed

    if( $child->get( "name" ) eq 'xchat-inputbox' ) {
to
    if( $child->get( "name" ) eq 'xchat-inputbox' ||
        $child->get( "name" ) eq 'hexchat-inputbox' ) {
That solved the problem.

Notifying me if someone calls me

The next problem: when someone mentioned my nick in a channel, the channel tab highlighted; but when I switched to the channel, there was no highlight on the actual line of conversation so I could find out who was talking to me. (It was turning the nick of the person addressing me to a specific color, but since every nick is a different color anyway, that doesn't make the line stand out when you're scanning for it.)

The highlighting for message lines is set in a dialog you can configure: Settings→Text events...
Scroll down to Channel Msg Hilight and click on that elaborate code on the right: %C2<%C8%B$1%B%C2>%O$t$2%O
That's the code that controls how the line will be displayed.

Some of these codes are described in Hexchat: Appearance/Theming, and most of the rest are described in the dialog itself. $t is an exception: I'm not sure what it means (maybe I just missed it in the list).

I wanted hexchat to show the nick of whoever called me name in inverse video. (Xchat always made it bold, but sometimes that's subtle; inverse video would be a lot easier to find when scrolling through a busy channel.) %R is reverse video, %B is bold, and %O removes any decorations and sets the text back to normal text, so I set the code to: %R%B<$1>%O $t$2 That seemed to work, though after I exited hexchat and started it up the next morning it had magically changed to %R%B<$1>%O$t$2%O.

Hacking hexchat source to remove hardwired keybindings

But the big problem was the hardwired keybindings. In particular, Ctrl-F -- the longstanding key sequence that moves forward one character -- in hexchat, it brings up a search window. (Xchat had this problem for a little while, many years ago, but they fixed it, or at least made it sensitive to whether the GTK key theme is "Emacs".)

Ctrl-F doesn't appear in the list under Settings→Keyboard shortcuts, so I couldn't fix it that way. I guess they should rename that dialog to Some keyboard shortcuts. Turns out Ctrl-F is compiled in. So the only solution is to rebuild from source.

I decided to use the Debian package source:

apt-get source hexchat

The search for the Ctrl-F binding turned out to be harder than it had been back in the xchat days. I was confident the binding would be in one of the files in src/fe-gtk, but grepping for key, find and search all gave way too many hits. Combining them was the key:

egrep -i 'find|search' *.c | grep -i key

That gave a bunch of spurious hits in fkeys.c -- I had already examined that file and determined that it had to do with the Settings→Keyboard shortcuts dialog, not the compiled-in key bindings. But it also gave some lines from menu.c including the one I needed:

    {N_("Search Text..."), menu_search, GTK_STOCK_FIND, M_MENUSTOCK, 0, 0, 1, GDK_KEY_f},

Inspection of nearby lines showed that the last GDK_KEY_ argument is optional -- there were quite a few lines that didn't have a key binding specified. So all I needed to do was remove that GDK_KEY_f. Here's my patch:

--- src/fe-gtk/menu.c.orig      2016-02-23 12:13:55.910549105 -0700
+++ src/fe-gtk/menu.c   2016-02-23 12:07:21.670540110 -0700
@@ -1829,7 +1829,7 @@
        {N_("Save Text..."), menu_savebuffer, GTK_STOCK_SAVE, M_MENUSTOCK, 0, 0,
 1},
 #define SEARCH_OFFSET (70)
        {N_("Search"), 0, GTK_STOCK_JUSTIFY_LEFT, M_MENUSUB, 0, 0, 1},
-               {N_("Search Text..."), menu_search, GTK_STOCK_FIND, M_MENUSTOCK,
 0, 0, 1, GDK_KEY_f},
+               {N_("Search Text..."), menu_search, GTK_STOCK_FIND, M_MENUSTOCK,
 0, 0, 1},
                {N_("Search Next"   ), menu_search_next, GTK_STOCK_FIND, M_MENUS
TOCK, 0, 0, 1, GDK_KEY_g},
                {N_("Search Previous"   ), menu_search_prev, GTK_STOCK_FIND, M_M
ENUSTOCK, 0, 0, 1, GDK_KEY_G},
                {0, 0, 0, M_END, 0, 0, 0},

After making that change, I rebuilt the hexchat package and installed it:

sudo apt-get build-dep hexchat
sudo apt-get install devscripts
cd hexchat-2.10.2/
debuild -b -uc -us
sudo dpkg -i ../hexchat_2.10.2-1_i386.deb

Update: I later wrote about how to automate this here: Debian: Holding packages you build from source, and rebuilding them easily.

And the hardwired Ctrl-F key binding was gone, and the normal forward-character binding from my GTK key theme took over.

I still have a couple of minor things I'd like to fix, like the too-large font hexchat uses for its channel tabs, but those are minor. At least I'm back to where I was before foolishly deciding to clean up my system.

February 25, 2016 02:00 AM

February 19, 2016

Akkana Peck

GIMP ditty: change font size and face on every text layer

A silly little GIMP ditty:
I had a Google map page showing locations of lots of metal recycling places in Albuquerque. The Google map shows stars for each location, but to find out the name and location of each address, you have to mouse over each star. I wanted a printable version to carry in the car with me.

I made a screenshot in GIMP, then added text for the stars over the places that looked most promising. But I was doing this quickly, and as I added text for more locations, I realized that it was getting crowded and I wished I'd used a smaller font. How do you change the font size for ALL font layers in an image, all at once?

Of course GIMP has no built-in method for this -- it's not something that comes up very often, and there's no reason it would have a filter like that. But the GIMP PDB (Procedural DataBase, part of the GIMP API) lets you change font size and face, so it's an easy script to write.

In the past I would have written something like this in script-fu, but now that Python is available on all GIMP platforms, there's no reason not to use it for everything.

Changing font face is just as easy as changing size, so I added that as well.

I won't bother to break it down line by line, since it's so simple. Here's the script: changefont.py: Mass change font face and size in all GIMP text layers.

February 19, 2016 06:11 PM

February 18, 2016

Jono Bacon

Supporting Beep Beep Yarr!

Some of you may be familiar with LinuxVoice magazine. They put an enormous amount of effort in creating a high quality, feature-packed magazine with a small team. They are led by Graham Morrison who I have known for many years and who is one of the most thoughtful, passionate, and decent human beings I have ever met.

Well, the same team are starting an important new project called Beep Beep Yarr!. It is essentially a Kickstarter crowd-funded children’s book that is designed to teach core principles of programming to kids. The project not just involves the creation of the book, but also a parent’s guide and an interactive app to help kids engage with the principles in the book.

They are not looking to raise a tremendous amount of money ($28,684 is the goal converted to mad dollar) and they have already raised $15,952 at the time of writing. I just went and added my support – I can’t wait to read this to our 3 year-old, Jack.

I think this campaign is important for a few reasons. Firstly, I am convinced that learning to program and all the associated pieces (logic flow, breaking problems down into smaller pieces, maths, collaboration) is going to be a critical skill in the future. Programming is not just important for teaching people how to control computers but it also helps people to fundamentally understand and synthesize logic which has knock-on benefits in other types of thinking and problem-solving too.

Beep Beep Yarr! is setting out to provide an important first introduction to these principles for kids. It could conceivably play an essential role in jumpstarting this journey for lots of kids, our own included.

So, go and support the campaign, not just because it is a valuable project, but also because the team behind it are good people who do great work.

by Jono Bacon at February 18, 2016 06:05 PM

February 15, 2016

Elizabeth Krumbach

Simcoe’s January 2016 Checkups

First up, as I first wrote about back in August, since July Simcoe has been struggling with some sores and scabbing around her eyes and inside her ear. This typically goes away after a few weeks, but it keeps coming back Over the winter holidays she started developing more scabbing, this time in addition to hear eyes and ears, it was showing up near her tail and back legs. She was also grooming excessively What could be going on?

We went through some rounds of antibiotics and then some Neo-Poly-Dex Ophthalmic for treatment of bacterial infections around her eyes throughout the fall. Unfortunately this didn’t help much, so we eventually scheduled an appointment at the beginning of January with a dermatologist at SFVS where she has been mostly transferred to for more specialized care of her renal failure as it progresses. The dermatologist determined that she’s actually suffering from allergies which are causing the breakouts. She’s now on a daily anti-allergy pill, Atopica. The outbreaks haven’t returned, but now she seems to be suffering from increasing constipation, which we’re currently trying to treat by supplementing her diet with pumpkin mixed with renal diet wet food she likes. It’s pretty clear that it’s causing her distress every time it happens. It’s unclear whether they’re related, but I have a call with the dermatologist and possibly the vet this week to find out.

As for her renal failure, we had an appointment on January 16th with the specialist to look at her levels and see how she’s doing. Due to the constipation we’re reluctant to put her on appetite stimulants just yet, but she is continuing to lose weight, which is a real concern. From November she was down from 8.9 to 8.8.

Simcoe weight

Her BUN and CRE levels also are on the increase, so we’re keeping a close eye on her.

Simcoe weight
Simcoe weight

Her next formal appointment is scheduled for April, so we’ll see how things go over the next month and a half. Behavior-wise she’s still the active and happy kitty we’re accustomed to, aside from the constipation.

Simcoe on Laundry
Simcoe on Suitcase

Still getting into my freshly folded laundry and claiming my suitcases every time I dare bring them out for a trip away from her!

by pleia2 at February 15, 2016 03:58 AM

February 12, 2016

Elizabeth Krumbach

Highlights from LCA 2016 in Geelong

Last week I had the pleasure of attending my second linux.conf.au. This year it took place in Geelong, a port city about an hour train ride southwest of Melbourne. After my Melbourne-area adventures earlier in the week, I made my way to Geelong via train on Sunday afternoon. That evening I met up with a whole bunch of my HPE colleagues for dinner at a restaurant next to my hotel.

Monday morning the conference began! Every day 1km the walk from my hotel to the conference venue at Deakin University’s Waterfront Campus and back was a pleasure as it took me along the shoreline. I passed a beach, a marina and even a Ferris wheel and a carousel.

I didn’t make time to enjoy the beach (complete with part of Geelong’s interesting post-people art installation), but I know many conference attendees did.

With that backdrop, it was time to dive into some Linux! I spent much of Monday in the Open Cloud Symposium miniconf run by my OpenStack Infra colleague over at Rackspace, Joshua Hesketh. I really enjoyed the pair of talks by Casey West, The Twelve-Factor Container (video) and Cloud Anti-Patterns (video). In both talks he gave engaging overviews of best practices and common gotchas with each technology. With containers it’s a temptation during the initial adoption phase to treat them like “tiny VMs” rather than compute-centric, storage free, containers for horizontally-scalable applications. He also stressed the importance of a consolidated code base for development and production and keeping any persistent storage out of containers and more generally the importance of Repeatability, Reliability and Resiliency. The second talk focused on how to bring applications into a cloud-native environment by using the 5-stages of grief repurposed for cloud-native. Key themes in this talk walked you through beginning with a legacy application being crammed into a container and the eventual modernization of that software into a series of microservices, including an automated build pipeline and continuous delivery with automated testing.

Unfortunately I was ill on Tuesday, so my conferencing picked up on Wednesday with a keynote by Catarina Mota who spoke on open hardware and materials, with a strong focus on 3D printing. It’s a topic that I’m already well-versed in, so the talk was mostly review for me, but I did enjoy one of the videos that she shared during her talk: Full Printed by nueveojos.

The day continued with a couple of talks that were some of my favorites of the conference. The first was Going Faster: Continuous Delivery for Firefox by Laura Thomson. Continuous Delivery (CD) has become increasingly popular for server-side applications that are served up to users, but this talk was an interesting take: delivering a client in a CD model. She didn’t offer a full solution for a CD browser, but instead walked through the problem space, design decisions and rationale behind the tooling they are using to get closer to a CD model for client-side software. Firefox is in an interesting space for this, as it already has add-ons that are released outside of the Firefox release model. What they decided to do was leverage this add-on tooling to create system add-ons, which are core to Firefox and to deliver microchanges, improvements and updates to the browser online. They’re also working to separate the browser code itself from the data that ships with it, under the premise that things like policy blacklists, dictionaries and fonts should be able to be updated and shipped independent of a browser version release. Indeed! This data would instead be shipped as downloadable content, and could also be tuned to only ship certain features upon request, like specific language support.


Laura Thomson, Director of Engineering, Cloud Services Engineering and Operations at Mozilla

The next talk that I got a lot out of was Wait, ?tahW: The Twisted Road to Right-to-Left Language Support (video) by Moriel Schottlender. Much like the first accessibility and internationalization talks I attended in the past, this is one of those talks that sticks with me because it opened my eyes to an area I’d never thought much about, as an English-only speaking citizen of the United States. She was also a great speaker who delivered the talk with the humor and intrigue… “can you guess the behavior of this right-to-left feature?” The talk began by making the case for more UIs supporting right to left (RTL) languages, citing that there are 800 million RTL speakers in the world who we should be supporting. She walked us through the concepts of Visual and Logical Rendering, how “obvious” solutions like flipping all content are flawed and considerations with regard to the relationship of content and the interface itself when designing for RTL. She also gave us a glimpse into the behavior of the Unicode Bidirectional Algorithm and the fascinating ways it behaves when mixing LTR and RTL languages. She concluded by sharing that expectations of RTL language users are pretty low since most software gets it wrong, but this means that there’s a great opportunity for projects that do support it to get it right. Her website on the topic that has everything she covered in her talk, and more, is at http://rtl.wtf.


Moriel Schottlender, Software Engineer at Wikimedia

Wednesday night was the Penguin Dinner, which is the major, all attendees welcome conference dinner of the event. The venue was The Pier, which was a restaurant appropriately perched on the end of a very long pier. It was a bit loud, but I had some interesting discussions with my fellow attendees and there was a lovely patio where we were able to get some fresh air and take pictures of the bay.

On Thursday a whole bunch of us enjoyed a talk about a Linux-driven Microwave (video) by David Tulloh. What I liked most about his talk was that while he definitely was giving a talk about tinkering with a microwave to give it more features and make it more accessible, he was also “encouraging other people to do crazy things.” Hack a microwave, hack all kinds of devices and change the world! Manufacturing one-off costs are coming down…

In the afternoon I gave my own talk, Open Source Tools for Distributed Systems Administration (video, slides). I was a bit worried that attendance wouldn’t be good because of who I was scheduled against, but I was mistaken, the room was quite full! After the talk I was able to chat with some folks who are also working on distributed systems teams, and with someone from another major project who was seeking to put more of their infrastructure work into open source. In all, a very effective gathering. Plus, my colleague Masayuki Igawa took a great photo during the talk!


Photo by Masayuki Igawa (source)

The afternoon continued with a talk by Rikki Endsley on Speaking their language: How to write for technical and non-technical audiences (video). Helpfully, she wrote an article on the topic so I didn’t need to take notes! The talk walked through various audiences, lay, managerial and experts and gave examples of how to craft posting for each. The announcement of a development change, for instance, will look very different when presenting it to existing developers than it may look to newcomers (perhaps “X process changed, here’s how” vs. “dev process made easier for new contributors!”), and completely differently when you’re approaching a media outlet to provide coverage for a change in your project. The article dives deep into her key points, but I will say that she delivered the talk with such humor that it was fun to learn directly from hearing her speak on the topic.


Also got my picture with Rikki! (source)

Thursday night was the Speakers’ dinner, which took place at a lovely little restaurant about 15 minutes from the venue via bus. I’m shy, so it’s always a bit intimidating to rub shoulders with some of the high profile speakers that they have at LCA,. Helpfully, I’m terrible with names, so I managed to chat away with a few people and not realize that they are A Big Deal until later. Hah! So the dinner was nice, but having been a long week I was somewhat thankful when the buses came at 10PM to bring us back.

Friday began with my favorite keynote of the conference! It was by Genevieve Bell (video), an Intel fellow with a background in cultural anthropology. Like all of my favorite talks, hers was full of humor and wit, particularly around the fact that she’s an anthropologist who was hired to work for a major technology company without much idea of what that would mean. In reality, her job turned out to be explaining humans to engineers and technologists, and using their combined insight to explore potential future innovations. Her insights were fascinating! A key point was that traditional “future predictions” tend to be a bit near-sighted and very rooted in problems of the present. In reality our present is “messy and myriad” and that technology and society are complicated topics, particularly when taken together. Her expertise brought insight to human behavior that helps engineers realize that while devices work better when connected, humans work better while disconnected (to the point of seeking “disconnection” from the internet on our vacations and weekends).

Additionally, many devices and technologies aim to provide a “seamless” experience, but that humans actually prefer seamful interactions so we can split up our lives into contexts. Finally, she spent a fair amount of time talking about our lives in the world of Internet of Things, and how some serious rules will need to be put in place to make us feel safe and supported by our devices rather than vulnerable and spied upon. Ultimately, technology has to be designed with the human element in mind, and her plea to us, as the architects of the future, is to be optimistic about the future and make sure we’re getting it right.

After her talk I now believe every technology company should have a staff cultural anthropologist.


Intel Fellow and cultural anthropologist Genevieve Bell

My day continued with a talk by Andrew Tridgell on Helicopters and rocket-planes (video), one on Copyleft For the Next Decade: A Comprehensive Plan (video) by Bradley Kuhn, a talk by Matthew Garrett on Troublesome Privacy Measures: using TPMs to protect users (video) and an interesting dive into handling secret data with Tollef Fog Heen’s talk on secretd – another take on securely storing credentials (video).

With that, the conference came to a close with a closing session that included raffle prizes, thanks to everyone and the hand-off to the team running LCA 2017 in Hobart next year.

I went to more talks than highlighted in this post, but with a whole week of conferencing it would have been a lot to cover. I also am typically not the biggest fan of the “hallway track” (introvert, shy) and long breaks, but I knew enough people at this conference to find people to spend time with during breaks and meals. I could also get a bit of work done during the longer breaks without skipping too many sessions and it easy to switch rooms between sessions without disruption. Plus, all the room moderators I saw did an excellent job of keeping things on schedule.

Huge thanks to all the organizers and everyone who made me feel so welcome this year. It was a wonderful experience and I hope to do it again next year!

More photos from the conference and beautiful Geelong here: https://www.flickr.com/photos/pleia2/albums/72157664277057411

by pleia2 at February 12, 2016 09:20 PM

February 10, 2016

iheartubuntu

OpenShot 2.0.6 (Beta 3) Released!


The third beta of OpenShot 2.0 has been officially released! To install it, add the PPA by using the Terminal commands below:

sudo add-apt-repository ppa:openshot.developers/ppa
sudo apt-get update
sudo apt-get install openshot openshot-doc

Now that OpenShot is installed, you should be able to launch it from your Applications menu, or from the terminal ($ openshot-qt). Every time OpenShot has an update, you will be prompted to update to the newest version. It's a great way to test our latest features.

Smoother Animation
Animations are now silky smooth because of improved anti-aliasing support in the libopenshot compositing engine. Zooming, panning, and rotation all benefit from this change.

Audio Quality Improvements
Audio support in this new version is vastly superior to previous versions. Popping, crackling, and other related audio issues have been fixed.

Autosave
A new autosave engine has been built for OpenShot 2.0, and it’s fast, simple to configure, and will automatically save your project at a specific interval (if it needs saving). Check the Preferences to be sure it’s enabled (it will default to enabled for new users).

Automatic Backup and Recovery
Along with our new autosave engine, a new automatic backup and recovery feature has also been integrated into the autosave flow. If your project is not yet saved… have no fear, the autosave engine will make a backup of your unsaved project (as often as autosave is configured for), and if OpenShot crashes, it will recover your most recent backup on launch.


Project File Improvements
Many improvements have been made to project file handling, including relative paths for built-in transitions and improvements to temp files being copied to project folders (i.e. animated titles). Projects should be completely portable now, between different versions of OpenShot and on different Operating Systems. This was a key design goal of OpenShot 2.0, and it works really well now.

Improved Exception Handling
Integration between libopenshot (our video editing library) and openshot-qt (our PyQt5 user interface) has been improved. Exceptions generated by libopenshot are now passed to the user interface, and no longer crash the application. Users are now presented with a friendly error message with some details of what happened. Of course, there is still the occasional “hard crash” which kills everything, but many, many crashes will now be avoided, and users more informed on what has happened.

Preferences Improvements
There are more preferences available now (audio preview settings - sample rate, channel layout, debug mode, etc…), including a new feature to prompt users when the application will “require a restart” for an option to take effect.


Improved Stability on Windows
A couple of pretty nasty bugs were fixed for Windows, although in theory they should have crashed on other platforms as well. But for whatever reason, certain types of crashes relating to threading only seem to happen on Windows, and many of those are now fixed.

New Version Detection
OpenShot will now check the most recent released version on launch (from the openshot.org website) and descretely prompt the user by showing an icon in the top right of the main window. This has been a requested feature for a really long time, and it’s finally here. It will also quietly give up if no Internet connection is available, and it runs in a separate thread, so it doesn’t slow down anything.

Metrics and Anonymous Error Reporting
A new anonymous metric and error reporting module has been added to OpenShot. It can be enabled / disabled in the Preferences, and it will occasionally send out anonymous metrics and error reports, which will help me identify where crashes are happening. It’s very basic data, such as “WEBM encoding error - Windows 8, version 2.0.6, libopenshot-version: 0.1.0”, and all IP addresses are anonymized, but will be critical to help improve OpenShot over time.

Improved Precision when Dragging
Dragging multiple clips around the timeline has been improved. There were many small issues that would sometimes occur, such as extra spacing being added between clips, or transitions being slightly out of place. These issues have been fixed, and moving multiple clips now works very well.

Debug Mode
In the preferences, one of the new options is “Debug Mode”, which outputs a ton of extra info into the logs. This might only work on Linux at the moment, because it requires the capturing of standard output, which is blocked in the Windows and Mac versions (due to cx_Freeze). I hope to enable this feature for all OSes soon, or at least to provide a “Debug” version for Windows and Mac, that would also pop open a terminal/command prompt with the standard output visible.

Updated Translations
Updates to 78 supported languages have been made. A huge thanks to the translators who have been hard at work helping with OpenShot translations. There are over 1000 phrases which require translation, and seeing OpenShot run so seamlessly in different languages is just awesome! I love it!

Lots of Bug fixes

  • In addition to all the above improvements and fixes, here are many other smaller bugs and issues that have been addressed in this version.
  • Prompt before overwriting a video on export
  • Fixed regression while previewing videos (causing playhead to hop around)
  • Default export format set to MP4 (regardless of language)
  • Fixed regression with Cutting / Split video dialog
  • Fixed Undo / Redo bug with new project
  • Backspace key now deletes clips (useful with certain keyboards and laptop keyboards)
  • Fixed bug on Animated Title dialog not updating progress while rendering
  • Added multi-line and unicode support to Animated Titles
  • Improved launcher to use distutils entry_points


Get Involved
Please report bugs and suggestions here: https://github.com/OpenShot/openshot-qt/issues. Please contribute language translations here (if you are a non-English speaking user): https://translations.launchpad.net/openshot/2.0/+translations.

by iheartubuntu (noreply@blogger.com) at February 10, 2016 01:23 PM

Elizabeth Krumbach

Kangaroos, Penguins, a Koala and a Platypus

On the evening of January 27th I began my journey to visit Australia for the second time in my life. My first visit to the land down under was in 2014 when I spoke at and attended my first linux.conf.au in Perth. Perth was beautiful, in addition to the conference (which I wrote about here, here and here), I took some time to see the beach and visit the zoo during my tourist adventures.

This time I was headed for Melbourne to once again attend and speak at linux.conf.au, this time in the port city of Geelong. I arrived the morning of Friday the 29th to spend a couple days adjusting to the time zone and visiting some animals. However, I was surprised at the unexpected discovery of something else I love in Melbourne: historic street cars. Called trams there, they run a free City Circle Tram that uses the historic cars! There’s even a The Colonial Tramcar Restaurant which allows you to dine inside one as you make your way along the city rails. Unfortunately my trip was not long enough to ride in a tram or enjoy a meal, but this alone puts Melbourne right on my list of cities to visit again.

At the Perth Zoo I got my first glimpse of a wombat (they are BIG!) and enjoyed walking through an enclosure where the kangaroos roamed freely. This time I had some more animals on my checklist, and wanted to get a bit closer to some others. After checking into my hotel in Melbourne, I went straight to the Melbourne Zoo.

I love zoos. I’ve visited zoos in countries all over the world. But there’s something special you should know about the Melbourne Zoo: they have a platypus. Everything I’ve read indicate that they don’t do very well in captivity and captive breeding is very rare. As a result, no zoos outside of Australia have platypuses, so if I wanted to see one it had to be in Australia. I bought my zoo ticket and immediately asked “where can I find the platypus?” With that, I got to see a platypus! They platypus was swimming in it’s enclosure and I wasn’t able to get a photo of it (moving too fast), but I did get a lovely video. They are funny creatures, and very cute!

The rest of the zoo was very nice. I didn’t see everything, but I spent a couple hours visiting the local animals and checking out some of their bigger exhibits. I almost skipped their seals (seals live at home!) and penguins (I’d see wild ones the next day!), but I’m glad I didn’t since it was a very nice setup. Plus, I wasn’t able to take pictures of the wild fairy penguins as to not disturb them in their natural habitat, but the ones at the zoo were fine.

I also got a video of the penguins!

More photos from the Melbourne Zoo here: https://www.flickr.com/photos/pleia2/albums/72157664216488166

When I got into a cab to return to my hotel it began to rain. I was able to pick up an early dinner and spend the evening catching up on some work and getting to bed early.

Saturday was animal tour day! I booked a AAT Kings full day Phillip Island – Penguins, Kangaroos & Koalas tour that had a tour bus picking me up right at my hotel. I selected the Viewing Platform Upgrade and it was well worth it.

Philip Island is about two hours from Melbourne, and it’s where the penguins live. They come out onto the beach at sunset and all rush back to their homes. The rest of the tour was a series of activities leading up to this grand event, beginning with a stop at MARU Koala & Animal Park. We were in the bus for nearly two hours to get to the small park, during which the tour guide told us about the history of Melbourne and about the penguins we’d see later in the evening.

The tour included entrance fees, but I paid an extra $20 to pet a koala and get some food for the kangaroos and other animals. First up, koala! The koala I got to pet was an active critter. It sat still during my photo, but between people it could be seen reaching toward the keepers to get back the stem of eucalyptus that it got to munch on during the tourist photos. It was fun to learn that instead of being really soft like they look, their fur feels a lot more like wool.

The rest of my time at the park was spent with the kangaroos. Not only are they just hopping around for everyone to see like in the Perth Zoo, when you have a container of food you get to feed them! And pet them! In case you’re wondering, it’s one of the best things ever. They’re all very used to being around human tourists all day, and when you lay your hand flat as instructed to have them eat from your hand they don’t bite.

I got to feed and pet lots of kangaroos!

The rest of the afternoon was spent visiting a couple scenic outlooks and a beach before stopping for dinner in the town of Cowes on Philip Island where I enjoyed a lovely fish dinner with a stunning view at Harry’s on the Esplanade. The weather was so nice!


Selfies were made for the solo tourist

As we approached the “skinny tip of the island” the tour guide told us a bit about the history of the island and the nature preserve where the penguins live. The area had once been heavily populated with vacation homes, but with the accidental introduction of foxes, which kill penguins, and increased human population, the island quickly saw their penguin (and other local wildlife populations) drop. We learned that a program was put in place to buy back all the private property and turn it into a preserve, and work was also done to rid the island of foxes. The program seems to have worked, the preserve no longer has private homes and we saw dozens of wild wallabies as well as some of the large native geese that were also targets of the foxes. Most exciting for me was that the penguin population was preserved for us to enjoy.

As the bus made its way through the park, we could see little penguin homes throughout the landscape. Some were natural holes built by the penguins, and others were man-made houses put in place when they tore down a private home and discovered penguins had been using it for a burrow and required some kind of replacement. The hills were also covered in deep trails that we learned were little penguin highways, used for centuries (millennia?) for the little penguins to make their way from the ocean where they hunted throughout the day, to their nests where they spend the nights. The bus then stopped at the top of a hill that looked down onto the beach where we’d spend the evening watching the penguins come ashore. I took the picture from inside the bus, but if you look closely at this picture you see the big rows of stadium seating, and then to the left, and closer, there are some benches that are curvy. The stadium like seating was general admission and the curvy ones are the viewing platform upgrade I paid for.

The penguins come ashore when it gets dark (just before 9PM while I was there), so we had about an hour before then to visit the gift shop and get settled in to our seats. I took the opportunity to send post cards to my family, featuring penguins and sent out right there from the island. I also picked up a blanket, because in spite of the warm day and my rain jacket, the wind had picked up to make it a bit chilly and it was threatening rain by the time dusk came around.

It was then time for the penguins. With the viewing platform upgrade the penguins were still a bit far when they came out of the ocean, but we got a nice view of them as they approached up the beach, walking right past our seating area! They come out of the ocean in big clumps of a couple dozen, so each time we saw another grouping the human crowd would pipe up and notice. I think for the general admission it would be a lot harder to see them come up on the beach. The rest of the penguin parade is fun for everyone though, they waddle and scuttle up the island to their little homes, and they pass all the trails, regardless of where you were seated. Along the pathways the penguins get so close to you that you could reach out and touch them (of course, you don’t!). Photos are strictly prohibited since the risk is too high that someone would accidentally use a flash and disturb them, but it was kind of refreshing to just soak in the time with the penguins without a camera/phone. All told, I understand there are nearly 1,500 penguins each night that come out of the ocean at that spot.

The hills then come alive with penguin noises as they enjoy their evenings, chatting away and settling in with their chicks. Apparently this parade lasts well into the night, though most of them do come out of the ocean during the hour or so that I spent there with the tour group. At 10PM it was time to meet back at the bus to take us back to Melbourne. The timing was very good, about 10 minutes after getting in the bus it started raining. We got to watch the film Oddball on our journey home, about another island of penguins in Victoria that was at risk from foxes but were saved.

In all, the day was pretty overwhelming for me. In a good way. Petting some of these incredibly cute Australian animals! Seeing adorable penguins in the wild! A day that I’ll cherish for a lifetime.

More photos from the tour here: https://www.flickr.com/photos/pleia2/albums/72157664216521696

The next day it was time to take a train to Geelong for the Linux conference. An event with a whole different type of penguins!

by pleia2 at February 10, 2016 08:58 AM

February 08, 2016

Akkana Peck

Attack of the Killer Titmouse!

[Juniper titmouse attacking my window] For the last several days, when I go upstairs in mid-morning I often hear a strange sound coming from the bedroom. It's a juniper titmouse energetically attacking the east-facing window.

He calls, most often in threes, as he flutters around the windowsill, sometimes scratching or pecking the window. He'll attack the bottom for a while, moving from one side to the other, then fly up to the top of the window to attack the top corners, then back to the bottom.

For several days I've run down to grab the camera as soon as I saw him, but by the time I get back and get focused, he becomes camera-shy and flies away, and I hear EEE EEE EEE from a nearby tree instead. Later in the day I'll sometimes see him down at the office windows, though never as persistently as upstairs in the morning.

I've suspected he's attacking his reflection (and also assumed he's a "he"), partly because I see him at the east-facing bedroom window in the morning and at the south-facing office window in the early afternoon. But I'm not sure about it, and certainly I hear his call from trees scattered around the yard.

Something I was never sure of, but am now: titmice definitely can raise and lower their crests. I'd never seen one with its crest lowered, but this one flattens his crest while he's in attack mode.

His EEE EEE EEE call isn't very similar to any of the calls listed for juniper titmouse in the Stokes CD set or the Audubon Android app. So when he briefly attacked the window next to my computer yesterday afternoon while I was sitting there, I grabbed a camera and shot a video, hoping to capture the sound. The titmouse didn't exactly cooperate: he chirped a few times, not always in the group of three he uses so persistently in the morning, and the sound in the video came out terribly noisy; but after some processing in audacity I managed to edit out some of the noise. And then this morning as I was brushing my teeth, I heard him again and he was more obliging, giving me a long video of him attacking and yelling at the bedroom window. Here's the Juniper titmouse call as he attacks my window this morning, and yesterday's Juniper titmouse call at the office window yesterday. Today's video is on youtube: Titmouse attacking the window but that's without the sound edits, so it's tough to hear him.

(Incidentally, since Audacity has a super confusing user interface and I'm sure I'll need this again, what seemed to work best was to highlight sections that weren't titmouse and use Edit→Delete; then use Effects→Amplify, checking the box for Allow clipping and using Preview to amplify it to the point where the bird is audible. Then find a section that's just noise, no titmouse, select it, run Effects→Noise Reduction and click Get Noise Profile. The window goes away, so click somewhere to un-select, call up Effects→Noise Reduction again and this time click OK.)

I feel a bit sorry for the little titmouse, attacking windows so frenetically. Titmice are cute, excellent birds to have around, and I hope he's saving some energy for attracting a mate who will build a nest here this spring. Meanwhile, he's certainly providing entertainment for me.

February 08, 2016 06:10 PM

February 05, 2016

Akkana Peck

Updating Debian under a chroot

Debian's Unstable ("Sid") distribution has been terrible lately. They're switching to a version of X that doesn't require root, and apparently the X transition has broken all sorts of things in ways that are hard to fix and there's no ETA for when things might get any better.

And, being Debian, there's no real bug system so you can't just CC yourself on the bug to see when new fixes might be available to try. You just have to wait, try every few days and see if the system

That's hard when the system doesn't work at all. Last week, I was booting into a shell but X wouldn't run, so at least I could pull updates. This week, X starts but the keyboard and mouse don't work at all, making it hard to run an upgrade. has been fixed.

Fortunately, I have an install of Debian stable ("Jessie") on this system as well. When I partition a large disk I always reserve several root partitions so I can try out other Linux distros, and when running the more experimental versions, like Sid, sometimes that's a life saver. So I've been running Jessie while I wait for Sid to get fixed. The only trick is: how can I upgrade my Sid partition while running Jessie, since Sid isn't usable at all?

I have an entry in /etc/fstab that lets me mount my Sid partition easily:

/dev/sda6 /sid ext4 defaults,user,noauto,exec 0 0
So I can type mount /sid as myself, without even needing to be root.

But Debian's apt upgrade tools assume everything will be on /, not on /sid. So I'll need to use chroot /sid (as root) to change the root of the filesystem to /sid. That only affects the shell where I type that command; the rest of my system will still be happily running Jessie.

Mount the special filesystems

That mostly works, but not quite, because I get a lot of errors like permission denied: /dev/null.

/dev/null is a device: you can write to it and the bytes disappear, as if into a black hole except without Hawking radiation. Since /dev is implemented by the kernel and udev, in the chroot it's just an empty directory. And if a program opens /dev/null in the chroot, it might create a regular file there and actually write to it. You wouldn't want that: it eats up disk space and can slow things down a lot.

The way to fix that is before you chroot: mount --bind /dev /sid/dev which will make /sid/dev a mirror of the real /dev. It has to be done before the chroot because inside the chroot, you no longer have access to the running system's /dev.

But there is a different syntax you can use after chrooting:

mount -t proc proc proc/
mount --rbind /sys sys/
mount --rbind /dev dev/

It's a good idea to do this for /proc and /sys as well, and Debian recommends adding /dev/pts (which must be done after you've mounted /dev), even though most of these probably won't come into play during your upgrade.

Mount /boot

Finally, on my multi-boot system, I have one shared /boot partition with kernels for Jessie, Sid and any other distros I have installed on this system. (That's somewhat hard to do using grub2 but easy on Debian though you may need to turn off auto-update and Debian is making it harder to use extlinux now.) Anyway, if you have a separate /boot partition, you'll want it mounted in the chroot, in case the update needs to add a new kernel. Since you presumably already have the same /boot mounted on the running system, use mount --bind for that as well.

So here's the final set of commands to run, as root:

mount /sid
mount --bind /proc /sid/proc
mount --bind /sys /sid/sys
mount --bind /dev /sid/dev
mount --bind /dev/pts /sid/dev/pts
mount --bind /boot /sid/boot
chroot /sid

And then you can proceed with your apt-get update, apt-get dist-upgrade etc. When you're finished, you can unmount everything with one command:

umount --recursive /sid

Some helpful background reading:

February 05, 2016 06:43 PM

February 02, 2016

Nathan Haines

Ubuntu Free Culture Showcase submissions are now open again!

It’s time once again for the Ubuntu Free Culture Showcase!

The Ubuntu Free Culture Showcase is a way to celebrate the Free Culture movement, where talented artists across the globe create media and release it under licenses that encourage sharing and adaptation. We're looking for content which shows off the skill and talent of these amazing artists and will greet Ubuntu 16.04 LTS users.

Not only will the chosen content be featured on the next set of pressed Ubuntu discs shared worldwide across the next two years, but it will serve the joint purposes of providing a perfect test for new users testing Ubuntu’s live session or new installations, but also celebrating the fantastic talents of artists who embrace Free content licenses.

While we hope to see contributions from the video, audio, and photographic realms, I also want to thank the artists who have provided wallpapers for Ubuntu release after release. Ubuntu 15.10 shipped with wallpapers from the following contributors:

I'm looking forward to seeing the next round of entrants and a difficult time picking final choices to ship with Ubuntu 16.04 LTS.

For more information, please visit the Ubuntu Free Culture Showcase page on the Ubuntu wiki.

February 02, 2016 11:33 AM

February 01, 2016

Jono Bacon

The Hybrid Desktop

OK, folks, I want to share a random idea that cropped up after a long conversation with Langridge a few weeks back. This is merely food for thought and designed to trigger some discussion.

Today my computing experience is comprised of Ubuntu and Mac OS X. On Ubuntu I am still playing with GNOME Shell and on Mac I am using the standard desktop experience.

I like both. Both have benefits and disadvantages. My Mac has beautiful hardware and anything I plug into it just works out the box (or has drivers). While I spend most of my life in Chrome and Atom, I use some apps that are not available on Ubuntu (e.g. Bluejeans and Evernote clients). I also find multimedia is just easier and more reliable on my Mac.

My heart will always be with Linux though. I love how slick and simple Shell is and I depend on the huge developer toolchain available to me in Ubuntu. I like how customizable my desktop is and that I can be part of a community that makes the software I use. There is something hugely fulfilling about hanging out with the people who make the tools you use.

So, I have two platforms and use the best of both. The problem is, they feel like two different boxes of things sat on the same shelf. I want to jumble the contents of those boxes together and spread them across the very same shelf.

The Idea

So, imagine this (this is total fantasy, I have no idea if this would be technically feasible.)

You want the very best computing experience, so you first go out and buy a Mac. They have arguably the nicest overall hardware combo (looks, usability, battery etc) out there.

You then download a distribution from the Internet. This is shipped as a .dmg and you install it. It then proceeds to install a bunch of software on your computer. This includes things such as:

  • GNOME Shell
  • All the GNOME 3 apps
  • Various command line tools commonly used on Linux
  • An ability to install Linux packages (e.g. Debian packages, RPMs, snaps) natively

When you fire up the distribution, GNOME Shell appears (or Unity, KDE, Elementary etc) and it is running natively on the Mac, full screen like you would see on Linux. For all intents and purposes it looks and feels like a Linux box, but it is running on top of Mac OS X. This means hardware issues (particularly hardware that needs specific drivers) go away.

Because shell is native it integrates with the Mac side of the fence. All the Mac applications can be browsed and started from Shell. Nautilus shows your Mac filesystem.

If you want to install more software you can use something such as apt-get, snappy, or another service. Everything is pulled in and available natively.

Of course, there will be some integration points where this may not work (e.g. alt-tab might not be able to display Shell apps as well as Mac apps), but importantly you can use your favorite Linux desktop as your main desktop yet still use your favorite Mac apps and features.

I think this could bring a number of benefits:

  • It would open up a huge userbase as a potential audience. Switching to Linux is a big deal for most people. Why not bring the goodness to the Mac userbase?
  • It could be a great opportunity for smaller desktops to differentiate (e.g. Elementary).
  • It could be a great way to introduce people to open source in a more accessible way (it doesn’t require a new OS).
  • It could potentially bring lots of new developers to projects such as GNOME, Unity, KDE, or Elementary.
  • It could significantly increase the level of testing, translations and other supplemental services due to more people being able to play with it.

Of course, from a purely Free Software perspective it could be seen as a step back. Then again, with Darwin being open source and the desktop and apps you install in the distribution being open source, it would be a mostly free platform. It wouldn’t be free in the eyes of the FSF, but then again, neither is Ubuntu. 😉

So, again, just wanted to throw the idea out there to spur some discussion. I think it could be a great project to see. It wouldn’t replace any of the existing Linux distros, but I think it could bring an influx of additional folks over to the open source desktops.

So, two questions for you all to respond to:

  1. What do you think? Could it be an interesting project?
  2. If so, technically how do you think this could be accomplished?

by Jono Bacon at February 01, 2016 03:17 AM

January 31, 2016

Elizabeth Krumbach

SCALE14x

I have already written about the UbuCon Summit and Ubuntu booth at SCALE14x (14th annual Southern California Linux Expo), but the conference went far beyond Ubuntu for me!

First of all, I love this new venue. SCALE had previously been held at hotels near LAX, with all the ones I’d attended being at the Hilton LAX. It was a fine venue itself, but the conference was clearly outgrowing it even when I last attended in 2014 and there weren’t many food options around, particularly if you wanted a more formal meal. The Pasadena Convention Center was the opposite of this. Lots of space, lots of great food of all kinds and price ranges within walking distance! A whole plaza across from the venue made a quick lunch at a nice place quite doable.

It’s also worth mentioning that with over 3000 attendees this year, the conference has matured well. My first SCALE was 9x back in 2011, and with every year the growth and professionalism has continued, but without losing the feel of a community-run, regional conference that I love so much. Even the expo hall has continued to show a strong contingent of open source project and organization booths among the flashy company-driven booths, but even the company booths weren’t over done. Kudos to the SCALE crew for their work and efforts that make SCALE continue to be one of my favorite open source conferences.

As for the conference itself, MJ and I were both able to attend for work, which was a nice change for us. Plus, given how much conference travel I’ve done on my own, it’s nice to travel and enjoy an event together.

Thursday was taken up pretty much exclusively by the UbuCon Summit, but Friday we started to transition into more general conference activities. The first conference-wide keynote was on Friday morning with Cory Doctorow presenting No Matter Who’s Winning the War on General Purpose Computing, You’re Losing where he explored security and Digital rights management (DRM) in the exploding field of the Internet of Things. His premise was that we did largely win the open source vs. proprietary battle, but now we’re in a whole different space where DRM are now threatening our safety and stifling innovation. Security vulnerabilities in devices are going undisclosed when discovered by third parties under threat of prosecution for violating DRM-focused laws which have popped up worldwide. Depending on the device, this fear of disclosure could actually result in vulnerabilities causing physical harm to someone if compromised in a malicious way. He also dove into more dystopian future where smart devices are given away for free/cheap but then are phoning home and can be controlled remotely by an entity that doesn’t have your personal best interest in mind. The talk certainly gave me a lot to think about. He concluded by presenting the Apollo 1201 Project “a mission to eradicate DRM in our lifetime” that he’s working on at the EFF, article here.

Later that morning I made my way over to the DevOpsDayLA track to present on Open Source tools for distributed systems administration. Unfortunately, the projectors in the room weren’t working. Thankfully my slides were not essential to the talk, so even though I did feel a bit unsettled to present without slides, I made it through. People even said nice things afterwards, so I think it went pretty well in spite of the technology snafu. The slides that should have been seen during the talk are available here (PDF) and since I am always asked, I do maintain a list of other open source infras. Thanks to @scalexphotos for capturing a photo during my talk.

In the afternoon I spent some time in the expo hall, where I was able to see many more familiar faces! Again, the community booths are the major draw for me, so it was great visiting with participants of projects and groups there. It was nice to swing by the Ubuntu booth to see how polished everything was looking. I also got to see Emma of System76, who I hadn’t seen in quite some time.

Friday evening had a series of Birds of a Feather (BoF) sessions. I was able to make my way over to one on OpenStack before wrapping up my evening.

Saturday morning began with a welcome from Pasadena City Council member Andy Wilson who was enthusiastic about SCALE14x coming to Pasadena and quickly dove into his technical projects and the work being done in Pasadena around tech. I love this trend of city officials welcoming open source conferences to their area, it means a lot that the work we’re doing is being taken seriously by the cities we’re in. Then it moved into a keynote by Mark Shuttleworth on Open Source in the World of App Stores which had many similarities to his talk at the UbuCon Summit, but was targeted more generally about how distributions can help keep pace today’s computing that deploys “at the speed of git.”

I then went to Akkana Peck’s talk on Stupid GIMP tricks (and smart ones, too). It was a very visual talk, so I’m struggling to do it justice in written form, but she demonstrated various tools for photo editing in GIMP that I knew nothing about, I learned a lot. She concluded by talking about the features that came out in the 2.8 release and then the features planned and being worked on in the upcoming 2.9 release. Video of the talk here In the afternoon I attended a Kubernetes talk, noting quickly that the containers track was pretty packed throughout the conference.


Akkana Peck on GIMP

Between “hallway track” chats about everything from the Ubuntu project to the OpenStack project infrastructure tooling, Saturday afternoon also gave me the opportunity to do a bit more wandering through the expo hall. I visited my colleagues at the HPE booth and was able to see their cloud in a box. It was amusing to see the suitcase version and the Ubuntu booth with an Orange box. Putting OpenStack clouds in a single demonstration deployment for a conference is a popular thing!

My last talk of the day was by OpenStack Magnum Project Technical Lead Adrian Otto on Docker, Kubernetes, and Mesos: Compared. He walked us through some of the basics of Magnum first, then dove into each technology. Docker Swarm is good for simple tooling that you’re comfortable with and doing exactly what you tell it (imperative) and have 100s-1000s machines in the cluster. Kubernetes is more declarative (you tell it what you want, it figures out how to do it) and currently has some scaling concerns that make it better suited for a cluster of up to 200 nodes. Mesos is a more complicated system that he recommended using if you have a dedicated infrastructure team and can effectively scale to over 10k nodes. Video of the talk here

Sunday began with a keynote by Sarah Sharp on Improving Diversity with Maslow’s Hierarchy of Needs. She spoke about diversity across various angles, from income and internet bandwidth restrictions to gender and race, and the intersection of these things. There are many things that open source projects assume: unlimited ability to download software, ability for contributors to have uninterrupted “deep hack mode” time, access to fast systems to do development on. These assumptions fall apart when a contributor is paying for the bandwidth they use, are a caretaker who doesn’t have long periods without interruptions or a new system that they have access to. Additionally, there are opportunities that are simply denied to many genders, as studies have show that mothers and daughters don’t have as many opportunities or as much access to technology as the fathers and sons in their household. She also explored safety in a community, demonstrating how even a single sexist or racist contributor can single-handedly destroy diversity for your project by driving away potential contributors. Having a well-written code of conduct with a clear enforcement plan is also important and cited resources for organizations and people who could help you with that, warning that you shouldn’t roll your own. She concluded by asking audience members to recognize the problem and take action in their communities to help improve diversity. Her excellent slides (with notes) are here and a video of the talk here.

I then made my way to the Sysadmin track to see Jonah Horowitz and Albert Tobey on From Sys Admin to Netflix SRE. First off, their slides were hilarious. Lots of 80s references to things that were out-dated as they made their way through how they’re doing Site Reliability Engineering (SRE) at Netflix and inside their CORE (Cloud Operations Reliability Engineering) team. In their work, they’ve moved past configuration management, preferring to deploy baked AMIs (essentially, golden images). They also don’t see themselves as “running applications for the developers” and instead empower developers to do their own releases and application-level monitoring. In this new world of managing fleets of servers rather than individual systems, they’ve worked to develop a blameless culture where they do postmortems so that anything that is found to be done manually or otherwise error-prone can be fixed so the issue doesn’t happen again. The also shared the open source tooling that they use to bypass traditional monitoring systems and provide SREs with a high level view of how their system is working, noting that no one in the organization “knows everything” about the infrastructure. This tooling includes Spinnaker, Atlas and Vector, along with their well-known Simian Army which services within Netflix must run (unless they have a good reason not to) to test tolerance of random instance failures. Video of the talk can be found here and slide here.

After lunch I made my way to A fresh look at SELinux… by Daniel Walsh. I’d seen him speak on SELinux before, and found his talk valuable then too. This time I was particularly interested in how it’s progressed in RHEL7/Centos7, like the new rules for a file type, such as knowing what permissions /home/user/.ssh should have and having an semanage command to set those permissions to that default instead of doing it manually. I also learned about semanage -e (equivalency) to copy permissions from one place to another and the new mv -Z which moves things while retaining the SELinux properties. Finally, I somehow didn’t have a good grasp on improvements to the man pages, doing things like `man httpd_selinux` works and is very helpful! I was also amused to learn a bout stopdisablingselinux.com (especially since our team does not turn it off, and that took some work on my part!). In closing, there’s also an SELinux Coloring Book (which I’ve written about before), and though I didn’t get to the session in time to get one, MJ picked me up on in the expo hall. Video of the talk here

With that, we were at the last talk of the conference. I went over to Dustin Kirkland’s talk on “adapt install [anything]” on your Ubuntu LTS server/desktop! Adapt is a wrapper around LXD containers that allows you to locally (unprivileged user) install versions of Ubuntu software from various versions and run it locally on your system. The script handles provisioning the container, many default settings and keeping it updated automatically, so you really can “adapt install” and run a series of adapt commands to interact with it as if it were prepared locally. It all reminded me of the pile of chroot-building scripts I had back when I was doing Debian packaging, but more polished than mine ever were! He wrote a blog post following up his talk here: adapt install [anything] which includes a link to his slides. Video from the talk here (link at 4 hours 42 minutes).

With the conference complete, it was sad to leave, but I had an evening flight out of Burbank. Amusingly, even my flight was full of SCALE folks, so there were some fun chats in the boarding area before our departure.

Huge thanks to everyone who made SCALE possible, I’m looking forward to next year!

More photos from SCALE14x here: https://www.flickr.com/photos/pleia2/albums/72157663821501532

by pleia2 at January 31, 2016 09:15 PM

Akkana Peck

Setting mouse speed in X

My mouse died recently: the middle button started bouncing, so a middle button click would show up as two clicks instead of one. What a piece of junk -- I only bought that Logitech some ten years ago! (Seriously, I'm pretty amazed how long it lasted, considering it wasn't anything fancy.)

I replaced it with another Logitech, which turned out to be quite difficult to find. Turns out most stores only sell cordless mice these days. Why would I want something that depends on batteries to use every day at my desktop?

But I finally found another basic corded Logitech mouse (at Office Depot). Brought it home and it worked fine, except that the speed was way too fast, much faster than my old mouse. So I needed to find out how to change mouse speed.

X11 has traditionally made it easy to change mouse acceleration, but that wasn't what I wanted. I like my mouse to be fairly linear, not slow to start then suddenly zippy. There's no X11 property for mouse speed; it turns out that to set mouse speed, you need to call it Deceleration.

But first, you need to get the ID for your mouse.

$ xinput list| grep -i mouse
⎜   ↳ Logitech USB Optical Mouse                id=11   [slave  pointer  (2)]

Armed with the ID of 11, we can find the current speed (deceleration) and its ID:

$ xinput list-props 11 | grep Deceleration
        Device Accel Constant Deceleration (259):       3.500000
        Device Accel Adaptive Deceleration (260):       1.000000

Constant deceleration is what I want to set, so I'll use that ID of 259 and set the new deceleration to 2:

$ xinput set-prop 11 259 2

That's fine for doing it once. But what if you want it to happen automatically when you start X? Those constants might all stay the same, but what if they don't?

So let's build a shell pipeline that should work even if the constants aren't.

First, let's get the mouse ID out of xinput list. We want to pull out the digits immediately following "id=", and nothing else.

$ xinput list | grep Mouse | sed 's/.*id=\([0-9]*\).*/\1/'
11

Save that in a variable (because we'll need to use it more than once) and feed it in to list-props to get the deceleration ID. Then use sed again, in the same way, to pull out just the thing in parentheses following "Deceleration":

$ mouseid=$(xinput list | grep Mouse | sed 's/.*id=\([0-9]*\).*/\1/')
$ xinput list-props $mouseid | grep 'Constant Deceleration'
        Device Accel Constant Deceleration (262):       2.000000
$ xinput list-props $mouseid | grep 'Constant Deceleration' | sed 's/.* Deceleration (\([0-9]*\)).*/\1/'
262

Whew! Now we have a way of getting both the mouse ID and the ID for the "Constant Deceleration" parameter, and we can pass them in to set-prop with our desired value (I'm using 2) tacked onto the end:

$ xinput set-prop $mouseid $(xinput list-props $mouseid | grep 'Constant Deceleration' | sed 's/.* Deceleration (\([0-9]*\)).*/\1/') 2

Add those two lines (setting the mouseid, then the final xinput line) wherever your window manager will run them when you start X. For me, using Openbox, they go in .config/openbox/autostart. And now my mouse will automatically be the speed I want it to be.

January 31, 2016 08:42 PM

January 30, 2016

Elizabeth Krumbach

Ubuntu at SCALE14x

I spent a long weekend in Pasadena from January 21-24th to participate in the 14th Annual Southern California Linux Expo (SCALE14x). As I mentioned previously, a major part of my attendance was focused on the Ubuntu-related activities. Wednesday evening I joined a whole crowd of my Ubuntu friends at a pre-UbuCon meet-and-greet at a wine bar (all ages were welcome) near the venue.

It was at this meet-and-greet where I first got to see several folks I hadn’t seen since the last Ubuntu Developer Summit (UDS) back in Copenhagen in 2012. Others I had seen recently at other open source conferences and still more I was meeting for the first time, amazing contributors to our community who I’d only had the opportunity to get to know online. It was at that event that the excitement and energy I used to get from UDS came rushing back to me. I knew this was going to be a great event.

The official start of this first UbuCon Summit began Thursday morning. I arrived bright and early to say hello to everyone, and finally got to meet Scarlett Clark of the Kubuntu development team. If you aren’t familiar with her blog and are interested in the latest updates to Kubuntu, I highly recommend it. She’s also one of the newly elected members of the Ubuntu Community Council.


Me and Scarlett Clark

After morning introductions, we filed into the ballroom where the keynote and plenaries would take place. It was the biggest ballroom of the conference venue! The SCALE crew really came through with support of this event, it was quite impressive. Plus, the room was quite full for the opening and Mark Shuttleworth’s keynote, particularly when you consider that it was a Thursday morning. Richard Gaskin and Nathan Haines, familiar names to anyone who has been to previous UbuCon events at SCALE, opened the conference with a welcome and details about how the event had grown this year. Logistics and other details were handled now too, and then they quickly went through how the event would work, with a keynote, series of plenaries and then split User and Developer tracks in the afternoon. They concluded by thanking sponsors and various volunteers and Canonical staff who made the UbuCon Summit a reality.


UbuCon Summit introduction by Richard Gaskin and Nathan Haines

The welcome, Mark’s keynote and the morning plenaries are available on YouTube, starting here and continuing here.

Mark’s keynote began by acknowledging the technical and preference diversity in our community, from desktop environments to devices. He then reflected upon his own history in Linux and open source, starting in university when he first installed Linux from a pile of floppies. It’s been an interesting progression to see where things were twenty years ago, and how many of the major tech headlines today are driven by Linux and Ubuntu, from advancements in cloud technology to self-driving cars. He continued by talking about success on a variety of platforms, from the tiny Raspberry Pi 2 to supercomputers and the cloud, Ubuntu has really made it.

With this success story, he leapt into the theme of the rest of his talk: “Great, let’s change.” He dove into the idea that today’s complex, multi-system infrastructure software is “too big for apt-get” as you consider relationships and dependencies between services. Juju is what he called “apt-get for the cloud/cluster” and explained how LXD, the next evolution of LXC running as a daemon, gives developers the ability to run a series of containers to test deployments of some of these complex systems. This means that just like the developers and systems engineers of the 90s and 00s were able to use open source software to deploy demonstrations of standalone software on our laptops, containers allow the students of today to deploy complex systems locally.

He then talked about Snappy, the new software packaging tooling. His premise was that even a six month release cycle is too long as many people are continuously delivering software from sources like GitHub. Many places have a solid foundation of packages we rely upon and then a handful of newer tools that can be packaged quickly in Snappy rather than going through the traditional Debian Packaging route, which is considerably more complicated. It was interesting to listen to this, as a former Debian package maintainer myself I always wanted to believe that we could teach everyone to do software packaging. However, seeing these efforts play out the community work with app developers it became clear between their reluctance and the backlog felt by the App Review Board, it really wasn’t working. Snappy moves us away from PyPI, PPAs and such into an easier, but still packaged and managed, way to handle software on our systems. It’ll be fascinating to see how this goes.


Mark Shuttleworth on Snappy

He concluded by talking about the popular Internet of Things (IoT) and how Ubuntu Core with Snappy is so important here. DJI, “the market leader in easy-to-fly drones and aerial photography systems,” now offers an Ubuntu-driven drone. The Open Source Robotics Institute uses Ubuntu. GE is designing smart kitchen appliances powered by Ubuntu and many (all?) of the self-driving cars known about use Ubuntu somewhere inside them. There was also a business model here, a company that produces the hardware and a minimal features set that comes with it, also sells a more advanced version, and then industry-expert third parties who further build upon it to sell industry-specific software.

After Mark’s talk there were a series of plenaries that took place in the same room.

First up was Sergio Schvezov who followed on Mark’s keynote nicely as he gave a demo of Snapcraft, the tool used to turn software into a .snap package for Ubuntu Core. Next up was Jorge Castro who gave a great talk about the state of Gaming on Ubuntu, which he said was “Not bad.” Having just had this discussion with my sister, the timing was great for me. On the day of his talk, there were 1,516 games on Steam that would natively run on Linux, a nice selection of which are modern games that are new and exciting across multiple platforms today. He acknowledged the pre-made Steam Boxes but also made the case for homebrewed Steam systems with graphics card recommendations, explaining that Intel did fine, AMD is still lagging behind high performance with their open source drivers and giving several models of NVidia cards today that do very well (from low to high quality, and cost: 750Ti, 950, 960, 970, 980, 980Ti). He also passed around a controller that works with Linux to the audience. He concluded by talking about some issues remaining with Linux Gaming, including regressions in drivers that cause degraded performance, the general performance gap when compared to some other gaming systems and the remaining stigma that there are “no games” on Linux, which talks like this are seeking to reverse. Plenaries continued with Didier Roche introducing Ubuntu Make, a project which makes creating a developer platform out of Ubuntu with several SDKs much easier so that developers reduce the bootstrapping time. His blog has a lot of great posts on the tooling. The last talk of the morning was by Scarlett Clark, who gave us a quick update on Kubuntu Development, explaining that the team had recently joined forces with KDE packagers in Debian to more effectively share resources in their work.

It was then time for group photo! Which included my xerus, and where I had a nice chat (and selfie!) with Carla Sella as we settled in for the picture.


Me and Carla Sella

In the afternoon I attended the User track, starting off with Nathan Haines on The Future of Ubuntu. In this talk he talked about what convergence of devices meant for Ubuntu and warded off concerns that the work on the phone was done in isolation and wouldn’t help the traditional (desktop, server) Ubuntu products. With Ubuntu Core and Snappy, he explained, all the work done on phones is being rolled back into progress made on the other systems, and even IoT devices, that will use them in the future. Following Nathan was the Ubuntu Redux talk by Jono Bacon. His talk could largely be divided into two parts: History of Ubuntu and how we got here, and 5 recommendations for the Ubuntu community. He had lots of great stories and photos, including one of a very young Mark, and moved right along to today with Unity 8 and the convergence story. His 5 recommendations were interesting, so I’ll repeat them here:

  1. Focus on core opportunities. Ubuntu can run anywhere, but should it? We have finite resources, focus efforts accordingly.
  2. Rethink what community in Ubuntu is. We didn’t always have Juju charmers and app developers, but they are now a major part of our community. Understand that our community has changed and adjust our vision as to where we can find new contributors.
  3. Get together more in person. The Ubuntu Online Summit works for technical work, but we’ve missed out on the human component. In person interactions are not just a “nice to have” in communities, they’re essential.
  4. Reduce ambiguity. In a trend that would continue in our leadership panel the next day, some folks (including Jono) argue that there is still ambiguity around Intellectual Propoerty and licensing in the Ubuntu community (Mark disagrees).
  5. Understand people who are not us.

Nathan Haines on The Future of Ubuntu

The next presentation was my own, on Building a career with Ubuntu and FOSS where I drew upon examples in my own career and that of others I’ve worked with in the Ubuntu community to share recommendations for folks looking to contribute to Ubuntu and FOSS as a tool to develop skills and tools for their career. Slides here (PDF). David Planella on The Ubuntu phone and the road to convergence followed my talk. He walked audience members through the launch plan for the phone, going through the device launch with BQ for Ubuntu enthusiasts, the second phase for “innovators and early adopters” where they released the Meizu devices in Europe and China and went on to explain how they’re tackling phase three: general customer availability. He talked about the Ubuntu Phone Insiders group of 30 early access individuals who came from a diverse crowd to provide early feedback and share details (via blog posts, social media) to others. He then gave a tour of the phones themselves, including how scopes (“like mini search engines on your phone”) change how people interact with their device. He concluded with a note about the availability of the SDK for phones available at developer.ubuntu.com, and that they’re working to make it easy for developers to upload and distribute their applications.

Video from the User track can be found here. The Developer track was also happening, video for that can be found here. If you’re scanning through these to find a specific talk, note that each is 1 hour long.

Presentations for the first day concluded with a Q&A with Richard Gaskin and Nathan Haines back in the main ballroom. Then it was off to the Thursday evening drinks and appetizers at Porto Alegre Churrascaria! Once again, a great opportunity to catch up with friends old and new in the community. It was great running into Amber Graner and getting to talk about our respective paid roles these days, and even touched upon key things we worked on in the Ubuntu community that helped us get there.

The UbuCon Summit activities continued after a SCALE keynote with an Ubuntu Leadership panel which I participated in along with Oliver Ries, David Planella, Daniel Holbach, Michael Hall, Nathan Haines and José Antonio Rey with Jono Bacon as a moderator. Jono had prepared a great set of questions, exploring the strengths and weaknesses in our community, things we’re excited about and eager to work on and more. We also took questions from the audience. Video for this panel and the plenaries that followed, which I had to miss in order to give a talk elsewhere, are available here. The link takes you to 1hr 50min in, where the Leadership panel begins.

The afternoon took us off into unconference mode, which allowed us to direct our own conference setup. Due to aforementioned talk I was giving elsewhere, I wasn’t able to participate in scheduling, but I did attend a couple sessions in the afternoon. First was proposed by Brendan Perrine where we talked about strategies for keeping the Ubuntu documentation up to date, and also talked about the status of the Community Help wiki, which has been locked down due to spam for nearly a month(!). I then joined cm-t arudy to chat about an idea the French team is floating around to have people quickly share stories and photos about Ubuntu in some kind of community forum. The conversation was a bit tool-heavy, but everyone was also conscious of how it would need to be moderated. I hope I see something come of this, it sounds like a great project.

With the UbuCon Summit coming to a close, the booth was the next great task for the team. I couldn’t make time to participate this year, but the booth featured lots of great goodies and a fleet of contributors working the booth who were doing a fantastic job of talking to people as the crowds continued to flow through each day.

Huge thanks to everyone who spent months preparing for the UbuCon Summit and booth on the SCALE14x expo hall. It was a really amazing event that I was proud to be a part of. I’m already looking forward to the next one!

Finally, I took responsibility for the @ubuntu_us_ca Twitter account throughout the weekend. It was the first time I’ve done such a comprehensive live-tweeting of an event from a team/project account. I recommend a browse through the tweets if you’re interested in hearing more from other great people live-tweeting the event. It was a lot of fun, but also surprisingly exhausting!

More photos from my time at SCALE14x (including lots of Ubuntu ones!) here: https://www.flickr.com/photos/pleia2/albums/72157663821501532

by pleia2 at January 30, 2016 11:40 PM

Jono Bacon

Happy Birthday, Stuart

About 15 years ago I met Stuart ‘Aq’ Langridge when he walked into the new Wolverhampton Linux Users Group I had just started with his trademark bombastic personality and humor. Ever since those first interactions we have become really close friends.

Today Stuart turns 40 and I just wanted to share a few words about how remarkable a human being he is.

Many of you who have listened to Stuart on Bad Voltage, seen him speak, worked with him, or socialized with him will know him for his larger than life personality. He is funny, warm, and passionate about his family, friends, and technology. He is opinionated, and many of you will know him for the amusing, insightful, and tremendously articulate way in which he expresses his views.

He is remarkably talented and has an incredible level of insight and perspective. He is not just a brilliant programmer and software architect, but he has a deft knowledge and understanding of people, how they work together, and the driving forces behind human interaction. What I have always admired is that while bombastic in his views, he is always open to fresh ideas and new perspectives. For him life is a journey and new ways of looking at the road are truly thrilling for him.

As I have grown as a person in my career, with my family, and particularly when moving to America, he has always supported yet challenged me. He is one of those rare friends that can enthusiastically validate great steps forward yet, with the same enthusiasm, illustrate mistakes too. I love the fact that we have a relationship that can be so open and honest, yet underlined with respect. It is his personality, understanding, humor, thoughtfulness, care, and mentorship that will always make him one of my favorite people in the world.

Stuart, I love you, pal. Have an awesome birthday, and may we all continue to cherish your friendship for many years to come.

by Jono Bacon at January 30, 2016 09:05 PM

January 29, 2016

Jono Bacon

Heading Out To linux.conf.au

On Saturday I will be flying out to linux.conf.au taking place in Geelong. Because it is outrageously far away from where I live, I will arrive on Monday morning. 🙂

I am excited to be joining the conference. The last time I made the trip was sadly way back in 2007 and I had an absolutely tremendous time. Wonderful people, great topics, and well worth the trip. Typically I have struggled to get out with my schedule, but I am delighted to be joining this year.

I will also be delivering one of the keynotes this year. My keynote will be on Thu 4th Feb 2016 at 9am. I will be delving into how we are at potentially the most exciting time ever for building strong, collaborative communities, and sharing some perspectives on how we empower a new generation of open source contributors.

So, I hope to see many of you there. If you want to get together for a meeting, don’t hesitate in getting in touch. You can contact me at jono@github.com for GitHub related discussions, or jono@jonobacon.org for everything else. See you there!

by Jono Bacon at January 29, 2016 07:10 AM

January 23, 2016

iheartubuntu

Brave Browser on Ubuntu

Brendan Eich, one of the co-founders of the Mozilla project (Firefox browser) is developing a new web browser promising to block intrusive ads and 3rd party trackers. Enter BRAVE built for Linux, Windows, OSX, iOS and Android...

https://brave.com/

Brave's browser, still in early development, speeds up web pages by stripping out not just ads but also other page elements that track online behavior and web elements used to deliver ads. By removing advertisements and trackers, Braves browser will speed up page loading considerably. It loads pages 2x to 4x faster than other smart phone browsers & up to 2x faster than other browsers for personal computers.


Blocking ads however could be a challenge for the Brave team as advertising helps fund websites and bloggers content. The browsers work around is to eventually display new ads from their own pool of advertisers and connect Bitcoin as a method for users to pay website owners directly for the content users are viewing for free. Its a new way of doing things for sure and could disrupt Googles ad network which is Googles biggest source of revenue.

Brave has been built out of the open source Chromium browser, which is the foundation for Google's Chrome browser. An interesting choice considering Brave is essentially trying to take market cap away from Google. Has Eich and his Brave team averted the ad blocking war or created a new style war?

All of Braves source packages have been made available on GitHub. We managed to compile it on Ubuntu 16.04, but ran into problems. The GitHub page includes a readme file for installation however its really incomplete right now as of 1/22/16. We also ran into problems with the newest version of Node.js 5.xx not being supported by our newest version of Ubuntu. However, installation of Node.js 5.xx may work fine on Ubuntu 15.10 or older, thus getting the Brave browser installed on Ubuntu.

Keep your eyes open on their GitHub for updated installation information such as DEB files or PPAs for an easier way to install Brave.

https://github.com/brave/browser-laptop

by iheartubuntu (noreply@blogger.com) at January 23, 2016 04:56 PM

January 20, 2016

Elizabeth Krumbach

December events and a pair of tapestries

In my last post I talked some about the early December tourist stuff that I did. I also partook in several events that gave me a nice, fun distraction when I was looking for some down time after work and book writing.

It’s no secret that I like good food, so when a spot opened up with some friends to check out Lazy Bear here in San Francisco, I was pretty eager to go. They had two seatings per night and everyone sits together at long tables and was served each course at the same time. We had to skip the pork selections, but I was happy with the substitutions they provided for us. They also gave us pencils and notebooks to take notes about the dishes. An overall excellent dinner.

On December 2nd MJ and I met up with my friend Amanda to see Randall Monroe of XKCD fame talk about his new book, Thing Explainer. In this book he talks about complicated concepts using only the 1000 most common words. He shared stories about the process of writing the book and some things he had a lot of fun with. It was particularly amusing to hear how much he used the word “bag” when explaining the human body. We waited around pretty late for what ended up being some marathon signing, huge thanks to him for staying around so we could get our copy signed!

The very next day I scored a ticket to a local Geek Girl Dinner here in SOMA. I’d only been to one before, and going alone always means I’m a bit on edge nervousness-wise. But it was a Star Wars themed dinner and I do enjoy hearing stories from other women in tech, so I donned my R2-D2 hoodie and made my way over. Turns out, not many people were there to celebrate Star Wars, but they did have R2-D2 cupcakes and some cardboard cutouts of the new characters, so they pulled it off. The highlight of the night for me was a technical career panel of women who were able to talk about their varied entry points into tech. As someone with a non-traditional background myself, it’s always inspiring to hear from other women who made major career changes after being inspired by technology in some way or another.


Twilio tech careers panel

I mentioned in an earlier post that our friend Danita was in town recently. The evening she arrived I was neck deep in book work… and the tail end of the Bring Back MST3K Kickstarter campaign. They hosted five hours of a telethon-style variety show with magicians, musicians, comedians and various cameos by past and future MST3K actors, writers and robots. I’m pretty excited about this reboot, MST3K was an oddly important show when I was a youth. A game based on riffing is what first brought me on to an IRC network and introduced me to a whole host of people who made major impacts in my life. We all loved MST3K. Today I still enjoy Rifftrax (including the live show I went to last week). In spite of technical difficulties it was fun to prop up my tablet while working and watch the stream of their final fundraising push as they broke the record for biggest TV kickstarter campaign ever. Congratulations everyone, I am delighted to have donated to the campaign and look forward to the new episodes!

Hanukkah was also in December. Unfortunately MJ had to be out of town for the first few days, so we did a Google Hangout video call each evening. I set the tablet up on the counter as I lit the lights. I also took pictures each night so I could share the experience further.

At the end of the month MJ had a couple of his cousins in town to visit over the Christmas holiday. I didn’t take much time off, but I did tag along on select adventures, enjoying several great meals together and snapping a bunch of tourist photos of the Golden Gate Bridge (album here). We also made our way to Pier 39 one afternoon to visit sea lions and MJ and I made a detour to the Aquarium of the Bay while the girls did some shopping. The octopus and sea otters were particularly lively that evening (album here) and I snapped a couple videos: Giant Pacific Octopus and River otters going away for the night. Gotta love the winter clothes the human family was wearing in the otter video, we had a brisk December!

To conclude, I’ll leave you with a pair of Peruvian tapestries that we picked up in Cusco in August. Peru was one of my favorite adventures to date, and it’s nice that we were able to bring home some woven keepsakes from the Center for Traditional Textiles. We bundled them together in a carry on to bring them home and then brought them to our local framing shop and art gallery for framing. It took a few months, but I think it was worth it, they did a very nice job.

And now that I’ve taken a breather, it’s time to pack for SCALE14x, which we’re leaving for tomorrow morning. I also need to see if I can tie off some loose ends with this chapter I’m working on before we go.

by pleia2 at January 20, 2016 02:04 AM

January 17, 2016

Elizabeth Krumbach

Local tourist: A mansion, some wine and the 49ers

Some will say that there are tourists and there are travelers. The distinction tends to be that tourists visit the common places and take selfies, while travelers wander off the beaten path and take a more peaceful and thoughtful approach to enjoying their chosen destination.

I’m a happy tourist. Even when I’m at home.

Back in December our friend Danita was in town and I took advantage of this fact by going full on Bay Area tourist with her.

Our first adventure was going down to the Winchester Mystery House in San Jose. Built continuously for decades by the widow Sarah Winchester (of Winchester rifle fame), the house is a maze of uneven floors, staircases that go nowhere and doors that could drop you a story or two if you don’t watch when stepping through them. It’s said that the spiritualist movement heavily influenced Mrs. Winchester’s decisions, from moving to California after her husband’s death to the need to continuously be doing construction. She had a private seance room and after the house survived the 1906 earthquake that destroyed the tower that used to be a key feature in the house, she followed spirit-driven guidance. This caused her to stop work on the main, highly decorated front part of the house and only work on the back half, not even fixing up the sections damaged in the earthquake.

Door to nowhere
A “door to nowhere” in the Winchester House

There certainly are bits about this place that remind me of a tourist trap, including the massive gift shop and ghost stories. But it wasn’t shopping, spiritualism or ghosts that brought me here. As an armchair history and documentary geek, I’ve known about the Winchester House for years. When I moved to the bay area almost six years ago, it immediately went on my “to visit” list. The beautiful Victorian architecture, the oddity that was how she built it and her interest in the latest turn of the 20th century innovations in the house are what interested me. She had three elevators in the house, of varying types as the technology was developed, providing a fascinating snapshot into approximately 20 years of early elevator innovation history. She was an early adopter of electricity, and there were various types of the latest time and energy-saving gadgets and tools that were installed to help her staff get their work done. Plus, in addition to having a car (with a chauffeur, obviously), the garage where it was kept had a car wash apparatus built in! We went on a behind-the-scenes tour to visit many of these things. The estate originally covered many acres, allowing for a large fruit orchard and fruit was actually processed on site, so we got to see the massive on-site evaporator used for preparing the fruit for distribution.


Fruit evaporator at Winchester House

When Mrs. Winchester died, her belongings were carefully distributed among her heirs, but no arrangements were made for the house. Instead, curious neighbors got together and made sure it was saved from demolition, effectively turning it into a tourist attraction just a few years after her passing. Still privately-owned, today it’s listed on the U.S. National Register of Historic Places.

Photos weren’t allowed inside the house, but I snapped away outside: https://www.flickr.com/photos/pleia2/albums/72157660011104133

My next round of local touristing took us north, to Sonoma county for some wine tasting! We’re a member of a winery up there, so we had our shipment to pick up too, but it’s also always fun bringing our friends to our favorite stops in wine country. We started at Imagery Winery where we picked up our wine and enjoyed tastings of several of their sweeter wines, including their port. From there we picked up fresh sandwiches at a deli and grocery store before making our way to Benziger Family Winery, where MJ and I got engaged back in 2011.s We ate lunch before the rain began and then went inside to do some more wine tastings. Thankfully, the weather cleared up before our 3PM tour, where we got to see the vinyards, their processing area and inside the wine caves. It was cold though, in the 40s with a stiff breeze throughout the day. Our adventure concluded with a stop at Jacuzzi Family Vineyards where we tasted some olive oils, vinegar and mustard.

More photos from our Sonoma adventure here: https://www.flickr.com/photos/pleia2/albums/72157661706977879

In slightly less tourism and more local experience, the last adventure I went on with Danita was a trip down the bay (took Amtrak) to the brand new NFL stadium for the 49ers on Sunday, December 20th. I’m not into football, but going to an NFL game was something I wanted to experience, particularly since this brand new stadium is the one the Super Bowl will be played in a few weeks from now. Nice experience to have! The forecast called for rain, but we lucked out and it was merely cold (40s and 50s), I picked up a winter hat there at the stadium and they appeared to be doing brisk business for us Californians who are not accustomed to the chilly weather. We got to our seats before all the pre-game activities began, of which there are many, I had no idea the kind of pomp that accompanies a football game! We had really nice seats right next to the field, so close that Danita was able to find us upon watching game footage later:

The game itself? I am still no football fan. As someone who doesn’t watch much, I’ll admit that it was a bit hard for me to follow. Thankfully Danita is a big fan so she was able to explain things to me when I had questions. And regardless of the sport, it is fun to be piled into a stadium with fans. Hot dogs and pretzels, cheering and excitement, all good for the human spirit. I also found the cheerleaders to be a lot of fun, for all the stopping and starting the football players did, the cheerleaders were active throughout the game. I also learned that the stadium was near the San Jose airport, I may have taken a lot of pictures of planes flying over the stadium. They also had a halftime break that featured some previous Super Bowl 49ers from the 80s, Joe Montana was among them. Even as someone who doesn’t pay attention to football, I recognized him!


Airplane, cheerleaders and probably some football happening ;)

The Amtrak trip home was also an adventure, but not the good kind. Our train broke down and we had to be rescued by the next train, an hour behind us. There were high spirits among our fellow passengers though… and lots of spirits, the train bar ran out of champagne. It was raining by the time we got on the next train and so we had a bit of a late and soggy trip back. Still, all in all I’m glad I went.

More photos from the game here: https://www.flickr.com/photos/pleia2/albums/72157662674446015

by pleia2 at January 17, 2016 08:01 PM

Color me Ubuntu at UbuCon Summit & SCALE14x

This week I’ll be flying down to Pasadena, California to attend the first UbuCon Summit, which is taking place at the the Fourteenth Annual Southern California Linux Expo (SCALE14x). The UbuCon Summit was the brain child of meetings we had over the summer that expressed concern over the lack of in person collaboration and connection in the Ubuntu community since the last Ubuntu Developer Summit back in 2012. Instead of creating a whole new event, we looked at the community-run UbuCon events around the world and worked with the organizers of the one for SCALE14x to bring in funding and planning help from Canonical, travel assistance to project members and speakers to provide a full two days of conference and unconference event content.

UbuCon Summit

As an attendee of and speaker at these SCALE UbuCons for several years, I’m proud to see the work that Richard Gaskin and Nathan Haines has put into this event over the years turn into something bigger and more broadly supported. The event will feature two tracks on Thursday, one for Users and one for Developers. Friday will begin with a panel and then lead into an unconference all afternoon with attendee-driven content (don’t worry if you’ve never done an unconference before, a full introduction after the panel will be provided on to how to participate).

As we lead up to this the UbuCon Summit (you can still register here, it’s free!) on Thursday and Friday, I keep learning that more people from the Ubuntu community will be attending, several of whom I haven’t seen since that last Developer Summit in 2012. Mark Shuttleworth will be coming in to give a keynote for the event, along with various other speakers. On Thursday at 3PM, I’ll be giving a talk on Building a Career with Ubuntu and FOSS in the User track, and on Friday I’ll be one of several panelists participating in an Ubuntu Leadership Panel at 10:30AM, following the morning SCALE keynote by Cory Doctorow. Check out the full UbuCon schedule here: http://ubucon.org/en/events/ubucon-summit-us/schedule/

Over the past few months I’ve been able to hop on some of the weekly UbuCon Summit planning calls to provide feedback from folks preparing to participate and attend. During one of our calls, Abi Birrell of Canonical held up an origami werewolf that she’d be sending along instructions to make. Turns out, back in October the design team held a competition that included origami instructions and gave an award for creating an origami werewolf. I joked that I didn’t listen to the rest of the call after seeing the origami werewolf, I had already gone into planning mode!

With instructions in hand, I hosted an Ubuntu Hour in San Francisco last week where I brought along the instructions. I figured I’d use the Ubuntu Hour as a testing ground for UbuCon and SCALE14x. Good news: We had a lot of fun, it broke the ice with new attendees and we laughed a lot. Bad news: We’re not very good at origami. There were no completed animals at the end of the Ubuntu Hour!

Origami werewolf attempt
The xerus helps at werewolf origami

At 40 steps to create the werewolf, one hour and a crowd inexperienced with origami, it was probably not the best activity if we wanted animals at the end, but it did give me a set of expectations. The success of how fun it was to try it (and even fail) did get me thinking though, what other creative things could we do at Ubuntu events? Then I read an article about adult coloring books. That’s it! I shot an email off to Ronnie Tucker, to see if he could come up with a coloring page. Most people in the Ubuntu community know Ronnie as the creator of Full Circle Magazine: the independent magazine for the Ubuntu Linux community, but he’s also a talented artist whose skills were a perfect matched for this task. Lucky for me, it was a stay-home snowy day in Glasgow yesterday and within a couple hours he had a werewolf draft to me. By this morning he had a final version ready for printing in my inbox.

Werewolf coloring page

You can download the creative commons licensed original here to print your own. I have printed off several (and ordered some packets of crayons) to bring along to the UbuCon Summit and Ubuntu booth in the SCALE14x expo hall. I’m also bringing along a bunch of origami paper, so people can try their hand at the werewolf… and unicorn too.

Finally, lest we forget that my actual paid job is a systems administrator on the OpenStack Infrastructure team, I’m also doing a talk at DevOpsDayLA on Open Source tools for distributed systems administration. If you think I geek out about Ubuntu and coloring werewolves, you should see how I act when I’m talking about the awesome systems work I get to do at my day job.

by pleia2 at January 17, 2016 06:32 PM

January 14, 2016

Akkana Peck

Snow hiking

[Akk on snowshoes crossing the Jemez East Fork]

It's been snowing quite a bit! Radical, and fun, for a California ex-pat. But it doesn't slow down the weekly hiking group I'm in. When the weather turns white, the group switches to cross-country skiing and snowshoeing.

A few weeks ago, I tried cross-country skiing for the first time. (I've downhill skied a handful of times, so I know how, more or less, but never got very good at it. Ski areas are way too far away and way too expensive in Californian.) It was fun, but I have a chronic rotator cuff problem, probably left over from an old motorcycle injury, and found my shoulder didn't deal well with skiing. Well, the skiing was probably fine. It was probably more the falling and trying to get back up again that it didn't like.

So for the past two weeks I've tried snowshoes instead. That went just fine. It doesn't take much learning: it's just like hiking, except a little bit harder work remembering not to step on your own big feet. "Bozo goes hiking!" Dave called it, but it isn't nearly as Bozo-esque as I thought it would be.

Last week we snowshoed from a campground out to the edge of Frijoles Canyon, in a snowstorm most of the way, and ice fog -- sounds harsh when described like that, but it was lovely, and we were plenty warm when we were moving. This week, we followed the prettiest trail in the area, the East Fork of the Jemez River. In summer, it's a vibrantly green meadow with the sparkling creek snaking through it. In winter, it turns into a green and sparkling white forest. Someone took a photo of me snowshoeing across one of the many log bridges spanning the East Fork. You can't see any hint of the river itself -- it's buried in snow.

But if you hike in far enough, there's a warm spring: we're on the edge of the Valles Caldera, an old supervolcano that still has plenty of low-level geothermal activity left. The river is warm enough here that it's still running even in midwinter ... and there was a dipper there. American dippers are little birds that dive into creeks and fly under the water in search of food. They're in constant motion, diving, re-emerging, bathing, shaking off, and this dipper went about its business fifteen feet from where we were standing watching it. Someone had told me that he saw two dippers at this spot yesterday, but we were happy to get such a good look at even one.

We had lunch in a sunny spot downstream from the dipper, then headed back to the trailhead. A lovely way to spend a winter day.

January 14, 2016 02:01 AM

January 11, 2016

Jono Bacon

SCALE14x Plans

In a week and a half I am flying out to Pasadena to the SCALE14x conference. I will be there from the evening of Wed 20th Jan 2016 to Sun 24th Jan 2016.

SCALE is a tremendous conference, as I have mentioned many times before. This is a busy year for me, so I wanted to share what I will be up to:

  • Thurs 21st Jan 2016 at 2pm in Ballroom AUbuntu Redux – as part of the UbuCon Summit I will be delivering a presentation about the key patterns that have led Ubuntu to where it is today and my unvarnished perspective on where Ubuntu is going and what success looks like.
  • Thurs 21st Jan 2016 at 7pm – in Ballroom DEFLOSS Reflections – I am delighted to be a part of a session that looks into the past, present, and future of Open Source. The past will be covered by the venerable Jon ‘Maddog’ Hall, the present by myself, and the future by Keila Banks.
  • Fri 22nd Jan 2016 at 10.30am – in Ballroom DE – Ubuntu Panel – I will be hosting a panel where Mark Shuttleworth (Ubuntu Founder), David Planella (Ubuntu Community Manager), Olli Ries (Engineering Manager), and Community Council and community members will be put under the spotlight to illustrate where the future of Ubuntu is going. This is a wonderful opportunity to come along and get your questions answered!
  • Fri 22nd Jan 2016 at 8pm – in Ballroom DEBad Voltage: Live – join us for a fun, informative, and irreverent live Bad Voltage performance. There will be free beer, lots of prizes (including a $2200 Pogo Linux workstation, Zareason Strata laptop, Amazon Fire Stick, Mycroft, Raspberry Pi 2 kit, plenty of swag and more), and plenty of audience participation and surprises. Be sure to join us!
  • Sat 23rd Jan 2016 at 4.30pm – in Ballroom HBuilding Awesome Communities On GitHub – this will be my first presentation in my new role as Director Of Community at GitHub. In it I will be delving into how you can build great communities with GitHub and I will talk about some of the work I will be focused on in my new role and how this will empower communities around the world.

I am looking forward to seeing you all there and if you would like have a meeting while I am there, please drop me an email to jono@github.com.

by Jono Bacon at January 11, 2016 05:48 PM

iheartubuntu

OpenShot 2.0 - Beta Released


The first beta release of OpenShot 2.0 is available to Kickstarter supporters and a much wider testing effort has started. For those who supported OpenShots Kickstarter, they will gain early access and receive a separate update with links to installers. For everyone else, the source code has been published and is available online, but its recommended to wait a little longer, until the installers are released for everyone.

More info here...

http://www.openshotvideo.com/2016/01/openshot-20-beta-released.html

For anyone who has looked for a video editor in Ubuntu, Openshot is really really nice. I have personally used it on several family occasions (weddings, birthdays, and a 50th anniversary) and it has produced great results.

One relative who was a film director in the 70s, moving on to live stage show productions in the 80s and 90s was really impressed with the work I did with OpenShot.

Definitely give OpenShot a try for your video editing needs!

If you really want to test the current version 1.1.3, there is a DEB installer here...

http://www.openshot.org/download/

..and OpenShot is in the Ubuntu Software Center as well with version 1.4.1. Or wait for further instructions for the newest OpenShot 2.0

by iheartubuntu (noreply@blogger.com) at January 11, 2016 05:01 PM

January 09, 2016

Elizabeth Krumbach

Going to the theater

I typically don’t spend a lot of time in theaters, for either movies or plays. Aside from some obvious exceptions, I’m not a big movie person.

This was turned on its head over the past month, with a total of five visits to theaters in the past month!

It began quietly, when I had a friend in town and she suggested we make our way over to The Castro Theatre to see The Nightmare Before Christmas. We’d both seen it dozens (hundreds?) of times, but it’s a favorite and I adore that theater. It’s an older theater with substantial adornments throughout. They regularly have an organist playing as you are getting settled into your seats along with slides of upcoming events. A much more relaxing and entertaining experience for me than a giant screen with a series of loud of commercials. The theater also sells snacks and drinks (alcoholic and otherwise) that you can take to your seats. The movie itself was full of the usual charm, even if the copy they had was older and had a few instances of skipping where the film had probably torn or otherwise been degraded. We took the streetcar home, rounding off a wonderful evening.

The next theater was the A.C.T.’s Geary Theater. This is where MJ and I saw Between Riverside and Crazy a few months back, the first play I’d seen in San Francisco! Since it was close to the holidays they were playing A Christmas Carol and we picked up tickets for the high balcony seats. Another one of the old style theaters that is intricately ornamented, I love simply being in that space. Staring up at the decorated ceiling, inspecting the private boxes. I had never seen A Christmas Carol live before, and in spite of it not being a holiday I celebrate these days, it’s still a story I love. They did a beautiful job with it, I loved their interpretation of the various spirits! And there was no getting around falling in love with the main character, Ebenezer Scrooge.

Then there was Star Wars: The Force Awakens! MJ managed to get us tickets for opening night down in Mountain View with several of his colleagues. I may have dressed up.

And gotten a commemorative cup.

It was at a Cinemark (no beautiful theater to look at), but the theater did have big reclining seats. It was also the 2D version of the movie, which I much preferred for the first time seeing it. The movie pulled all the right nostalgic heart strings. I laughed, I cried (more than once) and I thoroughly enjoyed it.

A few days later I made my way over to the Sundance Kabuki theater to see it again, this time in 3D in their eat in theater! We got there early to have dinner up on their balcony next to the theater. From there we picked up our 3D classes and settled in to the big, comfy reserved seats. And I didn’t partake, but they did have a series of amusing cocktails to celebrate the release.

Next I’ll have to see it 3D in the IMAX!

And then there was last night. I made my way over to The Castro Theatre yet again, this time to see a live Rifftrax performance to kick off SF Sketchfest. I’d gone to one of these back in 2013 as well, so it was a real treat to yet again see Kevin Murphy, Bill Corbett and Michael J. Nelson joined by Mary Jo Pehl, Adam Savage and others to riff on a series of old shorts films. The theater was packed for this event, and so my friend Steve and I tried our luck up on the balcony, which I barely knew existed and had never been to. It was a brilliant decision, the balcony was really nice and gave us a great view of the show.

As I try to be less of a hermit while MJ is out of town next week, I’m hoping to see another proper in theater movie with a local friend soon. I hardly know myself!

by pleia2 at January 09, 2016 12:21 AM

January 07, 2016

Jono Bacon

We Need Your Answers

As I posted about the other day we are doing Bad Voltage Live in Los Angeles in a few weeks. It is on Fri 22nd Jan 2016 at 8pm at the SCALE14x Conference in Los Angeles. Find out more about the show here.

Now, I need every one of you to help provide some answers for a quiz we are doing in the show. It should only take a few minutes to fill in the form and your input could be immortalized in the live show (the show will be recorded and streamed live so you can see it for posterity).

You don’t have to be at the live show or a Bad Voltage listener to share your answers here, so go ahead and get involved!

If you end up joining the show in-person you also have the potential to win some prizes (Mycroft, Raspberry Pi 2 kit, and more!) by providing the most amusing/best answers too. Irrespective of whether you join the show live though, we appreciate if you fill it in:

Go and fill it in by clicking here

Thanks, everyone!

by Jono Bacon at January 07, 2016 11:00 PM

January 06, 2016

Akkana Peck

Speaking at SCALE 14x

I'm working on my GIMP talk for SCALE 14x, the Southern California Linux Expo in Pasadena.

[GIMP] My talk is at 11:30 on Saturday, January 23: Stupid GIMP tricks (and smart ones, too).

I'm sure anyone reading my blog knows that GIMP is the GNU Image Manipulation Program, the free open-source photo and image editing program which just celebrated its 20th birthday last month. I'll be covering an assortment of tips and tricks for beginning and intermediate GIMP users, and I'll also give a quick preview of some new and cool features that will be coming in the next GIMP release, 2.10.

I haven't finished assembling the final talk yet -- if you have any suggestions for things you'd love to see in a GIMP talk, let me know. No guarantees, but if I get any requests I'll try to accommodate them.

Come to SCALE! I've spoken at SCALE several times in the past, and it's a great conference -- plenty of meaty technical talks, but it's also the most newbie-friendly conference I've been to, with talks spanning the spectrum from introductions to setting up Linux or introductory Python programming all the way to kernel configuration and embedded boot systems. This year, there's also an extensive "Ubucon" for Ubuntu users, including a keynote by Mark Shuttleworth. And speaking of keynotes, the main conference has great ones: Cory Doctorow on Friday and Sarah Sharp on Sunday, with Saturday's keynote yet to be announced.

In the past, SCALE been held at hotels near LAX, which is about the ugliest possible part of LA. I'm excited that the conference moving to Pasadena this year: Pasadena is a much more congenial place to be, prettier, closer to good restaurants, and it's even close to public transportation.

And best of all, SCALE is fairly inexpensive compared to most conferences. Even more so if you use the promo-code SPEAK for a discount when registering.

January 06, 2016 11:32 PM

Jono Bacon

Bad Voltage Live in Los Angeles: Why You Should Be There

On Friday 22nd January 2016 the Bad Voltage team will be delivering a live show at the SCALE14x conference in Pasadena, California.

For those of you unfamiliar with Bad Voltage, it is a podcast that Stuart Langridge, Bryan Lunduke, Jeremy Garcia, and myself do every two weeks that delves into technology, open source, linux, gaming, and more. It features discussions, interviews, reviews and more. It is fun, loose, and informative.

We did our very first Bad Voltage Live show last year at SCALE. To get a sense of it, you can watch it below:

Can’t see the video? Watch it here.

This year is going to be an awesome show, and here are some reasons you should join us.

1. A Fun Show

At the heart of Bad Voltage is a fun show. It is funny, informative, and totally irreverent. This is not just four guys sat on a stage talking. This is a show. It is about spectacle, audience participation, and having a great time.

While we discussed important topics in open source last year we also had a quiz where we made video compilations of people insulting our co-hosts. We even had a half-naked Bryan review a bottle of shampoo in a hastily put together shower prop.

You never know what might happen at a Bad Voltage Live show and that is the fun of it. The audience make the evening so memorable. Be sure to be there!

2. Free Beer (and non-alcoholic beverages)

If there is one thing that people enjoy at conferences is an event with free beer. Well, thanks to our friends at Linode the beer will be flowing. We are also arranging to have soft drinks available too.

So, get along to the show, have a few cold ones and have a great time.

3. Lots of Prizes to be Won

We are firm believers in free stuff. So, we have pulled together a big pile of free stuff that you can all win by joining us at the show.

This includes such wonderful items as:

A Pogo Linux Verona 931H Workstation (thanks to Pogo Linux)

A Zareason Strata laptop (thanks to Zareason)

A Mycroft AI device (thanks to Mycroft)

We will also be giving away a Raspberry Pi 2 kit, Amazon Fire Stick and other prizes!

You can only win these prizes if you join us at the show, so be sure to get along!

4. Free Entry

So, how much does it cost to get into some fun live entertainment, free beer, and a whole host of prizes to be won?

Nothing.

That’s right, the entire show is free to attend.

5. At SCALE

One of the major reasons we like doing Bad Voltage Live at SCALE is because the SCALE conference is absolutely fantastic.

There are a wide range of tracks, varied content, and a wonderful community that attends every year. There are also a range of other events happening as part of SCALE such as the Ubuntu UbuCon Summit.

So, I hope all of this convinces you that Bad Voltage Live is the place to be. I hope to see you there on Friday 22nd January 2016 at 8pm. Find out more here.

Thanks to our sponsors, Linode, Microsoft, Pogo Linux, Zareason, and SCALE.

by Jono Bacon at January 06, 2016 04:30 AM

January 03, 2016

Elizabeth Krumbach

Celebrating the 1915 World’s Fair in SF, the PPIE

I’ve been fascinated with the World’s Fair ever since learning about them as a kid. The extravagance, the collections of art and innovation, the thrill of being part of something that was so widely publicized worldwide and the various monuments left over when the fairs concluded. As I learned about past fairs I was always disappointed that I had missed their heyday, and struggled to understand why their time had passed. The fairs still happen now called Expositions and 2015 marked one in Milan, but unless you’re local or otherwise look for them, you typically won’t know about them. Indeed, most people don’t realize they still exist. No longer does the media descend upon them for a flourish of publicity. The great companies, artists, innovators, cities and countries of our ages no longer make grand investments in showing off their best over acres of temporary, but beautiful, pop up cities.

In 2015 I learned a lot more about the fairs and how times have changed as San Francisco celebrated the centennial of the Panama-Pacific International Exposition (PPIE) of 1915. As I spent the year reading about the fair and walking through various exhibits around the city, all the things I knew about 1915 became so much more real. The first trans-continental telephone call from New York to San Francisco was made just prior to the fair opening. Planes were still quite new and it was common for these early pilots to die while operating them (aviation pioneer Lincoln J. Beachey actually died at the PPIE). Communication and travel that I take for granted simply didn’t exist. When I reflect on the “need” for a World’s Fair, I realized the major ones took place during a special time of intense innovation and cultural exchange but where we didn’t yet have a good way of sharing these things yet. The World’s Fair provided that space, and people would pay to see it.

I began my learning when MJ bought me Laura Ackley’s San Francisco’s Jewel City: The Panama-Pacific International Exposition of 1915. I read it cover to cover over a couple months, providing a nice foundation for exhibits I visited as the year went on. The first was the “Fair, Please” exhibit at the SF Railway Museum. The museum talks about the exhibit in their 1915 Fair Celebration blog post, which also links to a 2005 article that gets into more depth about how transit handled the increased ridership and new routes that the PPIE caused. I enjoyed the exhibit, though it was quite small (the museum and gift shop itself is only a single, large room). While I was at that exhibit I also picked up a copy of the Bay Area Electric Railroad Association Journal from spring 2007 that had a 25 page article by Grant Ute about transit and the PPIE. Predictably that was my next pile of reading.

Back in September I attended a panel about transit and the PPIE, which Grant Ute was a part of along with other transit representatives and historians who spoke about the fair and touched upon the future of transit as well. I wrote about it in a post here, excerpt:

I spent the evening at the California Historical Society, which is just a block or so away from where we live. They were hosting a lecture on City Rising for the 21st Century: San Francisco Public Transit 1915, now, tomorrow.

The talk [by Grant Ute] and panel were thoroughly enjoyable. Once the panel moved on from progress and changes made and made possible by transit changes surrounding the PPIE, topics ranged from the removal of (or refusal to build) elevated highways in San Francisco and how it’s created a beautiful transit and walk-friendly city, policies around the promotion of public transit and how funding has changed over the years.

In October I bought the book Jewel City: Art from San Francisco’s Panama-Pacific International Exposition in preparation for an exhibit at the De Young museum of the same name: Jewel City: Art from San Francisco’s Panama-Pacific International Exposition. It’s a beautiful book, and while I didn’t read it cover to cover like Laura Ackley’s, browsing through it at my own pace and focusing on parts I was most interested in was enjoyable. In early November we made it out to the exhibit at the de Young Museum.

From the Exhibit website:

To mark this anniversary, Jewel City revisits this vital moment in the inauguration of San Francisco as the West Coast’s cultural epicenter. The landmark exhibition at the de Young reassembles more than 200 works by major American and European artists, most of which were on display at this defining event.

No photos were allowed inside the exhibit, but it was a wonderful collection. As someone who is not very into modern or abstract art, it was nice to see a collection from 1915 where only a tiny sampling of these types were represented. There was a lot of impressionism (my favorite), as well as many portraits and a small corner that showed off some photos, which at the time were working to gain acceptance as art. The gift shop gave me the opportunity to pick up a beautiful silk scarf that has a drawn aerial view of the fair grounds.

In December I had to squeeze in a bunch of exhibits! On December 1st the Palace of Fine Arts’ Innovation Hanger opened their own exhibit, City Rising: San Francisco and the 1915 World’s Fair. The first time I visited the Palace of Fine Arts I didn’t quite realize what it was. What is now Innovation Hanger used to house the Exploritorum and was the only indoor space in the area. The Palace of Fine Arts was otherwise an outdoor space, colonnades around a beautiful pond, culminating in a huge domed structure. Where were the fine arts? It turns out, this was an on-site re-creation of the Palace of Fine Arts from the PPIE! The art for the fair had been featured in galleries throughout the also reconstructed old Exploritorium/Innovation Hanger. As we entered the exhibit it was nice to think about how we were walking through the same place that so many pieces of art had been featured during the fair.

The exhibit took up a section of the massive space, a bit dwarfed by the high ceilings but managed to make a bit of a cozy corner to enjoy it. A few artifacts, including big ones like a Model T (an on-site Ford factory producing them was a popular attraction at the fair) were on display. They also had a big map of the fair grounds, with the Palace of Fine Arts show raised as the only remaining building on site. Various sections of the exhibit talked about different areas of the fair, and also different phases, from planning to events at the fair itself to the much later formal reconstruction of the Palace of Fine Arts. I picked up the book Panorama: Tales From San Francisco’s 1915 Pan-Pacific International Exposition, which is next in my reading pile.

MJ’s cousins were in town, so it was also a nice opportunity to take a peaceful walk through the grounds of the Palace of Fine Arts. It’s a beautifully stunning place, regardless of what you know if it’s history.

More photos from the exhibit at the Palace of Fine Arts here: https://www.flickr.com/photos/pleia2/albums/72157660512348743

A couple days later we went to the Conservatory of Flowers in Golden Gate Park. They opened their own PPIE exhibit, Garden Railway: 1915 Pan-Pacific in November. Flowers, trains and the PPIE? I’m so there! Their exhibit room isn’t very large, but they did have a lovely recreation of part of the joy zone, the Palace of Fine Arts and, of course, the Palace of Horticulture.

From their website:

In an enchanting display landscaped with hundreds of dwarf plants and several water features, model trains wend their way through the festive fairgrounds, zipping past whimsical recreations of the fair’s most dazzling monuments and amusements, including the Tower of Jewels, Palace of Fine Arts, and more. Interpretive signs, memorabilia and interactive activities throughout help visitors to understand the colorful history of the grand fair that signaled San Francisco’s recovery from the 1906 earthquake.

Model trains are fun, so the movement they brought to the exhibit was enjoyable.

The center of the exhibit featured a recreation of the Tower of Jewels.

More photos from the Conservatory of Flowers and their PPIE exhibit here: https://www.flickr.com/photos/pleia2/albums/72157662883839265

On January 31st MJ and I went to the closest of PPIE exhibits, at the California History Museum which is just a block from home. The museum was colorfully painted and you walked through 3 separate rooms, plus a section at the entrance and a large main area to explore.

I’d say this was the most comprehensive visit with regard to history of the PPIE and artifacts. I really enjoyed seeing various kinds of keepsakes that were given away at the fair, including various pamphlets put out by countries, companies and the fair itself. Collectors items of all kinds, from picture books to spoons and watches were bought by fair goers. At the end of the fair they even sold the Novagems that covered the Tower of Jewels, many in commemorative boxes with certificates of authenticity.

They also had a massive scale model of the main areas of the fair grounds. Produced in around 1938 for the Golden Gate International Exposition on nearby Treasure Island, it was brought out of storage for us to enjoy in this exhibit. It’s really nice that it’s been so well preserved!

More photos from the California History Museum exhibit here: https://www.flickr.com/photos/pleia2/albums/72157662962920716

As the year concluded I am even more in love with these old World’s Fairs than I ever was. As such, I’m still sad that I missed them, but I have a new found appreciation for our lives today and the opportunities we have. In 2015 I visited 3 continents, spent my days working in real time with people all over the world and had immediate access to the latest news in the palm of my hand. None of this was possible for someone of my means 100 years ago. As much as I think it would be a wonderful and fascinating experience, it turns out that I don’t actually need a World’s Fair to expose me to the world and technology outside my home town.

by pleia2 at January 03, 2016 08:34 PM

January 01, 2016

Elizabeth Krumbach

The adventures of 2015

I wasn’t sure what to expect from 2015. Life circumstances meant that I wanted to travel a bit less, which meant being more selective about the conferences I would speak at. At the same time, some amazing opportunities for conferences came up that I couldn’t bring myself to turn down. Meeting new people, visiting countries very foreign to me, plus a new continent (South America!), there was much to be excited about this year! Work has gone exceptionally well. I hit some major milestones with my career, particularly with regards to my technical level, all thanks to support from those around me, dedication to some important projects and hard work on the day to day stuff.

There were also struggles this year. Early in the year I learned of the passing of a friend and local open source advocate. MJ and I navigated our way through the frailness and loss of a couple family members. I was forced to pause, reflect upon and ultimately step away from some of my open source involvement as it was causing me incredible emotional pain and stress. I also learned a lot about my work habits and what it takes to put out a solid technical book. The book continues to be a real struggle, but I am thankful for support from those around me.

I’ve been diligent in continuing to enjoy this beautiful city we live in. We went on a streetcar tour, MJ took me to a Star Wars Giants game for my birthday and we went to various Panama-Pacific International Exhibit commemorative events. I finally made it down the bay to the Winchester House and to see a 49ers game. As friends and family come into town, I jumped at every opportunity to explore the new and familiar. I also spoke at a few local conferences and events which I wrote about: San Francisco Ubuntu Global Jam, Elastic{ON} 2015, Puppet Camp San Francisco 2015 and an OpenStack Meetup.


Enjoying San Francisco with a special tour on the Blackpool Boat Tram

At the Bay Bridge with visiting friend Crissi

Star Wars day at AT&T Park

At a 49ers game with visiting friend Danita

Visiting one of several PPIE15 exhibits

Health-wise, I had to go in for several diagnostic tests post-gallbladder to see why some of my liver levels are off. After a bit of stress, it all looks ok, but I do need to exercise on a more regular basis. The beautiful blue sky beckons me to make a return to running, so I plan on doing that while incorporating things I learned with the trainer I worked with this past year. We’ve also been tracking Simcoe’s health with her renal failure, it’s been 4 years since her diagnosis and while her health isn’t what it was, our little Siamese continuing to hang in there.

And then there was all my travel!


Manneken Pis in Brussels, Belgium

In front of the Sultan Qaboos Grand Mosque, Muscat, Oman

Beautiful views from the OpenStack Summit in Vancouver, Canada

With MJ in obligatory tourist photo at Machu Picchu, Peru

Kinkaku-ji (golden temple), Kyoto, Japan

Space Shuttle Discovery near Washington D.C.

I didn’t give as many talks as I did in 2014, but I felt I took a stronger aim at quality this year. Speaking at conferences like FOSSC Oman and Grace Hopper Celebration of Women in Computing exposed me to some amazing, diverse audiences that led to some fantastic conversations after my talks. Exploring new places and meeting people who enrich my life and technical expertise are why I do all of this, so it was important that I found so much value in both this year.


Speaking at FOSSC Oman in Muscat

As I kick off 2016, my book is front and center. I have an amazing contributing author working with me. A Rough Cuts version went up on Safari at the end of 2015 and I’ve launched the book website. As I push through final technical challenges I’m hopeful that the pieces will soon fall into place so I can push through to completion.

Most of all, as I reflect upon 2015, I see a lot of cheer and sorrow. High highs and low lows. I’m aiming at a more balanced 2016.

by pleia2 at January 01, 2016 07:40 PM

December 31, 2015

Akkana Peck

Weather musing, and poor insulation

It's lovely and sunny today. I was just out on the patio working on some outdoor projects; I was wearing a sweatshirt, but no jacket or hat, and the temperature seemed perfect.

Then I came inside to write about our snowstorm of a few days ago, and looked up the weather. NOAA reports it's 23°F at Los Alamos airport, last reading half an hour ago. Our notoriously inaccurate (like every one we've tried) outdoor digital thermometer says it's 26&deg.

Weather is crazily different here. In California, we were shivering and miserable when the temperature dropped below 60°F. We've speculated a lot on why it's so different here. The biggest difference is probably that it's usually sunny here. In the bay area, if the temperature is below 60°F it's probably because it's overcast. Direct sun makes a huge difference, especially the sun up here at 6500-7500' elevation. (It feels plenty cold at 26°F in the shade.) The thin, dry air is probably another factor, or two other factors: it's not clear what's more important, thin, dry, or both.

We did a lot of weather research when we were choosing a place to move. We thought we'd have trouble with snowy winters, and would probably want to take vacations in winter to travel to warmer climes. Turns out we didn't know anything. When we were house-hunting, we went for a hike on a 17° day, and with our normal jackets and gloves we were fine. 26° is lovely here if you're in the sun, and the rare 90° summer day, so oppressive in the Bay Area, is still fairly pleasant if you can find some shade.

But back to that storm: a few days ago, we had a snowstorm combined with killer blustery winds. The wind direction was whipping around, coming from unexpected directions -- we never get north winds here -- and it taught us some things about the new house that we hadn't realized in the nearly two years we've lived here.

[Snow coming under the bedroom door] For example, the bedroom was cold. I mean really cold. The windows on the north wall were making all kinds of funny rattling noises -- turned out some of them had leaks around their frames. There's a door on the north wall, too, that leads out onto a deck, and the area around that was pretty cold too, though I thought a lot of that was leakage through the air conditioner (which had had a cover over it, but the cover had already blown away in the winds). We put some towels around the base of the door and windows.

Thank goodness for lots of blankets and down comforters -- I was warm enough overnight, except for cold hands while reading in bed. In the morning, we pulled the towel away from the door, and discovered a small snowdrift inside the bedroom.

We knew the way that door was hung was fairly hopeless -- we've been trying to arrange for a replacement, but in New Mexico everything happens mañana -- but snowdrifts inside the room are a little extreme.

We've added some extra weatherstripping for now, and with any luck we'll get a better-hung door before the next rare north-wind snowstorm. Meanwhile, I'm enjoying today's sunshine while watching the snow melt in the yard.

December 31, 2015 06:28 PM

December 30, 2015

Jono Bacon

In Memory of Ian Murdock

Today we heard the sad news that Ian Murdock has passed away. He was 42.

Although Ian and my paths crossed relatively infrequently, over the years we became friends. His tremendous work in Debian was an inspiration for my own work in Ubuntu. At times when I was unsure of what to do in my work, Ian would share his guidance and wisdom. He never asked for anything in return. He never judged. He always supported the growth of Open Source and Free Software. He was precisely the kind of person that makes the Open Source and Free Software world so beautiful.

As such, when I heard about some of his erratic tweets a few days back as I landed back home from the UK for Christmas, I reached out with a friendly arm to see if there was anything I could do to help. Sadly, I got no response. I now know why: he had likely just passed away when I reached out to him.

While it is natural for us to grieve his passing, we should also take time to focus on what he gave us all. He gave us a sparkling personality, a passion for everyone to succeed, and a legacy of Open Source and Free Software that would be hard to match.

Ian, wherever you may be, rest in peace. We will miss you

by Jono Bacon at December 30, 2015 08:06 PM

December 28, 2015

Elizabeth Krumbach

Simcoe’s November 2015 Hospital Checkup

It’s been quite a season for Simcoe. I mentioned back in September that the scabbing around her eyes had healed up, but unfortunately it keeps coming back. The other day we also noticed a sore and chunk of missing fur at the base of the underside of her tail. She has a dermatologist appointment in the beginning of January, so hopefully we can get to the bottom of it. It would be very nice to know what’s going on, when we need to worry and what to do about it when it happens. Poor kitty!

This December marks four years with the renal failure diagnosis. With her BUN and CRE levels creeping up and weight dropping a bit, we decided to go in for a consultation with the hospital doctor (rather than her great regular vet). The hospital vet has been really helpful with his industry contacts and experience with renal failure cats, and we trust his opinion. The bad news is that renal transplants for cats haven’t improved much since her diagnosis. It’s still risky, traumatic and expensive. Worst of all, median survival rate still lands at only about three years.

Fortunately she’s still acting normal and eating on her own, so we have a lot of options. One of them is supplementing her diet with wet food. We also had the option of switching her subcutaneous fluid injections from 150ml every other day to 100ml daily. Another is giving her pills to stimulate appetite so her weight doesn’t drop too low. We’re starting off with the food and fluid schedule adjustments, which we began this month. We bought a small pet scale for here at home so we can keep a closer eye on her weight and will likely start weekly weigh-ins next week.

During the checkup in November, they also ran her blood work which is showing the trend continuing for the most part. Her BUN levels went up a lot, but the doctor was more focused on and concerned about CRE increases and weight decreases (though she did put on a few ounces).

CRE dropped a little, from 4.8 to 4.4.

CRE graph

BUN spiked, going from 54 to 75.

BUN graph

She’s still under 9lbs, but drifting in a healthy area in the high 8s, going from 8.8lbs to 8.9lbs.

Weight graph

We’re thankful that we’ve had so much time with her post-diagnosis, she’s been doing very well all things considered and she’s still a happy and active cat. She just turned nine years old and we’re aiming for several more years with her.

by pleia2 at December 28, 2015 04:14 AM

Akkana Peck

Extlinux on Debian Jessie

Debian "Sid" (unstable) stopped working on my Thinkpad X201 as of the last upgrade -- it's dropping mouse and keyboard events. With any luck that'll get straightened out soon -- I hear I'm not the only one having USB problems with recent Sid updates. But meanwhile, fortunately, I keep a couple of spare root partitions so I can try out different Linux distros. So I decided to switch to the current Debian stable version, "Jessie".

The mouse and keyboard worked fine there. Except it turned out I had never fully upgraded that partition to the "Jessie"; it was still on "Wheezy". So, with much trepidation, I attempted an apt-get update; apt-get dist-upgrade

After an interminable wait for everything to download, though, I was faced with a blue screen asking this:

No bootloader integration code anymore.
The extlinux package does not ship bootloader integration anymore.
If you are upgrading to this version of EXTLINUX your system will not boot any longer if EXTLINUX was the only configured bootloader.
Please install GRUB.
<Ok>

No -- it's not okay! I have good reasons for not using grub2 -- besides which, extlinux on exact machine has been working fine for years under Debian Sid. If it worked on Wheezy and works on Sid, why wouldn't it work on the version in between, Jessie?

And what does it mean not to ship "bootloader integration", anyway? That term is completely unclear, and googling was no help. There have been various Debian bugs filed but of course, no explanation from the developers for exactly what does and doesn't work.

My best guess is that what Debian means by "bootloader integration" is that there's a script that looks at /boot/extlinux/extlinux.conf, figures out which stanza corresponds to the current system, figures out whether there's a new kernel being installed that's different from the one in extlinux.conf, and updates the appropriate kernel and initrd lines to point to the new kernel.

If so, that's something I can do myself easily enough. But what if there's more to it? What would actually happen if I upgraded the extlinux package?

Of course, there's zero documentation on this. I found plenty of questions from people who had hit this warning, but most were from newbies who had no idea what extlinux was or why their systems were using it, and they were advised to install grub. I only found one hit from someone who was intentionally using extlinux. That person aborted the install, held back the package so the potentially nonbooting new version of extlinux wouldn't be installed, then updated extlinux.conf by hand, and apparently that worked fine.

It sounded like a reasonable bet. So here's what I did (as root, of course):

  • Open another terminal window and run ps aux | grep apt to find the apt-get dist-upgrade process and kill it. (sudo pkill apt-get is probably an easier approach.) Ensure that apt has exited and there's a shell prompt in the window where the scary blue extlinux warning was.
  • echo "extlinux hold" | dpkg --set-selections
  • apt-get dist-upgrade and wait forever for all the packages to install
  • aptitude search linux-image | grep '^i' to find out what kernel versions are installed. Pick one. I picked 3.14-2-686-pae because that happened to be the same kernel I was already running, from Sid.
  • ls -l /boot and make sure that kernel is there, along with an initrd.img of the same version.
  • Edit /boot/extlinux/extlinux.conf and find the stanza for the Jessie boot. Edit the kernel and append initrd lines to use the right kernel version.

It worked fine. I booted into jessie with the kernel I had specified. And hooray -- my keyboard and mouse work, so I can continue to use my system until Sid becomes usable again.

December 28, 2015 12:28 AM

December 26, 2015

Elizabeth Krumbach

Days in Kyoto

As I mentioned in my post about Osaka, we spent our nights in Osaka and days plus evenings on Friday and Saturday in Kyoto. Since our plans got squished a bit, we didn’t get to as many sights as I had wanted to in Kyoto, but we did get to visit some of the key ones, and were able to keep our plans to go to one of the best restaurants in the city.

On Friday we took a Japanese Rail train up to Kyoto early so we could make our lunch reservations at the famous Kichisen. This was probably the best meal we had on our trip. They serve the food in the Kaiseki tradition with their beautiful and fancy take on many of the traditional Kaiseki dishes. Upon arrival we were greeted by the hosts, took our shoes off and were led into our private dining room. We enjoyed tea as the courses began, and were impressed as each course was more dazzling and delicious than the last.

After that very long and satisfying lunch, we made our way to Kinkaku-ji, the Golden temple. Being the height of autumn tourist season it was incredibly busy. In order to get to the best views of the temple we actually had to wait and then work our way through the crowds. Fortunately the photos didn’t reflect the madness and I got some really great shots, like this one which is now my desktop background.

The temple complex closed around five and we made our way to over to the Kyoto Imperial Palace complex. It’s a massive park, and while we didn’t have tickets for a tour inside the palace areas, we were able to walk around it, explore the trails in the park.


Outside the Imperial Palace

We also enjoyed finding other little temples and ponds. It was a beautiful way to spend time as the sun set.


Another small temple in Imperial park

From there we went to the Gion district and walked around for a while before stopping for some tea. We had a late evening dinner at Roan Kikunoi, which was another Kaiseki-style meal. This time we were seated at a bar with several other patrons and the courses came out mostly at the same time for all of us. The dishes were good, I particularly enjoyed the sashimi courses.

Saturday morning was spent in Osaka, but we made it to Kyoto in the afternoon to go to Ginkaku-ji, the Silver Temple. The temple is not silver, but it’s called that to distinguish it from the Gold Temple across town that we saw the day before.


MJ and I at the silver temple

It was a nice walk around the grounds of the temple, and then you climb a series of stairs to get a view of the city of Kyoto.


View from hill at silver temple

We had reservations at Tagoto Honten for dinner on Saturday. We once again had a Kaiseki-style meal but this one was much more casual than the day before. By this time we were getting a little tired of the style, but there was enough variation to keep us happy.

I’m sure our whirlwind tour of the city hardly did it justice. While we knocked out some of the key attractions, there are dozens of smaller temples, a castle to tour, plus the imperial palace and I’ve heard there’s a long climb up a hill where you can see and feed monkeys! A dinner with a geisha was also on our list, but we couldn’t make those reservations with our time restraints either. We’d definitely also work to reserve far enough in advance to stay in Kyoto itself, as while the train rides to Osaka were easy and short, all told we probably spent an hour in transit when you factor in deciding a route, walking to and from the stations. On the topic of transit, we ended up taking cabs around Kyoto more than we did in the rest of Japan, partially because we were often short on time, and otherwise because the rail system just isn’t as comprehensive as other cities we went to (though buses were available). It was noteworthy to share that the cabs are metered, very clean and all had friendly, professional drivers.

We don’t often make solid plans to revisit a place we’ve been to together, as there are so many places in the world we want to see. Japan is certainly an exception. Not just because we missed our segment in Tokyo, but because a week isn’t nearly enough time to enjoy this country I unexpectedly fell in love with.

More photos from our adventures in Kyoto here: https://www.flickr.com/photos/pleia2/sets/72157659834169750

by pleia2 at December 26, 2015 05:08 PM

Shinkansen to Osaka

As I mentioned in my post about Tokyo, it’s taken me a couple months to get around to writing about our journeys in Japan. But here we are! On October 22nd we landed back in Japan after our quick trip back to Philadelphia and took the NE’X train right to the high speed Shinkansen which took us all the way to Osaka (about 300 miles) in approximately 3 hours.

Before getting on the Shinkansen we took the advice of one of MJ’s local colleagues and picked up a boxed meal on the railway platform. We helpfully had a translation explaining that we don’t eat pork, and the woman selling the boxes was very helpful in finding us a few that didn’t contain any pork. We were grateful for her help, as I made my way through the box and had no idea what I was eating. It was all delicious though, and beautifully presented.

Our original plan had been to stay in Kyoto, but we booked later than anticipated and the reasonable hotels in Kyoto had already sold out. With the beautiful weather and changing leaves, autumn in Kyoto is only second to the spring (when the cherry blossoms bloom) as far as being a busy tourist time. Staying in Osaka worked out well though, especially since there was a lot to do there after things closed in Kyoto!

We stayed at the beautiful, if incredibly fancy old style European, Hotel Hankyu International. It was just a quick walk from Umeda Station, which made getting around pretty easy. We took trains everywhere we went.

Most of Friday was spent in Kyoto, but Saturday morning we began exploring Osaka a bit with a train ride over to the Osaka Aquarium Kaiyukan. I had read about this aquarium before our trip, and learned that it’s one of the best in Asia. As a fan of zoos and aquariums, I was glad we got to keep this visit on our agenda.


Osaka Aquarium Kaiyukan

The aquarium is laid out as several levels, and you begin by taking an elevator to the top floor. The top floor has a natural light forest along with river otters, crabs and various fish and birds. As you go down through the aquarium you see penguins, seals, all kinds of sharks and fish. For me, the major draw was getting to see some massive whale sharks, which I hadn’t seen in captivity before.


Whale shark

After the aquarium we needed some lunch. MJ is a big fan of okonomiyaki, a Japanese pancake that’s filled with vegetables (mostly cabbage) and your choice of meat or seafood. We did some searching near the train station and found Fukutaro. It was crowded, but we got a seat pretty quickly. It’s also hot, since they prepare the food on a big grill at the front of the restaurant (which we sat near) and then there is a hot grill in front of you which they deliver the okonomiyaki to so that it stays warm as you eat. It was the best okonomiyaki I’ve ever had.

From there we made our way to Kyoto for the rest of the day and dinner. We came back to Osaka after dinner and went back to the area where the aquarium is to go up on the Tempozan Ferris wheel to see the bay at night! The Ferris wheel was all lit up in blue, and since it was later in the evening there was no line, we even had no trouble waiting for the transparent car.

Sunday morning we had to pack up and head back to the Shinkansen for our trip back to Tokyo. After some false starts in finding lunch (it was terribly tempting to get okonomiyaki again) we found ourselves at a mall that had a tempura restaurant. We did a several course meal where they brought out an assorted selection of tempura meats and vegetables. My life is now complete that I’ve had tempura pumpkin, it was amazing.

Our train ride in to Osaka was later in the day so it was mostly dark. I fully enjoyed the daytime train ride, we passed lots of little towns and lots of solar panels!

More photos from Osaka here: https://www.flickr.com/photos/pleia2/albums/72157659829244819

And more photos from our trip on the Shinkansen: https://www.flickr.com/photos/pleia2/sets/72157660421552335

by pleia2 at December 26, 2015 12:12 AM

December 22, 2015

kdub

New Mir Release (0.18)

Mir Image

If a new Mir release was on your Christmas wishlist (like it was on mine), Mir 0.18 has been released! I’ve been working on this the last few days, and its out the door now.  Full text of changelog. Special thanks to mir team members who helped with testing, and the devs in #ubuntu-ci-eng for helping move the release along.

Graphics

  • Internal preparation work needed for Vulkan, hardware decoded multimedia optimizations, and latency improvements for nested servers.
  • Started work on plugin renderers. This will better prepare mir for IoT, where we might not have a Vulkan/GLES stack on the device, and might have to use the CPU.
  • Fixes for graphics corruption affecting Xmir (blocky black bars)
  • Various fixes for multimonitor scenarios, as well as better support for scaling buffers to suit the the monitor its on.

Input

  • Use libinput by default. We had been leaning on an old version of the Android input stack. Completely remove this in favor of using libinput.

Bugs

  • Quite a long list of bug correction. Some of these were never ‘in the wild’ but existed in the course of 0.18 development.

What’s next?

Its always tricky to pin down what exactly will make it into the next release, but I can at least comment on the stuff we’re working on, in addition to the normal rounds of bugfixing and test improvements:

  • various Internet-o-Things and convergence topics (eg, snappy, figuring out different rendering options on smaller devices).
  • buffer swapping rework to accommodate different render technologies (Vulkan!) accommodations for multimedia, and improve latency for nested servers.
  • more flexible screenshotting support
  • further refinements to our window management API
  • refinements to our platform autodetection

How can I help?

Writing new Shells

A fun way to help would be to write new shells! Part of mir’s goals is to make this as easy to do as possible, so writing a new shell always helps us make sure we’re hitting this goals.

If you’re interested in the mir C++ shell API, then you can look at some of our demos, available in the ‘mir-demos’ package. (source here, documentation here)

Even easier than that might be writing a shell using QML like unity8 is doing via the qtmir plugin. An example of how to do that is here (instructions on running here).

Tinkering with technology

If you’re more of the nuts and bolts type, you can try porting a device, adding a new rendering platform to mir (OpenVG or pixman might be an interesting, beneficial challenge), or figuring out other features to take advantage of.

Standard stuff

Pretty much all open source projects recommend bug fixing or triaging, helping on irc (#ubuntu-mir on freenode) or documentation auditing as other good ways to start helping.

by Kevin at December 22, 2015 07:18 PM

December 20, 2015

Akkana Peck

Christmas Bird Count

Yesterday was the Los Alamos Christmas Bird Count.

[ Mountain chickadee ] No big deal, right? Most counties have a Christmas Bird Count, a specified day in late December when birders hit the trails and try to identify and count as many birds as they can find. It's coordinated by the Audubon Society, which collects the data so it can be used to track species decline, changes in range in response to global warming, and other scientific questions. The CBC has come a long way from when it split off from an older tradition, the Christmas "Side Hunt", where people would hit the trails and try to kill as many animals as they could.

But the CBC is a big deal in Los Alamos, because we haven't had one since 1953. It turns out that to run an official CBC, you have to be qualified by Audubon and jump through a lot of hoops proving that you can do it properly. Despite there being a very active birding community here, nobody had taken on the job of qualifying us until this year. There was a lot of enthusiasm for the project: I think there were 30 or 40 people participating despite the chilly, overcast weather.

The team I was on was scheduled to start at 7. But I had been on the practice count in March (running a practice count is one of the hoops Audubon makes you jump through), and after dragging myself out of bed at oh-dark-thirty and freezing my toes off slogging through the snow, I had learned that birds are mostly too sensible to come out that early in winter. I tried to remind the other people on the team of what the March morning had been like, but nobody was listening, so I said I'd be late, and I met them at 8. (Still early for me, but I woke up early that morning.)

[ Two very late-season sandhill cranes ] Sure enough, when I got there at 8, there was disappointment over how few birds there were. But actually that continued all day: the promised sun never came out, and I think the birds were hoping for warmer weather. We did see a good assortment of woodpeckers and nuthatches in a small area of Water Canyon, and later, a pair of very late-season sandhill cranes made a low flyover just above where we stood on Estante Way; but mostly, it was disappointing.

In the early afternoon, the team disbanded to go home and watch our respective feeders, except for a couple of people who drove down the highway in search of red-tailed hawks and to the White Rock gas station in search of rock pigeons. (I love it that I'm living in a place where birders have to go out of their way to find rock pigeons to count.)

I didn't actually contribute much on the walks. Most of the others were much more experienced, so mostly my role was to say "Wait, what's that noise?" or "Something flew from that tree to this one" or "Yep, sure enough, two more juncos." But there was one species I thought I could help with: scaled quail. We've been having a regular flock of scaled quail coming by the house this autumn, sometimes as many as 13 at a time, which is apparently unusual for this time of year. I had Dave at home watching for quail while I was out walking around.

When I went home for a lunch break, Dave reported no quail: there had been a coyote sniffing around the yard, scaring away all the birds, and then later there'd been a Cooper's hawk. He'd found the hawk while watching a rock squirrel that was eating birdseed along with the towhees and juncos: the squirrel suddenly sat up and stared intently at something, and Dave followed its gaze to see the hawk perched on the fence. The squirrel then resumed eating, having decided that a Cooper's hawk is too small to be much danger to a squirrel.

[ Scaled quail ] But what with all the predators, there had been no quail. We had lunch, keeping our eyes on the feeder area, when they showed up. Three of them, no, six, no, nine. I kept watch while Dave went over to another window to see if there were any more headed our way. And it turns out there was a whole separate flock, nine more, out in the yard. Eighteen quail in all, a record for us! We'd suspected that we had two different quail families visiting us, but when you're watching one spot with quail constantly running in and out, there's no way to know if it's the same birds or different ones. It needed two people watching different areas to get our high count ot 18. And a good thing: we were the only bird counters in the county who saw any quail, let alone eighteen. So I did get to make a contribution after all.

I carried a camera all day, but my longest regular lens (a 55-250 f/4-5.6) isn't enough when it comes to distant woodpeckers. So most of what I got was blurry, underexposed "record shots", except for the quail, cranes, and an obliging chickadee who wasn't afraid of a bunch of binocular-wielding anthropoids. Photos here: Los Alamos Christmas Bird Count, White Rock team, 2015.

December 20, 2015 09:21 PM