Planet Ubuntu California

September 28, 2016

Jono Bacon

Bacon Roundup – 28th September 2016

Here we are with another roundup of things I have been working on, complete with a juicy foray into the archives too. So, sit back, grab a cup of something delicious, and enjoy.

To gamify or not to gamify community (opensource.com)

In this piece I explore whether gamification is something we should apply to building communities. I also pull from my experience building a gamification platform for Ubuntu called Ubuntu Accomplishments.

The GitLab Master Plan (gitlab.com)

Recently I have been working with GitLab. The team has been building their vision for conversational development and I MCed their announcement of their plan. You can watch the video below for convenience:


Social Media: 10 Ways To Not Screw It Up (jonobacon.org)

Here I share 10 tips and tricks that I have learned over the years for doing social media right. This applies to tooling, content, distribution, and more. I would love to learn your tips too, so be sure to share them in the comments!

Linux, Linus, Bradley, and Open Source Protection (jonobacon.org)

Recently there was something of a spat in the Linux kernel community about when is the right time to litigate companies who misuse the GPL. As a friend of both sides of the debate, this was my analysis.

The Psychology of Report/Issue Templates (jonobacon.org)

As many of you will know, I am something of a behavioral economics fan. In this piece I explore the interesting human psychology behind issue/report templates. It is subtle nudges like this that can influence the behavioral patterns you want to see.

My Reddit AMA

It would be remiss without sharing a link to my recent reddit AMA where I was asked a range of questions about community leadership, open source, and more. Thanks to all of you who joined and asked questions!

Looking For Talent

I also posted a few pieces about some companies who I am working with who want to hire smart, dedicated, and talented community leaders. If you are looking for a new role, be sure to see these:

From The Archives

Dan Ariely on Building More Human Technology, Data, Artificial Intelligence, and More (forbes.com)

My Forbes piece on the impact of behavioral economics on technologies, including an interview with Dan Ariely, TED speaker, and author of many books on the topic.

Advice for building a career in open source (opensource.com)

In this piece I share some recommendations I have developed over the years for those of you who want to build a career in open source. Of course, I would love to hear you tips and tricks too!

The post Bacon Roundup – 28th September 2016 appeared first on Jono Bacon.

by Jono Bacon at September 28, 2016 03:00 PM

Elizabeth Krumbach

Yak Coloring

A couple cycles ago I asked Ronnie Tucker, artist artist and creator of Full Circle Magazine, to create a werewolf coloring page for the 15.10 release (details here). He then created another for Xenial Xerus, see here.

He’s now created one for the upcoming Yakkety Yak release! So if you’re sick of all the yak shaving you’re doing as we prepare for this release, you may consider giving yak coloring a try.

But that’s not the only yak! We have Tom Macfarlane in the Canonical Design Team once again for sending me the SVG to update the Animal SVGs section of the Official Artwork page on the Ubuntu wiki. They’re sticking with a kind of origami theme this time for our official yak.

Download the SVG version for printing from the wiki page or directly here.

by pleia2 at September 28, 2016 12:43 AM

September 26, 2016

Akkana Peck

Unclaimed Alcoholic Beverages

Dave was reading New Mexico laws regarding a voter guide issue we're researching, and he came across this gem in Section 29-1-14 G of the "Law Enforcement: Peace Officers in General: Unclaimed Property" laws:

Any alcoholic beverage that has been unclaimed by the true owner, is no longer necessary for use in obtaining a conviction, is not needed for any other public purpose and has been in the possession of a state, county or municipal law enforcement agency for more than ninety days may be destroyed or may be utilized by the scientific laboratory division of the department of health for educational or scientific purposes.

We can't decide which part is more fun: contemplating what the "other public purposes" might be, or musing on the various "educational or scientific purposes" one might come up with for a month-old beverage that's been sitting in the storage locker ... I'm envisioning a room surrounded by locked chain-link containing dusty shelves containing rows of half-full martini and highball glasses.

September 26, 2016 05:04 PM

Eric Hammond

Deleting a Route 53 Hosted Zone And All DNS Records Using aws-cli

fast, easy, and slightly dangerous recursive deletion of a domain’s DNS

Amazon Route 53 currently charges $0.50/month per hosted zone for your first 25 domains, and $0.10/month for additional hosted zones, even if they are not getting any DNS requests. I recently stopped using Route 53 to serve DNS for 25 domains and wanted to save on the $150/year these were costing.

Amazon’s instructions for using the Route 53 Console to delete Record Sets and a Hosted Zone make it look simple. I started in the Route 53 Console clicking into a hosted zone, selecting each DNS record set (but not the NS or SOA ones), clicking delete, clicking confirm, going back a level, selecting the next domain, and so on. This got old quickly.

Being lazy, I decided to spend a lot more effort figuring out how to automate this process with the aws-cli, and pass the savings on to you.

Steps with aws-cli

Let’s start by putting the hosted zone domain name into an environment variable. Do not skip this step! Do make sure you have the right name! If this is not correct, you may end up wiping out DNS for a domain that you wanted to keep.

domain_to_delete=example.com

Install the jq json parsing command line tool. I couldn’t quite get the normal aws-cli --query option to get me the output format I wanted.

sudo apt-get install jq

Look up the hosted zone id for the domain. This assumes that you only have one hosted zone for the domain. (It is possible to have multiple, in which case I recommend using the Route 53 console to make sure you delete the right one.)

hosted_zone_id=$(
  aws route53 list-hosted-zones \
    --output text \
    --query 'HostedZones[?Name==`'$domain_to_delete'.`].Id'
)
echo hosted_zone_id=$hosted_zone_id

Use list-resource-record-sets to find all of the current DNS entries in the hosted zone, then delete each one with change-resource-record-sets.

aws route53 list-resource-record-sets \
  --hosted-zone-id $hosted_zone_id |
jq -c '.ResourceRecordSets[]' |
while read -r resourcerecordset; do
  read -r name type <<<$(jq -r '.Name,.Type' <<<"$resourcerecordset")
  if [ $type != "NS" -a $type != "SOA" ]; then
    aws route53 change-resource-record-sets \
      --hosted-zone-id $hosted_zone_id \
      --change-batch '{"Changes":[{"Action":"DELETE","ResourceRecordSet":
          '"$resourcerecordset"'
        }]}' \
      --output text --query 'ChangeInfo.Id'
  fi
done

Finally, delete the hosted zone itself:

aws route53 delete-hosted-zone \
  --id $hosted_zone_id \
  --output text --query 'ChangeInfo.Id'

As written, the above commands output the change ids. You can monitor the background progress using a command like:

change_id=...
aws route53 wait resource-record-sets-changed \
  --id "$change_id"

GitHub repo

To make it easy to automate the destruction of your critical DNS resources, I’ve wrapped the above commands into a command line tool and tossed it into a GitHub repo here:

https://github.com/alestic/aws-route53-wipe-hosted-zone

You are welcome to use as is, fork, add protections, rewrite with Boto3, and generally knock yourself out.

Alternative: CloudFormation

A colleague pointed out that a better way to manage all of this (in many situations) would be to simply toss my DNS records into a CloudFormation template for each domain. Benefits include:

  • Easy to store whole DNS definition in revision control with history tracking.

  • Single command creation of the hosted zone and all record sets.

  • Single command updating of all changed record sets, no matter what has changed since the last update.

  • Single command deletion of the hosted zone and all record sets (my current challenge).

This doesn’t work as well for hosted zones where different records are added, updated, and deleted by automated processes (e.g., instance startup), but for simple, static domain DNS, it sounds ideal.

How do you create, update, and delete DNS in Route 53 for your domains?

Original article and comments: https://alestic.com/2016/09/aws-route53-wipe-hosted-zone/

September 26, 2016 09:30 AM

Jono Bacon

Looking for a data.world Director of Community

data.world

Some time ago I signed an Austin-based data company called data.world as a client. The team are building an incredible platform where the community can store data, collaborate around the shape/content of that data, and build an extensive open data commons.

As I wrote about previously I believe data.world is going to play an important role in opening up the potential for finding discoveries in disparate data sets and helping people innovate faster.

I have been working with the team to help shape their community strategy and they are now ready to hire a capable Director of Community to start executing these different pieces. The role description is presented below. The data.world team are an incredible bunch with some strong heritage in the leadership of Brett Hurt, Matt Laessig, Jon Loyens, Bryon Jacob, and others.

As such, I am looking to find the team some strong candidates. If I know you, I would invite you to confidentially share your interest in this role by filling my form here. This way I can get a good sense of who is interested and also recommend people I personally know and can vouch for. I will then reach out to those of you who this seems to be a good potential fit for and play a supporting role in brokering the conversation.

This role will require candidates to either be based in Austin or be willing to relocate to Austin. This is a great opportunity, and feel free to get in touch with me if you have any questions.

Director of Community Role Description

data.world is building a world-class data commons, management, and collaboration platform. We believe that data.world is the very best place to build great data communities that can make data science fun, enjoyable, and impactful. We want to ensure we can provide the very best support, guidance, and engagement to help these communities be successful. This will involve engagement in workflow, product, outreach, events, and more.

As Director of Community, you will lead, coordinate, and manage our global community development initiatives. You will use your community leadership experience to shape our community experience and infrastructure, feed into the product roadmap with community needs and requirements, build growth and engagement, and more. You will help connect, celebrate, and amplify the existing communities on data.world and assist new ones as they form. You will help our users to think bigger, be the best they can be, and succeed more. You’ll work across teams within data.world to promote the community’s voice within our different internal teams. You should be a content expert, superb communicator, and humble facilitator.

Typical activities for this role include:

  • Building and executing programs that grow communities on data.world and empower them to do great work.
  • Taking a structured approach to community roles, on-boarding, and working with our teams to ensure community members have a simple and powerful experience.
  • Developing content that promotes the longevity and sustainability of fast growing, organically built data communities with high impact outcomes.
  • Building relationships within the industry and community to be their representative for data.world in helping to engage, be successful, and deliver great work and collaboration.
  • Working with product, user operations, and marketing teams on product roadmap for community features and needs.
  • Being a data.world representative and spokesperson at conferences, events, and within the media and external data communities.
  • Always challenging our assumptions, our culture, and being singularly focused on delivering the very best data community platform in the world.

Experience with the following is required:

  • 5-7 years of experience participating in and building communities, preferably data based, or technical in nature.
  • Experience with working in open source, open data, and other online communities.
  • Public speaking, blogging, and content development.
  • Facilitating complex and sensitive community management situations with humility, judgment, tact, and humor.
  • Integrating company brand, voice, and messaging into developed content. Working independently and autonomously, managing multiple competing priorities.

Experience with any of the following preferred:

  • Data science experience and expertise.
  • 3-5 years of experience leading community management programs within a software or Internet-based company.
  • Media training and experience in communicating with journalists, bloggers, and other media on a range of technical topics.
  • Existing network from a diverse set of communities and social media platforms.
  • Software development capabilities and experience

The post Looking for a data.world Director of Community appeared first on Jono Bacon.

by Jono Bacon at September 26, 2016 04:16 AM

September 25, 2016

Elizabeth Krumbach

Beer and trains in Germany

I spent most of this past week in Germany with the OpenStack Infrastructure and QA teams doing a sprint at the SAP offices in Walldorf, I wrote about it here.

The last (and first!) time I was in Germany was for the same purpose, a sprint, that time in Darmstadt where I snuck in a tiny amount of touristing but due troubles with my gallbladder, I could have any fried foods or beer. Well, I had one beer to celebrate Germany winning the World Cup, but I regretted it big time.

This time was different, finally I could have liters of German beer! And I did. The first night there I even had some wiener schnitzel (fried veal!), even if we were all too tired from our travels to leave the hotel that night. We went out to beer gardens every other night after that, taking in the beautiful late summer weather and enjoying great beers.


Photo in the center by Chris Hoge (source)

But I have a confession to make: I don’t like pilsners and that makes Belgium my favorite beer country in Europe. Still, Germany has quite the title. Fortunately while they are the default, pilsners were not my only option. I indulged in dark lagers and hefeweizens all week. Our evening in Heidelberg I also had an excellent Octoberfest Märzen by Heidelberger, which was probably my favorite beer of the whole trip.

Now I’m getting ahead of myself because I was excited about all the beer. I arrived on Sunday, sadly much later than I had intended. My original flights had been rescheduled so ended up meeting my colleague Clark at the Frankfurt airport around 4PM to catch our trains to Walldorf. The train station is right there in the airport, and clear signs meant a no fuss transfer halfway through our journey to get to the next train. We were on the trains for about an hour before arriving at Wiesloch-Walldorf station. A ten Euro cab ride then got us to the hotel where we met up with several other colleagues for drinks.

Of course we were there to work, so that’s what we spend 9-5 each day doing, but the evenings were ours to explore our little corner of Germany. The first night we just walked into Walldorf after work and enjoyed drinks and food until the sun went down. Walldorf is a very cute little town and the outdoor seating at the beer garden we went to was a wonderful treat, especially since the weather was so warm and clear. We spent Wednesday night in Walldorf too.

More photos from Walldorf here: https://www.flickr.com/photos/pleia2/sets/72157670828593814/

Tuesday night was our big night out. We all headed out to the nearby Heidelberg for a big group dinner. After parking, we had a lovely short walk to the restaurant which took me by a shop that sold post cards! I picked up a trio of cards for my mother and sisters, as I typically do when traveling. The walk also gave a couple of us time to take pictures of the city before the sun went down.

Dinner was at Zum Weissen Schwanen (The White Swan). That was my four beer night.

After the meal several of us took a nice walk around the city a bit more. We got to look up and see the massive, lit up, Heidelberg Castle. It’s a pretty exceptional place, I’d love to properly visit some time. The post cards I sent to family all included the castle.

The drive back to the hotel was fun too. I got a tiny taste of the German autobahn as we got up to 220 kilometers per hour on our way back to the hotel before our exit came up. Woo!

My pile of Heidelberg photos are here: https://www.flickr.com/photos/pleia2/albums/72157674174957385

Thursday morning was my big morning of trains. I flew into Frankfurt like everyone else, but I flew home out of Dusseldorf because it was several hundred dollars cheaper. The problem is Walldorf and Dusseldorf aren’t exactly close, but I could spend a couple hours on European ICE (Inter-City Express) and get there. MJ highly recommended I try it out since I like trains, and with the simplicity of routing he convinced me to take a route from Mannheim all the way to Dusseldorf Airport with one simple connection, which just required walking across the platform.

I’m super thankful he convinced me to take the trains. The ticket wasn’t very expensive and I really do like trains. In addition to being reasonably priced, they’re fast, on time and all the signs were great so I didn’t feel worried about getting lost or ending up in the wrong place. The signs even report where each coach will show up on the platform so I had no trouble figuring out where to stand to get to my assigned seat.

I took a few more pictures while on my train adventure, here: https://www.flickr.com/photos/pleia2/albums/72157670930346613

And so I spent a couple hours on my way to Dusseldorf. I was a bit tired since my first train left the station at 7:36AM, so I mostly just listened to music and stared out the window. My flight out of Dusseldorf was uneventful, and was a direct to San Francisco so I was able to come home to my kitties in the early evening. Unfortunately MJ had left home the day before, so I’ll have to wait until we’re both in Philadelphia next week to see him.

by pleia2 at September 25, 2016 12:16 AM

September 24, 2016

Elizabeth Krumbach

OpenStack QA/Infrastructure Meetup in Walldorf

I spent this week in the lovely town of Walldorf, Germany with about 25 of my OpenStack Quality Assurance and Infrastructure colleagues. We were there for a late-cycle sprint, where we all huddled in a couple of rooms for three days to talk, script and code our way through some challenges that are much easier to tackle when all the key players are in a room together. QA and Infra have always been a good match for an event like this since we’re so tightly linked as things QA works on are supported by and tested in the Continuous Integration system we run.

Our venue this time around were the SAP offices in Walldorf. They graciously donated the space to us for this event, and kept us blessedly fed, hydrated and caffeinated throughout the day.

Each day we enjoyed a lovely walk from and to the hotel many of us stayed at. We lucked out and there wasn’t any rain while we were there so we got to take in the best of late summer weather in Germany. Our walk took us through a corn field, past flowers, gave us a nice glimpse at the town of Walldorf on the other side of the highway and then began in on the approach to the SAP buildings of which there are many.

The first day began with an opening from our host at the SAP offices, Marc Koderer and by the QA project lead Ken’ichi Ohmichi. From there we went through the etherpad for the event to figure out where to begin. A big chunk of the Infrastructure team went to their own room to chat about Zuulv3 and some of the work on Ansible, and a couple of us hung back with the QA team to move some of their work along.

Spending time with the QA folks I learned about future plans for a more useful series of bugday graphs. I also worked with Spencer Krum and Matt Treinish to land a few patches related to the new Firehose service. Firehose is a MQTT-based unified message bus that seeks to encompass all the developer-facing infra alerts and updates in a single stream. This includes job results from Gerrit, updates on bugs from Launchpad, specific logs that are processed by logstash and more. At the beginning of the sprint only Gerrit was feeding into it using germqtt, but by the end of Monday we had Launchpad bugs submitting events over email via lpmqtt. The work was mostly centered around setting up Cyrus with Exim and then configuring the accounts and MX records, and trying to do this all in a way that the rest of the team would be happy with. All seems to have worked out, and at the end of the day Matt sent out an email announcing it: Announcing firehose.openstack.org.

That evening we gathered in the little town of Walldorf to have a couple beers, dinner, and relax in a lovely beer garden for a few hours as the sun went down. It was really nice to catch up with some of my colleagues that I have less day to day contact with. I especially enjoyed catching up with Yolanda and Gema, both of whom I’ve known for years through their past work at Canonical on Ubuntu. The three of us also were walk buddies back to the hotel, before which I demanded a quick photo together.

Tuesday morning we started off by inviting Khai Do over to give a quick demo of the Gerrit verify plugin. Now, Khai is one of us, so what do I mean by “come over”? Of all the places and times in the world, Khai was also at the SAP offices in Walldorf, Germany, but he was there for a Gerrit Hackathon. He brought along another Gerrit contributor and showed us how the verify plugin would replace our somewhat hacked into place Javascript that we currently have on our review pages to give a quick view into the test results. It also offers the ability in the web UI to run rechecks on tests, and will provide a page including history of all results through all the patchsets and queues. They’ve done a great job on it, and I was thrilled to see upstream Gerrit working with us to solve some of our problems.


Khai demos the Gerrit verify plugin

After Khai’s little presentation, I plugged my laptop into the projector and brought up the etherpad so we could spend a few minutes going over work that was done on Monday. A Zuulv3 etherpad had been worked on to capture a lot of the work from the Infrastructure team on Monday. Updates were added to our main etherpad about things other people worked on and reviews that were now pending to complete the work.

Groups then split off again, this time I followed most of the rest of the Infrastructure team into a room where we worked on infra-cloud, our infra-spun, fully open source OpenStack deployment that we started running a chunk of our CI tests on a few weeks ago. The key folks working on it gave a quick introduction and then we dove right into debugging some performance problems that were causing failed initial launches. This took us through poking at the Glance image service, rules in Neutron and defaults in the Puppet modules. A fair amount of multi-player (using screen) debugging was done up on the projector as we shifted around options, took the cloud out of the pool of servers for some time, and spent some time debugging individual compute nodes and instances as we watched what they did when they came up for the first time. In addition to our “vanilla” region, Ricardo Carrillo Cruz also made progress that day on getting our “chocolate” region working (next up: strawberry!).

I also was able to take some time on Tuesday to finally get notice and alert notifications going to our new @openstackinfra Twitter account. Monty Taylor had added support for this months ago, but I had just set up the account and written the patches to land it a few days before. We ran into one snafu, but a quick patch (thanks Andreas Jaeger!) got us on our way to automatically sending out our first Tweet. This will be fun, and I can stop being the unofficial Twitter status bot.

That evening we all piled into cars to head over to the nearby city of Heidelberg for dinner and drinks at Zum Weissen Schwanen (The White Swan). This ended up being our big team dinner. Lots of beers, great conversation and catching up on some discussions we didn’t have during the day. I had a really nice time and during our walk back to the car I got to see Heidelberg Castle light up at night as it looms over the city.

Friday kicked off once again at 9AM. For me this day was a lot of talking and chasing down loose ends while I had key people in the room. I also worked on some more Firehose stuff, this time working our way down the path to get logstash also sending data to Firehose. In the midst of which, we embarrassingly brought down our cluster due to failure to quote strings in the config file, but we did get it back online and then more progress was made after everyone got home on Friday. Still, it was good to get part of the way there during the sprint, and we all learned about the amount of logging (in this case, not much!) our tooling for all this MQTT stuff was providing for us to debug. Never hurts to get a bit more familiar with logstash either.

The final evening was spent once again in Walldorf, this time at the restaurant just across the road from the one we went to on Monday. We weren’t there long enough to grow tired of the limited selection, so we all had a lovely time. My early morning to catch a train meant I stuck to a single beer and left shortly after 8PM with a colleague, but that was plenty late for me.


Photo courtesy of Chris Hoge (source)

Huge thanks to Marc and SAP for hosting us. The spaces worked out really well for everything we needed to get done. I also have to say I really enjoyed my time. I work with some amazing people, and come Thursday morning all I could think was “What a great week! But I better get home so I can get back to work.” Hey! This all was work! Also thanks to Jeremy Stanley, our fearless Infrastructure Project Team Leader who sat this sprint out and kept things going on the home front while we were all focused on the sprint.

A few more photos from our sprint here: https://www.flickr.com/photos/pleia2/albums/72157674174936355

by pleia2 at September 24, 2016 03:30 PM

September 20, 2016

Eric Hammond

Developing CloudStatus, an Alexa Skill to Query AWS Service Status -- an interview with Kira Hammond by Eric Hammond

Interview conducted in writing July-August 2016.

[Eric] Good morning, Kira. It is a pleasure to interview you today and to help you introduce your recently launched Alexa skill, “CloudStatus”. Can you provide a brief overview about what the skill does?

[Kira] Good morning, Papa! Thank you for inviting me.

CloudStatus allows users to check the service availability of any AWS region. On opening the skill, Alexa says which (if any) regions are experiencing service issues or were recently having problems. Then the user can inquire about the services in specific regions.

This skill was made at my dad’s request. He wanted to quickly see how AWS services were operating, without needing to open his laptop. As well as summarizing service issues for him, my dad thought CloudStatus would be a good opportunity for me to learn about retrieving and parsing web pages in Python.

All the data can be found in more detail at status.aws.amazon.com. But with CloudStatus, developers can hear AWS statuses with their Amazon Echo. Instead of scrolling through dozens of green checkmarks to find errors, users of CloudStatus listen to which services are having problems, as well as how many services are operating satisfactorily.

CloudStatus is intended for anyone who uses Amazon Web Services and wants to know about current (and recent) AWS problems. Eventually it might be expanded to talk about other clouds as well.

[Eric] Assuming I have an Amazon Echo, how do I install and use the CloudStatus Alexa skill?

[Kira] Just say “Alexa, enable CloudStatus skill”! Ask Alexa to “open CloudStatus” and she will give you a summary of regions with problems. An example of what she might say on the worst of days is:

“3 out of 11 AWS regions are experiencing service issues: Mumbai (ap-south-1), Tokyo (ap-northeast-1), Ireland (eu-west-1). 1 out of 11 AWS regions was having problems, but the issues have been resolved: Northern Virginia (us-east-1). The remaining 7 regions are operating normally. All 7 global services are operating normally. Which Amazon Web Services region would you like to check?”

Or on most days:

“All 62 regional services in the 12 AWS regions are operating normally. All 7 global services are operating normally. Which Amazon Web Services region would you like to check?”

Request any AWS region you are interested in, and Alexa will present you with current and recent service issues in that region.

Here’s the full recording of an example session: http://pub.alestic.com/alexa/cloudstatus/CloudStatus-Alexa-Skill-sample-20160908.mp3

[Eric] What technologies did you use to create the CloudStatus Alexa skill?

[Kira] I wrote CloudStatus using AWS Lambda, a service that manages servers and scaling for you. Developers need only pay for their servers when the code is called. AWS Lambda also displays metrics from Amazon CloudWatch.

Amazon CloudWatch gives statistics from the last couple weeks, such as the number of invocations, how long they took, and whether there were any errors. CloudWatch Logs is also a very useful service. It allows me to see all the errors and print() output from my code. Without it, I wouldn’t be able to debug my skill!

I used Amazon EC2 to build the Python modules necessary for my program. The modules (Requests and LXML) download and parse the AWS status page, so I can get the data I need. The Python packages and my code files are zipped and uploaded to AWS Lambda.

Fun fact: My Lambda function is based in us-east-1. If AWS Lambda stops working in that region, you can’t use CloudStatus to check if Northern Virginia AWS Lambda is working! For that matter, CloudStatus will be completely dysfunctional.

[Eric] Why do you enjoy programming?

[Kira] Programming is so much fun and so rewarding! I enjoy making tools so I can be lazy.

Let’s rephrase that: Sometimes I’m repeatedly doing a non-programming activity—say, making a long list of equations for math practice. I think of two “random” numbers between one and a hundred (a human can’t actually come up with a random set of numbers) and pick an operation: addition, subtraction, multiplication, or division. After doing this several times, the activity begins to tire me. My brain starts to shut off and wants to do something more interesting. Then I realize that I’m doing the same thing over and over again. Hey! Why not make a program?

Computers can do so much in so little time. Unlike humans, they are capable of picking completely random items from a list. And they aren’t going to make mistakes. You can tell a computer to do the same thing hundreds of times, and it won’t be bored.

Finish the program, type in a command, and voila! Look at that page full of math problems. Plus, I can get a new one whenever I want, in just a couple seconds. Laziness in this case drives a person to put time and effort into ever-changing problem-solving, all so they don’t have to put time and effort into a dull, repetitive task. See http://threevirtues.com/.

But programming isn’t just for tools! I also enjoy making simple games and am learning about websites.

One downside to having computers do things for you: You can’t blame a computer for not doing what you told it to. It did do what you told it to; you just didn’t tell it to do what you thought you did.

Coding can be challenging (even frustrating) and it can be tempting to give up on a debug issue. But, oh, the thrill that comes after solving a difficult coding problem!

The problem-solving can be exciting even when a program is nowhere near finished. My second Alexa program wasn’t coming along that well when—finally!—I got her to say “One plus one is eleven.” and later “Three plus four is twelve.” Though it doesn’t seem that impressive, it showed me that I was getting somewhere and the next problem seemed reasonable.

[Eric] How did you get started programming with the Alexa Skills Kit (ASK)?

[Kira] My very first Alexa skill was based on an AWS Lambda blueprint called Color Expert (alexa-skills-kit-color-expert-python). A blueprint is a sample program that AWS programmers can copy and modify. In the sample skill, the user tells Alexa their favorite color and Alexa stores the color name. Then the user can ask Alexa what their favorite color is. I didn’t make many changes: maybe Alexa’s responses here and there, and I added the color “rainbow sparkles.”

I also made a skill called Calculator in which the user gets answers to simple equations.

Last year, I took a music history class. To help me study for the test, I created a trivia game from Reindeer Games, an Alexa Skills Kit template (see https://developer.amazon.com/public/community/post/TxDJWS16KUPVKO/New-Alexa-Skills-Kit-Template-Build-a-Trivia-Skill-in-under-an-Hour). That was a lot of fun and helped me to grow in my knowledge of how Alexa works behind the scenes.

[Eric] How does Alexa development differ from other programming you have done?

[Kira] At first Alexa was pretty overwhelming. It was so different from anything I’d ever done before, and there were lines and lines of unfamiliar code written by professional Amazon people.

I found the ASK blueprints and templates extremely helpful. Instead of just being a functional program, the code is commented so developers know why it’s there and are encouraged to play around with it.

Still, the pages of code can be scary. One thing new Alexa developers can try: Before modifying your blueprint, set up the skill and ask Alexa to run it. Everything she says from that point on is somewhere in your program! Find her response in the program and tweak it. The variable name is something like “speech_output” or “speechOutput.”

It’s a really cool experience making voice apps. You can make Alexa say ridiculous things in a serious voice! Because CloudStatus started with the Color Expert blueprint, my first successful edit ended with our Echo saying, “I now know your favorite color is Northern Virginia. You can ask me your favorite color by saying, ‘What’s my favorite color?’.”

Voice applications involve factors you never need to deal with in a text app. When the user is interacting through text, they can take as long as they want to read and respond. Speech must be concise so the listener understands the first time. Another challenge is that Alexa doesn’t necessarily know how to pronounce technical terms and foreign names, but the software is always improving.

One plus side to voice apps is not having to build your own language model. With text-based programs, I spend a considerable amount of time listing all the ways a person can answer “yes,” or request help. Luckily, with Alexa I don’t have to worry too much about how the user will phrase their sentences. Amazon already has an algorithm, and it’s constantly getting smarter! Hint: If you’re making your own skill, use some built-in Amazon intents, like AMAZON.YesIntent or AMAZON.HelpIntent.

[Eric] What challenges did you encounter as you built the CloudStatus Alexa skill?

[Kira] At first, I edited the code directly in the Lambda console. Pretty soon though, I needed to import modules that weren’t built in to Python. Now I keep my code and modules in the same directory on a personal computer. That directory gets zipped and uploaded to Lambda, so the modules are right there sitting next to the code.

One challenge of mine has been wanting to fix and improve everything at once. Naturally, there is an error practically every time I upload my code for testing. Isn’t that what testing is for? But when I modify everything instead of improving bit by bit, the bugs are more difficult to sort out. I’m slowly learning from my dad to make small changes and update often. “Ship it!” he cries regularly.

During development, I grew tired of constantly opening my code, modifying it, zipping it and the modules, uploading it to Lambda, and waiting for the Lambda function to save. Eventually I wrote a separate Bash program that lets me type “edit-cloudstatus” into my shell. The program runs unit tests and opens my code files in the Atom editor. After that, it calls the command “fileschanged” to automatically test and zip all the code every time I edit something or add a Python module. That was exciting!

I’ve found that the Alexa speech-to-text conversions aren’t always what I think they will be. For example, if I tell CloudStatus I want to know about “Northern Virginia,” it sends my code “northern Virginia” (lowercase then capitalized), whereas saying “Northern California” turns into “northern california” (all lowercase). To at least fix the capitalization inconsistencies, my dad suggested lowercasing the input and mapping it to the standardized AWS region code as soon as possible.

[Eric] What Alexa skills do you plan on creating in the future?

[Kira] I will probably continue to work on CloudStatus for a while. There’s always something to improve, a feature to add, or something to learn about—right now it’s Speech Synthesis Markup Language (SSML). I don’t think it’s possible to finish a program for good!

My brother and I also want to learn about controlling our lights and thermostat with Alexa. Every time my family leaves the house, we say basically the same thing: “Alexa, turn off all the lights. Alexa, turn the kitchen light to twenty percent. Alexa, tell the thermostat we’re leaving.” I know it’s only three sentences, but wouldn’t it be easier to just say: “Alexa, start Leaving Home” or something like that? If I learned to control the lights, I could also make them flash and turn different colors, which would be super fun. :)

In August a new ASK template was released for decision tree skills. I want to make some sort of dichotomous key with that. https://developer.amazon.com/public/community/post/TxHGKH09BL2VA1/New-Alexa-Skills-Kit-Template-Step-by-Step-Guide-to-Build-a-Decision-Tree-Skill

[Eric] Do you have any advice for others who want to publish an Alexa skill?

[Kira]

  • Before submitting your skill for certification, make sure you read through the submission checklist. https://developer.amazon.com/public/solutions/alexa/alexa-skills-kit/docs/alexa-skills-kit-submission-checklist#submission-checklist

  • Remember to check your skill’s home cards often. They are displayed in the Alexa App. Sometimes the text that Alexa pronounces should be different from the reader-friendly card content. For example, in CloudStatus, “N. Virginia (us-east-1)” might be easy to read, but Alexa is likely to pronounce it “En Virginia, Us [as in ‘we’] East 1.” I have to tell Alexa to say “northern virginia, u.s. east 1,” while leaving the card readable for humans.

  • Since readers can process text at their own pace, the home card may display more details than Alexa speaks, if necessary.

  • If you don’t want a card to accompany a specific response, remove the ‘card’ item from your response dict. Look for the function build_speechlet_response() or buildSpeechletResponse().

  • Never point your live/public skill at the $LATEST version of your code. The $LATEST version is for you to edit and test your code, and it’s where you catch errors.

  • If the skill raises errors frequently, don’t be intimidated! It’s part of the process of coding. To find out exactly what the problem is, read the “log streams” for your Lambda function. To print debug information to the logs, print() the information you want (Python) or use a console.log() statement (JavaScript/Node.js).

  • It helps me to keep a list of phrases to try, including words that the skill won’t understand. Make sure Alexa doesn’t raise an error and exit the skill, no matter what nonsense the user says.

  • Many great tips for designing voice interactions are on the ASK blog. https://developer.amazon.com/public/solutions/alexa/alexa-skills-kit/docs/alexa-skills-kit-voice-design-best-practices

  • Have fun!

In The News

Amazon had early access to this interview and to Kira and wrote an article about her in the Alexa Blog:

14-Year-Old Girl Creates CloudStatus Alexa Skill That Benefits AWS Developers

which was then picked up by VentureBeat:

A 14-year-old built an Alexa skill for checking the status of AWS

which was then copied, referenced, tweeted, and retweeted.

Original article and comments: https://alestic.com/2016/09/alexa-skill-aws-cloudstatus/

September 20, 2016 04:15 AM

Akkana Peck

Frogs on the Rio, and Other Amusements

Saturday, a friend led a group hike for the nature center from the Caja del Rio down to the Rio Grande.

The Caja (literally "box", referring to the depth of White Rock Canyon) is an area of national forest land west of Santa Fe, just across the river from Bandelier and White Rock. Getting there involves a lot of driving: first to Santa Fe, then out along increasingly dicey dirt roads until the road looks too daunting and it's time to get out and walk.

[Dave climbs the Frijoles Overlook trail] From where we stopped, it was only about a six mile hike, but the climb out is about 1100 feet and the day was unexpectedly hot and sunny (a mixed blessing: if it had been rainy, our Rav4 might have gotten stuck in mud on the way out). So it was a notable hike. But well worth it: the views of Frijoles Canyon (in Bandelier) were spectacular. We could see the lower Bandelier Falls, which I've never seen before, since Bandelier's Falls Trail washed out below the upper falls the summer before we moved here. Dave was convinced he could see the upper falls too, but no one else was convinced, though we could definitely see the red wall of the maar volcano in the canyon just below the upper falls.

[Canyon Tree Frog on the Rio Grande] We had lunch in a little grassy thicket by the Rio Grande, and we even saw a few little frogs, well camouflaged against the dirt: you could even see how their darker brown spots imitated the pebbles in the sand, and we wouldn't have had a chance of spotting them if they hadn't hopped. I believe these were canyon treefrogs (Hyla arenicolor). It's always nice to see frogs -- they're not as common as they used to be. We've heard canyon treefrogs at home a few times on rainy evenings: they make a loud, strange ratcheting noise which I managed to record on my digital camera. Of course, at noon on the Rio the frogs weren't making any noise: just hanging around looking cute.

[Chick Keller shows a burdock leaf] Sunday we drove around the Pojoaque Valley following their art tour, then after coming home I worked on setting up a new sandblaster to help with making my own art. The hardest and least fun part of welded art is cleaning the metal of rust and paint, so it's exciting to finally have a sandblaster to help with odd-shaped pieces like chains.

Then tonight was a flower walk in Pajarito Canyon, which is bursting at the seams with flowers, especially purple aster, goldeneye, Hooker's evening primrose and bahia. Now I'll sign off so I can catalog my flower photos before I forget what's what.

September 20, 2016 02:17 AM

September 19, 2016

Jono Bacon

Looking For Talent For ClusterHQ

clusterhq_logo

Recently I signed ClusterHQ as a client. If you are unfamiliar with them, they provide a neat technology for managing data as part of the overall lifecycle of an application. You can learn more about them here.

I will be consulting with Cluster to help them (a) build their community strategy, (b) find a great candidate as Senior Developer Evanglist, and (c) help to mentor that person in their role to be successful.

If you are looking for a new career, this could be a good opportunity. ClusterHQ are doing some interesting work, and if this role is a good fit for you, I will also be there to help you work within a crisply defined strategy and be successful in the execution. Think of it as having a friend on the inside. 🙂

You can learn more in the job description, but you should have these skills:

  • You are a deep full-stack cloud technologist. You have a track record of building distributed applications end-to-end.
  • You either have a Bachelor’s in Computer Science or are self-motivated and self-taught such that you don’t need one.
  • You are passionate about containers, data management, and building stateful applications in modern clusters.
  • You have a history of leadership and service in developer and DevOps communities, and you have a passion for making applications work.
  • You have expertise in lifecycle management of data.
  • You understand how developers and IT organizations consume cloud technologies, and are able to influence enterprise technology adoption outcomes based on that understanding.
  • You have great technical writing skills demonstrated via documentation, blog posts and other written work.
  • You are a social butterfly. You like meeting new people on and offline.
  • You are a great public speaker and are sought after for your expertise and presentation style.
  • You don’t mind charging your laptop and phone in airport lounges so are willing and eager to travel anywhere our developer communities live, and stay productive and professional on the road.
  • You like your weekend and evening time to focus on your outside-of-work passions, but don’t mind working irregular hours and weekends occasionally (as the job demands) to support hackathons, conferences, user groups, and other developer events.

ClusterHQ are primarily looking for help with:

  • Creating high-quality technical content for publication on our blog and other channels to show developers how to implement specific stateful container management technologies.
  • Spreading the word about container data services by speaking and sharing your expertise at relevant user groups and conferences.
  • Evangelizing stateful container management and ClusterHQ technologies to the Docker Swarm, Kubernetes, and Mesosphere communities, as well as to DevOPs/IT organizations chartered with operational management of stateful containers.
  • Promoting the needs of developers and users to the ClusterHQ product & engineering team, so that we build the right products in the right way.
  • Supporting developers building containerized applications wherever they are, on forums, social media, and everywhere in between.

Pretty neat opportunity.

Interested?

If you are interested in this role, there are few options for next steps:

  1. You can apply directly by clicking here.
  2. Alternatively, if I know you, I would invite you to confidentially share your interest in this role by filling in my form here. This way I can get a good sense of who is interested and also recommend people I personally know and can vouch for. I will then reach out to those of you who this seems to be a good potential fit for and play a supporting role in brokering the conversation.

By the way, there are going to be a number of these kinds of opportunities shared here on my blog. So, be sure to subscribe to my posts if you want to keep up to date with the latest opportunities.

The post Looking For Talent For ClusterHQ appeared first on Jono Bacon.

by Jono Bacon at September 19, 2016 07:25 PM

September 17, 2016

Elizabeth Krumbach

Kubrick, Typeface to Interface and the Zoo

I’ve been home for almost three weeks, and now I’m back in an airport. For almost two weeks of that MJ has been on a business trip overseas and I’ve kept myself busy with work, the book release and meeting up with friends and acquaintances. The incredibly ambitious plans I had for this time at home weren’t fully realized, but with everything we have going on I’m kind of glad I was able to spend some time at home.

Mornings have changed some for me during these three weeks. Coming off of trips from Mumbai and Philadelphia in August my sleep schedule was heavily shifted and I decided to take advantage of that by going out running in the mornings. I’d been meaning to get back into it, and my doctor has gotten a bit more insistent of late based on some results from blood work, and she’s right. Instead of doing proper C25K this time I’ve just been doing interval run/walks. I walk about a half mile, do pretty even run/walk for two miles and then a half mile back. It’s not a lot, but I’ll up the difficultly level as I the run/walk I have going feels easier, I have been going out 4-5 days a week and so far it feels great and seems sustainable. Fingers crossed for keeping this up during my next few weeks of travel.

With MJ out of town I’ve made plans with a bunch of local friends. Meals with my friends James, Emma, Sabice and Amanda last week were all a lot of fun and reversed my at home trend of being a hermit. Last weekend I made my way over to to the Stanley Kubrick: The Exhibition. It opened in June and I’ve been interested in going, but sorting out timing and who to go with has been impossible. I finally just went by myself last Saturday after some having some sushi for lunch nearby.

I wouldn’t say I’m a huge Kubrick fan, but I have enjoyed a lot of his work. The exhibit does a really exceptional job showcasing his work, with bright walls throughout and really nicely laid out scripts, cameras, costumes and props from the films. I had just recently seen Eyes Wide Shut again, but the exhibit made me want to go back and watch the rest, and ones I haven’t seen (Lolita, Spartacus). I particularly enjoyed the bits about my favorite movies of his, 2001: A Space Odyssey and Dr. Strangelove or: How I Learned to Stop Worrying and Love the Bomb.

Some photos from the exhibition here: https://www.flickr.com/photos/pleia2/albums/72157670417890794

I did get out to a movie recently with my friend mct. We saw Complete Unknown which was OK, but not as good as I had hoped. Dinner at a nearby brewery rounded off the evening nicely.

With the whirlwind week of my book release, preparations for the OpenStack QA/Infrastructure sprint (which I’m on my way to now) and other things, I called it a day early on Thursday and met up with my friend Atul for some down time to visit the San Francisco Zoo. He’s been in town for several weeks doing a rotation for work, and we kept missing each other between other plans and my travel schedule. We got to the zoo in time to spend about 90 minutes there before they closed, making it around to most of the exhibits. We got a picture together by the giraffes, but they’ve opened exhibits for the Mexican Wolves and Sifaka lemurs since I last visited! It was fun to finally see both of those. I have some more zoo visits in my future too, hoping to visit the Philadelphia zoo when I’m there next weekend and then the Columbus Zoo after the Ohio LinuxFest in early October.

More zoo pictures here: https://www.flickr.com/photos/pleia2/albums/72157670651372724

Thursday night I met up with my friend Steve to go to the San Francisco Museum of Modern Art to see the Typeface to Interface exhibit. This museum is just a block from where I live and the recently reopened after a few years of massive renovations. They’re open until 9PM on Thursdays and we got there around 7:30 to quite the crowd of people, so these later hours seem to be really working for them. Unfortunately I’ve never been much of a fan of modern art. This exhibit interested me though, and I’m really glad I went. It walks you through the beginning of bringing typeface work into the digital realm, presenting you with the IBM Selectric that had replaceable typing element ball for different fonts. You see a variety of digital-inspired posters and other art, the New York Transit Authority Graphics Standards Manual. It was fun going with Steve too, since his UX expertise meant that he actually knew a thing or two about these things out of the geeky computer context I was approaching it with. I think they could have done a bit more to tie the exhibit together, but it’s probably the best one I’ve seen there.

We spent the rest of the evening before closing walking through several of the other galleries in the museum. Nothing really grabbed my interest, and a lot of it I found difficult to understand why it was in a museum. I do understand the aesthetically pleasing nature of much abstract art, but when it starts being really simple (panel of solid magenta) or really eclectic I struggle with understanding the appeal. Dinner was great though, both of us are east coasters by origin and we went to my favorite fish place in SOMA for east coast oysters, mussels, lobster rolls and strawberry shortcake.

Yesterday afternoon MJ got home from his work trip. In the midst of packing and laundry we were able to catch up and spend some precious time together, including a wonderful dinner at Fogo de Chão. Now I’m off to Germany for work. I had time to write this post because the first flight I had was delayed by an astonishing 6 hours, sailing past catching my connection. I’ve now been rebooked on a two stop itinerary that’s getting me in 5 hours later than I had expected. Sadly, this means I’m missing most of the tourist day in Heidelberg I had planned with colleagues on Sunday, but I expect we’ll still be able to get out for drinks in the evening before work on Monday morning.

by pleia2 at September 17, 2016 04:53 PM

September 13, 2016

Elizabeth Krumbach

Common OpenStack Deployments released!

Back in the fall of 2014 I signed a contract with Prentice Hall that began my work on my second book, Common OpenStack Deployments. This was the first book I was writing from scratch and the first where I was the lead author (the first books I was co-author on were the 8th and 9th editions of The Official Ubuntu Book). That contract started me on a nearly two year journey to write a this book about OpenStack, which I talk a lot about here: How the book came to be.

Along the way I recruited my excellent contributing author Matt Fischer, who in addition to his Puppet and OpenStack expertise, shares a history with me in the Ubuntu community and Mystery Science Theater 3000 fandom (he had a letter read on the show once!). In short, he’s pretty awesome.

A lot of work and a lot of people went into making this book a reality, so I’m excited and happy to announce that the book has been officially released as of last week, and yesterday I got my first copy direct from the printer!

As I was putting the finishing touches on it in the spring, the dedication came up. I decided to dedicate the book to the OpenStack community, with a special nod to the Puppet OpenStack team.

Text:

This book is dedicated to the OpenStack community. Of the community, I’d also like to specifically call out the help and support received from the Puppet OpenStack Team, whose work directly laid the foundation for the deployment scenarios in this book.

Huge thanks to everyone who participated in making this book a reality, whether they were diligently testing all of our Puppet manifests, lent their OpenStack or systems administration experience to reviewing or gave me support as I worked my way through the tough parts of the book (my husband was particularly supportive during some of the really grim moments). This is a really major thing for me and I couldn’t have done it without all of you.

I’ll be continuing to write about updates to the book over on the blog that lives on the book’s website: DeploymentsBook.com (RSS). You can also follow updates on Twitter via @deploymentsbook, if that’s your thing.

If you’re interested in getting your hands on a copy, it’s sold by all the usual book sellers and available on Safari. The publisher’s website also routinely has sales and deals, especially if you buy the paper and digital copies together, so keep an eye out. I’ll also be speaking at conferences over the next few months and will be giving out signed copies. Check out my current speaking engagements here to see where I’ll be and I will have a few copies at the upcoming OpenStack Summit in Barcelona.

by pleia2 at September 13, 2016 06:53 PM

September 12, 2016

Akkana Peck

Art on display at the Bandelier Visitor Center

As part of the advertising for next month's Los Alamos Artists Studio Tour (October 15 & 16), the Bandelier Visitor Center in White Rock has a display case set up, and I have two pieces in it.

[my art on display at Bandelier]

The Velociraptor on the left and the hummingbird at right in front of the sweater are mine. (Sorry about the reflections in the photo -- the light in the Visitor Center is tricky.)

The turtle at front center is my mentor David Trujillo's, and I'm pretty sure the rabbit at far left is from Richard Swenson.

The lemurs just right of center are some of Heather Ward's fabulous scratchboard work. You may think of scratchboard as a kids' toy (I know I used to), but Heather turns it into an amazing medium for wildlife art. I'm lucky enough to get to share her studio for the art tour: we didn't have a critical mass of artists in White Rock, just two of us, so we're borrowing space in Los Alamos for the tour.

September 12, 2016 04:38 PM

September 09, 2016

iheartubuntu

How to Read RSS on Ubuntu with FeedReader


One of my favorite apps on linux is FeedReader. Long ago when I used Google products they had a great RSS reader built in, but they phased it out and got rid of it. If you are a fan of reading blogs but dont want all of the emails for new articles piling into your inbox, FeedReader is a very nice app to organize and handle off of your RSS news.

Lets just get right to it and install which is very easy with a PPA in the terminal. Cut and paste each line into a terminal and press enter...

sudo apt-add-repository ppa:eviltwin1/feedreader-stable

sudo apt-get update && sudo apt-get install feedreader

The only caveat is this is not a true RSS manager but more of an RSS client. You will need to create an external account on one of the RSS web services (usually free) like InoReader, Feedly, or Tiny Tiny RSS. The beauty of this is that your blog feeds will all by synced up.

FeedReader has some nice features such as sharing an article to your Pocket, Instapaper or Readability accounts, as well as desktop notifications of new articles, keyboard shortcuts, and tagging of articles. As seen in my FeedReader image above, we list our subscribed blogs in one area on the left side, however if you have a lot of blogs you read you can make and group them into categories. Linux blogs, home improvement blogs, science blogs, whatever your desires.

Another bonus is you can even add YouTube pages to your feed! Example of a feed would be to go to their YouTube page and click onto the pages "Videos" feed and thats what you would plug into FeedReader. Heres an example of a YouTube link...

https://www.youtube.com/user/StarTrekContinues/videos

(thats a fan created Star Trek series in the aura of the original Star Trek TV show)

by iheartubuntu (noreply@blogger.com) at September 09, 2016 07:30 PM

New Version of Elementary OS Has Landed

The newest version of elementary OS 0.4, dubbed Loki, has landed today and it is impressive. More than a year in development, elementary OS ships with a carefully curated selection of apps that cater to every day needs so you can spend more time using your computer and less time cleaning up bloatware.

One of the major new features of elementary OS is their new Open Source App Store.
The AppCenter brings handcrafted apps and extensions directly to your desktop. Quickly discover new apps and easily update the ones you already have. No license keys, no subscriptions, no trial periods.

Loki is being built with the Ubuntu 16.04 LTS repository, which means it comes with Gtk 3.18, Vala 0.32, and Linux 4.4 as well as a multitude of other updated libraries. Loki also removes Ubuntu's Ayatana with a brand new set of wingpanel indicators for a better experience.

From their website...

"elementary OS is Open Source... Our code is available for review, scrutiny, modification, and redistribution by anyone. We dont make advertising deals and we don't collect sensitive personal data. Our only income is directly from our users. elementary OS is Safe & Secure. We're built on Linux: the same software powering the U.S Department of Defense, the Bank of China, and more"

If you want an impressive, beautiful and snappy linux desktop based on Ubuntu, give elementary OS a try.

by iheartubuntu (noreply@blogger.com) at September 09, 2016 06:06 PM

September 06, 2016

Elizabeth Krumbach

Labor Day Weekend in SF

Labor Day weekend was a busy one for us this year, as the last weekend MJ and I will be properly spending together for several weeks. Tomorrow he flies off to India for work. He’ll get home mid-day the following Friday, and Saturday morning I’ll be off to Germany. When I get home he’ll be in New York, directly from there we’ll meet at the end of September in Philadelphia for a couple events we have planned with family and friends. October will only be marginally better. But I already have three trips planned and he has at least one.

Saturday morning we went to services and then lunch in Japantown. We had a bit of a medical adventure in the afternoon (all is well) before going off to dinner. MJ started a new job in the beginning of August while I was in Mumbai and we hadn’t had time to properly schedule time to celebrate. After doing some searching as to a nice place to go where getting reservations wasn’t too much of a hassle, we ended up at Luce. They have a Michelin star, reservations were easy to get, they offer a tasting menu Tuesday through Saturday and they’re located just a few blocks from home.

I really enjoyed the ambiance of the restaurant, in spite of it being just off the lobby for the InterContinental. It was spacious and cool, and not very busy for a Saturday night, so we had privacy to talk and didn’t feel crowded. The food was good, with the portions being just right for a tasting menu. There were a lot of fish dishes (menu here), which suited me just fine, both the salmon and the halibut were amazing. I skipped the foie gras, but I did have the duck course, which I’m not usually that keen on, but it was not as tender and rare as is usually served so I was OK with it. The desserts were light, cold and fruity, making for a really nice ending to the meal that wasn’t at all heavy. I also opted for the wine pairing, which MJ dipped into throughout the meal. There was only one red among the selection, with the rest being a series of dry and sweet whites.


Selections from the tasting menu at Luce

Sunday we had a long lunch over at Waterbar on the Embarcadero. We’d been to the restaurant next door, EPIC Steak, many times but this was our first time snagging reservations over at Waterbar. They have an amazing oyster list and the views of the Bay Bridge there are stunning, especially on a day as beautiful as Sunday was.

After lunch we went to the Exploratorium. We’d been there a few times before, so this visit was specifically to see their exhibit Strandbeest: The Dream Machines of Theo Jansen.

I knew about the Strandbeests after seeing a video about them online. These mechanical animals are created by a Dutch artist artist and propel themselves along beaches. He makes them out of a type of plastic piping and has come up with a whole evolution through the various animals he’s made since beginning this work in 1990. They all have names, and the exhibit went through various iterations, some of which they had examples of on exhibit. It talked about the “nerves” and “muscles” that the Strandbeests have, how some are propelled by wind but many also have some mechanisms for limited self-propulsion. We also sat through a 32 minute video they had showing about them.

What was most striking about the exhibit was how strange it all was. This artist devoted a nice chunk of his life to this work and it’s kind of an unusual thing, but the Strandbeest are amazing. They are mechanical but appear so lifelike when they move. All in spite of being so very obviously built with plastic and they wander along beaches. More photos from the exhibit here: https://www.flickr.com/photos/pleia2/albums/72157673332520306

Monday didn’t have so many adventures. As we prepare for all this overlapping travel, we had a lot to get squared away at home and in preparation for our trips (like booking one of them! And packing!). Though we did have time to sneak out to a nice brunch together at the nearby Red Dog Restaurant.

by pleia2 at September 06, 2016 05:14 AM

Akkana Peck

The Taos Earthships (and a lovely sunset)

We drove up to Taos today to see the Earthships.

[Taos Earthships] Earthships are sustainable, completely off-the-grid houses built of adobe and recycled materials. That was pretty much all I knew about them, except that they were weird looking; I'd driven by on the highway a few times (they're on highway 64 just west of the beautiful Rio Grande Gorge Bridge) but never stopped and paid the $7 admission for the self-guided tour.

[Earthship construction] Seeing them up close was fun. The walls are made of old tires packed with dirt, then covered with adobe. The result is quite strong, though like all adobe structures it requires regular maintenance if you don't want it to melt away. For non load bearing walls, they pack adobe around old recycled bottles or cans.

The houses have a passive solar design, with big windows along one side that make a greenhouse for growing food and freshening the air, as well as collecting warmth in cold weather. Solar panels provide power -- supposedly along with windmills, but I didn't see any windmills in operation, and the ones they showed in photos looked too tiny to offer much help. To help make the most of the solar power, the house is wired for DC, and all the lighting, water pumps and so forth run off low voltage DC. There's even a special DC refrigerator. They do include an AC inverter for appliances like televisions and computer equipment that can't run directly off DC.

Water is supposedly self sustaining too, though I don't see how that could work in drought years. As long as there's enough rainfall, water runs off the roof into a cistern and is used for drinking, bathing etc., after which it's run through filters and then pumped into the greenhouse. Waste water from the greenhouse is used for flushing toilets, after which it finally goes to the septic tank.

All very cool. We're in a house now that makes us very happy (and has excellent passive solar, though we do plan to add solar panels and a greywater system some day) but if I was building a house, I'd be all over this.

We also discovered an excellent way to get there without getting stuck in traffic-clogged Taos (it's a lovely town, but you really don't want to go near there on a holiday, or a weekend ... or any other time when people might be visiting). There's a road from Pilar that crosses the Rio Grande then ascends up to the mesa high above the river, continuing up to highway 64 right near the earthships. We'd been a little way up that road once, on a petroglyph-viewing hike, but never all the way through. The map said it was dirt from the Rio all the way up to 64, and we were in the Corolla, since the Rav4's battery started misbehaving a few days ago and we haven't replaced it yet.

So we were hesitant. But the nice folks at the Rio Grande Gorge visitor center at Pilar assured us that the dirt section ended at the top of the mesa and any car could make it ("it gets bumpy -- a New Mexico massage! You'll get to the top very relaxed"). They were right: the Corolla made it with no difficulty and it was a much faster route than going through Taos.

[Nice sunset clouds in White Rock] We got home just in time for the rouladen I'd left cooking in the crockpot, and then finished dinner just in time for a great sunset sky.

A few more photos: Earthships (and a great sunset).

September 06, 2016 03:05 AM

September 01, 2016

Jono Bacon

The Psychology of Report/Issue Templates

This week HackerOne, who I have been working with recently, landed Report Templates.

In a nutshell, a report template is a configurable chunk of text that can be pre-loaded into the vulnerability submission form instead of a blank white box. For example:

96ba76fdc5f74c773902b8b975ea161326a72ffc_original

The goal of a report template is two-fold. Firstly, it helps security teams to think about what specific pieces of information they require in a vulnerability report. Secondly, it provides a useful way of ensuring a hacker provides all of these different pieces of information when they submit a report.

While a simple feature, this should improve the overall quality of reports submitted to HackerOne customers, improve the success of hackers by ensuring their vulnerability reports match the needs of their security teams, and result in overall better quality engagement in the platform.

Similar kinds of templates can be seen in platforms such as Discourse, GitLab, GitHub, and elsewhere. While a simple feature, there are some subtle underlying psychological components that I thought could be interesting to share.

The Psychology Behind the Template

When I started working with HackerOne the first piece of work I did was to (a) understand the needs/concerns of hackers and customers and then based on this, (b) perform a rigorous assessment of the typical community workflow to ensure that it mapped to these requirements. My view is simple: if you don’t have simple and effective workflow, it doesn’t matter how much outreach you do, people will get confused and give up.

This view fits into a wider narrative that has accompanied my work over the years that at the core of great community leadership is intentionally influencing the behavior we want to see in our community participants.

When I started talking to the HackerOne team about Report Templates (an idea that had already been bounced around), building this intentional influence was my core strategic goal. Customers on HackerOne clearly want high quality reports. Low quality reports suck up their team’s time, compromise the value of the platform, and divert resources from other areas. Similarly, hackers should be set up for success. A core metric for a hacker is Signal, and signal threshold is a metric for many of the private programs that operate on HackerOne.

In my mind Report Templates were a logical areas to focus on for a few reasons.

Firstly, as with almost everything in life, the root of most problems are misaligned expectations. Think about spats with your boss/spouse, frustrations with your cable company, and other annoyances as as examples of this.

A template provides an explicit tool for the security team to state exactly what they need. This reduces ambiguity, which in turn reduces uncertainty, which has proven to be a psychological blocker, and particularly dangerous on communities.

There has also been some interesting research into temptation and one of the findings has been that people often make irrational choices when they are in a state of temptation or arousal. Thus, when people are in a state of temptation, it is critical for us to build systems that can responsibility deliver positive results for them. Otherwise, people feel tempted, initiate an action, do not receive the rewards they expected (e.g. validation/money in this case), and then feel discomfort at the outcome.

Every platform plays to this temptation desire. Whether it is being tempted to buy something on Amazon, temptation to download and try a new version of Ubuntu, temptation to respond to that annoying political post from your Aunt on Facebook, or a temptation to submit a vulnerability report in HackerOne, we need to make sure the results of the action, at this most delicate moment, are indeed positive.

Report Templates (or Issue/Post Templates in other platforms) play this important role. They are triggered at the moment the user decides to act. If we simply give the user a blank white box to type into, we run the risk of that temptation not resulting in said suitable reward. Thus, the Report Template greases the wheels, particularly within the expectations-setting piece I outlined above.

Finally, and as relates to temptation, I have become a strong believer in influencing behavioral patterns at the point of action. In other words, when someone decides to do something, it is better to tune that moment to influence the behavior you want rather than try to prime people to make a sensible decision before they do so.

In the Report Templates example, we could have alternatively written oodles and oodles of documentation, provided training, delivered webinars/seminars and other content to encourage hackers to write great reports. There is though no guarantee that this would have influenced their behavior. With a Report Template though, because it is presented at the point of action (and temptation) it means that we can influence the right kind of behavior at the right time. This generally delivers better results.

This is why I love what I do for a living. There are so many fascinating underlying attributes, patterns, and factors that we can learn from and harness. When we do it well, we create rewarding, successful, impactful communities. While the Report Templates feature may be a small piece of this jigsaw, it, combined with similar efforts can join together to create a pretty rewarding picture.

The post The Psychology of Report/Issue Templates appeared first on Jono Bacon.

by Jono Bacon at September 01, 2016 08:50 PM

August 30, 2016

Elizabeth Krumbach

Layover “at” Heathrow

While on my way back from Mumbai several weeks ago I ended up with a seven hour layover at Heathrow. I had planned on just camping out in an airport lounge for that time and catching up on email, open source stuff, work stuff. Then my friend Laura reached out to see if I wanted her to pick me up at the airport so we could escape for a few hours and grab breakfast.

I’d never left the airport on a layover like this, but the chance to catch up with a good friend and being able to take advantage of not needing to plan for a VISA to enter were too good to pass up. Leaving immigration was fun, having to explain that I’d only be out of the airport for five hours. And so, with 7 hours between flights I properly entered England for the second time in my life.

We ended up at Wetherspoon’s pub in Woking for breakfast. Passing on the pork-heavy English breakfast, I had a lovely smoked salmon benedict and some tea.

The weather that morning was beautiful, so after breakfast wandered around town, stopped in a shop or two. We got some more tea (this time with cake!) and generally caught up. It was really nice to chat about our latest career stuff, geek out about open source and fill each other in on our latest life plans.

Definitely the best layover I’ve ever had, I’m super glad I didn’t just stay in the airport lounge! I’ll remind myself of this the next time the opportunity arises.

A handful of other photos here: https://www.flickr.com/photos/pleia2/albums/72157671052472560

by pleia2 at August 30, 2016 03:52 PM

Jono Bacon

My Reddit AMA Is Live – Join Us

Just a quick note that my Reddit Ask Me Anything discussion is live. Be sure to head over to this link and get your questions in!

All and any questions are absolutely welcome!

The post My Reddit AMA Is Live – Join Us appeared first on Jono Bacon.

by Jono Bacon at August 30, 2016 03:43 PM

Linux, Linus, Bradley, and Open Source Protection

Last week a bun-fight kicked off on the Linux kernel mailing list that led to some interesting questions about how and when we protect open source projects from bad actors. This also shone the light on some interesting community dynamics.

The touchpaper was lit when Bradley Kuhn, president of the Software Freedom Conservancy (an organization that provides legal and administrative services for free software and open source projects) posted a reply to Greg KH on the Linux kernel mailing list:

I observe now that the last 10 years brought something that never occurred before with any other copylefted code. Specifically, with Linux, we find both major and minor industry players determined to violate the GPL, on purpose, and refuse to comply, and tell us to our faces: “you think that we have to follow the GPL? Ok, then take us to Court. We won’t comply otherwise.” (None of the companies in your historical examples ever did this, Greg.) And, the decision to take that position is wholly in the hands of the violators, not the enforcers.

He went on to say:

In response, we have two options: we can all decide to give up on the GPL, or we can enforce it in Courts.

This rather ruffled Linus’s feathers who feels that lawyers are more part of the problem than the solution:

The fact is, the people who have created open source and made it a success have been the developers doing work – and the companies that we could get involved by showing that we are not all insane crazy people like the FSF. The people who have destroyed projects have been lawyers that claimed to be out to “save” those projects.

What followed has been a long and quite interesting discussion that is still rumbling on.

In a nutshell, this rather heated (and at times unnecessarily personal) debate has focused on when is the right time to defend the rights on the GPL. Bradley is of the view that these rights should be intrinsically defended as they are as important (if not more important) than the code. Linus is of the view that the practicalities of the software industry mean sending in the lawyers can potentially have an even more damaging effect as companies will tense up and choose to stay away.

Ethics and Pragmatism

Now, I have no dog in this race. I am a financial supporter of the Software Freedom Conservancy and the Free Software Foundation. I have an active working relationship with the Linux Foundation and I am friends with all the main players in this discussion, Linus, Greg, Bradley, Karen, Matthew, and Jeremy. I am not on anyone’s “side” here and I see value in the different perspectives brought to the table.

With that said, the core of this debate is the balance of ethics and pragmatism, something which has existed in open source and free software for a long time.

Linus and Bradley are good examples of either side of the aisle.

Linus has always been a pragmatic guy, and his stewardship of Linux has demonstrated that. Linus prioritizes the value of the GPL for practical software engineering and community-building purposes more-so than wider ideological free software ambitions. With Linus, practicality and tangible output come first.

Bradley is different. For Bradley, software freedom is first and foremost a moral issue. Bradley’s talents and interests lay with the legal and copyright aspects more-so than software engineering, so naturally his work has focused on licensing, copyright, and protection.

Now, this is not to suggest Linus doesn’t have ethics or that Bradley isn’t pragmatic, but their priorities are drawn in different areas. This results in differences in expectations, tone, and approach, with this debate being a good example.

Linus and Bradley are not alone here. For a long time there have been differences between organizations such as the Linux Foundation, the Free Software Foundation, and the Open Source Initiative. Again, each of these organizations draw their ethical and pragmatic priorities differently and they attract supporters who commonly share those similar lines in the sand.

I am a supporter of all of these organizations. I believe the Linux Foundation has had an unbelievably positive effect in normalizing and bridging the open source culture, methodology, and mindset to the wider business world. The Open Source Initiative have done wonderful work as stewards of licenses that thousands of organizations depend on. The Free Software Foundation has laid out a core set of principles around software freedom that are worthy for us all to strive for.

As such, I often take the view that everyone is bringing value, but everyone is also somewhat blinded by their own priorities and biases.

My Conclusion

Unsurprisingly, I see value in both sides of the debate.

Linus rightly raises the practicalities of the software industry. This is an industry in that is driven by a wide range of different forcing functions and pressures: politics, competition, supply/demand, historical precedent, cultural norms, and more. Many of these companies do great things, and some do shitty things. That is human beings for you.

As such, and like any industry, nothing is black and white. This isn’t as simple as Company A licenses code under the GPL and if they don’t meet the expectations of the license they should face legal consequences until they do. Each company has a delicate mix of these driving forces and Linus is absolutely right that a legal recourse could potentially have the inverse effect of reducing participation rather than improving it.

On the other hand, the GPL (or another open source license) does have to have meaning. As we have seen in countless societies in history, if rules are not enforced, humans will naturally try to break the rules. This always starts as small infractions but then ultimately grows more and more as the waters are tested. So, Bradley raises an important point, and while we should take a realistic and pragmatic approach to the norms of the industry, we do need people who are willing and able to enforce open source licenses.

The subtlety is in how we handle this. We need to lead with nuance and negotiation and not with antagonistic legal implications. The lawyers have to be a last resort and we should all be careful not to infer an overblown legal recourse for organizations that skirt the requirements of these licenses.

Anyone who has been working in this industry knows that the way you get things done in an organization is via a series of indirect nudges. We change organizations and industries with relationships, trust, and collaboration, and providing a supporting function to accomplish the outcome we want.

Of course, sometimes there has to be legal consequences, but this has to genuinely be a last resort. We need to not be under the illusion that legal action is an isolated act of protection. While legal action may protect the GPL in that specific scenario it will also freak out lots of people watching it unfold. Thus, it is critical that we consider the optics of legal action as much as the practical benefits from within that specific case.

The solution here, as is always the case, is more dialog that is empathetic to the views of those we disagree with. Linus, Bradley, and everyone else embroiled in this debate are on the right side of history. We just need to work together to find common ground and strategies: I am confident they are there.

What do you think? Do I have an accurate read on this debate? Am I missing something important? Share your thoughts below in the comments!

The post Linux, Linus, Bradley, and Open Source Protection appeared first on Jono Bacon.

by Jono Bacon at August 30, 2016 05:43 AM

August 29, 2016

Jono Bacon

Join my Reddit AMA Tomorrow

Screen Shot 2016-08-28 at 9.58.32 PM

Just a short reminder that tomorrow, Tuesday 30th August 2016 at 9am Pacific (see other time zone times here) I will be doing a Reddit AMA about community strategy/management, developer relations, open source, music, and anything else you folks want to ask about.

Want to ask questions about Canonical/GitHub/XPRIZE? Questions about building great communities? Questions about open source? Questions about politics or music? All questions are welcome!

To join, simply do the following:

  • Be sure to have a Reddit account. If you don’t have one, head over here and sign up.
  • On Tuesday 30th August 2016 at 9am Pacific (see other time zone times here) I will share the link to my AMA on Twitter (I am not allowed to share it until we run the AMA). You can look for this tweet by clicking here.
  • Click the link in my tweet to go to the AMA and then click the text box to add your question(s).
  • Now just wait until I respond. Feel free to follow up, challenge my response, and otherwise have fun!

I hope to see you all tomorrow!

The post Join my Reddit AMA Tomorrow appeared first on Jono Bacon.

by Jono Bacon at August 29, 2016 03:00 PM

Nathan Haines

Announcing the Ubuntu 16.10 Free Culture Showcase!

It’s time once again for the Ubuntu Free Culture Showcase!

The Ubuntu Free Culture Showcase is a way to celebrate the Free Culture movement, where talented artists across the globe create media and release it under licenses that encourage sharing and adaptation. We're looking for content which shows off the skill and talent of these amazing artists and will greet Ubuntu 16.10 users.

More information about the Free Culture Showcase is available at https://wiki.ubuntu.com/UbuntuFreeCultureShowcase

This cycle, we're looking for beautiful wallpaper images that will literally set the backdrop for new users as they experience Ubuntu 16.10 for the first time.  Submissions will be handled via Flickr at https://www.flickr.com/groups/ubuntu-fcs-1610/

I'm looking forward to seeing the next round of entrants and a difficult time picking final choices to ship with Ubuntu 16.10.

August 29, 2016 11:13 AM

August 28, 2016

Elizabeth Krumbach

Local Sights

I’ve been going running along the Embarcadero here in San Francisco lately. These runs afford me fresh air coming off the bay, stunning views of the bay itself, a chance to run under the beautiful Bay Bridge and down to the AT&T ballpark. I run past palm trees and E-Line street cars, and the weather is cool and clear enough to pretty much do it every day. In short, it sometimes feels like we live in paradise.

Naturally, we like to share that with friends and family who visit. I’ve had a fun year of local touristing as cousins, sisters and friends have been in town visiting. Our favorite place to take them is Fort Baker. It’s almost always less chaotic than the lookout point at the north side of the bridge, and you actually get to walk around a fair amount to get some views of both the Golden Gate Bridge and San Francisco itself. It’s also where I got my head shots done, including the header image I’ve used for this blog for several years. I’m a big fan of the city skyline from there.

Back in April we made a visit up to The Marine Mammal Center, which I wrote about here. We took an alternate route back due to a closed tunnel, and that’s how we ended up looking down at the Golden Gate Bridge from the northwest edge, the one view I hadn’t seen yet. It’s a pretty exceptional ones, getting to see the undeveloped hilly area on the north side and then the San Francisco city skyline in the far distance. I probably could have sat there all day.

Alas, I didn’t have all day. I had only taken the morning off from work and I had to grab a bite before catching the ferry back to San Francisco from Sausalito while MJ took everyone else on to Muir Woods. Now, I’d taken a ferry in the bay before, one to Alcatraz to do some tourist visiting, another to Alameda and back when visiting a potential location for a Partimus computer lab deployment. It’s always been a beautiful ride, but the ride from Sausalito to San Francisco lands into exceptional territory. You get views of several islands, both of San Francisco’s bridges, Alcatraz, Sausalito and the city. I was so happy on this ferry ride that I even had a conversation with a couple who was in town visiting from Canada and answered piles of questions about what we were seeing. This is something that shy, introvert me hardly ever does.

We also take folks up to Twin Peaks. How many cities in the world are there where you can climb a hill and look at downtown? In San Francisco, you can go up to Twin Peaks. It’s breathtaking.

Nice bay, right? We have an ocean too. I spent my youth on the coast of Maine. I didn’t sneak out to late night parties when I was a teenager, I snuck out to go to the park and sit by the ocean. My head clearing spot? The ocean. Needed cheering up when I was depressed? Trip to the ocean. First kiss? Happened right there on the rocks by the ocean. My love for being near the coast is a pretty deep part of who I am.

From the Cliff House on the western side of the city you get some great views of the beach stretching south.

Looking north you can see the ruins of the Sutro Baths that were opened in 1896 and lasted through the middle of the 20th century. Looking beyond to the other side of the golden gate.

Further views we caught this spring are in a pair of albums on Flickr, by month: April and June

by pleia2 at August 28, 2016 10:07 PM

August 27, 2016

kdub

Mir 0.24 Release

Mir 0.24 was just released this week!

We’ve reworked a few things internally and fixed a fair amount of bugs. Notably, our buffer swapping system and our input keymapping system were reworked (Alt-Gr should now work for international keyboards). There was also some improvements made to the server API to make window management better.

I’m most excited about the internal buffer swapping mechanism changes, as its what I’ve been working to release for a while now. The internal changes get us ready for Vulkan [1], and improve our multimedia support [2], improve  our WiDi support, and to reduce latency in nested server scenarios [3].

Mir ImageThis is prep work  for releasing some new client API functions (perhaps in 0.25, depending on how the trade winds are blowing… they’re currently gated in non-public project directories here). More on that once the headers are released.

[1]
Vulkan is  a new rendering API from Khronos designed to give finer-grained gpu control and more parallel operation between CPU and the GPU.

[2]
Especially multimedia decoding and encoding, which need more arbitrary buffer control.

[3]
“Unity8” runs in a nested server configuration for multiuser support (among other reasons). unity-system-compositor controls the framebuffer, unity8 sessions connect to unity-system-compositor, and clients connect to the appropriate unity8 session. More fine-grained buffer submissions allow us to forward buffers more creatively, making sure the clients have zero-copy more often.

by kdub at August 27, 2016 07:19 AM

August 26, 2016

Akkana Peck

More map file conversions: ESRI Shapefiles and GeoJSON

I recently wrote about Translating track files between mapping formats like GPX, KML, KMZ and UTM But there's one common mapping format that keeps coming up that's hard to handle using free software, and tricky to translate to other formats: ESRI shapefiles.

ArcGIS shapefiles are crazy. Typically they come as an archive that includes many different files, with the same base name but different extensions: filename.sbn, filename.shx, filename.cpg, filename.sbx, filename.dbf, filename.shp, filename.prj, and so forth. Which of these are important and which aren't?

To be honest, I don't know. I found this description in my searches: "A shape file map consists of the geometry (.shp), the spatial index (.shx), the attribute table (.dbf) and the projection metadata file (.prj)." Poking around, I found that most of the interesting metadata (trail name, description, type, access restrictions and so on) was in the .dbf file.

You can convert the whole mess into other formats using the ogr2ogr program. On Debian it's part of the gdal-bin package. Pass it the .shp filename, and it will look in the same directory for files with the same basename and other shapefile-related extensions. For instance, to convert to KML:

 ogr2ogr -f KML output.kml input.shp

Unfortunately, most of the metadata -- comments on trail conditions and access restrictions that were in the .dbf file -- didn't make it into the KML.

GPX was even worse. ogr2ogr knows how to convert directly to GPX, but that printed a lot of errors like "Field of name 'foo' is not supported in GPX schema. Use GPX_USE_EXTENSIONS creation option to allow use of the <extensions> element." So I tried ogr2ogr -f "GPX" -dsco GPX_USE_EXTENSIONS=YES output.gpx input.shp but that just led to more errors. It did produce a GPX file, but it had almost no useful data in it, far less than the KML did. I got a better GPX file by using ogr2ogr to convert to KML, then using gpsbabel to convert that KML to GPX.

Use GeoJSON instead to preserve the metadata

But there is a better way: GeoJSON.

ogr2ogr -f "GeoJSON" -t_srs crs:84 output.geojson input.shp

That preserved most, maybe all, of the metadata the .dbf file and gave me a nicely formatted file. The only problem was that I didn't have any programs that could read GeoJSON ...

[PyTopo showing metadata from GeoJSON converted from a shapefile]

But JSON is a nice straightforward format, easy to read and easy to parse, and it took surprisingly little work to add GeoJSON parsing to PyTopo. Now, at least, I have a way to view the maps converted from shapefiles, click on a trail and see the metadata from the original shapefile.

See also:

August 26, 2016 06:11 PM

August 25, 2016

Jono Bacon

Social Media: 10 Ways To Not Screw It Up

Social media is everywhere. Millions of users, seemingly almost as many networks, and many agencies touting that they have mastered the zen-like secrets to social media and can bring incredible traction.

While social media has had undeniable benefits to many, it has also been contorted and twisted in awkward ways. For every elegant, well deliver social account there are countless blatant attention-grabbing efforts.

While I am by no means a social media expert, over the years I have picked up some techniques and approaches that I have found useful with the communities, companies, and clients I have worked with. My goal has always been to strike a good balance between quality, engagement, and humility.

I haven’t always succeeded, but here are 10 things I recommend you do if you want to do social media well:

1. Focus on Your Core Networks

There are loads of social media networks out there. For some organizations there is an inherent temptation to grow an audience on all of them. More audiences mean more people, right?

Well, not really.

As with most things in life, it is better to have focus and deliver quality than to spread yourself too thin. So, pick a few core networks and focus on them. Focus on delivering great content, growing your audience, and engaging well.

My personal recommendations are to focus n Twitter and Facebook for sure, as they have significant traction, but also Instagram and Google+ are good targets too. It is really up to you though for what works best for your organization/goals.

2. Configure Your Accounts Well

Every social media network has some options for choosing an avatar, banner, and adding a little text. It is important to get this right.

Put yourself in the position of your audience. Imagine they don’t know who you are and they stumble on your profile. Sure, a picture of a care bear and a quote from The Big Lebowski may look cool, but it doesn’t help the reader.

Their reading of this content is going to result in a judgement call about you. So, reflect yourself accurately. Want to be a professional? Look and write professionally. Want to be a movie fan who believes in magical bears? Well, erm, I guess you know what to do.

It is also important to do this for SEO (Search Engine Optimization). If you want more Google juice for your name/organization, be sure to incorporate it in your profiles and content.

3. Quality vs. Quantity

A while back I spent a bit of time working with some folks who were really into social media. They had all kinds of theories about how the Facebook and Twitter algorithms prioritize content, hide it from users, and only display certain types of content to others. Of course this is not an exact science as these algorithms are typically confidential to those networks.

There is no doubt that social networks have to make some kind of judgement on what to show – there is just too much material to show it all. So, we want to be mindful of these restrictions, but also be wary that a lot of this is guessing.

The trick here is simple: focus on delivering high quality content and just don’t overdo it. Posting 50 tweets in a day is not going to help – it will be too much and probably not high quality (likely due to the quantity). Even if your audience sees it all, it will just seem spammy.

Now, you may be asking what high quality content would look like? Fundamentally I see it as understanding your audience, how they communicate, and mirroring those interests and tonality. Some examples:

  • Well written content that is concise, loose, and fun.
  • Interesting thoughts, ideas, and discussions.
  • Links to interesting articles, data, and other material.
  • Interesting embedded pictures, videos, and other content.

Speaking of embedding…

4. Embed Smartly

All the networks allow you to embed pictures and videos in your social posts.

Where possible, always embed something. It typically results in higher performing posts both in terms of views and click-rate.

Video has proven to do very well on social media networks. People are naturally curious and click the video to see it. Be mindful here though – posting a 45 minute documentary isn’t going to work well. A 2 minute clip will work great though.

Also, check how different networks display videos. For example, on Twitter and Google+, YouTube videos get a decent sized thumbnail and are simple to play. On Facebook though, YouTube videos are noticeably smaller (likely because Facebook doesn’t want people embedding YouTube videos). So, when posting on Facebook, uploading a native video might be best.

Pictures are an interesting one. A few tips:

  • Square pictures work especially well. They resize well in most social interfaces to take up the maximum amount of space.
  • The ideal size is 505×505 pixels on Facebook. I have found this size to work well on other networks too.
  • Images that work particularly well are high contrast and have large letters. They stand out more in a feed and make people want to click them. An example of an image I am using for my Reddit AMA next week:

Social Media

Authenticity is essential in any human communication. As humans we are constantly advertised to, sold, and marketed at, and thus evolution has increasingly expanded our bullshit radar.

This radar gets triggered when we see inauthentic content. Examples of this include content trying to be overly peppy, material that requires too many commitments (e.g. registrations), or clickbait. A classic example from our friends at Microsoft:

Social Media

Social media is fundamentally about sharing and discussion and representing content and tonality that matches your audience. Make sure that you do both authentically.

Share openly, and discuss openly. Act and talk like a human, not a business book, don’t try to be someone you are not, and you will find your audience enjoys your content and finds your efforts rewarding.

6. Connect and Schedule Your Content

Managing all these social media networks is a pain. Of course, there are many tools that you can use for extensive analytics, content delivery, and team collaboration. While these are handy for professional social media people, for many people they are not particularly necessary.

What I do recommend for everyone though is Buffer.

The idea is simple. Buffer lets you fill a giant bucket full of social media posts that will hit the major networks such as Twitter, Facebook, Google+ (pages), and Instagram. You then set a schedule for when these posts should go out and Buffer will take care of sending them for you at an optimal chosen time.

Part of the reason I love this is that if you have a busy week and forget to post on social media, you know that you are always sharing content. Speaking personally, I often line up my posts on a Sunday night and then periodically post during the week.

Speaking of optimal times…

7. Timing Is Everything

If you want your content to get a decent number of views and clicks, there are definitely better times than others to post.

Much of this depends on your audience and where you are geographically. As an example, while I have a fairly global audience for my work, a significant number of people are based in US. As such, I have found that the best time for my content is in the morning between 8am and 9am Pacific. This then still hits Europe and out towards India.

To figure out the best time for you, post some social posts and look at the analytics to see which times work best. Each social network has analytics available and Buffer provides a nice analytics view too, although the nicer stats require a professional plan.

Knowing what is the best time to post combined with the scheduled posting capabilities of Buffer is a great combo.

8. Deliver Structured Campaigns

You might also want to explore some structured campaigns for your social media efforts. These are essentially themed campaigns designed to get people interested or involved.

A few examples:

  • Twitter Chats – here you simply choose a hashtag and some guests, announce the chat, and then invite your guests to answer the questions via Twitter and for the audience to respond. They can be rather fun.
  • Calls For Action – again, choose a hashtag, and ask your audience for feedback to certain questions. This could be questions, suggestions, content, and more.
  • Thematic Content – here you post a series of posts with similar images or videos attached.

You are only limited by your imagination, but remember, be authentic. Social media is riddled with cheesy last-breath attempts at engagement. Don’t be one of those people.

9. Don’t Take Yourself too Seriously

There has much various studies to suggest social media encourages narcissism. There is certainly observational evidence that backs this up.

You should be proud of your work, proud of your projects, and focus on doing great things. Always try to ensure that you are down to earth though, and demonstrate a grounded demeanor in your posts. No one likes ego, and it is more tempting than ever to use social media as a platform for a confidence boost and increasingly post ego-drive narcissistic content.

Let’s be honest, we have all made this mistake from time to time. I know I have. We are human beings, after all.

As I mentioned earlier, you always want to try to match your tonality to your audience. For some global audiences though it can be tempting to err on the side of caution and be a little too buttoned up. This often ends up being just boring. Be professional, sure, but surprise your audience in your humanity, your humility, and that there is a real person behind the tweet or post.

10. What Not To Do

Social media can be a lot of fun and with some simple steps (such as these) you can perform some successful and rewarding work. There are a few things I would recommend you don’t do though:

  • Unless you want to be a professional provocateur, avoid deliberately fighting with your audience. You will almost certainly disagree with many of your followers on some political stances – picking fights won’t get you anywhere.
  • Don’t go and follow everyone for the purposes of getting followed back. When I see that Joe Bloggs has 5,434 followers and is following 5,654 people, it smacks of this behavior. 😉
  • Don’t be overtly crass. I know some folks online, and even worked with some people, who just can’t help dropping F bombs, crass jokes, and more online. Be fun, be a little edgy, but keep it classy, people.

So, that’s it. Just a few little tips and tricks I have learned over the years. I hope some of this helps. If you found it handy, click those social buttons on the side and practice what you preach and share this post. 🙂

I would love to learn from you though. What approaches, methods, and techniques have you found for doing social media better? Share your ideas in the comment box and let’s have a discussion…

The post Social Media: 10 Ways To Not Screw It Up appeared first on Jono Bacon.

by Jono Bacon at August 25, 2016 03:00 PM

August 23, 2016

Elizabeth Krumbach

FOSSCON 2016

Last week I was in Philadelphia, which was fun and I got to do some Ubuntu stuff but I was actually there to speak at FOSSCON. It’s not the largest open source conference, but it is in my adopted home city of Philadelphia and I have piles of friends, mentors and family there. I love attending FOSSCON because I get to catch up with so many people, making it a very hug-heavy conference. I sadly missed it last year, but I made sure to come out this year.

They also invited me to give a closing keynote. After some back and forth about topics, I ended up with a talk on “Listening to the Needs of Your Global Open Source Community” but more on that later.

I kicked off my morning by visiting my friends at the Ubuntu booth, and meeting up with my OpenStack and HPE colleague Ma Dong who had flown in from Beijing to join us. I made sure we got our picture taken by the beautiful Philadelphia-themed banner that the HPE open source office designed and sent for the event.

At 11AM I gave my regular track talk, “A Tour Of OpenStack Deployment Scenarios.” My goal here was to provide a gentle introduction, with examples, of the basics of OpenStack and how it may be used by organizations. My hope is that the live demos of launching instances from the Horizon web UI and OpenStack client were particularly valuable in making the connection between the concepts of building a cloud the actual tooling you might use. The talk was well-attended and I had some interesting chats later in the day. I learned that a number of the attendees are currently using proprietary cloud offerings and looking for options to in-house some of that.

The demos were very similar to the tutorial I gave at SANOG earlier this month, but the talk format was different. Notes from demos here and slides (219K).


Thanks to Ma Dong for taking a picture during my talk! (source)

For lunch I joined other sponsors at the sponsor lunch over at the wonderful White Dog Cafe just a couple blocks from the venue. Then it was a quick dash back to the venue for Ma Dong’s talk on “Continuous Integration And Delivery For Open Source Development.”

He outlined some of the common mechanisms for CI/CD in open source projects, and how the OpenStack project has solved them for a project that eclipses most others in size, scale and development pace. Obviously it’s a topic I’m incredibly familiar with, but I appreciated his perspective as a contributor who comes from an open source CI background and has now joined us doing QA in OpenStack.


Ma Dong on Open Source CI/CD

After his talk it was also nice to sit down for a bit to chat about some of the latest changes in the OpenStack Infrastructure. We were able to catch up about the status of our Zuul tooling and general direction of some of our other projects and services. The day continued with some chats about Jenkins, Nodepool and how we’ve played around with infrastructure tooling to cover some interesting side cases. It was really fun to meet up with some new folks doing CI things to swap tips and stories.

Just before my keynote I attended the lightning talks for a few minutes, but had to depart early to get set up in the big room.

They keynote on “Listening to the Needs of Your Global Open Source Community” was a completely new talk for me. I wrote the abstract for it a few weeks ago for another conference CFP after the suggestion from my boss. The talk walked through eight tips for facilitating the collection of feedback from your community as one of the project leaders or infrastructure representatives.

  • Provide a simple way for contributors to contact project owners
  • Acknowledge every piece of feedback
  • Stay calm
  • Communicate potential changes and ask for feedback
  • Check in with teams
  • Document your processes
  • Read between the lines
  • Stick to your principles

With each of these, I gave some examples from my work mostly in the Ubuntu and OpenStack communities. Some of the examples were pretty funny, and likely very familiar with any systems folks who are interfacing with users. The Q&A at the end of the presentation was particularly interesting, I was very focused on open source projects since that’s where my expertise lies, but members of the audience felt that my suggestions were more broadly applicable. In those moments after my talk I was invited to speak on a podcast and encouraged to write a series of articles related to my talk. Now I’m aiming for writing some OpenSource.com content on over the next couple weeks.

Slides from the talk are here (7.3M pdf).


And thanks to Josh, José, Vincent and Nathan for snapping some photos of the talk too!

The conference wound down and following the keynote with a raffle and we then went our separate ways. For me, it was time for spending time with friends over a martini.

A handful of other photos from the conference here: https://www.flickr.com/photos/pleia2/albums/72157671843605132

by pleia2 at August 23, 2016 09:01 PM

Jono Bacon

Bacon Roundup – 23rd August 2016

Well, hello there, people. I am back with another Bacon Roundup which summarizes some of the various things I have published recently. Don’t forget to subscribe to get the latest posts right to your inbox.

Also, don’t forget that I am doing a Reddit AMA (Ask Me Anything) on Tues 30th August 2016 at 9am Pacific. Find out the details here.

Without further ado, the roundup:

Building a Career in Open Source (opensource.com)
A piece I wrote about how to build a successful career in open source. It delves into finding opportunity, building a network, always learning/evolving, and more. If you aspire to work in open source, be sure to check it out.

Cutting the Cord With Playstation Vue (jonobacon.org)
At home we recently severed ties with DirecTV (for lots of reasons, this being one), and moved our entertainment to a Playstation 4 and Playstation Vue for TV. Here’s how I did it, how it works, and how you can get in on the action.

Running a Hackathon for Security Hackers (jonobacon.org)
Recently I have been working with HackerOne and we recently ran a hackathon for some of the best hackers in the world to hack popular products and services for fun and profit. Here’s what happened, how it looked, and what went down.

Opening Up Data Science with data.world (jonobacon.org)
Recently I have also been working with data.world who are building a global platform and community for data, collaboration, and insights. This piece delves into the importance of data, the potential for data.world, and what the future might hold for a true data community.

From The Archive

To round out this roundup, here are a few pieces I published from the archive. As usual, you can find more here.

Using behavioral patterns to build awesome communities (opensource.com)
Human beings are pretty irrational a lot of the time, but irrational in predictable ways. These traits can provide a helpful foundation in which we build human systems and communities. This piece delves into some practical ways in which you can harness behavioral economics in your community or organization.

Atom: My New Favorite Code Editor (jonobacon.org)
Atom is an extensible text editor that provides a thin and sleek core and a raft of community-developed plugins for expanding it into the editor you want. Want it like vim? No worries. Want it like Eclipse? No worries. Here’s my piece on why it is neat and recommendations for which plugins you should install.

Ultimate unconference survival guide (opensource.com)
Unconferences, for those who are new to them, are conferences in which the attendees define the content on the fly. They provide a phenomenal way to bring fresh ideas to the surface. They can though, be a little complicated to figure out for attendees. Here’s some tips on getting the most out of them.

Stay up to date and get the latest posts direct to your email inbox with no spam and no nonsense. Click here to subscribe.

The post Bacon Roundup – 23rd August 2016 appeared first on Jono Bacon.

by Jono Bacon at August 23, 2016 01:48 PM

Elizabeth Krumbach

Wandering around Philadelphia

Philadelphia is my figurative (and may soon be literal…) second home. Visits are always filled with activities, events, friends and family. This trip was a considerably less structured. I flew in several days before the conference I was attending and stayed in my friend’s guest room, and didn’t take much time off from work, instead working from a couch most of the week with my little dog friend Blackie.

I did have some time for adventuring throughout the week though, taking a day off to check out The Science Behind Pixar exhibit down at The Franklin Institute with a friend. On our way down we stopped at Pudge’s in Conshohocken to satisfy my chicken cheesesteak craving. It hit the spot.

Then we were off to the city! The premise of the exhibit seemed to be trying to encourage youth into STEM fields by way of the creative processes and interesting jobs at a company like Pixar. As such, they walked you through various phases of production of Pixar films and have hands-on exhibits that let you simply play around with the themes of what professionals in the industry do. It’s probably a good idea to encourage interest, even if a museum exhibit can’t begin to tackle the complexity of these fields, as a technologist I agree that the work is ultimately fun and exciting.

But let’s be honest, I’m an adult who already has an STEM career and I’ve been a Pixar fan since the beginning. I was there so I could get selfies with Wall-E (and Buzz, Sully and Mike, Edna Mode, Dory…).

A few more photos from the exhibit here: https://www.flickr.com/photos/pleia2/albums/72157671629547292

We had the whole afternoon, so I also got to see the Lost Egypt exhibit, which was fun to see after the Egypt exhibit I saw at de Young last month. We went to a couple planetarium shows and also got all the nostalgia on as I revisited all the standing exhibits. Like the trains. I love the trains. The Franklin Institute is definitely one of my favorite museums.

That evening I also got to check out the new Hive76 location. The resurgence of hackerspaces had just started when I left Philly, and while I was never super involved, I did host a few “PLUG into Hive” meetings there when I was coordinating the LUG and had friends at Hive. It was nice getting to see their new space. After dinner I had the amusing experience of going to catch Pokémon in a park after dark, along with several other folks who were there for the same reason. There really is something to be said for a game that gets people out of their house at night to go for walks and socialize over augmented reality. Even if I didn’t catch any new Pokémon. Hah!

Wednesday and Thursday nights I spent time with my best buddies Danita and Crissi. Dinner, drinks, lots of good chatting. It had absolutely been too much time since we’d spend time together, spending time catching up was just the thing I needed. I’ll have to make sure I don’t let so much time pass between getting together in the future.

More photos from various wanderings this past week (including dinosaurs!) here: https://www.flickr.com/photos/pleia2/albums/72157671629567332

And then MJ and I spent Friday and Sunday on a secret mission before flying home. I’ll write more about that once it becomes unclassified.

by pleia2 at August 23, 2016 02:35 AM

August 22, 2016

Elizabeth Krumbach

Ubuntu in Philadelphia

Last week I traveled to Philadelphia to spend some time with friends and speak at FOSSCON. While I was there, I noticed a Philadelphia area Linux Users Group (PLUG) meeting would land during that week and decided to propose a talk on Ubuntu 16.04.

But first I happened to be out getting my nails done with a friend on Sunday before my talk. Since I was there, I decided to Ubuntu theme things up again. Drawing freehand, the manicurist gave me some lovely Ubuntu logos.

Girly nails aside, that’s how I ended up at The ATS Group on Monday evening for a PLUG West meeting. They had a very nice welcome sign for the group. Danita and I arrived shortly after 7PM for the Q&A portion of the meeting. This pre-presentation time gave me the opportunity to pass around my BQ Aquaris M10 tablet running Ubuntu. After the first unceremonious pass, I sent it around a second time with more of an introduction, and the Bluetooth keyboard and mouse combo so people could see convergence in action by switching between the tablet and desktop view. Unlike my previous presentations, I was traveling so I didn’t have my bag of laptops and extra tablet, so that was the extent of the demos.

The meeting was very well attended and the talk went well. It was nice to have folks chiming in on a few of the topics (like the transition to systemd) and there were good questions. I also was able to give away a copy of our The Official Ubuntu Book, 9th Edition to an attendee who was new to Ubuntu.

Keith C. Perry shared a video of the talk on G+ here. Slides are similar to past talks, but I added a couple since I was presenting on a Xubuntu system (rather than Ubuntu) and didn’t have pure Ubuntu demos available: slides (7.6M PDF, lots of screenshots).

After the meeting we all had an enjoyable time at The Office, which I hadn’t been to since moving away from Philadelphia almost seven years ago.

Thanks again to everyone who came out, it was nice to meet a few new folks and catch up with a bunch of people I haven’t seen in several years.

Saturday was FOSSCON! The Ubuntu Pennsylvania LoCo team showed up to have a booth, staffed by long time LoCo member Randy Gold.

They had Ubuntu demos, giveaways from the Ubuntu conference pack (lanyards, USB sticks, pins) and I dropped off a copy of the Ubuntu book for people to browse, along with some discount coupons for folks who wanted to buy it. My Ubuntu tablet also spent time at the table so people could play around with that.


Thanks to Randy for the booth photo!

At the conference closing, we had three Ubuntu books to raffle off! They seemed to go to people who appreciated them and since both José and I attended the conference, the raffle winners had 2/3 of the authors there to sign the books.


My co-author, José Antonio Rey, signing a copy of our book!

by pleia2 at August 22, 2016 07:53 PM

A lecture, a symphony and a lot of street cars

My local July adventures weren’t confined to mummies, baseball and food. I also attended a few shows a lectures.

On July 14th I met up with a friend to see a Kevin Kelly speak on The Next 30 Digital Years, put on by The Long Now Foundation. This lecture covered a series of trends (not specific technologies) that Kelly felt would drive the future. This included proliferation of “screens” on a variety of surfaces to meet our ever-increasing desire to be connected to media we now depend on in our work and lives. He also talked about the rise of augmented reality, increased tracking for increased personalization of services (with a sidebar about privacy) and increasing sharing economy, where access continues to replace ownership.

What I enjoyed most about this talk was how optimistic he was. Even while tackling difficult topics like privacy in a very connected world, he was incredibly positive about what our future holds in store for us. This held true even when questions from the audience expressed more pessimistic views.

In a weekend that revolved around events near City Hall, the very next evening I went to the San Francisco Symphony for the first time. As SciFi fan who has a sidebar love for movie scores, my introduction to the symphony here was appropriately Star Trek: The Ultimate Voyage — A 50th Anniversary Celebration (article). The event featured the full symphony, with a screen above them that showed clips and a narrated exploration through the Star Trek universe as they played scores from movies and selections from each series. They definitely focused on TOS and TNG, but there was decent representation of the rest. I also learned that SF trekkies really like Janeway. Me too. It was a really fun night.

We also went to an event put on by the Western Neighborhoods Project (WNP), Streetcar San Francisco: Transit Tales of the City in Motion at Balboa Theatre.

The event featured short films and clips of historic streetcars and expertise from folks over at Market Street Railway (which may have been how I heard about it). The clips covered the whole city, including a lot of downtown as they walked us through some the milestones and transit campaigns in the history of the city. It was particularly interesting to learn about the street cars in the west side of the city, where they used to have have a line that ran up around Land’s End, and some neat (or tacky) hanging “sky-trams” which took you from Cliff House to Point Lobos, an article about them here: A Brief History of San Francisco’s Long-Lost Sky Tram, which also references the WNP page about them.

This event also clued me in to the existence of OpenSF History by WNP. They’re going through a collection of historic San Francisco photos that have been donated and are now being digitized, indexed and shared online. Very fun to browse through, and there are great pictures of historic streetcars and other transit.

by pleia2 at August 22, 2016 01:56 PM

August 18, 2016

Jono Bacon

Opening Up Data Science with data.world

Earlier this year when I was in Austin, my friend Andy Sernovitz introduced me to a new startup called data.world.

What caught my interest is that they are building a platform to make data science and discovery easier, more accessible, and more collaborative. I love these kinds of big juicy challenges!

Recently I signed them up as a client to help them build their community, and I want to share a few words about why I think they are important, not just for data science fans, but from a wider scientific discovery perspective.

Screen Shot 2016-08-15 at 3.35.31 AM

Armchair Discovery

Data plays a critical role in the world. Buried in rows and rows of seemingly flat content are patterns, trends, and discoveries that can help us to learn, explore new ideas, and work more effectively.

The work that leads to these discoveries is often bringing together different data sets to explore and reach new conclusions. As an example, traffic accident data for a single town is interesting, but when we combine it with data sets for national/international traffic accidents, insurance claims, drink driving, and more, we can often find patterns that can help us to influence and encourage new behavior and technology.

Screen Shot 2016-08-15 at 3.36.10 AM

Many of these discoveries are hiding in plain sight. Sadly, while talented data scientists are able to pull together these different data sets, it is often hard and laborious work. Surely if we make this work easier, more accessible, consistent, and available to all we can speed up innovation and discovery?

Exactly.

As history has taught us, the right mixture of access, tooling, and community can have a tremendous impact. We have seen examples of this in open source (e.g. GitLab / GitHub), funding (e.g. Kickstarter / Indiegogo), and security (e.g. HackerOne).

data.world are doing this for data.

Data Science is Tough

There are four key areas where I think data.world can make a potent impact:

  1. Access – while there is lots of data in the world, access is inconsistent. Data is often spread across different sites, formats, and accessible to different people. We can bring this data together into a consistent platform, available to everyone.
  2. Preparation – much of the work data scientists perform is learning and prepping datasets for use. This work should be simplified, done once, and then shared with everyone, as opposed to being performed by each person who consumes the data.
  3. Collaboration – a lot of data science is fairly ad-hoc in how people work together. In much the same way open source has helped create common approaches for code, there is potential to do the same with data.
  4. Community – there is a great opportunity to build a diverse global community, not just of data scientists, but also organizations, charities, activists, and armchair sleuths who, armed with the right tools and expertise, could make many meaningful discoveries.

This is what data.world is building and I find the combination of access, platform, and network effects of data and community particularly exciting.

Unlocking Curiosity

If we look at the most profound impacts technology has had in recent years it is in bubbling people’s curiosity and creativity to the surface.

When we build community-based platforms that tap into this curiosity and creativity, we generate new ideas and approaches. New ideas and approaches then become the foundation for changing how the world thinks and operates.

screencapture-data-world-1471257465804

As one such example, open source tapped the curiosity and creativity of developers to produce a rich patchwork of software and tooling, but more importantly, a culture of openness and collaboration. While it is easy to see the software as the primary outcome, the impact of open source has been much deeper and impacted skills, education, career opportunities, business, collaboration, and more.

Enabling the same curiosity and creativity with the wealth of data we have in the world is going to be an exciting journey. Stay tuned.

The post Opening Up Data Science with data.world appeared first on Jono Bacon.

by Jono Bacon at August 18, 2016 03:00 PM

August 17, 2016

Akkana Peck

Making New Map Tracks with Google Earth

A few days ago I wrote about track files in maps, specifically Translating track files between mapping formats. I promised to follow up with information on how to create new tracks.

For instance, I have some scans of old maps from the 60s and 70s showing the trails in the local neighborhood. There's no newer version. (In many cases, the trails have disappeared from lack of use -- no one knows where they're supposed to be even though they're legally trails where you're allowed to walk.) I wanted a way to turn trails from the old map into GPX tracks.

My first thought was to trace the old PDF map. A lot of web searching found a grand total of one page that talks about that: How to convert image of map into vector format?. It involves using GIMP to make an image containing just black lines on a white background, saving as uncompressed TIFF, then using a series of commands in GRASS. I made a start on that, but it was looking like it might be a big job that way. Since a lot of the old trails are still visible as faint traces in satellite photos, I decided to investigate tracing satellite photos in a map editor first, before trying the GRASS method.

But finding a working open source map editor turns out to be basically impossible. (Opportunity alert: it actually wouldn't be that hard to add that to PyTopo. Some day I'll try that, but now I was trying to solve a problem and hoping not to get sidetracked.)

The only open source map editor I've found is called Viking, and it's terrible. The user interface is complicated and poorly documented, and I could input only two or three trail segments before it crashed and I had to restart. Saving often, I did build up part of the trail network that way, but it was so slow and tedious restoring between crashes that I gave up.

OpenStreetMap has several editors available, and some of them are quite good, but they're (quite understandably) oriented toward defining roads that you're going to upload to the OpenStreetMap world map. I do that for real trails that I've walked myself, but it doesn't seem appropriate for historical paths between houses, some of which are now fenced off and few of which I've actually tried walking yet.

Editing a track in Google Earth

In the end, the only reasonable map editor I found was Google Earth -- free as in beer, not speech. It's actually quite a good track editor once I figured out how to use it -- the documentation is sketchy and no one who writes about it tells you the important parts, which were, for me:

Click on "My Places" in the sidebar before starting, assuming you'll want to keep these tracks around.

Right-click on My Places and choose Add->Folder if you're going to be creating more than one path. That way you can have a single KML file (Google Earth creates KML/KMZ, not GPX) with all your tracks together.

Move and zoom the map to where you can see the starting point for your path.

Click the "Add Path" button in the toolbar. This brings up a dialog where you can name the path and choose a color that will stand out against the map. Do not hit Return after typing the name -- that will immediately dismiss the dialog and take you out of path editing mode, leaving you with an empty named object in your sidebar. If you forget, like I kept doing, you'll have to right-click it and choose Properties to get back into editing mode.

Iconify, shade or do whatever your window manager allows to get that large, intrusive dialog out of the way of the map you're trying to edit. Shade worked well for me in Openbox.

Click on the starting point for your path. If you forgot to move the map so that this point is visible, you're out of luck: there's no way I've found to move the map at this point. (You might expect something like dragging with the middle mouse button, but you'd be wrong.) Do not in any circumstances be tempted to drag with the left button to move the map: this will draw lots of path points.

If you added points you don't want -- for instance, if you dragged on the map trying to move it -- Ctrl-Z doesn't undo, and there's no Undo in the menus, but Delete removes previous points. Whew.

Once you've started adding points, you can move the map using the arrow keys on your keyboard. And you can always zoom with the mousewheel.

When you finish one path, click OK in its properties dialog to end it.

Save periodically: click on the folder you created in My Places and choose Save Place As... Google Earth is a lot less crashy than Viking, but I have seen crashes.

When you're done for the day, be sure to File->Save->Save My Places. Google Earth apparently doesn't do this automatically; I was forever being confused why it didn't remember things I had done, and why every time I started it it would give me syntax errors on My Places saying it was about to correct the problem, then the next time I'd get the exact same error. Save My Places finally fixed that, so I guess it's something we're expected to do now and then in Google Earth.

Once I'd learned those tricks, the map-making went fairly quickly. I had intended only to trace a few trails then stop for the night, but when I realized I was more than halfway through I decided to push through, and ended up with a nice set of KML tracks which I converted to GPX and loaded onto my phone. Now I'm ready to explore.

August 17, 2016 11:26 PM

Jono Bacon

Join My Reddit AMA – 30th August 2016 at 9am Pacific

On Tuesday 30th August 2016 at 9am Pacific (see other time zone times here) I will be doing a Reddit AMA about my work in community strategy, management, developer relations, open source, music, and elsewhere.

Screen Shot 2016-08-16 at 10.45.40 PM

For those unfamiliar with Reddit AMAs, it is essentially a way in which people can ask questions that someone will respond to. You simply add your questions (serious, or fun both welcome!) and I will respond to as many as I can.

It has been a while since my last AMA, so I am looking forward to this one.

Feel free to ask any questions you like, and this could include questions that relate to:

  • Community management, leadership, and best practice.
  • Working at Canonical, GitHub, XPRIZE, and elsewhere.
  • The open source industry, how it has changed, and what the future looks like.
  • The projects I have been involved in such as Ubuntu, GNOME, KDE, and others.
  • The driving forces behind people and groups, behavioral economics, etc.
  • My other things such as my music, conferences, writing etc.
  • Anything else – politics, movies, news, tech…ask away!

If you want to ask about something else though, go ahead! 🙂

How to Join

Joining the AMA is simple. Just follow these steps:

  • Be sure to have a Reddit account. If you don’t have one, head over here and sign up.
  • On Tuesday 30th August 2016 at 9am Pacific (see other time zone times here) I will share the link to my AMA on Twitter (I am not allowed to share it until we run the AMA). You can look for this tweet by clicking here.
  • Click the link in my tweet to go to the AMA and then click the text box to add your question(s).
  • Now just wait until I respond. Feel free to follow up, challenge my response, and otherwise have fun!

Simple as that. 🙂

A Bit of Background

For those of you unfamiliar with my work, you can read more here, but here is a quick summary:

  • I run a community strategy/management and developer relations consultancy practice.
  • My clients include Deutsche Bank, HackerOne, data.world, Intel, Sony Mobile, Open Networking Foundation, and others.
  • I previously served as director of community for GitHub, Canonical, and XPRIZE.
  • I serve as an advisor to various organizations including Open Networking Foundation, Mycroft AI, Mod Duo, and Open Cloud Consortium.
  • I wrote The Art of Community and have columns for Forbes and opensource.com. I have also written four other books and hundreds of articles.
  • I have been involved with various open source projects including Ubuntu, GNOME, KDE, Jokosher, and others.
  • I am an active podcaster, previously with LugRadio and Shot of Jaq, and now with Bad Voltage.
  • I am really into music and have played in Seraphidian and Severed Fifth.

So, I hope you manage to make it over to the AMA, ask some fun and interesting questions, and we can have a good time. Thanks!

The post Join My Reddit AMA – 30th August 2016 at 9am Pacific appeared first on Jono Bacon.

by Jono Bacon at August 17, 2016 03:00 PM

Elizabeth Krumbach

The West, Mummies and Baseball

I spent most of July at home, which gave us time to take some time get over to the Legion of Honor for an exhibit I was looking forward to, and to another Giants game this season.

The exhibit I wanted to see was Wild West: Plains to the Pacific. There are absolutely heartbreaking things about the west story, but I grew up on westerns and stories of wagon trains. I have a visceral connection to the west story. People from the east building their new life out west, braving hardship and heartbreak. Even my own move west was a re-invention of myself. So I was definitely drawn to this exhibit.

The exhibit takes you through various periods of time, from the frontier to present day. Journeys by the first artists who captured the beauty of the western territories, wild west shows, farmers and beyond. Some of the most striking images were those advertising fruit boxes from California, each drawing distinction for their brand with bright colors and clever names.

While we were there, we decided to also go to a lecture that happened to be presented that day on “Mummies! The Medicine, Myths, and Marvels of Ancient Egyptian Mummification” by Charlotte Read which accompanied another exhibit they had, The Future of the Past: Mummies and Medicine. It was a great talk to see prior to seeing the exhibit, since she described many of the things we’d later see, including details about amulets, which played a prominent role and gave us a glimpse into the technologies they’re using today to peer under the wrapping of mummies, non-destructively.

The exhibit itself was quite small, only taking up one room, but it was worth seeing. You get to see the pair mummies themselves, along with facial reconstructions and the high resolution CT scans preformed on them. The exhibit also presented several of the artifacts that are often buried alongside mummies.

More photos from the Legion of Honor here: https://www.flickr.com/photos/pleia2/albums/72157667790231333

Later in the month we went to see the San Francisco Giants play over at the beautiful AT&T Park.

It was the second game we saw this season, and sadly they did not triumph this time. It was a good game to watch though, and the weather was beautiful. Plus, we had great company as a friend of ours joined us.

by pleia2 at August 17, 2016 01:01 PM

August 16, 2016

Jono Bacon

Cutting the Cord With Playstation Vue

We just cut the cord, and glory is ours. I thought I would share how we did it to provide food for thought for those of you sick of cable (and maybe so people can stop bickering on my DirecTV blog post from years back).

Photo on 8-16-16 at 2.12 PM

I will walk through the requirements we had, what we used to have, and what the new setup looks like.

Requirements

The requirements for us are fairly simple:

  • We want access to a core set of channels:
    • Comedy Central
    • CNN
    • Food Network
    • HGTV
    • Local Channels (e.g. CBS, NBC, ABC).
  • Be able to favorite shows and replay them after they have aired.
  • Have access to streaming channels/services:
    • Amazon Prime
    • Netflix
    • Crackle
    • Spotify
    • Pandora
  • Be able to play Blu-ray discs, DVDs, and other optical content. While we rarely do this, we want the option.
  • Have a reliable Internet connection and uninterrupted service.
  • Have all of this both in our living room and in our bedroom.
  • Reduce our costs.
  • Bonus: access some channels on mobile devices. Sometimes I would like to watch the daily show or the news while on the elliptical on my tablet.

Previous Setup

Our previous setup had most of these requirements in place.

For TV we were with DirecTV. We had all of the channels that we needed and we could record TV downstairs but also replay it upstairs in the bedroom.

We have a Roku that provides the streaming channels (Netflix, Amazon Prime, Crackle, Spotify, and Pandora).

We also have a cheap Blueray player which while rarely used, does come in handy from time time.

Everything goes into Pioneer Elite amp and I tried to consolidate the remotes with a Logitech Harmony but it broke immediately and I have heard from others the quality is awful. As such, we used a cheaper all in one remote which could do everything except the Roku as that is bluetooth.

The New Setup

At the core of our new setup is a Playstation 4. I have actually had this for a while but it has been sat up in my office and barely used.

Photo on 8-16-16 at 2.10 PM

The Playstation 4 provides the bulk of what we need:

  • Amazon Prime, Netflix, and Spotify. I haven’t found a Pandora app yet, but this is fine.
  • Blueray playback.
  • Obviously we have the additional benefit of now being able to play games downstairs. I am enjoying having a blast on Battlefield from time to time and I installed some simple games for Jack to play on.

For the TV we are using Playstation Vue. This is a streaming service that has the most comprehensive set of channels I have seen so far, and the bulk of what we wanted is in the lowest tier plan ($40/month). I had assessed some other services but key channels (e.g. Comedy Central) were missing.

Photo on 8-16-16 at 2.14 PM

Playstation Vue has some nice features:

  • It is a lot cheaper. Our $80+/month cable bill has now gone down to $40/month with Vue.
  • The overall experience (e.g. browsing the guide, selecting shows, viewing information) is far quicker, more modern, and smoother than the clunky old DirecTV box.
  • When browsing the guide you can not just watch live TV but also watch previous shows that were on too. For example, missed The Daily Shows this week? No worries, you can just go back and watch them.
  • Playstation Vue is also available on Android, IOS, Roku and other devices which means I can watch TV and play back shows wherever I am.

In terms of the remote control I bought the official Playstation 4 remote and it works pretty well. It is still a little clunky in some areas as the apps on the Playstation sometimes refer to the usual playstation buttons as opposed to the buttons on the remote. Overall though it works great and it also powers my other devices (e.g. TV and amp), although I couldn’t get volume pass-through working.

Networking wise, we have a router upstairs in the bedroom which is where the feed comes in. I then take a cable from it and send it over our power lines with a Ethernet Over Power adapter. Then, downstairs I have an additional router which is chained and I take ethernet from the router to the Playstation. This results in considerably more reliable performance than using wireless. This is a big improvement as the Roku doesn’t have an ethernet port.

In Conclusion

Overall, we love the new setup. The Playstation 4 is a great center-point for our entertainment system. It is awesome having a single remote, everything on one box and in one interface. I also love the higher-fidelity experience – the Roku is great but the interface looks a little dated and the apps are rather restricted.

Playstation Vue is absolutely awesome and I would highlight recommend it for people looking to ditch cable. You don’t even need a Playstation 4 – you can use it on a Roku, for example.

I also love that we are future proofed. I am planning on getting Playstation VR, which will now work downstairs, and Sony are bringing more and more content and apps to the Playstation Store. For example, there are lots of movies, TV shows, and other content which may not be available elsewhere.

I would love to hear your stories though about your cord cutting. Which services and products did you move to? What do you think about a games console running your entertainment setup? What am I doing wrong? Let me know in the comments!

The post Cutting the Cord With Playstation Vue appeared first on Jono Bacon.

by Jono Bacon at August 16, 2016 09:27 PM

August 15, 2016

Jono Bacon

Running a Hackathon for Security Hackers

A few weeks ago I flew out to Las Vegas with HackerOne to help run an event we had been working on for a while called H1-702. It was a hackathon designed for some of the world’s most talented security hackers.

H1-702 was one piece in a picture to ensure HackerOne is the very best platform and community for hackers to hack, learn, and grow.

This was the event that we invite the cream of the crop to…hackers who have been doing significant and sustained work and who have delivered some awesome vulnerability reports.

20160804_165520

Hacking For Fun and Profit

For the event we booked a MGM Grand Skyloft for three evenings. We invited the most prolific hackers on HackerOne to join us where they would be invited to hack on a specific company’s technology each night. They didn’t learn about which company it was until the evening they arrived…this kept a bit of mystery in the air. 😉

The first night had Zenefits, the second Snapchat, and the third Panasonic Avionics. This was a nice mixture of web, mobile, and embedded.

20160804_183128

Each evening Hackers were provided with the scope and then invited to hack these different products and submit vulnerabilities. Each company had their security team and developers on-hand where they would be able to answer questions, review and confirm reports quickly (and then fix the issues.)

Confirmed reports would result in a payout from the company and reputation points. This would then bump the hacker higher up on the H1-702 leaderboard and closer to winning the prestige of H1-702 Most Valued Hacker, complete with a pretty badass winners belt. As you can imagine, things got a little competitive. 😉

20160804_165509

Each evening kicked off at around 7pm – 8pm and ran until the wee hours. The first night, for example, I ended up heading to bed at around 5.30am and they were still going.

There was an awesome electricity in the air and these hackers really brought their A-game. Lots of hackers walked out the door having made thousands of dollars for an evening’s hacking.

While competitive, it was also social, with people having a good time and getting to know each other. Speaking personally, it was great to meet some hackers who I have been following for a while. It was a thrill to watch them work.

Taking Care of Your Best

In every community you always get a variance of quality and commitment. Some people will be casual contributors and some will invest significant time and energy in the community and their work. It is always critical to really take care of your best, and H1-702 was one way in how want to do this at HackerOne.

Given this, we wanted deliver a genuinely premium event for these hackers and ensure that everyone received impeccable service and attention, not just at the event but from the minute they arrived in Vegas. After all, they have earned it.

20160804_184037

This was an exercise in detail. We ensured we had a comfortable event space in a cool hotel. We had oodles of booze, with some top-shelf liquor. We provided food throughout the evening and brought in-chair massages later in the night to re-invigorate everyone. We provided plenty of seating, both in quiet and noisier spaces, lots of power sockets and we worked to have fast and reliable Internet. We provided each hacker with a HackerOne backpack, limited edition t-shirts, and other swag such as H1-702 challenge coins. We ensured that there was always someone hackers could call to solve problems, and we were receptive to feedback each night to improve it the following night.

Throughout the evening we worked to cater to the needs of hackers. We had members of HackerOne helping hackers solve problems, keep everyone hydrated and fed, and having a good time. HackerOne CEO Mårten Mickos was also running around like a waiter (amusingly with a white towel) ensuring everyone had drinks in their hands.

Overall, it was a fun event and while it went pretty well, there is always plenty to learn and improve for next time. If this sounds like fun, be sure to go and sign up and hack on some programs and earn a spot next year.

The post Running a Hackathon for Security Hackers appeared first on Jono Bacon.

by Jono Bacon at August 15, 2016 03:00 PM

Elizabeth Krumbach

Local Edibles

San Francisco has a lot of great food, and more restaurants than we could possibly visit. Over the past month or so we’ve tried a couple more and returned to a couple of our favorites.

While MJ was working in the city, I finally got to visit Hakkasan, an upscale Cantonese restaurant. They have an array of delicious entrees, but their “small eats” and dim sum are exceptional. The food is also beautiful, upon receiving our first round of dishes, including the amazing Crispy prawn with mango, MJ asked where I wanted to start. “I want to start by taking a picture of my food!”

Our new dining adventures continued with a visit to Tadich Grill, arguably “the oldest continuously running restaurant in San Francisco” (via wikipedia). The place started out as a coffee stand in 1849 and has changed names and owners, making their “oldest” claim a bit tenuous, but however you count, it is an old place by San Francisco standards and they’ve been in their current location since 1967. They don’t take reservations, and we came in around 9PM and still had about a half hour wait along with the crowd that was mostly tourists. We were finally seated as one of the last seatings of the evening. They specialize in seafood dishes, and the wait staff where all wearing white jackets, looking pretty formal. The appetizers and entrees didn’t blow me away, but it was a decent seafood. What did make me happy was dessert, they have a solid carrot cake, which I’m not used to finding in San Francisco. Paired with a Claiborne & Churchill 2014 Dry Gewürztraminer, it was a perfect ending to the evening. As a bonus, it made me explore Claiborne & Churchill’s wines, and their selection of sweet wines is really nice, I’ve ordered a Port, a couple of their Muscats and of course some more Gewürztraminer.

We also recently joined friends for a dinner at Lazy Bear, which we first went to in December. As I wrote then, the seating is family style and they serve a fixed tasting menu. This time I also did the wine pairing, which was totally worth it, they had a really nice list of wines and the portions were nicely timed with the dishes.

Going to Lazy Bear is always an experience, more photos from the evening here: https://www.flickr.com/photos/pleia2/albums/72157669411350191

Last, but the most important, we went back to Jardiniere to celebrate our third wedding anniversary… a couple months late. I was traveling on our actual anniversary at the end of April, and then between trips and general being busy, it took until July to actually get reservations and settle on the evening. It was nice to finally go out to celebrate together. They have a variety of French inspired dishes that I love, but they prepare an amazing rare wagyu both proper the Japanese Wagyu, and American. We got one of each, along with a lovely dessert.

by pleia2 at August 15, 2016 02:42 PM

August 14, 2016

Akkana Peck

Translating track files between mapping formats

I use map tracks quite a bit. On my Android phone, I use OsmAnd, an excellent open-source mapping tool that can download map data generated from free OpenStreetMap, then display the maps offline, so I can use them in places where there's no cellphone signal (like nearly any hiking trail). At my computer, I never found a decent open-source mapping program, so I wrote my own, PyTopo, which downloads tiles from OpenStreetMap.

In OsmAnd, I record tracks from all my hikes, upload the GPX files, and view them in PyTopo. But it's nice to go the other way, too, and take tracks or waypoints from other people or from the web and view them in my own mapping programs, or use them to find them when hiking.

Translating between KML, KMZ and GPX

Both OsmAnd and PyTopo can show Garmin track files in the GPX format. PyTopo can also show KML and KMZ files, Google's more complicated mapping format, but OsmAnd can't. A lot of track files are distributed in Google formats, and I find I have to translate them fairly often -- for instance, lists of trails or lists of waypoints on a new hike I plan to do may be distributed as KML or KMZ.

The command-line gpsbabel program does a fine job translating KML to GPX. But I find its syntax hard to remember, so I wrote a shell alias:

kml2gpx () {
        gpsbabel -i kml -f $1 -o gpx -F $1:t:r.gpx
}
so I can just type kml2gpx file.kml and it will create a file.gpx for me.

More often, people distribute KMZ files, because they're smaller. They're just gzipped KML files, so the shell alias is only a little bit longer:

kmz2gpx () {
        kmlfile=/tmp/$1:t:r.kml 
        gunzip -c $1 > $kmlfile
        gpsbabel -i kml -f $kmlfile -o gpx -F $kmlfile:t:r.gpx
}

Of course, if you ever have a need to go from GPX to KML, you can reverse the gpsbabel arguments appropriately; and if you need KMZ, run gzip afterward.

UTM coordinates

A couple of people I know use a different format, called UTM, which stands for Universal Transverse Mercator, for waypoints, and there are some secret lists of interesting local features passed around in that format.

It's a strange system. Instead of using latitude and longitude like most world mapping coordinate systems, UTM breaks the world into 60 longitudinal zones. UTM coordinates don't usually specify their zone (at least, none of the ones I've been given ever have), so if someone gives you a UTM coordinate, you need to know what zone you're in before you can translate it to a latitude and longitude. Then a pair of UTM coordinates specifies easting, and northing which tells you where you are inside the zone. Wikipedia has a map of UTM zones.

Note that UTM isn't a file format: it's just a way of specifying two (really three, if you count the zone) coordinates. So if you're given a list of UTM coordinate pairs, gpsbabel doesn't have a ready-made way to translate them into a GPX file. Fortunately, it allows a "universal CSV" (comma separated values) format, where the first line specifies which field goes where. So you can define a UTM UniCSV format that looks like this:

name,utm_z,utm_e,utm_n,comment
Trailhead,13,0395145,3966291,Trailhead on Buckman Rd
Sierra Club TH,13,0396210,3966597,Alternate trailhead in the arroyo
then translate it like this:
gpsbabel -i unicsv -f filename.csv -o gpx -F filename.gpx
I (and all the UTM coordinates I've had to deal with) are in zone 13, so that's what I used for that example and I hardwired that into my alias, but if you're near a zone boundary, you'll need to figure out which zone to use for each coordinate.

I also know someone who tends to send me single UTM coordinate pairs, because that's what she has her Garmin configured to show her. For instance, "We'll be using the trailhead at 0395145 3966291". This happened often enough, and I got tired of looking up the UTM UniCSV format every time, that I made another shell function just for that.

utm2gpx () {
        unicsv=`mktemp /tmp/point-XXXXX.csv` 
        gpxfile=$unicsv:r.gpx 
        echo "name,utm_z,utm_e,utm_n,comment" >> $unicsv
        printf "Point,13,%s,%s,point" $1 $2 >> $unicsv
        gpsbabel -i unicsv -f $unicsv -o gpx -F $gpxfile
        echo Created $gpxfile
}
So I can say utm2gpx 0395145 3966291, pasting the two coordinates from her email, and get a nice GPX file that I can push to my phone.

What if all you have is a printed map, or a scan of an old map from the pre-digital days? That's part 2, which I'll post in a few days.

August 14, 2016 04:29 PM

August 13, 2016

Elizabeth Krumbach

Local Critters

I already wrote about some of the local drinks we’ve been enjoying over these past few months, it’s time to move on to animals! Most of which have had their existence proven by science.

Back in April I made one of my standard pilgrimages to the San Francisco Zoo, where we’re members. This time we went with my sister and law and her husband, and the highlight of the visit for me was finally seeing little Jasiri, the lion cub. He was a bit hard to make out, hidden under the shade of a bush, but I was able to find him, near his mother Sukari.


Jasiri and Sukari

We made our usual stops, visiting the rescued sea lions, the grizzly sisters frolicking in their pool and of course to penguin island. I also got my first look at Claudia, the Andean Condor who recently became a resident there. I’ll have to go back soon, they opened up their new Mexican gray wolf exhibit in June and their Sifaka Lemur exhibit opens in a week.


Rainbow at Penguin Island

More photos from that visit to the San Francisco Zoo here: https://www.flickr.com/photos/pleia2/sets/72157666612834551

During the same visit to San Francisco, the four of us also made our way up to Sausalito to visit the The Marine Mammal Center (TMMC). It’s one of our favorite organizations, and following our donation this year they reached out to us to offer a tour, which we decided to take advantage of while we had family in town.

The volunteer spent about an hour with us, walking us through the public areas, including the holding pens where we saw the elephant seals being fed, a lab where they were doing blood analysis, their “fish kitchen” where they prepare food for the animals and over to their public autopsy area. He also demonstrated for us how they go about capturing the animal, joking that “everything is a seal, and everything is about 100 lbs” when people call in reports. In reality, they also rescue many sea lions as well and most of the animals are quite a bit heftier and powerful than the 100 lbs claim suggests.

We then went behind the scenes. The site is owned by the US government and the organization is granted use of what is actually and old missile facility. Part of the massive filtration system for all their tanks and pools is now located where they used to store missiles. Fortuitously, we also got to see a truck coming in with some newly rescued patients. A baby harbor seal was among the rescues, who we got to see unloaded and nearly broke my heart when he cried his “maaa” cry. He’s in excellent hands though, they do really great work there.

Picture taking behind the scenes was limited, but I do have several more photos The Marine Mammal Center here: https://www.flickr.com/photos/pleia2/albums/72157664556780964

Finally, I made a visit to a more… elusive critter. After giving an Ubuntu presentation at Felton LUG a few months ago, I took the opportunity of being in Felton to visit the Bigfoot Discovery Museum right down the street. It was amusing, but completely coincidental, that this visit came on the heels of my visit just weeks before to the International Cryptozoology Museum in Maine. It’s true that I’m terribly fascinated by the search for cryptids like bigfoot, but the skeptic in me won’t get me much further than fascination until there’s more solid evidence.

This museum walks you through the evidence that does exist, including various footprint casts, an analysis of the famous Patterson–Gimlin film and maps of reported sightings throughout northern California. There’s also nearly a full room devoted to the pop culture around the creatures, from toys to movie posters. The proprietor was enthusiastic about sharing stories with visitors about sightings and the evidence that exists, and hearing his enthusiasm for his work was alone worth the visit for me.

More photos from the museum here: https://www.flickr.com/photos/pleia2/albums/72157670265156315

by pleia2 at August 13, 2016 05:14 PM

Local Potions

As I look through my blog posts this year, I’ve noticed a very travel and conference-focused trend. It seems I’ve been really good about staying on top of writing about these things, and less so with some of our local adventures. At this point it seems I have to reach all the way back to April to start writing about what we’ve been up to, covering visits with friends and family, trips up to wine country, adventures to new theaters and the symphony. Instead of stepping through these chronologically, I thought it might be more interesting to group things up.

To begin, let’s talk about some of the wonderful things I’ve had to drink this year.

Back in April, my sister in law and her husband were in town visiting. When we have guests in town there’s a bunch of stuff we love to do, but it’s also fun to check off some of the attractions we haven’t seen yet either. That brought us to the Japanese Tea Garden in Golden Gate Park. Going into this I didn’t have any expectations, I wasn’t sure what it had, how big it was or anything. Now that I’ve been, I can definitely recommend a visit.

There’s a path that winds through the garden, taking you through various Japanese trees, flowers and other plants. Throughout they have pagodas of varied provenance, some created in Japan and shipped over, some with dedications, one that came from the Pan-Pacific International Exposition (world’s fair) that was held in San Francisco in 1915. There’s a distinctive arched bridge, stone lanterns and various water features, from fountains and streams, to a koi pond. The walk through these lovely gardens concluded for us at their tea house, where we got snacks and some hot Matcha green tea.

More photos from our visit to the Japanese Tea Garden here: https://www.flickr.com/photos/pleia2/albums/72157666201291390

Moving from tea to something a bit stronger, in June we had a pair of friends in town who we took up to Sonoma County for some wine tasting. Now, we’ve done this journey with many folks, so I won’t give a play by play this time around, but it was worth noting because we went on the Partners Wine Tour at Benziger Family Winery, something I hadn’t done in years.

But first, we had lunch! We ordered sandwiches at the nearby Glen Ellen Village Market, bought a bottle of 2013 Dragonsleaf Pinot Noir and sat outside in the shade outside the tasting room at Benziger. There are few things in the world so relaxing and satisfying as that picnic lunch with friends was.

It was then onward for the tour! Taking a little over an hour, the tour took us on a long ride through the vineyards, giving history of the winery and tastings throughout our journey. We then went into the wine caves, where we sampled some not-quite-finished wine right out of the cask. The dining room in the cave was closed for maintenance, so our final tour stop was into the library, where our membership status got us a sample of one of their amazing old library wines.

More photos from the Benziger Partners Tour, and stops at Jacuzzi Winery and Imagery Winery here: https://www.flickr.com/photos/pleia2/albums/72157669175719622

Wine adventures continued later in the month when MJ and I went up to Napa Valley for an afternoon of dining at Rutherford Grill and tasting as Rutherford Hill Winery, where we are members and had to drive up to pick up our shipment of wines. I’m a fan of Merlots, and Rutherford Hill is internationally famous for them, but what really made membership for us was how well we’re treated as members. They have an amazing tasting room in their wine caves, and are very flexible about wines you can pick up in your shipment, since price is determined by precisely what you buy anyway. With them, membership turns out to essentially be an agreement to buy a certain number of wine bottles per cycle, with a set of recommendations as a default. It was all very refreshing for such a large Napa winery, and their wines are exceptional.

Upon arrival, we were led back to the caves where we sat in a little alcove to sample a series of wines. Our host was wonderful and we really enjoyed the ambiance and coolness of the cave, especially with how bright and warm it was outside. Once the tasting was concluded, I took advantage of the springtime climate that had all their flowers in full bloom before going inside to pick up our wines and a few other goodies.

More photos (especially flowers!) from our day up at Rutherford Hill here: https://www.flickr.com/photos/pleia2/sets/72157667277830503

My final drink adventure was closer to home than all the rest, a quick walk from home down 2nd street and over to Black Hammer Brewing. It was here that I met my friend Pasi who was in town from Finland and another fellow in town from Copenhagen. It’s on the newer side and they don’t serve food (though outside food is allowed), so I hadn’t been to this brewery before. We arrived pretty late and only had a couple rounds, but I was really impressed with the variety available in their small batches of beers. I went with the Nautical Twilight and Sunrise Set.

I love living in a place where I can not only find some delicious things to drink, but do so in beautiful places. Even the brewery, though lacking in scenery of my tea and wine trips, had a comfortable atmosphere that’s begging me to return.

by pleia2 at August 13, 2016 12:06 AM

August 12, 2016

Elizabeth Krumbach

Laundry, Lunches, Trains, Temple and Shopping in Mumbai

My tourist adventures in Mumbai continued on Saturday following the conference. I made plans through the hotel concierge to hire a driver and tour guide for the day. I was initially a bit worried about the weather, since reports (and warnings!) had forecasted rain, but we lucked out. I was picked up from the hotel promptly at 10AM and thus began a wonderful day with my tour guide, Mala Bangera.

I’ll start off by say that Mala was a wonderful guide, one of the best I’ve ever had. She had over 20 years of experience and so was extremely knowledgeable, had lived all over the world (including the US) and had strong relationships with people at all the places we visited. She was able to advise me when I could take pictures, and honestly explain how much and when to tip people we encountered and interacted with throughout our journey (a fellow I got my picture with, a woman whose wares I asked for a photo of, a fellow who watched our shoes at a temple). I had a wonderful day under her guidance, and I’m so glad I worked with her instead of spending the day venturing out on my own!

As for where we went, I largely let her set the agenda when I explained that I just wanted to see some popular tourist sites and temples. It was the right choice. Our first stop was the Dhobi Ghat, a huge open air laundry facility. As I’ve learned some about domestic history, laundry looms large in the list of things that was incredibly laborious and time-consuming until the modern era of washing machines that people like me now have in our homes. Of course the modern washer and dryer I have assume access to a lot of water and electricity, and space, which is not ubiquitous in India. Instead, Mala explained that many people send their clothes out to be washed, dried, pressed and delivered back to their home, and a lot of this work happens at a dhobi ghat.


Dhobi Ghat (laundry)

Our next stop was to see the Dabbawalas. These folks collect your prepared lunch from your house in morning and deliver it to work around lunchtime. That means you get your home-cooked meal at lunch (which is typically a large meal there) and you don’t need to carry it with you when you leave home early in the morning. Doing this in bulk allows for an inexpensive service and a lifelong profession for the Dabbawalas, one fellow my tour guide introduced me to has been doing it for over 25 years. It was a Saturday so the pickup location we visited wasn’t as busy as it was on week days, but it was worth seeing anyway.


Dabbawalas sorting and carrying lunches

From there we went to the Chhatrapati Shivaji Terminus (CST), the train station! A Gothic Revival designed by Frederick William Stevens in 1887, the station first caught my eye on the previous Sunday when we drove by it. This time we were able to actually go inside, buy a platform ticket and wander around. I advised against taking pictures inside the station itself, but I did see a whole bunch of trains, from the city commuter trains to the larger long-distance trains. I learned that they have cars reserved exclusively for women, which is understandable given the need to accommodate modest religious faiths on trains that get incredibly packed during weekday commutes (genders were also segregated at most security check points I went through, frequently with women given private screening behind a curtain).


Chhatrapati Shivaji Terminus

We then drove over to a highly respected shop in the area where I picked up some little gifts, and a couple not so little things for myself. I had a lovely experience, while shopping for some Indian sapphire jewelry (my birthstone) they brought over some wonderfully spiced Kashmiri Kahwa hot tea, and also gave me some to take home with me as I was checking out.

The next stop was the Babu Amichand Panalal Adishwarji Jain Temple, which is routinely called the most beautiful Jain Temple in Mumbai. As far as temples go, my guide explained that those by the Jains tend to be the most beautiful in India. This temple absolutely lived up to the hype. When you enter you’re instructed to remove your shoes as you walk around. My guide is a friend of the temple and was able to escort me inside and encouraged my photo taking, unaccompanied tourists are asked to remain in the courtyard so it was a real honor to be able to go in.


At the Jain Temple

More photos from the Jain temple here: https://www.flickr.com/photos/pleia2/albums/72157669057260703

Our final big stop was the Hanging Gardens of Mumbai. The gardens are perched on a hill that covers a water reservoir. It’s quite pretty, with various flowering plants and bushes, and a collection of animal-shaped topiaries (giraffe! elephant! ox cart!). The garden is also located near the Zoroastrian Tower of Silence, where their dead are exposed for reclamation by nature.

On the drive back to my hotel we passed the Siddhivinayak Temple, dedicated to the Hindu Lord Ganesha. We also stopped to get some fruit-flavored ice cream from Natural Ice Cream, after a few samples, I had the watermelon and jackfruit ice creams, yum! We also passed the Haji Ali Dargah, a mosque and tomb that sits in Worli Bay. A thin strip of land goes out to the mosque, but it’s covered over during high tide, limiting when visits are possible. As a result, when you see it during high tide the building seems to be floating out in the bay. I developed a fascination for this as I was there, passing it a few times during my trip during varying levels of tide. I would love to visit it more closely some day.


Haji Ali Dargah

Saturday evening was spent out to dinner and drinks with some of the SANOG conference folks, where I had a Bira91 White Ale, a wheat beer that is one of the few solid, non-lager beers I’ve encountered in Asia. In addition to enjoying wonderful company and beer at dinner, I had my final trip in an auto rickshaw that night as we wanted to get back to the hotel without getting drenched from the rain. This rickshaw ride was outside of high traffic areas, so it was considerably less nerve-wracking than the ride earlier in the week!


Inside an auto rickshaw earlier in the week, with a cameo from a double-decker bus!

My final day in Mumbai was Sunday, which I spent getting loose ends tied off, visiting with some conference people and then in the evening with my friend Devdas who took me on a train to south Mumbai again. Now, you all know me. I love trains. It was nice to get to take one on a Sunday evening, not too crowded and as a critical piece of infrastructure the trains are kept maintained.


Selfie on a train

We got off at the station I had visited the day before and walked down to Leopold Cafe, which was recommended by a friend who visited Mumbai in the recent past. As a landmark and tourist destination, the cafe also made headlines eight years ago as one of the sites of the 2008 Mumbai attacks. I admit that with a flight ahead of me, I just had some pretty basic continental food, though my chicken sandwich still had a nice bite to it. The walk to the restaurant took us into the Colaba Causeway market. I had already done my shopping for the trip, but it was neat to see all of the stuff they had for sale, and to make our way through the crowds of people on the sidewalks and streets.

My evening concluded by skipping the train back due to the long, soggy walk back to the station and my inclination to stay dry before my flight. I instead took a long Uber ride back to the hotel to pick up my luggage. The ride back started out in a downpour, but eventually cleared up. I was able to see the Haji Ali Dargah lit up at night, and all the people hanging out on Marine Drive, with the signature Queen’s Necklace lights along the boulevard. I also enjoyed going over the Sky Link bridge one final time at night when it wasn’t raining.

And with that, my trip concluded! More photos from throughout my adventures in Mumbai here: https://www.flickr.com/photos/pleia2/albums/72157671033977871

by pleia2 at August 12, 2016 04:26 PM

Jono Bacon

My Blog is Creative Commons Licensed

Earlier this week I was asked this on Twitter:

Screenshot from 2016-08-12 22-50-26

An entirely reasonable question given that I had entirely failed to provide clarity on how my content is licensed on each page. So, thanks, Elio, for helping me to fix this. You will now see a licensing blurb at the bottom of each post as well as a licensing drop-down item in the menu.

To clarify, all content on my blog is licensed under the Creative Commons Attribution Share-Alike license. I have been a long-time free culture and Creative Commons fan, supporter, and artist (see my archive of music, podcasts, and more here), so this license is a natural choice.

Let’s now explore what you can do with my content under the parameters of this license.

What You Can Do

The license is pretty simple. You are allowed to:

  • Share – feel free to share my blog posts with whoever you want.
  • Remix – you are welcome to use my posts and create derived works from them.

…there is a requirement though. You are required to provide attribution for my content. I don’t need a glowing missive about how the article changed your life, just a few words that reference me as the author and point to the original article, that’s all. Something like:

‘My Blog is Creative Commons Licensed’ originally written by Jono Bacon and originally published at http://www.jonobacon.org/2016/08/12/my-blog-is-creative-commons-licensed/

will be great. Thanks!

To learn more about your rights with my content, so the license details.

What I Would Love You Do

So, that’s what you are allowed to do, but what would I selfishly love you to do with my content?

Well, a bunch of things:

  • Share it – I try to write things on this blog that are helpful to others, but it is only helpful if people read it. So, your help sharing and submitting my posts on and to social media, news sites, other blogs, and elsewhere is super helpful.
  • Include and reference it in other work – I always love to see my work included and referenced in other blog posts, books, research papers, and elsewhere. If you find something useful in my writing, feel free to go ahead and use it.
  • Translate it – I would love to see my posts translated into different languages, just like Elio offered to do. If you do make a translation, let me know so I can add a link to it in the original article.

Of course, if you have any other questions, don’t hesitate to get in touch and whether you just read my content or choose to share, derive, or translate it, thanks for being a part of it! 🙂

The post My Blog is Creative Commons Licensed appeared first on Jono Bacon.

by Jono Bacon at August 12, 2016 02:59 PM

August 11, 2016

Jono Bacon

Speaking at Abstractions

Update: my talk has been moved to 1.30pm on Friday 19th August 2016.

Just a quick note to let you know that I will be zooming my way on the reddest of red eyes to Pittsburgh, PA to speak at Abstractions next week.

I first heard about Abstractions some time ago and I was pretty stunned by the speaker roster which includes Jeffrey Zeldman, Richard Stallman, Mitchell Hashimoto, Larry Wall, and others.

I absolutely love events such as Abstractions. The team have clearly worked hard to put together a solid event with a great line-up, professional look and feel, great speaker relations, and more.

Building a Community Exoskeleton

I am going to delivering a talk on Friday at 4.20pm called Building a Community Exoskeleton. The abstract:

Community is at the core of all successful open source projects. The challenge is that building empowered, productive, and inclusive communities is complex work that lives in the connective tissue between technology and people. In this new presentation from Jono Bacon, he will share some insight into how you can build an exoskeleton that wraps around community members to help them to do great work, form meaningful relationships, and help each other to be successful. The presentation will delve into success stories in open source and elsewhere, the underlying behavioral principles we can tap into, infrastructure and workflow decisions, and how we get people through the door and involved in our projects. Bacon will also cover the risks and potholes you will want to delicately swerve around. If you are running an existing project or company, or starting something new, be sure to get along to this presentation, all delivered in Bacon’s trademark loose and amusing style.

I am hoping I will get an opportunity to see many of you there (details of how to attend are here), and I want to offer a huge thanks to the Abstractions team for their kindness, hospitality, and service. I am looking forward to getting out there!

The post Speaking at Abstractions appeared first on Jono Bacon.

by Jono Bacon at August 11, 2016 03:53 PM

August 10, 2016

Jono Bacon

The Bacon Travel Survival Guide

plane

I spend a fair bit of time traveling. Like many of you, this involves catching planes, trains, and automobiles, schlepping around between airports and hotels, figuring out conference centers, and more.

Some years back I shared a room with my friend Pete Graner and was amused by how much crap he packed into his bag. Despite my mild mockery, whenever anyone needed something, Pete got pinged.

Over the years I too have learned how to tame the road, and I want to share some lessons learned from how to pack the right items, book your travel wisely, stay connected, and more.

Build a Backpack

Your backpack is your travel buddy. You will carry it everywhere and it will contain the most critical things you need on your journey. You want it to be comfortable, contain the essentials, and be ready for action whether in your hotel room, at the office, at an airport, or elsewhere.

The Bag

First, you need to get the bag itself. Don’t skimp on cost here, this thing is going to get thrown around and trust me, you don’t want it to drop apart in an airport. Some essential features I look for in the bag:

  • Handles that can wrap around the handle on your suitcase. This means you can then attach it to the suitcase and not have to carry the bag when rolling your suitcase along.
  • Includes at least 4 different compartments, which I would use for:
    1. Your laptop/tablet. Some bags can open up to make it easier for scanning laptops in X-Ray machines. Not essential, but nice to have.
    2. Travel documents and important things (e.g. cash).
    3. Your cables, chargers, and other peripherals/devices.
    4. Medicines and other consumables.
  • A means to attach a water bottle (e.g. an included hook) or pocket to strap it into.
  • Bonus side pockets for sunglasses, tissues, and other items are always nice.

More than anything, ensure the bag is comfortable to wear. This thing is going to strapped to your back a lot, so make sure it feels comfortable to carry and doesn’t rub up against your shoulders.

Filling It

OK, so we have a nice shiny bag. What do we put in it?

You want to ensure you carry items not just for your common tasks, but also for a few outliers too. Also, I recommend many of these items always live in your bag (even if you buy duplicate items for your home.) This then ensures that you don’t forget to pack them when you travel.

Here is a shopping list of what I carry with me, which could be inspiration for your own bag:

Tech

  • Laptop – the jewel in the crown. This always comes with me.
  • Other gadgets – I often also carry:
    • Tablet.
    • Kindle.
  • Laptop charger – obviously this is pretty essential if you want some juice in your laptop.
  • Cables – I carry a bunch of cables, including:
    • 2 x Micro-USB for phones and devices.
    • 1 x Lightning for Apple devices.
    • Fitbit charging cable.
  • Multi-outlet adapter – a handy travel multi-outlet adapter where I can plug in 4 devices into a single outlet.
  • USB outlets – these are those little gadgets with a USB socket that you plug into the wall. I carry at least two and they are used to juice my devices.
  • Outlet adapters – these are the devices that convert between different power outlet types for different countries. I have been through dozens of these, so spend your money on quality. Be sure to buy one that supports every socket in the world. I always carry 2.
  • Battery pack – this is one of those battery packs that you can use for charging your devices when out and about. Get a decent one (at least 12000mAh) with both the 1A and 2.1A ports so you can get a fast charge.
  • USB sticks – I carry a couple of USB sticks around in case I need to transfer data between machines. I often have one as a bootable Ubuntu stick just in case I need to boot into Linux on another machine.
  • Headphones – get some decent headphones (with a built in mic), you will be using them a lot. I use Bose headphones and love them. They may be more expensive, but totally worth it.
  • Notebook and pen – always handy for scribbling down ideas, notes, and other musings. Also critical if you working with a company that doesn’t let you take a laptop into their office due to security measures – you will use this to take notes.

Personal Care

For the ladies reading, adjust to taste (e.g. perfume, not cologne):

  • First aid kit – always have this just in case.
  • Tissues – get a couple of pocket packs, useful for when you have the sniffles.
  • Mints – no one likes travel breath, so this is a handy way of combating it when you have to run straight into a meeting after a flight.
  • Hand sanitizer – other people are icky, wipe them off you.
  • Headache tablets – get tablets your doctor recommends. I carry Aleve, but make sure the ones you get are safe for you (that you are not allergic to).
  • Diarrhea medicine – always handy to have and critical for some further flung destinations. I carry Pepto-Bismol tablets.
  • Cologne – I always like to smell good and usually carry two colognes with me. You can buy an atomizer that you can pre-fill with your cologne before you travel. Or, buy a travel size cologne.
  • Deodorant – essential. You never know when you are going to be stuck in a hot place and you don’t want to get sweaty. I usually carry a roll on.
  • Band-aids/plasters – I carry a few of these, not just in case I cut myself but also in case I get blisters on my feet when I have bought new shoes.
  • Gas/heartburn medicine – always handy to have, particularly for some destinations with richer food.
  • Hangover medicine – it has been known that I have the odd beer here or there on the road. Some scientific research has resulted in me carrying some Blowfish. Be sure to check what you carry is safe (some medicines have ingredients you may be allergic to).
  • Sunscreen – as a pasty white chap, the sun can be my enemy. I carry a small spray can that I can lacquer myself with if I am going to be outside for a while.
  • Water bottle – I always carry a quality water bottle. When traveling you should stay hydrated. Be sure to get a bottle that can strap to your backpack. If there is no means to strap it, buy a carabiner hook. Also, get a bottle where the spout is covered and the cover is lockable. This will ensure you don’t get germs on the spout and that water doesn’t spray out while walking.

Other

  • Cash – I always carry a small amount of bills and coins in my bag. The bills are handy for tipping and purchasing small items when you have run out of cash in your wallet. The coins are helpful for parking meters.
  • Sunglasses – always handy in sunny climbs. I have a dedicated travel pair of sunnies that always lives in my bag so I never forget them.

Get Expedited Customs Entry

If you are traveling regularly, you should strive to make your overall journey as simple and effortless as possible. One easy way to do this is with expedited customs entry.

This varies from country to country but here in the USA there are two programs that are essential – TSA Pre and Global Entry.

The latter, Global Entry, means you can skip the lines when you arrive from an international trip and simply go to a machine where your documents are checked. It can literally save you hours stood in line after a long trip.

TSA Pre is a program in which you can get expedited screening in American airports. It means you can join a shorter line and you don’t have to take off your shoes or belt, or take your laptop out of your bag. TSA Pre is awesome.

If you apply for Global Entry you also get TSA Pre, so that is the way to do. Sure, it involves you filling in a large form and taking a trip out to the airport for a meeting, but given the amount of time an frustration it saves, it is critical.

Tip: When booking your flights be sure to specify your Known Traveler number (which you get with Global Entry) when booking. If you don’t specify it you won’t be able to use Global Entry or TSA Pre on your itinerary, which is rather annoying.

Book Your Travel Wisely

For the majority of trips you take there will be a mode of transportation (e.g. plane, train, car) and a hotel. When booking either of these you should always (a) choose the wisest providers, (b) book the best trips, and (c) always work towards to status/rewards.

Pick good providers

For picking the best providers, do your research. Ask your friends what their favorite airlines are, which hotels they like, and other opinions. Also do some online research.

As an example, a couple of current viewpoints from me currently about airlines:

  • United – pretty average service but cost effective and have a great rewards program. Also easy to spend your miles (few blackout dates).
  • Virgin Atlantic – awesome airline, but more expensive. OK rewards program but I have found it difficult to spend miles (lots of blackout dates).
  • Southwest – great airline, services a lot of the USA. Really nice staff, but their rewards only really gets you on the plane earlier.

Do your research and find the right balance of service and value.

Book the trip that works for you

For booking the best trips, be sure to check out some of the modern providers such as Hipmunk, Kayak, and others. This can make putting together an itinerary much easier.

A few tips for booking flights;

  • When picking seats check SeatGuru to see if it works well for you. Always pick your preferred seat when you book your flight.
  • Always check the layover time – I never layover for less than an hour. Too risky of you have a late takeoff time.
  • Sometimes I also check the cancellation/delay record of an airport. For example, Shenzhen has a pretty poor record and so I have taken a train to reduce the risk of a canceled flight.
  • Remember that bulkhead seats don’t have movable arms so if you get a row to yourself you won’t be able to stretch out.

Work towards rewards

Most airlines and hotels offer rewards programs for regular business. Where possible, you should try to book with the same providers to build up your status. This will the open up perks such as free flights, lounge access, free bags, complimentary upgrades, and more.

When evaluating which rewards plans to use, consider the following:

  • Assess how the rewards program works. Some can be a fairly complex and some are simple. Make sure you understand how you can get the most out of it.
  • Choose airlines that have lots of flights from your nearest airports. This will make it easier to ensure you pick the same airline for most flights.
  • Review how easy it is to book free flights. Do they have lots of blackout dates that make it difficult?
  • Review the perks of the airline. Is it worth it and can you accomplish the different status levels with your typical travel?

Load Up Your Phone

When you are on the road your phone is your trusty companion. It will keep you entertained, informed, and connected.

Aside from ensuring it is always charged, we want to ensure it is connected and has the right apps on to make our trip easier.

Choosing a Plan

Be sure to check what your carrier’s travel/roaming rates are. This varies tremendously between carriers and getting this wrong can cost you a fortune.

Where possible, I always recommend that you are able to have roaming and data when you travel. While it is often slow, it can be essential as part of your trip for contacting colleagues/customers, coordinating travel, finding places to eat, learning the local culture and more.

As an example, T-Mobile has phenomenal unlimited international roaming. Ever since they switched this on it has made travel infinitely better and more reassuring.

Be sure to check the parameters of how this works though. As an example, with T-Mobile, for me to have a call with my wife in America it is better if I call her (the rate is much lower) than if she calls me. Be sure to know these specifics so you can make the most out of your service.

Install Essential Apps

Everyone will have different requirements here, but I recommend you install the following types of apps:

  • Itinerary – I love TripIt. It is a simple app you can forward your email itinerary confirmations to and it provides a simple way of viewing them and providing additional information.
  • Airline – be sure to install the apps for the airlines you fly. Often you can check-in with the app as well as use an electronic boarding pass so you don’t have to print your boarding pass at the airport.
  • Carsharing – be sure to get Uber / Lyft and any regional travel apps (e.g. taxi apps for towns that have banned ridesharing).
  • Business discovery – be sure to install Yelp and TripAdvisor which is hugely helpful for finding decent places to eat, fun bars, and more.
  • Translation – I also recommend you install the Google Translate app. It can not just translate text but also translate text in photos and via the camera too.
  • Entertainment – be sure to install the video, music, reading, and games apps you love. This is always handy for evenings when you just want to relax in your hotel room or for the long trips.

So, there we have it. I hope some of these recommendations are helpful.

Travel Tips

Outside of getting prepped for your trip, here are some random tips that might be handy for while you are on your trip:

At the Airport

  • Check in as soon as your flight opens. When you make the booking, add a cell-phone number so you get texted when check-in opens. This will ensure you get a decent seat choice.
  • Before you fly, buy some essentials in case you need them in the air:
    • A few bottles of water.
    • A few snacks (e.g. protein bars).
  • I always like to eat a big meal before a big flight. Plane food is usually not great and they may have run out of the option you want.
  • Explore off-site parking options. Often it can be way cheaper for parking. Also, check for coupons, there are usually decent discounts available online.

On the Plane

  • Wipe down your tray table and arm-rest with a sanitation wipe. This can prevent getting sick while traveling (which is never fun).
  • When you get to your seat, take your headphones, e-book reader, and tissues out of your bag. This means you don’t have to wait for the seat-belt lights to go out before you can grab them.
  • Track your flight time and be sure to hit the restroom around 45 minutes before landing. When they announce the plane is descending there is often a bum rush for the loo.
  • When they offer drinks and they pour you a little cup, ask for the full can. They usually give it to you.

At the Hotel

  • If you wear shirts/blouses, be sure to check if an ironing board and iron is in your room when you arrive. If not, ask for it to be brought up before you go to bed so you are not rushed in the morning.
  • Don’t have an ironing board and have creases in your clothes? Use a hairdryer on your clothes while you wear them. It often gets most of the creases out.
  • When you got to bed, plug everything into charge, including your portable battery pack. This will ensure you are powered up the following day.
  • You can call reception for a wake up call, but always set a wake-up call on your phone/tablet. Too many hotels forget to actually wake you up.
  • As soon as you wake up, switch the shower on and see if there is hot water. Some hotels take a while to warm up and this prevents you getting delayed.
  • Have one of those rooms where you need to enter your room card to keep the lights on? Just put any other card in there (e.g. an old subway pass) and it usually works. 😉

I would love to hear your tips though. What travel secrets have you unlocked? Be sure to let everyone know in the comments…

The post The Bacon Travel Survival Guide appeared first on Jono Bacon.

by Jono Bacon at August 10, 2016 03:00 PM

Akkana Peck

Double Rainbow, with Hummingbirds

A couple of days ago we had a spectacular afternoon double rainbow. I was out planting grama grass seeds, hoping to take take advantage of a rainy week, but I cut the planting short to run up and get my camera.

[Double rainbow]

[Hummingbirds and rainbow] And then after shooting rainbow shots with the fisheye lens, it occurred to me that I could switch to the zoom and take some hummingbird shots with the rainbow in the background. How often do you get a chance to do that? (Not to mention a great excuse not to go back to planting grass seeds.)

(Actually, here, it isn't all that uncommon since we get a lot of afternoon rainbows. But it's the first time I thought of trying it.)

Focus is always chancy when you're standing next to the feeder, waiting for birds to fly by and shooting whatever you can. Next time maybe I'll have time to set up a tripod and remote shutter release. But I was pretty happy with what I got.

Photos: Double rainbow, with hummingbirds.

August 10, 2016 01:40 AM

August 07, 2016

Elizabeth Krumbach

A Gateway, a Synagogue and a Museum in Mumbai

Last Saturday I arrived in India for the first time. A conference was on my schedule, but since this was my first time visiting this country I decided to do some touristing around Mumbai. Unfortunately it’s monsoon season, so it’s been an incredibly soggy trip. I joked that coming from drought-ridden California, I was coming to visit in order to get my rain quota met for the year. Mumbai didn’t disappoint.

This first day my plan was to meet up with my friend Nigel Babu, who I met in the Ubuntu community. Our real life paths first met at an Ubuntu Developer Summit in Budapest, and then again a couple years ago when he came to my home of San Francisco for a conference. It was really nice to finally meet in his home territory. He picked me up at my hotel, and we took a drive over the Bandra–Worli Sea Link, a beautiful bridge that links the hotel where I am staying with south Mumbai. Once over the bridge, we stopped briefly to check out the views of the sea, but the rain drove us back into the car pretty quickly. It was then south to the Marine Drive, or Queen’s Necklace. That’s where I got my rainy day picture taken, before we stopped for some snacks and Masala chai at a nearby hotel cafe.

Our journey continued south, where we first walked to the Knesset Eliyahoo Synagogue, the 2nd oldest in the city. I had been clued into the existence of this synagogue by a friend of mine who had visited a few years before, and the description of the place in my tourist book cemented my desire to go. The whole building is turquoise, and that bright color extends to the inside of the building. It was a quiet day there and we were the only visitors, so they welcomed us inside and allowed some picture taking.

The stained class inside was beautiful, but the damp climate definitely was taking a toll on the building. One of the more interesting things to see in this Sephardic synagogue was a marble slab on the wall near the ark that had the 10 commandments, not in Hebrew or a local language, but in English. They had a little gift shop and I picked up a small Haggadah branded with their location as a keepsake of my visit.

More photos from the synagogue here: https://www.flickr.com/photos/pleia2/sets/72157671835673625

From there we walked down to the Gateway of India, where we got an all important selfie.

Visiting there also offered a nice look over at the lovely Taj Mahal hotel (not to be confused with the Taj Mahal in Agra). As a tourist attraction, it was worth seeing, but there’s not much to actually do by the gateway, so we quickly were off to our next stop, Chhatrapati Shivaji Maharaj Vastu Sangrahalaya, formerly known as the Prince of Wales Museum of Western India.

As my first full day in India, it was nice to visit this museum. I like museums, it gave Nigel and I some time to chat, but also gave me a wonderful view into the local culture from a locally curated collection. Most of the museum was casually air conditioned, so walking through the galleries was not challenging, though I did enjoy the select galleries that had strong air conditioning. The galleries had an interesting mix of very old Indian artifacts, statues, weaving, weapons, as well as some paintings, furniture and more from the colonial periods. I enjoyed the relationship between these galleries in a building that itself was from the colonial period.


Ganesha statue at CSMVS

I bought a photo pass, so lots of photos from the museum are here: https://www.flickr.com/photos/pleia2/sets/72157668844045884

By this time evening was creeping up and we had dinner plans. We stopped for some hot chocolate and then got exceptionally lost as we looked for the local friends we were meeting for dinner. We did make it eventually, and had a lovely, if late, seafood dinner at Gajalee. The adventurous day, heat and humidity, and jet lag were eating at my appetite, I tried everything but it wasn’t a big meal for me. Good company though, I got to meet Mehul Ved from Ubuntu India for the first time!

The conference then took over most of the rest of my week, but I was able to sneak out to the Taraporewala Aquarium between amidst the rain storms on Friday. The aquarium was redone in 2015, but it still couldn’t really compare to the world class aquariums I’ve grown accustomed to, both in size and cleanliness. I’m pretty sure most tanks in aquariums are cleaned around the clock to keep them looking spiffy. Still, the building is beautiful and I did enjoy seeing a sea turtle and the sea horses.

The entire week was also accented by amazing food, most of which was unnoticeably and unintentionally (for me) vegetarian. Most mornings I began my day with Masala Dosa with Sambar, except for the last when I went with Poori Bhaji, along with watermelon juice and a cup or two of strong coffee. I got some fruit flavored ice cream (jackfruit and watermelon) and the conference introduced me to the near-candy dessert, Jalebi.

Perhaps the crowning meal of my trip was at a vegetarian Thali place (largest picture below), where we were served endless little cups of food, which when accompanied by various flatbreads was a deceptively large amount of food. Given my love for animals, I rely upon cognitive dissonance to keep me a meat eater, since vegetarianism is still a challenge to pull off in the US and have the satisfying diet I want (a salad is not an acceptable vegetarian option). If I were living here and had the array of amazing food that’s vegetarian it would be a no-brainer. The only challenge for me here was the spice, which my stomach is not at all accustom to. Even ordering everything extremely mild, my antacid bottle was never far away, and I might actually go for some bland foods upon returning home.

Saturday I hired a guide through the hotel concierge and saw a whole collection of other places, but that’s for another post. More uncategorized photos from my adventures including ones the following weekend that I haven’t written about yet here: https://www.flickr.com/photos/pleia2/albums/72157671033977871/

by pleia2 at August 07, 2016 10:50 AM

SANOG 28

This week I traveled to Mumbai, India to participate in SANOG 28, (South Asian Network Operators Group). This was an unusual conference for me. My husband is the networking guru and he routinely attends NANOG meetings, for the North American group. I even had dinner here at SANOG with a woman who knows him. The closest I’ve gotten to NANOG is tagging along when the conference brings him to interesting of useful place (San Juan, Philadelphia) and doing some dinners with attendees who I know when I happen to be around. Plus, I usually go to open source or systems operations conferences. This was the first time I’d been to a conference focused on networking operations.

So, what brought me to the other side of the world to this uncharacteristic-for-me conference? I was encouraged to submit a proposal to do an OpenStack tutorial, and it was accepted! I’m really grateful to my friend Devdas Bhagat who encouraged me to submit. He has kept me in the loop all week with social activities and generally being around for me as I started interacting with a community that’s so new to me.

As the conference began, I learned that there have been nine SANOGs in India, and that this was the third time they’ve come to Mumbai. SANOG itself covers Afghanistan, Bangladesh, Bhutan, India, Maldives, Nepal, Pakistan and Sri-Lanka, but given the venue the first speaker spoke on some of the challenges confronting India specifically.

I enjoyed a keynote by Joe Abley of Dyn, where he spoke on treating your technical teams well and making sure you’re doing everything you can to support them in their work and goals. He also mentioned the splitting of technical from managerial tracks. This is becoming increasingly common in the bay area, they learned some time ago that engineering and manager skills are very different and people should be leveling up on their own tracks. It’s a message that I’m glad is being spread more widely, as an engineer myself I can confidently say that I’ll be a happy person if I can continue moving up in my career to conquer more interesting technical problems, and without ever having to manage other people.

Speaking directly to the technical talks we had Paul Wilson, the director of APNIC, give a keynote on the transition of IANA stewardship from the US Government to ICANN. Speaking as an operations person who is aware of the broader internet governance work because that’s where my servers live, I knew this transition had been in the works for several years but I didn’t know much about the actual plans or status. This presentation was the clearest, most concise summary of the plans, progress and status of the work they had been doing, and how close they are to finishing!

The most surprising part about this conference for me was the status of IPv6 in APAC, a view into which was presented by Byron Ellacott of APNIC. I had been under the naive assumption that given the explosive growth of network infrastructure in the regions over the past several years, it would go without saying that these green fields be IPv6 capable. I was wrong. While IPv6 adoption in the US and a few countries in Europe has continued to grow, it remains very low, to non-existent in most APAC regions. At a speaker dinner later in the week I asked about this, and the consensus was a chicken and egg problem. A considerable amount of content is still IPv4, so until that moves to IPv6, providing capability for it doesn’t make sense. As long as adoption remains low (estimated 6.5% worldwide) and IPv4 is still supported, organizations don’t have incentive to offer their content over IPv6. Instead, they’re taking extensive advantage of NAT and keep trying to find ways to get more IPv4 addresses (even if the math is against them). The whole discussion gave me some pause about the push for IPv6. Having a husband in the industry and working on a team that is eager to see strong IPv6 support in our infrastructure, I was an early adopter (I’ve had a AAAA record for this blog for years!). I thought we were all moving in the direction of adoption, but are we really?

The second day began with a talk about the status of Root DNS anycast in South Asia and how that impacts users by Anurag Bhatia of Hurricane Electric. It continued with an update from Champika Wijayatunga if ICANN on the rotation of and changes to ICANN’s Root Zone Keysigning key (KSK) and related Verisign Zonesigning Key (ZKZ), which I didn’t now a lot about but you can by checking out ICANN’s site on the topic. It definitely was surprising to learn that a rotation plan for the KSK wasn’t previously in place and that it’s remained the same since 2010.

These first morning talks concluded with a pair that were amusingly juxtaposed: The first was by Matthew Jackson on how geo-restrictions in New Zealand lead to the development of technologies to get around the limitations and subsequent policy changes. As a native of the US, I’ve only rarely been impacted by region-blocking, but it has always been troublesome to me. As he said in his talk: “The internet we built wasn’t meant to be geo-restricted.” Indeed. The talk that came after it was about ISP/network-level content filtering technologies. Hah!

As the day wound down, so did the conference. The closing event was held at the nearby Mumbai Cricket Association Indoor Cricket Academy and Recreation Center. It’s the off-season, so no Cricket was happening and the field was dark, but the inside of the building was beautiful. Though I’m not much of a party type, it was nice to meet a few folks and have some snacks before concluding my evening.


Hanging out with Devdas at the closing party!

The week continued with tutorials. On Thursday I presented mine: An Introduction to OpenStack. When my presentation was being evaluated by the committee in early July, I worked with them to tune the description to make an allowance for familiarity with Linux. Following acceptance, they strategically scheduled my tutorial the day after an Introduction to Linux hosted by Devdas.

As I wrote about in this interview, the tutorial was divided up into three parts:

  1. Introduction to some OpenStack deployments
  2. Demonstration
  3. Building your own cloud

Since the audience was very networking focused (less open source, systems), what I sought to communicate was the basic concepts around OpenStack and some of the services it could provide. Then, by giving a demonstration of using different components through a DevStack install, give people a practical view into launching instances, adding block storage, metering and object storage. The talk concluded by doing a section very similar to my CodeConf talk back in June, where I explored the next steps as they begin their journey into OpenStack territory.

The tutorial was 90 minutes long, and I had a few very engaged members of the audience. Afterwards I was able to talk to a couple of folks who previously had trouble separating all the Open* named projects, and were glad to learn more about OpenStack so at least that one stood out. My publisher also gave me some coupons for the digital version of Common OpenStack Deployments so I was able to give those out to three participants, and pre-order discounts for the rest of the audience.

Slides from the here, which include a link out to the DevStack demonstration instructions: sanog_2016_intro_to_openstack.pdf

I think what I enjoyed most about this conference was simply being exposed to a new community, it was a real plasure to be able to sit down at dinner with some of the brilliant people solving problems with these expanding networks. Beyond our discussions about the expansion of (or lack of) IPv6, I was able to chat with a DNS engineer at RIPE about infrastructure they use for the root server they run. I was specifically interested in how much organizational sharing happens between operators of the root DNS servers. His answer? Very little, intentionally. As a champion of open source infrastructures, it took some time for me to come around, but I conceded that in this it does make sense. By using different tooling and methodologies, the heart of the internet is kept safe against inevitable vulnerabilities that arise in one tool or another.

Huge thanks to the organizers of this conference and everyone who made me feel so welcome during my first visit to India. These past few nights I’ve had some great food and very friendly company of some great people from organizations whose work I admire.

More photos from the event here: https://www.flickr.com/photos/pleia2/albums/72157671053188251

by pleia2 at August 07, 2016 06:36 AM

August 06, 2016

Akkana Peck

Adding a Back button in Python Webkit-GTK

I have a little browser script in Python, called quickbrowse, based on Python-Webkit-GTK. I use it for things like quickly calling up an anonymous window with full javascript and cookies, for when I hit a page that doesn't work with Firefox and privacy blocking; and as a quick solution for calling up HTML conversions of doc and pdf email attachments.

Python-webkit comes with a simple browser as an example -- on Debian it's installed in /usr/share/doc/python-webkit/examples/browser.py. But it's very minimal, and lacks important basic features like command-line arguments. One of those basic features I've been meaning to add is Back and Forward buttons.

Should be easy, right? Of course webkit has a go_back() method, so I just have to add a button and call that, right? Ha. It turned out to be a lot more difficult than I expected, and although I found a fair number of pages asking about it, I didn't find many working examples. So here's how to do it.

Add a toolbar button

In the WebToolbar class (derived from gtk.Toolbar): In __init__(), after initializing the parent class and before creating the location text entry (assuming you want your buttons left of the location bar), create the two buttons:

        backButton = gtk.ToolButton(gtk.STOCK_GO_BACK)
        backButton.connect("clicked", self.back_cb)
        self.insert(backButton, -1)
        backButton.show()

        forwardButton = gtk.ToolButton(gtk.STOCK_GO_FORWARD)
        forwardButton.connect("clicked", self.forward_cb)
        self.insert(forwardButton, -1)
        forwardButton.show()

Now create those callbacks you just referenced:

   def back_cb(self, w):
        self.emit("go-back-requested")

    def forward_cb(self, w):
        self.emit("go-forward-requested")

That's right, you can't just call go_back on the web view, because GtkToolbar doesn't know anything about the window containing it. All it can do is pass signals up the chain.

But wait -- it can't even pass signals unless you define them. There's a __gsignals__ object defined at the beginning of the class that needs all its signals spelled out. In this case, what you need is

       "go-back-requested": (gobject.SIGNAL_RUN_FIRST,
                              gobject.TYPE_NONE, ()),
       "go-forward-requested": (gobject.SIGNAL_RUN_FIRST,
                              gobject.TYPE_NONE, ()),
Now these signals will bubble up to the window containing the toolbar.

Handle the signals in the containing window

So now you have to handle those signals in the window. In WebBrowserWindow (derived from gtk.Window), in __init__ after creating the toolbar:

        toolbar.connect("go-back-requested", self.go_back_requested_cb,
                        self.content_tabs)
        toolbar.connect("go-forward-requested", self.go_forward_requested_cb,
                        self.content_tabs)

And then of course you have to define those callbacks:

def go_back_requested_cb (self, widget, content_pane):
    # Oops! What goes here?
def go_forward_requested_cb (self, widget, content_pane):
    # Oops! What goes here?

But whoops! What do we put there? It turns out that WebBrowserWindow has no better idea than WebToolbar did of where its content is or how to tell it to go back or forward. What it does have is a ContentPane (derived from gtk.Notebook), which is basically just a container with no exposed methods that have anything to do with web browsing.

Get the BrowserView for the current tab

Fortunately we can fix that. In ContentPane, you can get the current page (meaning the current browser tab, in this case); and each page has a child, which turns out to be a BrowserView. So you can add this function to ContentPane to help other classes get the current BrowserView:

    def current_view(self):
        return self.get_nth_page(self.get_current_page()).get_child()

And now, using that, we can define those callbacks in WebBrowserWindow:

def go_back_requested_cb (self, widget, content_pane):
    content_pane.current_view().go_back()
def go_forward_requested_cb (self, widget, content_pane):
    content_pane.current_view().go_forward()

Whew! That's a lot of steps for something I thought was going to be just adding two buttons and two callbacks.

August 06, 2016 10:45 PM

August 02, 2016

Elizabeth Krumbach

Ubuntu 16.04 Release Party San Francisco Concluded!

On the evening of Thursday, July 28th I hosted the Ubuntu 16.04 Release Party in San Francisco. It was a couple months after release, but nicely lined up with the 16.04.1 release, where folks running 14.04 would finally be prompted to upgrade to 16.04. It also ended up being just a week after the release of the 9th edition of The Official Ubuntu Book, so I was able to give away a couple of copies during the party!

The evening was hosted by OpenDNS, who were incredibly welcoming and gracious hosts. Thanks so much, Jennifer Basalone and crew!

The space was excellent, having power strips set up at a pair of tables near the entrance, a whole area of seating for the presentation and an open floor plan that lent itself to casual chats as well as pulling out laptops to swap tips with each other. An Ubuntu Studio install was even started during the event. We did have the unfortunate snafu of a baseball game just down the street messing up nearby traffic a bit, but hopefully that didn’t discourage too many attendees, as public transit to the venue was still pretty easy.

The venue provided drinks and I was able to order salad and a pile of pizzas to make sure everyone was well fed throughout the event.

Like with my past presentations at LUGs in June and July, I brought along my underpowered Lenovo G575, which I had Ubuntu 16.04 running on and my Dell Mini 9 with Xubuntu 16.04. Plus I had my pair of tablets, Nexus 7 and Aquaris M10 with the hot-off-the-download OTA-12.

The tablets definitely got the most attention at this event, and showing off desktop mode (convergence!) by connecting my Lenovo keyboard+mouse combo to the Aquaris M10 was a lot of fun.

I did my release presentation a final time at this event, this time updated with OTA-12 notes. Slides available: sf_release_party_ubuntu_1604.pdf (6.0M), sf_release_party_ubuntu_1604.odp (5.4M), please feel free to use them as you see fit.

A few more photos from the event here: https://www.flickr.com/photos/pleia2/albums/72157671609240786

by pleia2 at August 02, 2016 03:06 AM

August 01, 2016

Jono Bacon

10 Lessons Learned in Training Knowledge Workers

Earlier this week, @naval (CEO and co-founder of AngelList) asked a question on Twitter:

Screen Shot 2016-07-30 at 11.38.04 AM

At the heart of his question is an interesting observation. As automation and artificial intelligence replaces manual jobs, how do we retrain people in the new knowledge economy where information handling and management is in high demand?

I thought I would share some experiences, observations, and recommendations based upon when I did this previously in my career.

OpenAdvantage

Back in 2004 I was peddling my wares as a journalist, writing for the tech press. I was living in the West Midlands in England and heard about a new organization in nearby Birmingham called OpenAdvantage.

The premise was neat: open source was becoming a powerful force in technology and OpenAdvantage was set up to provide free consultancy for companies wanting to harness open source, as well as individuals who wanted to upskill in these new technologies. At the time in the West Midlands lots of industry was being automated and moved out to Asia, so lots of Midlanders were out of jobs and looking to retrain. This required, by definition, retaining the workforce as knowledge workers.

uceopenadvantbigimage1

OpenAdvantage was funded by the UK government and the University of Central England, had a decent war chest, and was founded by Scott Thompon and Paul Cooper (the latter of which I met when he heckled me at a talk I gave at a Linux User Group once. 🙂 )

So, I went along to their launch event and wrote a piece about them. Shortly after, Paul and Scott invited me back over to the office and offered me a job there as an open source consultant.

I took the role, and this is where I cut my teeth on a lot of open source, community, and working with businesses. We had crazy targets to hit each month, so we ended up working with and training hundreds of organizations and individuals across a wide range of areas, and covering a wide berth of open source technology and approaches.

All of our services were entirely free IF the person or organization was based in the West Midlands (as this is the area our funding was supporting.)

Training Knowledge Workers

So, what lessons did we learn from this work that can be applied to Naval’s question? I have 10 primary recommendations for training new knowledge workers…

1. Understand your audience and their diversity

Many people who are being retrained will come from varying backgrounds and have different levels of experience, goals, insecurities, and ambitions. As an example, some people may not possess the foundational computing skills required for the topic you are training, yet others will. Also, there may be different concerns about connectivity, social media, and networking based on how much your audience have been exposed to technology.

Be sure to understand your audience and craft your training to their comprehensive set of needs. A good way to do this is to survey your audience before they join. Then you can tune your training effectively.

2. Teach skills that have clear market value

When someone needs to change careers, their top concern is usually supporting their family and bringing financial security to the home. They will only consider skills that have clear market value. So, be aware of what the market needs and train based on those skills. The market is ever changing, and thus are the requirements, so adapt your program to these needs.

So, even though you may love Haskell, if the market is demanding Ruby developers, teach Ruby. Sure, you may love SugarCRM, but if the market demands Salesforce, do the same. One caveat here though is always keeping an eye on new trends so you can provide training on technologies and services as they ripen so you can equip your audience for the very best and most timely opportunities.

3. Tie the training to direct market benefits

Aside from market value, you also want to ensure your audience understands the potential of acquiring those skills before they embark on the training. Benefits such as job security, good salaries, health/insurance benefits, and more can be a useful forcing function that will get them through the training.

Also be sure to train a mixture of vocational skills (e.g. technologies) as well as best practice, methodologies, and approaches for being successful in the workplace. This could include topics such as project management, leadership skills, time management, and more.

4. Provide training at zero (or very low) cost

One of the major benefits of our work at OpenAdvantage was that we provided free services. This made it a no-brainer for many people to consume these services.

You should also try to engineer a situation where your training is also a no-brainer and the cost is free or as close to free as possible. If you charge a high sticker value for the training, many people may not be able to justify or afford it.

A good way to offset costs is with partnerships and sponsorships. Explore different vendors to see if they can sponsor the training, talk to local chambers and charities to see if they can help, and see if local businesses can provide venues, equipment, and other resources to keep the costs low and your training as accessible as possible to your audience.

5. Build in clear intrinsic/extrinsic rewards

For the training to really succeed, the audience needs to gain both intrinsic rewards (such as better capabilities, confidence, digital literacy etc) and extrinsic rewards (material items such as t-shirts, trophies, mugs etc).

Focus on the intrinsic rewards first: they are the confidence and opportunity boosting benefits that will get them over the hump to changing careers and succeeding in their new profession.

The extrinsic rewards can be a boon here though, but where possible, ensure they are useful in their career development. Items such as notepads/pens, USB sticks, books, training materials, and other items are good examples that can support your audience and make them feel rewarded. Avoid gimmicks or tat that doesn’t play a functional benefit as a knowledge worker.

6. Teach by doing, not just by presenting

Having someone sit down in front of a day of presentations is boring. Instead, present short bursts of core skills, but get your audience doing stuff, talking, and working together. Have them execute tasks, experiment, and play. This is what seals the skills in.

My favorite approach here is to teach multiple short presentations (15 minutes or less) and then provide a “challenge” or “task” for them to complete to exercise these new skills, explore, and experiment.

This is important not just for skills development but it also encourages your audience to talk to each other in the session, collaborate, solving problems together, and build relationships.

7. Provide follow up service and connections

It is tempting to assume that when that exhausting day of training is over, you are done. Not at all.

Always follow up with your audience to see how they are doing, introduce them to local communities, show them useful tools, introduce them to other people they may find helpful, connect them to organizations looking for staff and more.

Retraining people is not just about soaking up knowledge it is about bridging the gap to new industries and the people within them. These additional recommendations, connections, and introductions can often be one of the most empowering parts of the overall experience.

8. Teach them how to teach themselves

One of the major challenges with education is that it often teaches skills in a vacuum. Sadly, this just isn’t how the world works.

The most capable and successful people in the world develop the abilities to (a) always learn and grow new skills, (b) always be willing to challenge themselves and their assumptions, and (c) be willing to experiment and try new things. This is a lifelong process, but you should help your audience to learn how to teach themselves and expand their skills.

For example, teach them how to research problems online, how to find support forums and groups, ask meaningful questions, and how to experiment, debug issues, and solve problems. These are critical skills for knowledge workers to be successful.

9. Teach streetsmarts

Another element that is often sadly lacking in traditional education are streetsmarts such as modern trends, memes, and methods of engaging in technology and beyond.

Teach your audience some of these streetsmarts. Examples of this could include the do’s and dont’s of online communication, how to deal with trolls/critics, trending technologies and cultures, how to be successful in an internationally diverse world, and other areas. Again, this will reinforce their capabilities beyond the skills they need to do a job.

10. Build their confidence

One of the most notable things I remembered from my OpenAdvantage days (and have seen since then) is that a lot of people who are transitioning into the knowledge economy feel overwhelmed by the task. They often feel there are too many tools, too many things to learn, that they will never figure it out, and sometimes that they are too old to get started.

This is insecurity, and it can be conquered. The vast majority of people can traverse the challenge and do well, but they need confidence in themselves to get over the bumps in the road and that feeling of being overwhelmed.

Give them that confidence. Help them to understand that this is just technology, and it often looks harder than it really is. Help them to see their potential, what benefits this will open up for them, and how much bigger the market opportunity will be for them. Remind them of the abundance of choices that will open up to them, the confidence it will give them, and how their social and professional networks will grow. Remind them of the good they are doing for their family and the brighter future they will be building.


So, there we have it. I hope some of these learnings are useful to those of you doing this work, and I hope this provided some food for thought for Naval’s question on Twitter.

I would love to hear your thoughts too. What other ideas and methods can we use to make it easier to retrain people as knowledge workers? Which of my points can be expanded or improved? What are your stories from the trenches? Let us know in the squarkbox…

The post 10 Lessons Learned in Training Knowledge Workers appeared first on Jono Bacon.

by Jono Bacon at August 01, 2016 03:00 PM

July 30, 2016

kdub

ATTiny85 PWM from Timer/Counter1

 

I’ve been tinkering with the ATTiny chip a bit lately, and I wanted to hook up of my stepper motors to it. This chip has 2 timers, and a few pins that can output PWM signals.

I had OC1B on PB4/pin3 free, and the Timer/Counter1 module looked a bit better than the Timer/Counter0 module for the relatively-long pulse needed to control the stepper motor.

First thing to figure out is what I wanted the PWM signal to look like. Stepper motors care more about the pulse length than the frequency. My motor accepted 700-1500us as the control range, so I decided to go with a 4000us period (250Hz).

The way that the Timer/Counter1 module works is it will increase its count from 0 to a certain register value (OCR1C), and then reset to zero. In PWM mode, the OC1B pin is cleared when the counter hits a certain value (OCR1B), and set when the counter is 0. So, if you control OCR1C, you can control the the period, and by controlling OCR1B, you can control the pulse width. I decided to use the full width of the counting (8 bits), so OCR1C would be set to 0xFF.

Next I had to figure out how quickly the counter would be incrementing. This is selected via the system clock rate, and the prescaler on the timer. I needed the 16Mhz PLL clock on the chip (CLKSEL=0x0001), so that was fit. I selected 256 as the prescaler value so that:

16Mhz / 256 (prescalar) / 256 (OCR1C) = ~4ms.

Time to write some code!

//fuses: L: 0xE1 H: 0xDD E: 0xff
#define F_CPU 16500000
#include <avr/io.h>
#include <util/delay.h>
void main()
{
    //Set Pin3/PB4 to output
    DDRB = 1 << DDB4;
   
    //approximately a 700us pulse
    OCR1B = 0x2D;
    OCR1C = 0xFF;

    TCCR1 = 1 << CTC1 | //clear on match with OCR1C
            9 << CS10;  //set prescaling to clk/256
    GTCCR = 1 << PWM1B | //enable PWM mode on OC1B
            2 << COM1B0; //clear OC1B when we hit OCR1B
  
    for (;;)
        _delay_ms(100);
}

 

Now that that was written, hooked up a simple circuit, and hooked my logic analyzer to a resistor on the output pin to verify the output:

pwm-test

The logic analyzer showed:

pwm-screenshot

Success! A 4ms period with a 700us pulse width. I could now drive my stepper motor, and by changing OCR1B, I could designate which position the motor was in.

 

by kdub at July 30, 2016 09:14 AM

July 28, 2016

iheartubuntu

How to Run Android Apps Easily on Ubuntu For The First Time


There have been several instances where it would be more comfortable to run an Android app on my computer than to use my smart phone. I have tried running Android in a VirtualBox and it does work, however, Android is its own OS that you still need to boot into. But what if you could run an Android APK directly in Ubuntu? Well.... you can!*

Google released a Chrome app named ARC Welder, which allows you to run Android apps if you’re on the Chrome OS or using the Chrome web browser. Grab the ARC Welder Chrome app here (200MB)...

https://chrome.google.com/webstore/detail/arc-welder/emfinbmielocnlhgmfkkmkngdoccbadn

Just open the link in Google Chrome or search for ARC Welder in the Chrome app store and install it.

Installing Android apps take a little bit of extra steps, but not a big deal. First, find the Android app you want in the Google Play Store...

https://play.google.com/store/apps

Copy the weblink of the app and paste it in the APK-DL website...

http://apk-dl.com/

This will generate an APK download link for the app. Once you have downloaded an APK file, open the ARC Welder app in your Chrome browser (thru the browser apps link). The first time you run ARC Welder it will ask you where to store files for the apps. Create a folder wherever you like before installing an APK. I created a folder in my HOME directory. Once you do that, simply add your APK file now.


You'll also be asked about the size of your app such as tablet or phone, landscape or portrait. Put them to your liking and then install!


* Here is the caveat. Not ALL Android apps are going to work. Some do, some dont. There’s no guarantee the apps you try will work or that they’ll be usable.

I originally did this so I can run an investment app called "Robinhood" (pictured at the top of this post) to buy and sell stocks without paying any commission fees. The app is nice, but its much easier to use it on a desktop computer. Other apps I have tried that work are basic programs like "Bitcoin Checker" and "Coin Pirates" and "Backgammon Free". Programs that require heavy graphics like car racing games probably wont work. Hulu DID work for me, Netflix did NOT. Your mileage may very. Have fun and Good Luck!




by iheartubuntu (noreply@blogger.com) at July 28, 2016 01:33 AM

July 27, 2016

Jono Bacon

Bacon Roundup

In my work I tend to create a lot of material both on my website here as well as on other websites (for example, my opensource.com column and my Forbes column. I also participate in interviews and other pieces.

I couldn’t think of an efficient way to pull these together for you folks to check out. So, I figured I will periodically share these goings on in a post. Let’s get this first Bacon Roundup rolling…

How hackers are making products safer (cio.com)
An interview about the work I am doing at HackerOne in building a global community of hackers that are incentivized to find security issues, build their expertise/skills, and earn some money.

8 ways to get your swag on (opensource.com)
A column about the challenges that face shipping swag out to community members. Here are 8 things I have learned to make this easier covering production, shipping, and more.

10 tips for new GitHub projects (opensource.com)
Kicking off a new GitHub project can be tough for new communities. I wrote this piece to provide 10 simple tips and tricks to ensure your new GitHub project is setting off on the right path.

The Risks of Over-Rewarding Communities (jonobacon.org)
A piece about some interesting research into the risks of over-rewarding people to the point of it impacting their performance. This covers the research, the implications for communities, and some practical ways to harness this in your community/organization.

GC On-Demand Podcast Interview (http://podcast.discoposse.com/)
I had a blast chatting to Eric Wright about community management, career development, traversing challenges, and constantly evolving and improving your craft. A fun discussion and I think a fun listen too.

Taking your GitHub issues from good to great (zenhub.com)
I was invited by my friends at ZenHub to participate in a piece about doing GitHub issues right. They wrote the lions-share of this piece but I contributed some material.

Finally, if you want to get my blog posts directly to your inbox, simple put your email address into the box to the right of this post. This will ensure you never miss a beat.

The post Bacon Roundup appeared first on Jono Bacon.

by Jono Bacon at July 27, 2016 03:00 PM

July 26, 2016

Jono Bacon

The Risks of Over-Rewarding Communities

Incentive plays an important role in communities. We see it everywhere: community members are rewarded with extrinsic rewards such as t-shirts, stickers, gadgets, or other material, or intrinsic rewards such as increased responsibilities, kudos, reputation, or other benefits.

too_many_T-shirts_2

The logic seems seems sound: if someone is the bees knees and doing a great job, they deserve to be rewarded. People like rewards, and rewards make people want to stick around and contribute more. What’s not to love?

There is though some interesting evidence to suggest that over-rewarding your communities, either internal to an organization or external, has some potent risks. Let’s explore the evidence and then see how we can harness it.

The Research

Back in 1908, Yerkes-Dodson, psychologists (and potential prog rock band) developed the Yerkes-Dodson Law. It suggests performance in a task increases with arousal, but only to a point. Now, before you get too hot under the collar, this study refers to mental or physiological arousal such as motivation. The study highlights a “peak arousal” time which is the ideal mix of the right amount of arousal to hit maximal performance.

Dan Ariely in The Upside of Irrationality took this research and built on it to test the effect of extrinsic rewards on performance. He asked a series of people in India to perform tasks with varying levels of financial reward (very small up to very high). His results were interesting:

Relative to those in the low- or medium-bonus conditions, they achieved good or very good performance less than a third of the time. The experience was so stressful to those in the very-large-bonus condition that they choked under the pressure.

I found this choke point insight interesting. We often see an inverse choke point when the stress of joining a community is too high (e.g. submitting a first code pull request to your peers). Do we see choke points for communities members with a high level of pressure to perform though?

Community Strategy Implications

I am not so sure. Many communities have high performing community members with high levels of responsibility (e.g. release managers, security teams, and core maintainers) who perform with predictably high quality results.

Where we often see the ugly head of community is with entitlement; that is, when some community members expect to be treated differently to others.

When I think back to the cases where I have seen examples of this entitlement (which shall remain anonymous to protect the innocent) it has invariably been due to an imbalance of expectations and rewards. In other words, when their expectations don’t match their level of influence on a community and/or they feel rewarded beyond that suitable level of influence, entitlement tends to brew.

As as such, my graph looks a little like this:

Screen Shot 2016-07-26 at 8.42.49 AM

This shows the Yerkes-Dodson curve but subdivides the timeline into three distinctive areas. The first area is used for growth and we use rewards as a means to encourage participation. The middle area is for maintenance and ensuring regular contribution over an extended period of time. The final area is the danger zone – this is where entitlement can set in, so we want to ensure that manage expectations and rewards carefully. In this end zone we want to reward great work, but ultimately cap the size of the reward – lavish gifts and experiences are probably not going to have as much impact and may even risk the dreaded entitlement phenomena.

This narrative matches a hunch I have had for a while that rewards have a direct line to expectations. If we can map our rewards to effectively mitigate the inverse choke point for new members (thus make it easier to get involved) and reduce the latter choke point (thus reduce entitlement), we will have a balanced community.

Things You Can Do

So, dear reader, this is where I give you some homework you can do to harness this research:

  1. Design what a ‘good’ contribution is – before you can reward people well you need to decide what a good contribution is. As an example, is a good code contribution a well formed, submitted, reviewed, and merged pull request? Decide what it is and write it down.
  2. Create a platform for effectively tracking capabilities – while you can always throw out rewards willy-nilly based on observations of performance, this risks accusations of rewarding some but not others. As such, implement an independent way of mapping this good contribution to some kind of automatically generated numeric representation (e.g. reputation/karma).
  3. Front-load intrinsic rewards – for new contributors in the growth stage, intrinsic rewards (such as respect, support, and mentoring) are more meaningful as these new members are often nervous about getting started. You want these intrinsic rewards primarily at the beginning of a new contributor on-ramp – it will build a personal sense of community with them.
  4. Carefully distribute extrinsic rewards – extrinsic rewards such as clothing, gadgets, and money should be carefully distributed along the curve in the graph above. In other words, give out great material, but don’t make it too opulent otherwise you may face the latter choke point.
  5. Create a distribution curve of expectations – in the same way we are mapping rewards to the above graph, we should do the same with expectations. At different points in the community lifecycle we need to provide different levels of expectations and information (e.g. limited scope for new contributions, much wider for regular participants). Map this out and design systems for delivering it.

If we can be mindful of the Yerkes-Dodson curve and balance expectations and rewards well, we have the ability to build truly engaging and incentivized communities and organizations.

I would love to have a discussion about this in the comments. Do you think this makes sense? What am I missing in my thinking here? What are great examples of effective rewards? How have you reduced entitlement? Share your thoughts…

The post The Risks of Over-Rewarding Communities appeared first on Jono Bacon.

by Jono Bacon at July 26, 2016 04:04 PM

July 25, 2016

Elizabeth Krumbach

The Official Ubuntu Book, 9th Edition released!

Back in 2014 I had the opportunity to lend my expertise to the 8th edition of The Official Ubuntu Book and began my path into authorship. Since then, I’ve completed the first edition of Common OpenStack Deployments, coming out in September. I was thrilled this year when Matthew Helmke invited me back to work on the 9th edition of The Official Ubuntu Book. We also had José Antonio Rey joining us for this edition as a third co-author.

One of the things we focused on with the 8th edition was, knowing that it would have a shelf life of 2 years, future-proofing. With the 9th edition we continued this focus, but also wanted to add a whole new chapter: Ubuntu, Convergence, and Devices of the Future

Taking a snippet from the book’s sample content, the chapter gives a whirlwind tour of where Ubuntu on desktops, servers and devices is going:

Chapter 10: Ubuntu, Convergence, and Devices of the Future 261

The Convergence Vision 262
Unity 263
Ubuntu Devices 264
The Internet of Things and Beyond 268
The Future of the Ubuntu Desktop 272
Summary 273

The biggest challenge with this chapter was the future-proofing. We’re in an exciting point in the world of Ubuntu and how it’s moved far beyond “Linux for Human Beings” on the desktop and into powering servers, tablets, robots and even refrigerators. With the Snappy and Ubuntu Core technologies both powering much of this progress and changing rapidly, we had to be cautious about how in depth we covered this tooling. With the help of Michael Hall, Nathan Haines and Sergio Schvezov I believe we’ve succeeded in presenting a chapter that gives the reader a firm overview of these new technologies, while being general enough to last us until the 10th edition of this book.

Also thanks to Thomas Mashos of the Mythbuntu team and Paul Mellors who also pitched in with this edition. Finally, as with the last edition, it was a pleasure to work with Matthew and José on this book. I hope you enjoy it!

by pleia2 at July 25, 2016 08:27 PM

Jono Bacon

Audio Interview: On Building Communities, Careers, and Traversing Challenges

GCOD-Header-Tall

Last week I was interviewed by the wonderful Eric Wright for the GC On-Demand Podcast.

Over the years I have participated in various interviews, and this was a particularly fun, meaty, and meaningful discussion. I think this could be worth a listen, particularly if you are interested in community growth, but also leadership and facing and traversing challenges.

Some of the topics we discussed included:

  • How I got into this business.
  • What great communities look like and how to build them.
  • How to keep communities personal, particularly when dealing with scale.
  • Managing the expectations of different parts of an organization.
  • My 1/10/100 rule for mentoring and growing your community.
  • How to evolve and grow the skills of your community members and teams in a productive way.
  • My experiences working at Canonical, GitHub and XPRIZE.
  • Increasing retention and participation in a community.
  • Building effective leadership and leading by example.
  • Balancing open source consumption and contribution.
  • My recommended reading list.
  • Lots of fun anecdotes and stories.

So, go and grab a cup of coffee, and use the handy player below to listen to the show:

You can also find the show here.

Eric is a great guy and has a great podcast. I encourage you to check out his website and subscribe to the podcast feed to stay up to date with future episodes.

The post Audio Interview: On Building Communities, Careers, and Traversing Challenges appeared first on Jono Bacon.

by Jono Bacon at July 25, 2016 03:00 PM

July 22, 2016

Elizabeth Krumbach

Ubuntu 16.04 in the SF Bay Area

Back in June I gave a presentation on the 16.04 release down at FeltonLUG, which I wrote about here.

Making my way closer to home, I continued my tour of Ubuntu 16.04 talks in the San Francisco Bay Area. A couple weeks ago I gave the talk at SVLUG (Silicon Valley Linux Users Group) and on Tuesday I spoke at BALUG (Bay Area Linux Users Group).

I hadn’t been down to an SVLUG meeting in a couple years, so I appreciated the invitation. They have a great space set up for presentations, and the crowd was very friendly. I particularly enjoyed that folks came with a lot of questions, which meant we had an engaging evening and it stretched what is alone a pretty short talk into one that filled the whole presentation time. Slides: svlug_ubuntu_1604.pdf (6.0M), svlug_ubuntu_1604.odp (5.4M)


Presentation, tablets and giveaways at SVLUG

At BALUG this week things were considerably more casual. The venue is a projector-less Chinese restaurant these days and the meetings tend to be on the small side. After family style dinner, attendees gathered around my big laptop running Ubuntu as I walked through my slide deck. It worked better than expected, and the format definitely lent itself to people asking questions and having discussions throughout too. Very similar slides to the ones I had at SVLUG: balug_ubuntu_1604.pdf (6.0M), balug_ubuntu_1604.odp (5.4M)


Setup and giveaways at BALUG

Next week my Ubuntu 16.04 talk adventures culminate in the event I’m most excited about, the San Francisco Ubuntu 16.04 release party at OpenDNS office located at 135 Bluxome St in San Francisco!

The event is on Thursday, July 28th from 6:30 – 8:30PM.

It’s right near the Caltrain station, so where ever you are in the bay it should be easy to get to.

  • Laptops running Ubuntu and Xubuntu 16.04.
  • Tablets running the latest Ubuntu build, including the bq Aquaris M10 that shipped with Ubuntu and demonstrates convergence.
  • Giveaways, including the 9th edition of the Official Ubuntu book (new release!), pens, stickers and more.

I’ll need to plan for food, so I need folks to RSVP. There are a few options for RSVP:

Need more convincing? It’ll be fun! And I’m a volunteer whose systems engineering job is unrelated to the Ubuntu project. In order to continue putting the work into hosting these events, I need the satisfaction of having people come.

Finally, event packs from Canonical are now being shipped out to LoCos! It’s noteworthy that for this release instead of shipping DVDs, which have been in sharp popularity decline over the past couple of years, they are now shipping USB sticks. These are really nice, but the distribution is limited to just 25 USB sticks in the shipment for the team. This is an order of magnitude fewer than we got with DVDs, but they’re also much more expensive.


Event pack from Canonical

Not in the San Francisco Bay Area? If you feel inspired to give an Ubuntu 16.04 presentation, you’re welcome to use my slides, and I’d love to see pictures from your event!

by pleia2 at July 22, 2016 12:17 AM

July 21, 2016

Jono Bacon

Hack The World

hacktheworld-logo

As some of you will know, recently I have been consulting with HackerOne.

I just wanted to share a new competition we launched yesterday called Hack The World. I think it could be interesting to those of you already hacking, but also those of you interested in learning to hack.

The idea is simple. HackerOne provides a platform where you can go and hack on popular products/services (e.g. Uber, Adobe, GitHub, Square, Slack, Dropbox, GM, Twitter, Yahoo!, and many more) and submit vulnerability reports. This is awesome for hackers as they can safely hack on products/services, try out new hacking approaches/tools, build relationships with security teams, build a resume of experience, and earn some cold hard cash.

Currently HackerOne has 550+ customers, has paid over $8.9 million in bounties, and fixed over 25,000 vulnerabilities, which makes for a safer Internet.

Hack The World

Hack The World is a competition that runs from 20th July 2016 – 19th September 2016. In that time period we are encouraging people to hack programs on HackerOne and submit vulnerability reports.

When you a submit a vulnerability report that is valid, the program may award you a bounty payment (many people all over the world earn significant buckets of money from bounties). In addition, you will be rewarded reputation and signal. Reputation is an indicator of active activity and participation, and signal is the average reputation in your reports.

Put simply, whoever earns the most reputation in the competition can win some awesome prizes including $1337 in cash, a hackable FPV drone kit, awesome limited edition swag, and bragging rights as being one of the most talented hackers in the world.

To ensure the competition is fair for everyone, we have two brackets – one for experienced hackers and one for new hackers. There will be 1st, 2nd, and runner up prizes in each bracket. This means you folks new at hacking have a fighting chance to win!

Joining in the fun

Getting started is simple. Just go and register an account or sign in if you already have an account.

To get you started, we are providing a free copy of Peter Yaworski’s awesome Web Hacking 101 book. Ensure you are logged in and then go here to grab the book. It will then be emailed to you.

Now go and and find a program, start hacking, learn how to write a great report, and submit reports.

When your reports are reviewed by the security teams in the programs you are hacking on the reputation will be awarded. You will then start appearing on the Hack The World Leaderboard which at the time of writing looks a little like this:

Screen Shot 2016-07-20 at 9.48.03 PM

This data is almost certainly out of date as you read this, so go and see the leaderboard here!

So that’s the basic idea. You can read all the details about Hack The World by clicking here.

Hack The World is a great opportunity to hack safely, explore new hacking methods/tools, make the Internet safer, earn some money, and potentially be crowned as a truly l33t hacker. Go hack and prosper, people!

The post Hack The World appeared first on Jono Bacon.

by Jono Bacon at July 21, 2016 03:00 PM