Webmaking in the UK, and face-to-face events

One of this week’s conversations was with Nesta, about Webmaker usage within the UK and whether or not we have data to support the theory that face-t0-face events have an impact getting people involved in making on the web. These are two topics that interest me greatly.

I’m basically copying some of my notes into blog form so that the conversation isn’t confined to a few in-boxes.

And the TL;DR is our data represents what we’ve done, rather than any universal truth.

Our current data would support the hypothesis that face-to-face time is important for learning, but that would simply be because that’s how our program has been designed to date. In other words, our Webmaker tools were designed primarily for use in face-to-face events, which meant that adoption by ‘self-learners’ online is low because their is little guidance or motivation to play with our tools on your own. This year we’re making a stronger push on developing tools that can be used remotely, alongside our work on volunteer led face-to-face events. This will lead to a less biased overall data set in the future where we can begin to properly explore the impact on making and learning for people who do or don’t attend face-to-face events at various stages in their learning experience. In particular I’m keen to understand what factors help people transition from learners, to mentoring and supporting their peers.

I also took a quick look at the aggregate Google Analytics location data for the UK audience which I hadn’t done before and which re-enforces the point above.

Screen Shot 2015-01-30 at 11.14.29

Above: Traffic to Webmaker (loosely indicating an interest in the topic) is roughly distributed like a population map of the UK. This is what I expect to see of most location data.

Screen Shot 2015-01-30 at 11.17.25

Above: However, if you look at the locations of visitors who make something, there are lots of clusters around the UK and London is equaled by many other cities.

To-date, usage of the Webmaker tools has been driven by those who are using the tools to teach the web (i.e. Webmaker Mentors). But we also know there are large numbers of people who find Webmaker outside of the face-to-face event scenarios who need a better route into Webmaker’s offering.

The good news is that this year’s plans look after both sets of potential learners.

Fundraising testing update

I wrote a post over on fundraising.mozilla.org about our latest round of optimization work for our End of Year Fundraising campaign.

We’ve been sprinting on this during the Mozilla all-hands workweek in Portland, which has been a lot of fun working face-to-face with the awesome team making this happen.

You can follow along with the campaign, and see how were doing at fundraising.mozilla.org

And of course, we’d be over the moon if you wanted to make a donation.

IMG_0373
These amazing people are working hard to build the web the world needs.

Learning about Learning Analytics @ #Mozfest

If I find a moment, I’ll write about many of the fun and inspiring things I saw at Mozfest this weekend, but this post is about a single session I had the pleasure of hosting alongside Andrew, Doug and Simon; Learning Analytics for Good in the Age of Big Data.

We had an hour, no idea if anyone else would be interested, or what angle people would come to the session from. And given that, I think it worked out pretty well.

la_session

We had about 20 participants, and broke into four groups to talk about Learning Analytics from roughly 3 starting points (though all the discussions overlapped):

  1. Practical solutions to measuring learning as it happens online
  2. The ethical complications of tracking (even when you want to optimise for something positive – e.g. Learning)
  3. The research opportunities for publishing and connecting learning data

But, did anyone learn anything in our Learning Analytics session?

Well, I know for sure the answer is yes… as I personally learned things. But did anyone else?

I spoke to people later in the day who told me they learned things. Is that good enough?

As I watched the group during the session I saw conversations that bounced back and forth in a way that rarely happens without people learning something. But how does anyone else who wasn’t there know if our session had an impact?

How much did people learn?

This is essentially the challenge of Learning Analytics. And I did give this some thought before the session…

IMG_0184

As a meta-exercise, everyone who attended the session had a question to answer at the start and end. We also gave them a place to write their email address and to link their ‘learning data’ to them in an identifiable way. It was a little bit silly, but it was something to think about.

This isn’t good science, but it tells a story. And I hope it was a useful cue for the people joining the session.

Response rate:

  • We had about 20 participants
  • 10 returned the survey (i.e. opted in to ‘tracking’), by answering question 1
  • 5 of those answered question 2
  • 5 gave their email address (not exactly the same 5 who answered both questions)

Here is our Learning Analytics data from our session

Screen Shot 2014-10-30 at 13.53.26

Is that demonstrable impact?

Even though this wasn’t a serious exercise. I think we can confidently argue that some people did learn, in much the same way certain newspapers can make a headline out of two data points…

What, and how much they learned, and if it will be useful later in their life is another matter.

Even with the deliberate choice of question which was almost impossible to not show improvement from start to end of the session, one respondent claims to be less sure what the session was about after attending (but let’s not dwell on that!).

Post-it notes and scribbles

If you were at the session, and want to jog your memory about what we talked about. I kind-of documented the various things we captured on paper.

Screen Shot 2014-10-30 at 14.40.57
Click for gallery of bigger images

Into 2015

I’m looking forward to exploring Learning Analytics in the context of Webmaker much more in 2015.

And to think that this was just one hour in a weekend full of the kinds of conversations that repeat in your mind all the way until next Mozfest. It’s exhausting in the best possible way.

Mozilla Contributor Analysis Project (Joint MoCo & MoFo)

I’m  back at the screen after a week of paternity leave, and I’ll be working part-time for next two weeks while we settle in to the new family routine at home.

In the meantime, I wanted to mention a Mozilla contributor analysis project in case people would like to get involved.

We have a wiki page now, which means it’s a real thing. And here are some words my sleep-deprived brain prepared for you earlier today:

The goal and scope of the work:

Explore existing contribution datasets to look for possible insights and metrics that would be useful to monitor on an ongoing basis, before the co-incident workweek in Portland at the beginning of December.

We will:

  • Stress-test our current capacity to use existing contribution data
  • Look for actionable insights to support Mozilla-wide community building efforts
  • Run ad-hoc analysis before building any ‘tools’
  • If useful, prototype tools that can be re-used for ongoing insights into community health
  • Build processes so that contributors can get involved in this metrics work
  • Document gaps in our existing data / knowledge
  • Document ideas for future analysis and exploration

Find out more about the project here.

I’m very excited that three members of the community have already offered to support the project and we’ve barely even started.

In the end, these numbers we’re looking at are about the community, and for the benefit of the community, so the more community involvement there is in this process, the better.

If you’re interested in data analysis, or know someone who is, send them the link.

This project is one of my priorities over the following 4-8 weeks. On that note, this looks quite appealing right now.

So I’m going make more tea and eat more biscuits.

“Conclusions”

Mile long string of baloons (6034077499)

  • Removing the second sentence increases conversion rate (hypothesis = simplicity is good).
  • The button text ‘Go!’ increased the conversion rate.
  • Both variations on the headline increased conversion rate, but ‘Welcome to Webmaker’ performed the best.
  • We should remove the bullet points on this landing page.
  • The log-in option is useful on the page, even for a cold audience who we assume do not have accounts already.
  • Repeating the ask ‘Sign-up for Webmaker’ at the end of the copy, even when it duplicates the heading immediately above, is useful. Even at the expense of making the copy longer.
  • The button text ‘Create an account’ works better than ‘Sign up for Webmaker’ even when the headline and CTA in the copy are ‘Sign up for Webmaker’.
  • These two headlines are equivalent. In the absence of other data we should keep the version which includes the brand name, as it adds one further ‘brand impression’ to the user journey.
  • The existing blue background color is the best variant, given the rest of the page right now.

The Webmaker Testing Hub

If any of those “conclusions” sound interesting to you, you’ll probably want to read more about them on the Webmaker Testing Hub (it’s a fancy name for a list on a wiki).

This is where we’ll try and share the results of any test we run, and document the tests currently running.

And why that image for this blog post?

Because blog posts need and image, and this song came on as I was writing it. And I’m sure it’s a song about statistical significance, or counting, or something…

Something special within ‘Hack the snippet’

Here are a couple of notes about ‘Hack the snippet‘ that I wanted to make sure got documented.

  1. It significantly changed peoples’ predisposition to Webmaker before they arrived on the site
  2. Its ‘post-interaction’ click-through-rate was equivalent to most one-click snippets

Behind these observations, something special was happening in ‘Hack the snippet’. I can’t tell you exactly what it was that had the end-effect, but it’s worth remembering the effect.

1. It ‘warmed people up’ to Webmaker

  • The ‘Hack the snippet’ snippet
    • was shown to the same audience (Firefox users) as eight other snippet variations we ran during the campaign
    • had the same % of users click through to the landing page
    • had the same on-site experience on webmaker.org as all the other snippet variations we tested (the same landing page, sign-up ask etc)
  • But when people who had interacted with ‘Hack the snippet’ landed on the website, they were more than three times as likely to signup for a webmaker account

Same audience, same engagement rate, same ask… but triple the conversion rate (most regular snippet traffic converted ~2%, ‘Hack the snippet’ traffic converted ~7%).

Something within that experience (and likely the overall quality of it) makes the Webmaker proposition more appealing to people who ‘hacked the snippet’. It could be one of many things: the simplicity, the guided learning, the feeling of power from editing the Firefox start page, the particular phrasing of the copy or many of the subtle design decisions. But whatever it was, it worked.

We need to keep looking for ways to recreate this.

Not everything we do going forwards needs to be a ‘Hack the snippet’ snippet (you can see how much time and effort went into that in the bug).

But when we think about these new-user experiences, we have a benchmark to compare things too. We know how much impact these things can have when all the parts align.

2. The ‘post-interaction’ CTR was as good as most one-click snippets

This is a quicker note:

  • Despite the steps involved in completing the ‘Hack the snippet’ on page activity, the same total number of people clicked through when compared to a standard ‘one-click’ snippet.
  • We got the same % of the audience to engage with a learning activity and then click through to the webmaker site, as we usually get just giving them a link directly to Webmaker
    • This defies most “best practice” about minimizing number of clicks

Again, this doesn’t give us an immediate thing we can repeat, but it gives us a benchmark to build on.

One month of Webmaker Growth Hacking

This post is an attempt to capture some of the things we’ve learned from a few busy and exciting weeks working on the Webmaker new user funnel.

I will forget some things, there will be other stories to tell, and this will be biased towards my recurring message of “yay metrics”.

How did this happen?

Screen Shot 2014-09-01 at 14.25.29

As Dave pointed out in a recent email to Webmaker Dev list, “That’s a comma, not a decimal.”

What happened to increase new user sign-ups by 1,024% compared the previous month?

Is there one weird trick to…?

No.

Sorry, I know you’d like an easy answer…

This growth is the result of a month of focused work and many many incremental improvements to the first-run experience for visitors arriving on webmaker.org from the promotion we’ve seen on the Firefox snippet. I’ll try to recount some of it here.

While the answer here isn’t easy, the good news is it’s repeatable.

Props

While I get the fun job of talking about data and optimization (at least it’s fun when it’s good news), the work behind these numbers was a cross-team effort.

Aki, Andrea, Hannah and I formed the working group. Brett and Geoffrey oversaw the group, sanity checked our decisions and enabled us to act quickly. And others got roped in along the way.

I think this model worked really well.

Where are these new Webmaker users coming from?

We can attribute ~60k of those new users directly to:

  • Traffic coming from the snippet
  • Who converted into users via our new Webmaker Landing pages

Data-driven iterations

I’ve tried to go back over our meeting notes for the month and capture the variations on the funnel as we’ve iterated through them. This was tricky as things changed so fast.

This image below gives you an idea, but also hides many more detailed experiments within each of these pages.

Testing Iterations

With 8 snippets tested so far, 5 funnel variations and at least 5 content variables within each funnel we’ve iterated through over 200 variations of this new user flow in a month.

We’ve been able to do this and get results quickly because of the volume of traffic coming from the snippet, which is fantastic. And in some cases this volume of traffic meant we were learning new things quicker than we were able to ship our next iteration.

What’s the impact?

If we’d run with our first snippet design, and our first call to action we would have had about 1,000 new webmaker users from the snippet, instead of 60,000 (the remainder are from other channels and activities). Total new user accounts is up by ~1,000% but new users from the snippet specifically increased by around 6 times that.

One not-very-weird trick to growth hacking:

I said there wasn’t one weird trick, but I think the success of this work boils down to one piece of advice:

  • Prioritize time and permission for testing, with a clear shared objective, and get just enough people together who can make the work happen.

It’s not weird, and it sounds obvious, but it’s a story that gets overlooked often because it doesn’t have the simple causation based hooked we humans look for in our answers.

It’s much more appealing when someone tells you something like “Orange buttons increase conversion rate”. We love the stories of simple tweaks that have remarkable impact, but really it’s always about process.

More Growth hacking tips:

  • Learn to kill your darlings, and stay happy while doing it
    • We worked overtime to ship things that got replaced within a week
    • It can be hard to see that happen to your work when you’re invested in the product
      • My personal approach is to invest my emotion in the impact of the thing being made rather than the thing itself
      • But I had to lose a lot of A/B tests to realize that
  • Your current page is your control
    • Test ideas you think will beat it
    • If you beat it, that new page is your new control
    • Rinse and repeat
    • Optimize with small changes (content polishing)
    • Challenge with big changes (disruptive ideas)
  • Focus on areas with the most scope for impact
    • Use data to choose where to use data to make choices
    • Don’t stretch yourself too thin

What happens next?

  • We have some further snippet coverage for the next couple of weeks, but not at the same level we’ve had recently, so we’ll see this growth rate drop off
  • We can start testing the funnel we’ve built for other sources of traffic to see how it performs
  • We have infrastructure for spinning up and testing landing pages for many future asks
  • This work is never done, but with any optimization you see declining returns on investment
    • We need to keep reassessing the most effective place to spend our time
    • We have a solid account sign-up flow now, but there’s a whole user journey to think about after that
    • We need to gather up and share the results of the tests we ran within this process

Testing doesn’t have to be scary, but sometimes you want it to be.

Overlapping types of contribution

Screen Shot 2014-08-21 at 14.02.27TL;DR: Check out this graph!

Ever wondered how many Mozfest Volunteers also host events for Webmaker? Or how many code contributors have a Webmaker contributor badge? Now you can find out

The reason the MoFo Contributor dashboard we’re working from at the moment is called our interim dashboard is because it’s combining numbers from multiple data sources, but the number of contributors is not de-duped across systems.

So if you’re counted as a contributor because you host an event for Webmaker, you will be double counted if you also file bugs in Bugzilla. And until now, we haven’t known what those overlaps look like.

This interim solution wasn’t perfect, but it’s given us something to work with while we’re building out Baloo and the cross-org areweamillionyet.org (and by ‘we’, the vast credit for Baloo is due to our hard working MoCo friends Pierros and Sheeri).

To help with prepping MoFo data for inclusion in Baloo, and by  generally being awesome, JP wired up an integration database for our MoFo projects (skipping a night of sleep to ship V1!).

We’ve tweaked and tuned this in the last few weeks and we’re now extracting all sorts of useful insights we didn’t have before. For example, this integration database is behind quite a few of the stats in OpenMatt’s recent Webmaker update.

The downside to this is we will soon have a de-duped number for our dashboard, which will be smaller than the current number. Which will feel like a bit of a downer because we’ve been enthusiastically watching that number go up as we’ve built out contribution tracking systems throughout the year.

But, a smaller more accurate number is a good thing in the long run, and we will also gain new understanding about the multiple ways people contribute over time.

We will be able to see how people move around the project, and find that what looks like someone ‘stopping’ contributing, might be them switching focus to another team, for example. There are lots of exciting possibilities here.

And while I’m looking at this from a metrics point of view today, the same data allows us to make sure we say hello and thanks to any new contributors who joined this week, or to reach out and talk to long running active contributors who have recently stopped, and so on.