In my experience, it is vital to incorporate a user-centred and learning-centred approach from the very start of any project, particularly in digital learning products and services for children. Let me explain why and give you my tips for getting this right.
I work in the development of, primarily (but not exclusively) digital products for children. Mainly EdTech and Edutainment (eek, never sure about that word!) products, basically products that help children to learn whilst playing and having fun. In particular, I help to bring in the user and learning perspective at every stage of the project.
It’s often a journey with new clients who typically involve me, the first time, quite late in the project when they feel user testing is needed. However, as our relationship develops, they tend to get in touch earlier and earlier until the call I get is: “We’re just starting thinking about the next product we might develop…” or “We’re just starting work on a pitch and wanted to get you involved…”.
It’s at this point that I know they really understand the value of a truly user-centred experience and I suspect their next product will be a real winner.
Why? Because good user experience isn’t just about sticking a good UX designer on a project. A user-centred approach must pervade every stage of the process, and for the learning products I work on, this must be coupled with a sound learning-centred approach too.
In fact, in my experience, some of the most serious user-centred mistakes happen right at the start of a project. Here are some of the key reasons why:
- Too many details of the intended product/service are already fixed: To be user-centred you must start with defining what the problem is that you’re trying to solve, from a user’s perspective, and be flexible on how the solution may take shape. Otherwise, you’re starting with a solution. If you do that, it’s easy to find yourself inadvertently manipulating any user insight to back you up, rather than really using it to inspire the optimal solution.
- My recommendation: Ensure the ‘problem’/motivation for the project from a users’ perspective, is clearly defined, agreed and communicated across the team at the very start.
- Believing your concept is completely unique: Every team is hoping to create something unique in the end, however, realistically many aspects of what you make will be present in existing products/services. Typically you will find good examples that are not competitor products as well as the more obvious examples. E.g. signing up to a subscription is common across many products – there is little value in reinventing the wheel and a lot to be gained by finding best practice (see case study below).
- My recommendation: It’s well worth taking the time to cherry-pick good (and bad) examples of aspects of what you are trying to achieve, then explore and analyse these. Take the time to user test various implementations of feature where you hope to be innovative, to get inspiration, and ones where different approaches are common but you wish to identify best practice. I would also recommend all the team get hands-on with these examples themselves. A couple of hours of ‘play & share’ can be invaluable. All this ‘competitor’ research will give you a head start in defining features, interactions, layout and design, information architecture, onboarding, user flows and much more.
- Assuming you are experts on your users already: Of course, many companies know their customers and/or users very well. However, the context of a new product/service will inevitably be different and not everyone involved in development will have the same knowledge or perspective.
- My recommendation: Ensure existing knowledge is shared to all members of the development team and considered in the relevant context. This maybe previous research reports, personas, analytics data etc or more informal knowledge from internal experts, the UX team, marketing teams, Product Managers etc. New research is likely to be needed to fill in any gaps and answer questions that arise. Ultimately, you want your whole team to reach a common understanding of the different types of users and their needs/priorities (e.g. through personas, prioritised user needs/MOSCOW rating etc). These outputs should become the responsibility of someone embedded in your team with the role of Audience Advocate (whether that’s the Product Manager / Lead or otherwise). These need to be referenced regularly, particularly at key decision points, and typically continue to evolve throughout the project.
- (For learning products) Insufficient focus on learning objectives/opportunities: Strangely, in the teams I’ve worked with, creating educational or edutainment products, either they are learning-led (with a strong team of educational experts) or user-led (with a strong end-to-end user-centred process), or sadly occasionally neither: few teams seem to do both well. This can end up giving the learning objectives insufficient focus. I’ve also found some teams developing more ‘edutainment’ products, nervous about getting advice on learning. However, even if you do not intend to sell a product on its educational credentials, if you hope or see opportunities for some learning through play it is worth understanding how to optimise these. Frankly, I wish every product for children included this thinking from the start.
- My recommendation: Share an overview of how children learn in general and good pedagogy in the specific learning areas of interest with the whole development team at an early stage. This may require some internal desk research (there is a lot of knowledge available online or via books) or bringing in an expert to run a workshop or two if you do not have education/learning expertise embedded in your team. I have found that taking one day out for a workshop combining, ‘who are your users?’, and ‘how do they learn?’ with a play session on relevant products and an ideation session works really well.
- Believing it’s too soon to involve users: How can it ever be too soon to understand your audience? For some teams, a lot of the knowledge may already be known and they may get away with little or no direct user involvement. However, typically there are gaps to fill that need interviews, user testing, focus groups or other techniques to fill. I know teams with children as their audience can often be worried about involving children with anything early stage but there are so many techniques I’ve used to make this work. Please don’t use this as an excuse.
- My recommendation: quash this mindset with the explanations above if it ever comes up! If you’re working with children, consider free play sessions on competitor apps, running group sessions like a club / craft activity and working with whole families or friendship pairs to help them feel comfortable.
- Providing no budget or time for early-stage insight/research: Nearly all teams have budget and time constraints, but scrimping on user-centred activities at the start is not the way to go. I’ve heard some people tell me it can’t fit into an Agile process but again this simply isn’t true – I’ve done it successfully many times.
- My recommendation: budgeting is hard but, even with the most meagre budgets I believe it’s possible to be user-centred. Take the time to consider the risks of racing off in the wrong direction and prioritise accordingly – or give me a call! After all, as I once heard in a saying: “There is no use galloping if you’re going in the wrong direction”.
Whilst I’m rarely permitted to give details of projects I’ve worked on for confidentiality reasons, the examples below give you an idea of how I approach embedding a user-centred, learning-centred approach at the start of projects.
Case Study 1: Laying the Groundwork for an Award-Winning Learning App
In a recent project, this initial phase involved all of the following:
- An initial overview presentation on the target age and learning principals: in this I painted a picture of what children at the top and bottom end of the target age group were like. What they are into, the books, toys, play, music, apps, games etc that are popular and why, the difference/changes across the target age. I talked about the basic principles of learning such as principles around engaging interest, building on existing knowledge, consolidating and encouraging retrieval/use of knowledge and skills. This also covered some specific information about the target subject.
- Detailed relevant pedagogy research: via books, online sources, academic publications etc, between myself and a key internal team member, we studied the most up-to-the-minute thinking on pedagogy in the specific subject of the learning app (in this case related to language development). This included analysing studies of other relevant digital tools. From this work, key priorities and approaches for the specific project were agreed and shared with the wider team.
- Learning by example: we studied a number of existing apps on the market, identifying features, interactions, user flows and other characteristics we could learn from. We included a few of these examples in our first round of user research (alongside early concepts and a range of art styles). We observed children playing them and learnt a huge amount that fed into flow/wording/screen design/interaction design/feature set and more. In addition, with parents, we focused on the process of signing up for a subscription via a number of examples. This is a vital conversion point where barriers need to be minimised and this allowed us to identify what would be best practice for this audience. From this, the user flows could quickly be created, hence streamlining the user experience design.
Case Study 2: The Preschool Age Approach
Another recent project involved an app for preschoolers. There are so many apps on the market for this age group that there is almost too much choice available as potential inspiration.
Clearly for this age group, interviewing the children and expecting them to talk about what they want, like, need is unrealistic. However, giving them a free choice of 5 or so apps and watching them make choices and play is fascinating. This was the approach we took at the start of this project. We ran the research from my home to provide a relaxed home environment that wasn’t daunting for the children and some of the core team observed the play first hand too. They found the experience so beneficial that they continued to refer to things they had observed throughout the project and beyond. I helped them to shape, prioritise and get most meaning from what we observed, highlighting common behaviours and how this related to our project. My client took away a vast amount of insight about the target age and how they interact with apps, what would appeal to them, engage them and support their learning and development. For example, it helped them realise quite how much of a barrier drag and drop can be at this young age if any accuracy is required and how quickly you can lose their attention! Furthermore, in a very informal way, it was also possible to get some understanding of how certain features were clearly helping children to engage and learn, whilst others did not, and, through interview with the parents, what features would encourage downloads and subscription sign ups.
Case Study: Quick, Low Cost & Meaningful
When a client I work with regularly started work on an early years maths app, a new territory for them, they gave me a call. Budgets and timescales were pretty restricted and, with my experience of observing many maths apps being played during my time creating the Good App Guide, I was able to give them sufficient insight without running any primary research.
In this case, I pulled together a report on the Early Years Foundation Stage UK curriculum for Maths, the basic principles of learning through play and analysis of good and bad approaches across a range of maths apps on the market for the same age group. I presented it (in this case via an interactive video conference workshop which worked extremely well) which enabled my client and their team to fully understand, ask questions. We also walked through their existing early ideas together, with reference to the curriculum and pedagogical insight. From there, the insight then flowed out into further ideation and concept development activities. The whole thing took only 2 – 3 days effort, turned around in full within a week.
Later in the project, I reviewed concepts and ran 2 rounds of user testing before they launched. They later won an award for the app.
I’m a strong believer that early insight-rich workshops like the one described above, involving members of different teams, lead to innovate, user/learning-centred ideas with strong buy-in and understanding between the teams involved.
On another project, we utilised similar methods as part of a Google Sprints approach which is great to move forward with an early project fast. I have also used a similar approach with agencies as part of their pitch development process before a project is even won.
Case Study: Larger Scale Research
For larger-scale projects, a broader research phase may be appropriate. I’ve lead insight research across the UK (and across countries) via interviews, focus groups, play sessions and questionnaires both remotely and in person, as well as studying analytics and other quantitative data, to gather the information required. At times this has involved sessions with the target audience themselves, and sometimes with experts like teachers, educational consultants or relevant academics. Such research is invaluable to a project, providing critical insight that can be returned to as the project progresses (and that is also, often, meaningful to feed into future projects, marketing activities etc).
One example, is a recent project for the STEM education sector (a subject area I’m particularly passionate about) initially involving focus groups with a range of primary school and secondary school teachers, across the UK and covering different disciplines. We included subject leader, form teachers, SEN specialists and career guidance specialists in an attempt to understand the range and differences within the target audience.
This early research laid the groundwork for personas, storyboards, user needs and priorities, wireframes, user flows, interaction design, visual design approach and even copy and naming. It was followed a month or so later by user testing of prototypes I developed. All this insight and early work gave the development team a wealth of knowledge on which to start the next phase. The service, in this case, is still in development.