In studio 3 (and later my internship), we designed an educational app with Ralf Muhlberger, CEO of The Latest Tricks and 89Friends. He was looking into creating an AR gardening app for school students to fulfils their curriculum requirements in a fresh and interesting way. We called it Project Sapling, and it was by far the most interesting project I’ve worked on. Designing an application that fulfils a curricular and experiential goal was an interesting challenge, as well as positioning the app for funding grants and prototyping with mobile devices in mind.
It’s been a while since my team and I finished our internships with 89Friends, so I wanted to reflect on what I learned from the experience and how it applies to my projects going forward.
Designing for AR
The big challenge from the get-go was wrapping my head around designing for AR. I believe that AR is largely a gimmick as far as games are concerned. Many games priorities projecting traditional games (like shooters) in AR but never actually feel like they are made better by AR’s inclusion. It’s like the transition between PC/Console games and mobile games. For the first few years, games used on-screen controls that were clearly derivative of controller-based schemes but without the tactility of a real controller. But as developers came to terms with the kinds of experiences that a touchscreen (and secondary sensors on mobile devices) can provide, games became significantly more natural and intuitive to control. AR is yet to find that revolution in the games space, but has much more interesting ramifications for utilitarian applications, like visualisation and training.
The key difference that AR provides over traditional mobile is that objects on the screen are anchored to the real-world. It’s a slight difference, but being able to experience proportion and space does wonders for the educational value of a product. Imagine being able to see the size of an ancient tree with your own eyes, or see how far you need to space your crops to prevent cross-germination. The sense of scale is difficult to replicate without that anchor to reality.
Having this realisation was difficult. When we first discovered we were going to be using AR a lot of our ideas gravitated to QR codes, since they are a good way to populate the school’s buildings with anchors and encourage the students to use the space to tick off objectives in their “quests”. Through thinking about it, it was pretty clear that their wasn’t much use for AR at all in the way our games were designed.
The Virtual Garden – a digital shared space where students craft and maintain a garden by planting new plants, watering them and keeping pests at bay – was the best and most natural way to user AR. Plants were proportional to each other in size and distance, and gave students a reasonable idea of how to apply their learnings practically. Furthermore, the recurring operations of running and maintaining a garden were applied through mini games that are also operated in AR. Users can see the amount of water required to water a pant properly, and observe the kinds of pests that are dangerous for plants and which insects are preferred. Nothing about AR made the “game” experience special, but it absolutely helped make the “education” unique and hopefully more effective.
Designing for Funding Bodies
While Project Sapling would never be released, we had to design the game to accommodate potential funding providers. Initially, we were focussed on designing the game around the Virtual Garden, the quests that the players complete would give seeds that they use in it. But as we looked into the available funding bodies, the most likely candidate was the National Landcare Fund, which was primarily focussed on ecosystem education and reparation. We had to change our focus from the “veggie garden” approach we initially had and transition into an “ecosystem development” approach, which opened up a whole range of potential learning opportunities for us and the students.
With this new design, users could still enjoy the “veggie garden” systems of the Virtual Garden, but the plants they grew would shape the dominant ecosystem of the space.
We could teach the students about what types of plants each ecosystem has, how to identify ecosystems, and after a dominant ecosystem was developed, what animals and insects would inhabit it. The Ecosystem approach was an avenue into the animals and insects learning categories as well. Students could see the animals interacting with the plants, and tap on them to learn more. By shaping the ecosystems through the plants they planted, the relationships between the animals, insects and plants could be taught naturally and intuitively, shaped by the students’ actions, while also fulfilling the funding requirements for the Landcare Fund.
Designing for a Curriculum
The curricular requirements for Project Sapling were the first roadblock we ran into. Obviously, the games we create had to naturally teach students the majority of their curriculum elements, but in a way that was fun and interesting in all the ways a classroom environment can’t be. To begin with, we got a list of all of the curriculum elements we could find online, which for us was the NSW gardening curriculum, which contained a collection of learning outcomes for Science, Maths, English, etc. as they apply to gardening. We put them all in a table and for each learning outcome that could possibly have a game idea hidden beneath it, we wrote a quick and dirty razor for a minigame. Some elements didn’t lend themselves to games at all (measuring plants, for one) but items like identifying insects were pretty easy make into games. To teach students how to play the games, we didn’t want to make them wade through wall of text. We created little avatars to guide players through the first time they play a game through dialogue.
The next challenge was how this app and platform could benefit teachers. We created the Teacher’s Lounge, where teachers could track their students progress over time. The biggest areas of importance for this is for teachers to track the most improved, least improved and declining students. The app can use the metrics of the student-side of the app to track their engagement, inclugin the games they’ve played, their performance in the games, and their engagement with the Virtual Garden. The hope is that teachers will use these metrics to provide extra care to struggling students that otherwise might not show signs of decline.
The next stage of designing the teacher interactions is considering the circumstance where teachers want to stagger their curriculum, and save certain mini games, quests or items for later in the year. In the Teachers Lounge, the teacher can trigger “events”, where the students will be taken to a certain screen in their apps to follow along with the class, or certain new mini games will be made available to them.
Designing for a Client
We lucked out in Studio 3 by working with Ralf, someone who’s been on our end many times before. Ralf already knew how to communicate with games and tech people productively and knew how long some of these tasks can take us. But when we worked under him as an intern, we started having to answer to HIS clients, down in New South Wales. These people were teachers with their own goals and a totally different understanding and approach to games than we did. From what we learned in class, we made sure we underpromised and overdelivered, and kept them out of the loop of any of the technical struggles that weren’t particularly important. The biggest hurdle of communicating with these people was trying to translate their requests and ideas into actionable tasks, and translate our game development paradigms into plain English for them to understand. All in all, they were easy and understanding to work with, and supported our ideas and gave actionable feedback when needed.
Designing for Ease of Use (and Rapid Prototyping)
My favourite challenge from this task was the UI design. Project Sapling is a heavily UI focussed project, and so needs a common-sense UI that utilises the UX paradigms that our target audience — kids — have already learned from using iOS and Android.
The UI needed to be dead simple with a clear sense of place and hierarchy, use bold and easily readable text and provide large touch-targets. It needed to tie seamlessly into AR and feel like AR is the centrepiece of the experience; the top of the hierarchy. It needed to be fun to look at visually and use colours to avoid a clinical mood.
The biggest touch-stones for the design were the Nintendo Switch UI, and iOS. I’m not a fan of Android’s specific UI conventions. The Hamburger menu for instance, feels like it’s never obvious what it’s used for. Similar to the floating action button, which often has a bunch of other controls included with it that you don’t expect. iOS on the other hand, has a bunch of UI conventions that are natural and useful, like:
• Translucency to provide a natural visual sense of depth
• The “grabber” which indicates that an item can be moved into and off screen, and rounded corners on items that can be moved.
• Natural looking light and dark interfaces.
• Works very naturally with large, bold typography and imagery
The biggest visual reference for the app was the Apple Maps application. It’s adaptive to the differing visual styles of the maps view and the satellite view. It also uses a rounded translucent UI pane, which displays useful information, frequent places and will fill with the currently selected address. I think it’s a very well designed application for its sheer versatility, and solves a lot of our own design problems for how we organise and format the information and give the user a sense of place.
To prototype the UI, we used Adobe’s Experience Design application, which allows me to design UI screens with all of the modern flourishes and effects, like translucency, and within the UI be able to set up buttons and links them between screens, so you can demo the entire UI experience before you have to do any code. It was a massive lifesaver!
Furthermore, the issue is that since Adobe Experience Design is a desktop only app that’s strapped down in Adobe’s account system, it’s really hard to give this demo to stakeholders to check out themselves. So for that part of the process, we used Marvel’s Prototyping-on-Paper (POP). This app does much of the same things that XD does, but without any of the UI design principles. Therefore, our workflow was as follows:
• Design Prototype in a notebook
• Take pictures and string together in POP
• Test the design
• Make changes and commit the design to Experience Design
• Test this design
• Export the images from Experience Design and transplant them into POP
• Give the POP project to their stakeholders.
Not having to do any work in Unity to set the project up was a massive time-saver. By the time we would be ready to begin production, we already had a UI that worked, and a game flow that we could put in the hands of our client with a helpful app and web interface.
This is the near-final demonstration I did of the UI (at this stage called Ardent Garden), we made a few changes since then but nothing drastic:
The big thing Studio 3 and my internship taught me is that Serious Games are a big business, and that the potential within games as both an entertainment and educational medium are important, and are already changing the way we train staff and experiment with learning. You can learn a lot by watching a professional do a good job, but you can learn even more by making the mistakes yourself in a safe environment to see why things are they way they are.
This project, while it never reached critical mass, taught me so much about how I approach game design under the constraints of implementation, audience, technology, and content. Working with clients and being forced to leave my comfort zone in order to learn about gardening, AR implementations and app development are all incredibly useful skills that I know are a foundation for even more of what there is to learn. I just have to get out there and learn it.