AR in the City – Its finished!

arcityall

We are delighted to announce that the learning resource developed in the AR in the City project is now available for use. The content (best viewed on iPads), developed in collaboration with the HEA, BSA, BSC and Jisc encourages the user to explore three fictitious parts of London; guided by a classroom activity to uncover correlations and stimulate discussion using socio-economic data (Census and Police). It also provides interesting facts about the different data sets – Housing, Crime and Family. While it is primarily aimed at sociology students, it could also be used more widely for both A-Level students and first year undergraduates.

Initial feedback was positive with one respondent indicating that is was a“…Really good overview and intro to stats;” and  “could work really well on focused topic.”

Explore or utilise this resource by downloading Junaio from the iPad App store and scanning the QR code in the postcard below (this can be printed out for ease of use). When the channel has loaded, hold the iPad over the map of London to display models from three fictitious areas – Forestminster, Pinkham and Cobalt Wharf. Clicking on each building reveals illustrated data snapshots relating to Housing, Crime and Family.

AR in the city postcard V2.0

A set of worked learning activities can be downloaded below to use with students  in the classroom environment in conjunction with traditional group tasks.

Learning Materials

Exercise Proforma_crime

Exercise Proforma_WorkedExample

An instructional video on how to use the resource can be viewed below:

It is hoped that the resource can act to inspire similar examples in other disciplines, repurposing the idea and demonstrating multiple applications. Already, Dr Susanne Boyle in collaboration with Glasgow Caledonian university has developed an IPE (Interprofessional Education) resource around Cochlear Implants that was recently presented at Thomas Jefferson University IPE Conference

IMG_0026

Posted in ARinthecity, Content Development, Dissemination | Tagged , , , , , , , | 1 Comment

Leeds College of Music AR Project

Screen Shot 2014-09-02 at 10.57.16

Over the past couple of months Jisc Mimas have been involved in leading the technical development of a new Augmented Reality resource around the music production studios in Leeds College of Music working with Craig Golding and Ruth Clark. It hopes to support students  working/studying in the music studios by displaying 3D visual overlays, technical documentation and other media assets, linking them to the physical production equipment in front of them. Using iPads, students stand in front of the production desk and the Augmented Reality software tracks the 3d object before snapping different coloured content accurately over it providing the user with a way to interact with different parts, surfacing contextual material. More information about the content and student feedback will appear here in the following few weeks.

Posted in Matthew Ramirez, MimasAR, mobile research, Student Experience, User Testing | Tagged , , , , , , , | Leave a comment

Introducing the Fabulous Frogs App: Splendid and Native

What can children learn?
The development of the Fabulous Frogs App: Splendid and Native completes the second phase of the Mapping the Museum Project. Developed using Junaio this app is an interactive AR tool which is targeted at 7 – 11 years and maps to Key Stage 2 of the National Curriculum in England and to the “responsible citizens” and “successful learners” capacities of the Scottish Curriculum for Excellence: specifically the AR app helps to develop children’s capabilities to understand the environment and to use technology for learning independently. The app addresses the following learning objectives:

  • Species Information e.g. where the Splendid Leaf Frog lives – presented with a map of its geographical extent
  • Frog anatomy – using label overlay on the trigger image
  • Frog life cycle – interactive quiz comparing between the Splendid Leaf Frog and the native Common Frog

Other fun features

Due to the successful collaboration with Manchester Museum we have been fortunate to gain access to some fantastic content to produce the app such as:

  • Sir David Attenborough’s introduction to the Splendid Leaf Frog during his visit earlier this year to Manchester Museum’s Vivarium
  • In-depth description of the Splendid Leaf Frog provided through the very popular Frog Blog
  • A high quality image of the Splendid Leaf Frog photographed by Chris Mattison (see below)

How does it work?

  • Download Junaio
  • Open Junaio and scan the QR code below
  • Hover you smart phone or iPad over the below trigger image and enjoy the interactive content
Fabulous Frogs App; Spendid and Native Trigger Image

Fabulous Frogs App; Splendid and Native Trigger Image

Instructions

Instructions

Fabulous Frogs App: Splendid and Native QR Code

Fabulous Frogs App: Splendid and Native QR Code

 

 

 

 

 

 

Next steps

Andrew Gray, Curator of Herpetology plans to implement the Fabulous Frogs App: Splendid and Native in the Vivarium Gallery over the summer, this will allow visitors to try out the app then and there in the gallery with their mobile phone. He will also be adding a link to the Virtual Vivarium on the Frog Blog where the Google Earth KMZ file can be downloaded so that all the Vivarium collection can be explored and viewed in the 3D globe either at home or school.

To conclude…

It has been a great experience working with Andrew and Adam Bland (Vivarium Assistant) at the museum and is an example of a very successful collaboration with Mimas in developing the Virtual Vivarium and Fabulous Frogs App. I would also like to thank Tom Hart (User Experience Developer) for all his hard work on developing the Virtual Vivarium website and interactive Frog Life Cycle quiz and to Matt Ramirez (AR Developer) for his help and advice when I have been developing the AR app.

Find out more

If you found this post interesting you might also like to read the complementary Frog Blog  post ‘Fabulous Frogs’ featuring a link to Sir David Attenborough’s Nature episode about Fabulous Frogs he encountered at the Vivarium during his visit to Manchester Museum.

Posted in Content Development, Gail Millin-Chalabi, geo-spatial, MimasAR, Museums, pedagogy | Tagged | 3 Comments

Initial thoughts of wearables in Education

photo-2

Object based Augmented Reality on tower PC to enhance instructional learning

I have been fortunate enough over the last few weeks to get hold of a pair of Google Glass and do some initial research of potential use cases in education. As stated in previous posts, at present there are some serious limitations on their use in an AR capacity, probably the most worrying is that they get very hot after only a couple of minutes! However, I wanted to see if I could demonstrate a simple application where wearables could add to the learner experience rather than replicate what is already available.

The idea of the connected world is very popular at the moment, the nirvana for many is that the sensors we use in our devices/wearables can provide a highly engaging, informative and personalised experience, especially when practical tasks are concerned. Augmented assets could potentially complement the physical environments we work in to assist in so many technical processes. With this in mind, and having the Mimas Sys Admin conveniently sat at a desk opposite, I acquired an old PC tower with a view to building an AR experience guiding a user on how to remove the Riser-Card Cage. It was relatively quick to build the assets for the channel, most of the time was spent putting together the 3D CAD tracking model and the animation in Blender.

The video on Google Glass is relatively low spec (720p) so it was necessary to sacrifice the accuracy of the trigger model to display the content. As a result, sometimes the alignment of the 3d assets was off but still perfecting usable. The major issue with Google Glass and third party apps at the moment is the lack of navigation afforded to the user, put simply there is no way of interacting with the AR environment dynamically.

In the coming weeks I hope to experiment with Epson Moverio to port the same experience to their wearable which offers a trackpad for user interaction and in my mind a more valuable experience. Metaio are about to release Junaio Mirage that was first demoed at InsideAR last year which specifically adapts their current AR browser for delivery on wearables (Google Glass, Epson Moverio and Vuzix). Although a long way from  being the year of the wearable, it is interesting to see how these new additions to the consumable market can conceivably benefit instructional learning in the future.

Posted in MimasAR | Tagged , , , , , , , , , , , | 1 Comment

Virtual Vivarium Launch @ Reptile Big Saturday Event!

I had a fabulous time this weekend at the Manchester Museum – Reptile Big Saturday event! The day was full of activities from creating a sock lizard to holding a real life Chameleon.  Tom (User Experience Developer) and I (Geodata R & D Officer) were at the event to promote the new Virtual Vivarium app to visitors. We were given a fantastic large screen to project the Virtual Vivarium onto.

Virtual Vivarium in Google Earth

Virtual Vivarium in Google Earth

I had created 60 copies of a Reptile Finder Quiz Sheet which parents and children could work through to discover key facts about the Cone Headed Lizard and Fijian Banded Iguana. Happy to say all the quizzes had been used by the end of the event with many children finding the answers to all the questions and rewarded with a Chameleon Virtual Vivarium sticker!

Virtual Vivarium Sticker

Virtual Vivarium Sticker (many thanks to Jennifer Matthews for printing these out on short notice)

The event had a fantastic atmosphere, I enjoyed the music from La Tinto Bros and was able to look at the tortoises on the Cheshire Chelonia Group stand.

I found that the children really engaged with the Virtual Vivarium and parents were pleased to hear that they could use it at home. Some visitors to the stand said they would suggest the app to be used in local Brownie groups and at schools. For younger children Tom and I helped with the navigation but children around 8 years and above were more confident navigating the app.

Younger children required assistance with the app

Younger children required assistance with navigating the app

A Virtual Vivarium Feedback form was provided for visitors to provide suggestions on how the app could be improved in the future. We got seven responses on the day with many saying after using the app at home they would provide feedback (survey open until 30 June 2014). All respondents found the Virtual Vivarium experience to be either Very Good (71 %) or Good (29%). Ease of use was rated as Very Good (43%) and Good (57%). All seven respondents would use the Virtual Vivarium at home with one of the reasons given as being “great for kids to learn about nature”. Areas for improvement included:

  • Making the app quicker
  • More child friendly
  • Make the areas overlay more

Others felt it was great the way it is currently and couldn’t think of any improvements.

Me with daughter and mother after helping them to do the reptile finder quiz

Me with daughter and mother after helping them to do the Reptile Finder Quiz

A great day all round and many thanks to Vicky, Anna and Andrew at Manchester Museum for inviting Tom and I from Mimas to participate at the Reptile Big Saturday event which was a real success with over one thousand people attending this family event.

Posted in Dissemination, Gail Millin-Chalabi, geo-spatial, Mimas, User Testing | Tagged , | 3 Comments

Mapping Amphibians and Reptiles!

As previously posted I am working with the Vivarium team at Manchester Museum as part of the Mapping the Museum Project. One of the main objectives of the project involves illustrating the spatial distribution of 20+ amphibians and some reptiles to reflect all the species that live at the Vivarium. The tool selected for the final visualization was Google Earth as this is a freely available tool which anyone can download and use both at home or in a classroom.

The challenge…. To find a dataset where the spatial distribution of amphibians and reptiles have already been mapped.

The solution…. The International Union for Conservation of Nature (IUCN) 2013 Red List Spatial Data Download. This is a fabulous site allowing you to download for free a zip file containing a shapefile for all amphibians and reptiles. Other data is available too e.g mammals, corals, birds, marine fish, mangroves, seagrasses and cone snails.

Species Extraction

The shapefile from IUCN can be viewed in a Geographical Information System (GIS) such as ArcGIS or the open source QGIS. In ArcGIS I opened up the attribute table for the ALL_AMPHIBIANS_NOV2013.shp file and used the Select by Attributes function to obtain all spatial location records for a particular species e.g. Oriental Fire-bellied Toad, Bombina orientalis. The selected data can then be exported as its own shapefile which I called bombina_orientalis.shp and can be viewed in ArcGIS.

bombina orientalis

Spatial distribution of the Oriental Fire-bellied Toad in ArcGIS using the IUCN dataset.

Convert to KML

Keyhole Markup Language (KML) is defined by Wikipedia (2014) as ‘an XML notation for expressing geographic annotation and visualization within Internet-based, two-dimensional maps and three-dimensional Earth browsers’. KML is used for Google Earth visualization and hence the shapefile (bombina_orientalis.shp) needed to be converted to KML. Using the tool Layer to KML in ArcGIS I was able to convert the shapefile into KML and view the Oriental Fire-bellied Toad polygon in Google Earth.

google_earth

Visualization of the Oriental Fire-bellied Toad distribution in Google Earth using KML.

The KML defines a string of geographic coordinates within the following tags: MultiGeometry, outerBoundaryIs, LinearRing, coordinates. The coordinates in Geo-Global (EPSG 4326) provide the location values required to visualize the spatial distribution of the Oriental Fire-bellied Toad (as shown above).

Additional information can then be provided about the species itself such as:

oriental_fire_bellied_toad

Further information provided about the Oriental Fire-bellied Toad with a relevant link to the Vivarium Frog Blog.

The methodology detailed above has been repeated for all species found at the Manchester Museum Vivarium. However for one particular species the Lemur Leaf Frog, Agalychnis lemur the spatial distribution from the IUCN (green polygon outlines on the map below) was edited by Andrew Gray, Curator of Herpetology, whereby he felt the red polygon outlines on the map below reflects the most up-to-date spatial extent for the critically endangered Lemur Leaf Frog, Agalychnis lemur.

IUCN_Distribution_Map_Agalychnis_lemur

IUCN spatial distribution of the Lemur Leaf Frog (green polygon outline). Vivarium’s spatial distribution of the Lemur Leaf Frog (red polygon outline)

The Virtual Vivarium App will be available to explore and try out at Manchester Museum’s Big Saturday: World of Reptiles 11:00 – 15:00 24 May 2014.

Posted in Gail Millin-Chalabi, geo-spatial, Museums, Project Outputs | Leave a comment

AR in the City – it’s almost built!

You may recall back in November we introduced an exciting collaborative project called – AR in the City. Well, things are moving along nicely, and our city is becoming populated with relevant facts, data and more.

AR shard large

A collaborative team, consisting of AR expertise at Mimas and a number of key people across the sector has come together to create an AR Sociology experience. Come with us on a tour and understand more about different areas of the city including the social aspects of three different topics: Housing, Crime and Family. Discover some surprising facts about the city based on a London landscape, and compare and contrast data. This is aimed at sociology students, but could be used more widely for both A level students and first year undergraduates.

Who’s involved?

• Matt Ramirez – Lead AR Developer (Mimas)
• Helen Jones – Higher Education Academy (HEA) and British Society of Criminology (BSC LTN)
• Judith Mudd – Chief Executive, The British Sociological Association (BSA)
• Dr Martyn Chamberlain – Senior Lecturer in Criminology and Social Policy, Loughborough University
• John MacInnes – ESRC Strategic Advisor on Quantitative Methods Training
• James Nicholson – Consultant to the SMART Centre, Durham University

Project Expectations

The AR experience will begin with a trigger image of recognizable image of a London Landscape (see below), which will coax the user to delve deeper into different areas.

Screenshot 2014-04-09 13.59.36

There will be three key areas: Family, Housing and Crime, which will ‘reveal’ three fictitious selectable areas of London – an affluent area, impoverish area and a commercial area. A dashboard will be displayed that will allow the user to select various related quantitative data; such as the number of people, households, ethnicity, types, and crime rates etc. Each area will allow the user to drill down further to obtain further data.

Alongside the AR experience will be a teaching support pack, which will include the key teaching & learning activities, such as videos; did you know facts; can you find questions; supporting documentation and relevant links, including the Smart Plotter.

The experience will also cover a 60-year time frame, to coincide with the BSA’s 60th birthday.

Benefits for your students

Imagine your students accessing this rich information not on a computer or laptop but on their phones or iPads, in a seminar room with you, or out in the field, engaged in self-directed learning. This is a step-change away from flat websites, but to situated learning, and increasing the student engagement. This type of learning also promotes social learning in a group environment, which you cannot achieve using websites. The simple fact that this learning activity can be achieved individually, in a group or on the move, makes this an exciting project, with further scope for development for a broad array of subjects.

So, if you are a Sociology or Criminology educator you have the chance to be amongst the first to engage with a new teaching tool which will examine social inclusion and exclusion like never before.

AR in the city will be free to use and should be available in the autumn term 2014.

Want to know more?

Colleagues within the team have produced a superb article for Network: Magazine of the British Sociological Association. Helen and Judith discuss more about what AR can do for students and teachers, with further background to the project.

We will be posting more on the final outputs in due course.

Posted in Uncategorized | Leave a comment

Vivarium App Development

Posted on behalf of Andrew Gray: Curator of Herpetology at Manchester Museum

At the moment Adam and I are working on providing information to Gail Millin-Chalabi at Mimas to support a superb new project which is currently underway called Mapping the Museum’It aims to enhance collections through the use of Augmented Reality (AR) and 3D mapping visualisation. The first phase of the project is focusing on amphibian and reptile species found in the Vivarium and there will be 2 new applications developed.

The first will be called ‘Virtual Vivarium’ - Developed using Google Earth the Virtual Vivarium is the app any visitor will be able use to find out more about the amphibians that are currently both on display and behind the scenes. The Virtual Vivarium will provide exciting new content on:

    • Where in the world all the amphibian species in the vivarium are located – it visualises the most up-to-date distribution maps from the International Union for Conservation of Nature (IUCN) and our expertise here using the 3D Google Earth globe.
    • Detailed species descriptions including photographs, videos and relevant links.
    • All the in-situ costa rican conservation work we are involved in.

Issues surrounding the threats amphibians are facing, using a case study of Madagascar which shows how their habitat has changed over time due to deforestation using satellite image animation

Virtual Vivarium

Virtual Vivarium developed using Google Earth

Fabulous Frogs App: Splendid & Native Developed using Junaio the Fabulous Frogs App: Splendid and Native is an interactive Augmented Reality (AR) tool which is targeted at 7 – 11 years and maps to Key Stage 2 of the National Curriculum in England and to the “responsible citizens” and “successful learners” capacities of the Scottish Curriculum for Excellence: specifically the AR app helps to develop children’s capabilities to evaluate environmental issues and to use technology for learning independently. The app will include the following learning objectives and features based around the Splendid Leaf Frog:

  • Where the Splendid Leaf Frog lives in relation to the learner
  • Understanding the anatomy of frogs (see below).
  • Frog life cycle – comparison between the Splendid Leaf Frog and the native Common Frog, including quick quiz questions for each stage of the life cycle
  • Viewing our Splendid Leaf Frogs with David Attenborough
  • In-depth  exploration of related Frog Blog Manchester content
  • How you can help – Sponsor a Frog
Fabulous Frogs App: Splendid & Native

Fabulous Frogs App: Splendid & Native

Both these 2 exciting new apps are planned for release this summer and will be accessed exclusively through taking a photo of a Splendid Leaf Frog on Frog Blog Manchester with your phone or iPad.

Image | Posted on by | Leave a comment

A new medical school collaboration

Augmented Reality has the potential to enhance the clinical skills development for a range of healthcare professional students medical, pharmacy, nursing and dentistry students who need to be able to demonstrate competency as part of their clinical and practical assessments. Building on our experiences with the Prescribing Skills ibooks and drawing upon new expertise within the team we are currently working collaboratively with medics and pharmacists to develop AR resources that reinforce practical skills development for OSCEs.

Using 3D object recognition on iPads, users can access overlaid virtual imagery, supporting media and textual instructions shadowing them through practical clinical procedures. Additionally, students will be able to interact with AR ready posters around the clinical skills labs enabling linking to video demonstrations and formatting testing of individual learning may be achieved via associated quizzes.

Posted in Uncategorized | Leave a comment

AR horizon scanning and Big Data

Augmented Reality has established itself over the last few years as an emerging technology without appearing to generate enough momentum to galvanise mainstream adoption, that is until now. I predict this will rapidly change as “Big Data” becomes increasingly prevalent and easy to integrate with developing technologies and hardware. If we look at the themes emerging from CES 2014 and most recently the MWC, many of the commercial big hitters are focusing on the use of mobile devices to serve consumers with a plethora of data to assist in their daily lives. This is important where AR is concerned in terms of the use of environmental aware applications monitoring user location, to present relevant contextual information. Obviously, privacy is a major inhibitor to uptake with users reluctant to allow companies like Google access to personal information. Even so, I believe education applications can exploit the potential of a marriage between AR and big data.

The healthcare industry is starting to see integration between wearable devices coupled with patient data/imaging to identify precision placement when applying an IV in the example above from Evena. This could have a massive impact in the training of medical students and continued development of existing medical professionals. The automotive industry are also providing manual data to owners enabling them to perform simple maintenance procedures such as topping up windscreen fluid and oil (see below).

Mini One AR concept

Using AR to top up washer fluid in Mini One

 

 

 

 

 

 

 

With an increasing number of sensors being incorporated into nomadic devices, imagine being able to monitor the air quality at different points in a city for an environmental tour, then compare it in real time with city averages across the globe to draw out correlations and causality.

Core sample

Core geology sample

As part of an AR Geology field trip Mimas developed last year, users could listen to qualified academic commentary, uncover fossils and navigate a predefined route. Going beyond this, I envisage it will soon be possible to use big data to serve soil information and accurate core samples based on their exact location. Couple this with increased range of device sensors and students could record pH readings for project work, adding them to datasets for analysis, enabling the prediction of future environmental impact. This data could then be shared with the Environmental Agency so that preventative measures could be taken prior to any serious ecological impact.

 

 

Of course, there are a few obstacles to overcome before users are able to realise the potential application of environmental aware content in an education context. Battery life is a major problem yet to be resolved, and will need to be addressed before other sensors can be reliably integrated to ubiquitous devices. Also, hardware capabilities, although improving exponentially, will at present prevent some of the more engaging and practical uses of AR, especially dynamic 3D modelling.  Wearable devices are still in their infancy and user experiences are patchy at best, social acceptance is another issue to overcome, merely putting frames around the technology (http://www.google.com/glass/start/) will not stop users looking like a cyborg and feeling marginalised.

Viktoe Mayer-Schonberger and Kenneth Cukier talk about the value of “data reuse” in their book Big Data;  how data can have multiple uses to extend value beyond its original purpose. Consider an engineering student being able to use an AR resource to overlay BIM information (electrical circuitry, air conditioning layout, floorplan) as 3D models on a site visit to quality assure the construction of utilities. Fast forward a few years to a fire fighter wading through the same smoke filled room using the same data to navigate around the building and find the nearest water supply via an AR HUD in their protective visor. It sounds like the stuff of science fiction but it is closer than you think. This is also a great illustration of how educational research could be applied to positively affect a critical part of our lives in the emergency services.

Although I do not pretend to come from a position of neutrality where AR is concerned, I honestly believe that as big data becomes more available and consumable by mobile/wearable devices, its relationship with AR can help move towards a critical mass in education and beyond. Users will come to view this new synergy as a natural way to consume information, instead of an unwieldy technology shoehorned into their everyday lives.

Posted in geo-spatial, Mimas, MimasAR, mobile research, Uncategorized | Tagged , , , , , , , , | 1 Comment