Engaging with local community through playful visualisation of (geo-located) city data as parametric interactive abstract faces. Developing a mobile AR (Augmented Reality) app with two modes, where ‘Mode 1’ will be - translating city data as a parametric face of the city (neighborhood) and ‘Mode 2’ will enable the application of this city ‘FaceMap’ as a live selfie effect using Face Tracking AR technology.
Each year Australian government collects tons of potentially useful and interesting data. However most this data lays dormant and is not being utilised, because most of the data is being collected, stored and communicated to public as those ‘boring’, hard to read spreadsheets or text files.
The fact is that not all people are good at reading numbers and/or understanding and interpreting numeric or text-based data sets. Thus, en masse, public is reluctant to actively engage with this potentially valuable source of information.
However humans are really good at reading faces and facial expressions. We use our faces to communicate complex information - instantly. It’s in our human nature. We wake up in the morning and we look in the mirror. What does our face tell us? Throughout history portraits played a crucial role in art and culture. We see faces on book and journal covers, advertisement and movie posters. Nowadays - selfies are integral to social culture. AR Face Tracking technology is extremely popular among available mobile apps, and live selfie effects (such as snapchat filters) are being used daily by millions of people (Snapchat alone is used be over 158 million people daily).
Make data visualisation - engaging and easy to ‘digest’!
Use data variables as generative face attributes.
This projects taps into human natural capabilities of reading and understanding faces. The intent is to translate geo-located city data as (hopefully cute and engaging) abstract faces, allowing people to use the city ‘FaceMaps’ as selfie effects or view them as a combined map of data-faces [What is the face of your city?].
FaceMap prototype uses following data to visualise the "faces" of cities:
-median taxable income
-Average Taxable Income
-Median Rent (weekly)
-Average Household Size
-Total Private Dwellings
-Dwelling With No Motor Vehicles
-Education: Bachelor Level and Above
-Average Total Business Income
-Average Net Tax
-Age of community (young-old)
-Homeowner status (renting / paid in full / paying the home loan / etc.)
-Occupation (student / working full time / unemployed)
-Park (green) area in the city
-Houses water consumption
-Houses energy consumption
-Number of sport facilities / public libraries / art centers
Mobile app generating an abstract face interpretation, where facial attributes are informed by local city data, including such data variables as: average age of community, homeowner status, occupation, park (green) area in the city, health level, sustainability of housing: energy and water consumption, average income, education level etc…. And allowing people to express and share their feeling and emotions regarding the current state of city ‘FaceMap’ by choosing happy/unhappy facial expressions that will be applied on this data-face-interpretation.
Mobile selfie effect app, which would apply chosen various city faces as masks, using Face Tracking AR technology. Allowing people to share these FaceMap selfies with the larger community, raising public awareness of current state of Australian cities.