The Watsons

The Watsons Main Image

“Meet the Watsons” are a set of four projection-mapped sculptures that analyze and deconstruct a Twitter account live in front of an audience. This piece encourages viewers to question what they share online, who has access to it, and the tone of their communications. Expanding on themes explored in my earlier work, You Probably Live in Horsham¹, the Watsons examine metadata² security and availability, challenging audiences to consider: Do you know what you put online? Should it be public?

Accessibility was central to the piece, given the diverse demographics of Twitter users. By creating physical, human-like sculptures, I aimed to make the Watsons approachable and relatable. Their homely personas guide users through insights gathered from analyzing tweets, removing language and reading barriers. The Watsons’ friendly appearance contrasts with the dark shadows they cast, symbolizing the sinister undertones of online surveillance.

The Watsons consist of six components: four sculptures, a projection keystone application, a language analysis system, text-to-speech functionality, a scripting engine, and a user-interaction front end. These elements were meticulously designed to provide a seamless gallery experience while maintaining the depth of information presented. Avoiding third-party text analysis APIs like IBM Watson, I developed a bespoke system for analyzing tweets, utilizing research papers and tools like Ian Barber’s PHP wrappers for the Twitter Fabric API³.

The Watsons Process

Above: The process of creating the Watsons, left: inner structure, middle: rough shape, end: finished sculpture.

/*
*/

Sculptures

The sculptures are crafted from porcelain, chosen for its durability and bright, matte finish that disperses projected light effectively. Each was created in ~12 hours of continuous work to ensure consistent drying and prevent kiln firing issues. Sculptural details vary depending on the level of animation projected—for example, the boy’s eyes and mouth are smoothed for projection mapping, while the baby’s features are fully formed.

Key inspirations for the Watsons’ design include Julian Opie’s Large Portrait Sculptures⁸ for their smooth aesthetic, and Mother G.O.A.T, a projection-mapped face from a 2011 Lady Gaga concert, likely created by Nick Knight. These influences shaped the Watsons to ensure their projections looked as natural as possible.

/*
*/

Technology

1. Front-End System

Users initiate interaction through the joe.ac website on their phones. After entering their name, Twitter username, and pronouns, their phone acts as a ticket, placing them in a queue. Once analyzed (~20 seconds), the Watsons speak to the user while their phone displays a live transcript. This system, built with JavaScript (using AJAX for database communication) and PHP, powers the following analysis stage.

2. Analysis Engine

The analysis engine, written in PHP, processes tweets as follows:

  1. Fetch the user’s 10 most recent tweets via the Twitter Fabric API³.
  2. Extract geotags or locations to deduce where the user likely resides.
  3. Identify affiliations with universities or schools by analyzing user bios and tweet content.
  4. Perform sentiment analysis using a Naive Bayes algorithm implemented by James Hennessey⁴.
  5. Extract keywords by removing stopwords (based on DarrenN’s GitHub list⁵).
  6. Use a POS (part of speech) tagger to categorize and prioritize words.

3. Scripting Engine

Once data is analyzed, the scripting engine dynamically generates a conversation script, typically lasting ~90 seconds. The script follows this structure:

  1. Greeting the user with one of 455 variations.
  2. Discussing the user’s likely location with live, contextually relevant data (via GeckoLandmarks).
  3. Exploring affiliations with universities or schools.
  4. Analyzing and commenting on specific tweets, focusing on strong sentiments.
  5. Concluding with a thank-you and goodbye.

Example JSON Script

Above: Example JSON script used for scripting the Watsons.

4. Projection Application

The projection application, written in Processing and Java, uses a modified version of David Bouchard’s Keystone framework⁹ to map animations onto the sculptures. Textures, created in Photoshop, combine features from friends, family, and royalty-free celebrity images. Animations include moving eyes, mouths, and layered facial features.

Text-to-speech functionality, powered by Amazon’s IVONA⁶, streams audio with low latency using chunked encoding⁷. A custom class monitors audio amplitude to synchronize speech with animations.

/*
*/

Final Words

During the Symbiosis exhibition, nearly 300 people interacted with the Watsons, with many expressing surprise at the accuracy of the information revealed. The project’s reliability—despite occasional server and API limitations—demonstrated the robustness of its design. Attendees reported greater awareness of online privacy, with some deciding to make their Twitter accounts private.

The Watsons achieved their goal: raising awareness of online intimacy through thoughtful, engaging dialogue rather than shock tactics.

/*
*/

Exhibitions and Awards

  • Symbiosis Exhibition, Goldsmiths, University of London (April 28, 2016)
  • Generation: Computing Degree Show, Goldsmiths, University of London (June 2, 2016)
  • Awarded Best Creative at the Computing Degree Show 2016

“It was clear that this project had a solid conceptual center but also a really robust delivery method made up of layers of different technologies that remarkably coalesced into a single engaging experience for the audience.” — Justin Spooner, Unthinkable Digital

/*
*/

Sources

  1. You Probably Live in Horsham
  2. Metadata
  3. Twitter’s Fabric API
  4. James Hennessey
  5. DarrenN’s Stopwords List
  6. IVONA
  7. Chunked Encoding
  8. Julian Opie’s Sculptures
  9. David Bouchard’s Keystone Framework
(c) Joe McAlister 2024. This site is powered using 100% renewable electricity. The 'tired eye' logomark is a registered trademark of Joe McAlister.