contextual normalcy
What
Contextual Normalcy is a participatory AI Research project in which we use artificial intelligence and crowd-sourced data to create alternative visions of normalcy for mental health.
When
2018-ongoing
Who (cURRENT)
Christine Meinders
Shiveesh Fotedar
Carey Crooks
Michaella Moon
Michelle Gong
Matt Asato-Adams
Past Collaborators
Yun Lucy Zhang Bencharit
Don Brittain
Stephanie M. Cedeño
Yvonne Cruz
Judy DeTar
Shiveesh Fotedar
Michelle Gong
Qiao Huang
Christine Meinders
Kedar Reddy
Theeba Soundararajan
Jana Thompson
XRStudio

Inspirations
Alison Adam
Sara Ahmed
Karen Barad
Rebecca Fiebrink
N. Katherine Hayles
Dara Blumenthal
Margaret Boden

Our current project Contextual Normalcy is exploratory and responsive. Rather than simply focusing on feelings, we ask about stories. As we experience the shared trauma of the pandemic, how might we understand and process collective feelings? What patterns or experiences reflect this time, and how can we communicate them? Could we create a COVID emoji? What is the shared sound of a pandemic across locations? How might we continue to explore feelings from multiple entry points, not seeking to trigger trauma but to explore and to honor shared, embodied experiences?

Do you remember the sound of spring 2020? When everything except the hospitals seemed to freeze? The sound of empty public spaces, the absence of engines? With sounds of human movement deadened, seismologists could record more nuanced tectonic vibrations and people reported worldwide that birds sounded louder.

Sound has much to do with space and time, meaning it also has much to do with place. Sound connects place to our senses, and thus our feelings. In this project, and using sound as a key, events and locations will be mapped by each individual while also shared and distributed across a social landscape.

Our project will use Machine Learning, data tagging, and user-contributed stories to explore the sound feeling of a space. Through a “feelings kernel,” we will be able to explore the relationships between our sound mapping in a shared time of distance.
This example is imagined as an AR filter, but may also be accessed (manifested) through a webpage. This AR filter would operate as an interface that “maps out” or translates your personal “kernel” to an audio database. The interface would also do the reverse and “map in,” translating other people’s feelings to your kernel and allowing the user to connect with an individual in their social circle. This transformation would be generated by using Machine Learning to cluster keywords and audio files, finding patterns to further develop the sound mapping project.  

We are proposing the beginning of a multi-stage project: a pre-data collection through co-crafting our research tools for data collection, as well as incorporating previous Contextual Normalcy project-based research tools and community workshops with Feminist.AI. We will ask people to explore sonic, location-based feelings and associations by prompting them to upload a sound and think about where they encountered the sound. They are also encouraged to upload a story about this sound or location. As sound data comes in, we will work with sound engineers, biologists, environmental scientists, psychologists, artists, and makers to identify patterns and sound design for the sound collection tool. In Stage 2 of the project, we will use a custom “sound authoring tool” to continue this research. This will map against existing research that looks at location-based feeling, movement patterns, and diet, continuing to holistically explore wellness from a multi-sensory perspective.

While our project is multi-sensory at scale, for this segment of the project we are focusing on sound and story (narrative). We are starting to ask the questions that will help to create this sound authoring tool.

In a past portion of the larger project, we began to explore “releasing” feelings (Contextual Normalcy, 2018); now, we are exploring feelings through sound and location. We believe that these explorations, when taken holistically, will offer new dimensions to traditional diagnostic medicine and clinical psychology.

This work can prompt questions that will start to bridge individuals and communities around the world through sound. In active research project we consider: What is the sonic-feeling of a location? How might we use crowdsourced data collection and storytelling to explore a single dimension within multi-dimensional approaches to feelings?

There are three ways an individual may be a part of this arts-feeling sound research: as an author, through participating in experiences, or by taking on both roles. As an author, the individual creates their own feeling sound kernel which can be used to map their concept of a feeling to someone else's feeling or “global feelings.” The author is asked to record or upload a sound. This may be at a location, or they might write or speak in the location they are thinking, or there may be no location at all. They may pair keywords with the feeling, simply identify a feeling, or record a story that includes multiple feelings (some feelings are identified, others may not be) or a representation of a place (actual park or imagined park).

Theory

This project is a critical exploration on the idea of normalcy in the post-human age, given that the idea of mental health and normalcy is largely defined from the point of view of Western culture.This research explores alternative ontologies of normalcy as a function of language, body, and space by leveraging participatory design principles, artificial intelligence, and augmented reality technologies.

On this basis we explore the change in the idea of “normalcy” in the advent of COVID-19. The research includes sourcing “feeling” data from a diverse set of individuals across various demographic, psychographic, and geographic individuals.

In this project we:
-Leverage Spatial computing and AR technologies to explore the environmental and spatial dimension of a “feeling.”
-Leverage imaging and sensory technologies to capture embodied “feeling” data of the human body and movement.
-Leverage NLP and other algorithms to explore existing and emerging relationships in the concepts of “feeling.”
-Leverage various Machine Learning techniques to compose and cluster the “feelings'' across the globe to create a holistic representation of “normalcy”. We engage in participatory design methodologies to bring in communities to identify their thinking about feelings.
-Explore and display the changes in “normalcy” across cultures and geographies since the COVID-19 Pandemic through creation of various Multidimensional (spatial, body, textual, object time) designed artifacts representing contextual normalcy


Process

Using the social tech design tool poieto (patent pending), this project explores key elements of data, form, and frames to create new thinking about feeling. In this example, we continue to investigate one part of our multimodal exploration, on data and sound. We use a participatory design process, encouraging communities to design the reality they want to inhabit. In this stage of the Contextual Normalcy project- focusing on sound- we are returning to a world where reducing human created sound inputs may sound like isolation to us, but may also allow us to hear other sounds or movements more clearly and move us toward a collective, post-human design space. This will occur while in data collection that can modify and contextualize how we think about feeling, and how that then applies to diagnosis and treatment in individual and collective health.

poieto frames this research by critically exploring data, models, and critical making. This is achieved by:

-Iterating from various Machine Learning models and observing emergent patterns in Contextual Normalcy.
-Bringing in communities to contribute what they think the different rules should be based on different inputs (in this case, feelings).
-“Rules” can be the rules and guidelines one follows for a specific input.
-“Rules” can also be the “things people do” given a situation.
-“Rules” can be an imagined or metaphysical set of rules and relationships of ideas through which one might express/communicate or decide around a topic.
-“Rules” are contextual and fluid and can change as a function of people, time, space, situation, emotion, or other “Rules.”

We can collect data on and find patterns in the rules as a function of different cultures, demographics, location, natural environment, and psychographics. As we continue to collect data we will continue to document our co-creation through workshops and Feminist.AI community meetings.

About Feminist.AI:
Feminist.AI works to put technology into the hands of makers, researchers, thinkers, and learners to amplify unheard voices and create more accessible AI for all. We create spaces where intergenerational BIPOC and LGBTQIA+ womxn and non-binary folks can gather to build tech together that is informed by our cultures, identities, and experiences.We engage with intersectional feminism to spotlight our stories, inventions, designs and leadership, and to co-create more equitable futures.
(Feminist.AI)