poetry in motion

A multisensory fusion of art, tech and language

when
duration, year
4 weeks, Fall 2023

Where

location, type
Berkeley, MDes

who

the team
S. Alexx Zaki, Bob Wei, Junjie Li

Which

skills, tools
Experience Design, Microcontrollers, AI, Code

ROle

my responsibilities
Planner, Researcher, Designer, Developer

Methods

techniques used
Immersive/Sensory Considerations, Usability Testing
AI-generated image of the art installation, from a prompt referencing the original exhibition plan.
WHAT

Background:

Poetry in Motion is an interactive art installation that emerged from the fusion of technology, art, and linguistics. This project creates a dynamic, immersive experience where participants' movements transform into AI-generated poetry, surrounded by a multisensory environment.

Challenge:

The primary challenge was "Immerse yourself in the technology; understand its workings, and identify potential applications. Create a realistic scenario where the technology can be applied. Build a demonstration that enables the audiences to engage with the technology." The technology in question was mainly;

  • Microcontrollers, sensors, and actuators.
  • Digital modeling, simulation, and fabrication tools.
  • LLM based AI

Idea:

"Poetry in Motion" combines Particle's Photon 2, Adafruit’s Triple-axis Accelerometer, and OpenAI’s API for real-time poetry generation, and a carefully crafted multisensory environment. Participants interact with the installation using a motion-sensing wand, with their movements generating unique poetic expressions.

How

Research Insights:

Exploration into motion sensors, AI-driven poetry creation, and multisensory feedback highlighted the need for precise technology integration. Testing various sensors and AI models was crucial in developing a system that could respond fluidly to participants' movements.

Design Process:

The project journey involved extensive experimentation with motion sensors, AI poetry generation, and multisensory integration. Challenges in technology compatibility and implementation were overcome through iterative testing and teamwork, resulting in a cohesive and engaging experience.

The journey began with an personal idea that blossomed into a collaborative project, during a pin-up idea critique session for our final semester project.

Towards the end, two of my classmates approached me, saying they would love to work together on my project idea. I was definitely taken aback by their unexpected interest.

However, their enthusiasm was both affirming and exhilarating, transforming my solo venture into a promising team project. This phase was filled with excitement and the shared anticipation of what we could create together.

With the team established, we dove into developing the concept, refining the idea, and exploring its potential. Assigning roles was a natural process, as each team member gravitated towards their strengths and interests.

We faced the challenge of integrating our diverse skills and interests into a cohesive development strategy early on, but I tried to work it out rather quickly, ensuring that each contribution was both valued and pivotal to the project's success.

I also enlisted the help of my mentor, Professor Purin Phanichphant, who contributed to the FigJam board in the planning stage.

The prototyping phase was pivotal, focusing on the integration of Adafruit’s ADXL326 accelerometer and the OpenAI API for real-time poetry generation. We encountered some significant challenges in calibrating the accelerometer to accurately translate physical movements into digital signals.

Additionally, the shift from ZeroWidth (which we initially planned to use) to OpenAI's API just days before the showcase added pressure, necessitating rapid adaptation and extensive testing to ensure seamless interaction.

Integrating the motion-sensing technology with the AI-generated poetry system was complex. We aimed to create a multi-sensory environment where movement seamlessly translated into poetic expressions.

Challenges included synchronizing the motion data (x y z coordinates) with the poetry output and ensuring the interactive installation was intuitive and engaging.

Refining the user experience involved continuous testing and adjustments, enhancing the installation's responsiveness to movements and its overall aesthetic appeal.

Acknowledging the spatial constraints at the showcase event, I opted to create a detailed 3D mockup of our installation afterwards. This decision allows me to provide an immersive online experience that conveys the essence of our interactive installation effectively.

The virtual representation was meticulously designed to capture the interactive nuances of the physical installation, ensuring that viewers could experience our work, even if they weren't at the event.

interactive 3D model coming soon

The journey began with an personal idea that blossomed into a collaborative project, during a pin-up idea critique session for our final semester project.

Towards the end, two of my classmates approached me, saying they would love to work together on my project idea. I was definitely taken aback by their unexpected interest.

However, their enthusiasm was both affirming and exhilarating, transforming my solo venture into a promising team project. This phase was filled with excitement and the shared anticipation of what we could create together.

With the team established, we dove into developing the concept, refining the idea, and exploring its potential. Assigning roles was a natural process, as each team member gravitated towards their strengths and interests.

We faced the challenge of integrating our diverse skills and interests into a cohesive development strategy early on, but I tried to work it out rather quickly, ensuring that each contribution was both valued and pivotal to the project's success.

I also enlisted the help of my mentor, Professor Purin Phanichphant, who contributed to the FigJam board in the planning stage.

The prototyping phase was pivotal, focusing on the integration of Adafruit’s ADXL326 accelerometer and the OpenAI API for real-time poetry generation. We encountered some significant challenges in calibrating the accelerometer to accurately translate physical movements into digital signals.

Additionally, the shift from ZeroWidth (which we initially planned to use) to OpenAI's API just days before the showcase added pressure, necessitating rapid adaptation and extensive testing to ensure seamless interaction.

Integrating the motion-sensing technology with the AI-generated poetry system was complex. We aimed to create a multi-sensory environment where movement seamlessly translated into poetic expressions.

Challenges included synchronizing the motion data (x y z coordinates) with the poetry output and ensuring the interactive installation was intuitive and engaging.

Refining the user experience involved continuous testing and adjustments, enhancing the installation's responsiveness to movements and its overall aesthetic appeal.

Acknowledging the spatial constraints at the showcase event, I opted to create a detailed 3D mockup of our installation afterwards. This decision allows me to provide an immersive online experience that conveys the essence of our interactive installation effectively.

The virtual representation was meticulously designed to capture the interactive nuances of the physical installation, ensuring that viewers could experience our work, even if they weren't at the event.

interactive 3D model coming soon

why

Impact:

This installation is a contribution to cultural and linguistic appreciation, offering a unique blend of art and technology. It promotes mindfulness, emotional well-being, and social inclusion, providing a communal space for diverse groups to engage creatively with AI. This project not only demonstrates the potential of AI in the arts but is a testament to the power of interdisciplinary collaboration. It provides an example of how technology can enrich cultural landscapes and stimulate meaningful debate on the role of AI in society.

Reflections:

Poetry in Motion has been a profound learning journey, pushing the boundaries of interactive art and technology integration. Feedback from the Jacobs Winter Showcase highlighted potential areas for expansion, such as AR integration and advanced personalization, setting the stage for future developments.

What else
A photo taken during the first test
More projects

this page is still in the works

Please check back again in a week.
Thank you :)
Animation by Artemiy on LottieFiles