A micro-interaction concept that lets you attach a quick note (a mood, a memory, a lyric) when sharing a song. Designed for both mobile and smartwatch, with a focus on making music sharing feel more human.
Sending someone a song is a deeply personal act. But most music-sharing flows treat it like any other link, with no context, no emotion, no story behind it. VibeNote asks: what if you could attach a feeling to a song before you send it?
The concept centers on a single micro-interaction: an "Add a Note" feature that lets users attach a quick Mood, Memory, or Lyric context tag before sharing. Designed for both mobile and smartwatch, the experience had to be frictionless: quick to use, emotionally resonant, and clear at every step.
The "Add a Note" feature is triggered from the song card. A "+" button appears directly below the track details, paired with clear labelling so it's discoverable without any onboarding. Tapping it opens a compact panel with three context options and a short text field.
Sending a song with no context feels impersonal. But adding a long caption feels effortful. VibeNote lives in between: one tap adds just enough meaning without slowing the interaction down.
User selects a song, taps Add Note, picks a context type (Mood, Memory, or Lyric), types a short note, and hits Send. A confirmation animation plays. The receiver gets a message card with the song and the note attached.
The song card sits at the top. Below it, a popup panel appears with the three note type icons, a text field, and a Send button. A smiling emoji plays as playful feedback once the note is sent. Light, warm, human.
The Now Playing screen shows album art, song title, four large reaction emojis, a small "+ Note" button, and playback controls at the bottom. Visual hierarchy guides the eye: react first, note second, control third.
"Turning a plain link into a quick, meaningful connection."
— Design brief, VibeNote conceptOn the smartwatch interface, tapping a reaction emoji is its own micro-interaction with three distinct phases. Breaking it down this way helped ensure each moment felt intentional rather than accidental.
Hovering over a reaction emoji makes it slightly enlarge, signalling that something will happen. Tapping begins the reaction animation. The trigger is intentional but lightweight, reducing accidental activations on the small circular display.
When tapped, a small version of the emoji slowly appears on top of the album cover, as if a sticker is being placed down. This visual feedback, combined with the hover scale-up and a short haptic tap, confirms to the user that their reaction was added successfully.
The sticker emoji settles on the album cover and stays there briefly, then the interface resets automatically. The user is not stuck in any mode and can immediately continue interacting: react again, add a note, or skip the track.
As part of the course, I applied five UX laws directly to the VibeNote design. Each one shaped a specific decision in the interface, from how information is grouped to how users move through the flow.
The screen is divided into clear sections: song info at the top, preset reaction icons in the middle, and the note entry field at the bottom. These chunks let users process each part of the sharing flow one step at a time, reducing cognitive load on a small screen.
The reaction icons all share the same size, color style, and circular container shape. Even though each represents a different context type, their visual similarity signals that these options behave the same way. They are all quick-add context tags.
The confirmation message "Note Sent!" appears close to the note box where the user typed. This proximity connects the success state directly to the action just completed, reducing uncertainty and making the interaction feel clear and grounded.
The note input area is enclosed within a bordered rectangle, creating a common region that separates it from the rest of the screen. This focuses attention on the message field and helps users clearly identify where to type without distraction.
After tapping "Add a Note," users move through small, clearly sequential steps: pick a note type, choose a contact, send. Each step feels like progress. The final confirmation screen with a clear completion state and a "Back to Listening" button reinforces that the goal was reached, motivating users to repeat the action.
Beyond UX laws, three core interaction design principles shaped specific decisions in the watch interface, particularly around how the app guides users and confirms their actions.
The "+ Note" button appears directly below the song info, paired with clear labelling and an icon. It appears at exactly the right moment in the flow, so users understand what they can do without needing extra guidance or onboarding.
After sending, a green checkmark appears with a short animation and haptic tap, giving visual and physical confirmation simultaneously. This immediate feedback reassures users the message was delivered without requiring them to navigate or guess what happened next.
Every shared song appears as a structured message card: album art left, song title and artist right, sender's note below. This consistent layout means recipients can instantly recognise a shared song in any context, reducing cognitive load across the experience.
Designing for a circular smartwatch display introduced real constraints that shaped the interaction in useful ways. Typing on a smartwatch is slow and uncomfortable, so the note selection screen offers only a few short preset options: "This reminded me of you," "Listen to the lyrics," "I think you would like this," with a custom text field as a fallback. The constraint keeps the task simple and fast.
Visual hierarchy was equally critical. On a small display, every pixel matters. The album art anchors the screen. Reaction emojis are large because reacting is the primary action. The "+ Note" button is small and secondary. Playback controls sit at the very bottom, available but not dominant.
Limiting note options to a few preset phrases was not just a workaround for small screen typing. It actively reduced decision fatigue and kept the interaction feeling effortless. The constraint became a feature.
Laying out a circular interface forced a clear prioritisation: what does the user need to see first, second, third? The answer shaped every sizing and placement decision, and made the conceptual model immediately clear to first-time users.
A single micro-interaction can carry a lot of emotional weight. One button, one animation, one confirmation. Getting the VibeNote flow right meant thinking carefully about every state, even the ones that last under a second.
Designing for a tiny circular display forced decisions that actually made the design better. The limitations of the form factor pushed me toward clarity I might not have found on a larger canvas.
Applying UX laws like Proximity and Goal Gradient felt mechanical at first. But working through how each one showed up in my own design made them feel intuitive rather than prescriptive. Now I reach for them naturally.
"The more we design with inclusion in mind, the more natural and human our solutions become."
— Personal reflection, SI 207