Senior product designer with over six years of experience in leading 0->1 products. Also a maker, coder, photographer, and more.
I was the sole designer on the team, and the app was launched in App Store and Google Playstore in September 2020.
Made in: 2020
Made of: Interaction Design, Visual Design, Motion Design, User Research
The Foot.Science app let people scan their own feet using just a phone camera. By relying on photogrammetry technology, the app could generate a precise 3D model from 3 photos of each foot, with an in-depth analysis of the feet.
The output could be used not only for creating custom-fit insoles and footwear, but also to help any footwear shopper find the most comfortable fit.
While it’s fun to see a 3D model of your own feet, the capture process itself was quite involved and demanding. Users had to sit down, place a sheet of paper for scale, and move their phone through three exact spots — each at a very specific tilt.
The app is composed of four key parts that guide users from setup to discovery:
- Onboarding – Previews what’s ahead and sets clear expectations
- Instructions & preparation – Guides users to set up correctly before capture
- Photo capture – The core and most technically demanding stage, where users take 6 images from specific angles
- Foot facts – The reward after completion, offering interactive insights about their feet
01
Onboarding
The pagination design incorporates the arc with three dots that’s visually consistenct with the rest of the app. Each video transitions smoothly to the next, creating a seamless and fluid experience between pages.
02
Instructions & preparation
Through research, I learned that long, text-heavy instructions often lead to frustration and drop-off. To make the setup feel more approachable, I designed it as a conversational UI. Users could simply scroll through the chat-like flow to revisit guidance at any time.
Short video demos were embedded to show, rather than tell, how each step works, with UI elements superimposed directly in the footage to preview what comes next.
03
Photo capture
Since users had to move their phone through several positions in space—each requiring just the right tilt—we needed a way to guide them visually in real time and help them stay on track.
To spark ideas, I looked to apps with similar spatial interactions, like panorama capture, measuring, and leveling, to see how they keep users oriented.
Several ideas emerged from the inspirations and a quick brainstorm with the team—from how to guide users to start from the right side of the foot, and how to nudge them toward the right tilt.
However, these versions also required users to multitask—watching the wheel while adjusting their phone. We decided to simplify, breaking the task into smaller, focused steps.
To put our best foot forward (no pun intended), we built a proof of concept, tested with users, and gathered insights that informed the following interations.
Tilting the phone turned out to be the part users struggled with the most. The original foot overlay didn’t help much. Most users either didn’t notice it or weren’t sure what it was meant to do.
To make the guidance clearer, I replaced the overlay with a paper outline that conveyed two things:
- Tilt the phone until the shapes align
- Adjust the distance so the sizes match
When the box turned green, users knew they were in the right spot, and the app automatically captured the photo a moment later.
To help users move and adjust the phone with confidence, I introduced subtle animations that demonstrated each step and offered immediate feedback when something went wrong.
During testing, we learned that capturing the side photos was noticeably harder than the top view. Users often couldn’t clearly see the screen while positioning the phone, which made visual cues alone less reliable.
To keep the guidance effective in these moments, we layered in other senses — using short voice prompts and gentle haptic feedback to help users stay oriented throughout the process.
The original flow began with the inside, followed by the top and outside photos. During testing, we found that the first photo was the hardest. Users were still figuring out how the process worked, and it was difficult to see the screen from that angle.
To build early confidence, I reversed the sequence to start with the easiest angle — the top view — before moving to the sides.
I also introduced a progress tracker that transitioned in and out of the wheel before each photo. It helped users understand where they were in the sequence and how each angle connected to the next. This simple addition made the experience feel more guided and cohesive.
04
Foot facts
The Foot.Science app successfully democratized the foot scanning process, allowing users to capture accurate 3D models of their feet at home. The app's launch in September 2020 marked a significant milestone in making custom-fit footwear accessible to a broader audience.
Let’s chat