Philo Homes: Closing the Gap
UXD · E-commerce · Internship · In Progress

Role
Organization
Timeline
01/2026-Present
Tools
The Contradiction
Philo Homes is an AI furniture startup that turns empty rooms into shoppable, fully-styled designs through 3D scanning and AI recommendations.
I joined as the sole UX designer through the Zell Lurie Institute's ZLT program, matched to the startup when it had a live MVP but but incomplete flows, no Figma source files for most screens, and no template browsing experience at all.
The AI capabilities were real, the experience around them wasn't. The gap was the design problem.
Deliverable Highlights
What Shaped the Design
01 Strict ≠ confusing.
Translating AI requirements into human-readable guidance.
The room scan is what makes everything else in Philo Homes AI Studio work. But the AI model has strict input requirements - users need to start at the entrance, do a slow 360° sweep, keep ceiling and floor in frame, stay within a 13-feet height limit, and keep the video under two minutes. They're what make the 3D reconstruction accurate.

The original flow buried the requirements. Instructions were scattered, unclear, and easy to skip. Users were failing without knowing why. The fix wasn't simplifying the rules — it was making them impossible to miss.

02 Not everyone starts with a scan.
Designing for the browser.
Flow A assumes users are ready to commit — scan, quiz, recommendation. But many users need to browse first. I designed it end-to-end, starting with a competitive audit of IKEA, Wayfair, and Havenly.
The core takeaway: inspiration-first browsing always follows the same skeleton — image-first discovery → product list → add to cart.
I used that as the foundation and added Philo Homes 's differentiator: the option to fit any template into user's actual scanned room. That's where Flow B connects back to the AI capability that makes this app different.
03 Designing for six months from now.
The question that changed how I think about design.
My first search design covered all the expected bases — search bar, recent history, keyword results, category filters. It sure worked, because it looked like every other shopping app out there.
In a design review, my mentor asked a question that named what I'd been sensing:
" You're designing for now. What will people be using six months from now?"
That landed. I'd been designing from what exists today rather than where things are heading. For a product that won't reach users for months, that gap matters a lot.
After that, I researched where search was actually heading.
Google Shopping AI
Google had begun rolling out AI-powered product search — users describe what they want in natural language and receive curated recommendations, bypassing keyword filtering entirely.
IKEA, Wayfair, Amazon
Despite having AI features elsewhere, all three still defaulted to traditional keyword search.
The gap between where AI search was heading and where shopping apps currently were confirmed the opportunity.
Our usability testing results showed that users preferred AI-native search, not because the interface was radically different, but because it felt like the product was working with them, not just waiting to be searched.
04 Cross-platform isn't translation. It's fit.
Finding what each platform actually does better.
The initial ask was to move the scanning flows to web. My mentor pushed back, and that reframe led to a better decision. The right question wasn't how to port mobile to web. It was what the web platform actually does better.
Scanning a room is obviously a mobile task. But a larger screen is genuinely better at something mobile struggles with, such as comparing multiple options simultaneously.

The web design is still in progress — but this reframe is the most important thing I took away from it.
05 Usability Testing
7 participants / Hi-fidelity Figma prototype / 4 task flows
No major flow breakdowns. Most findings were interaction-level or content problems.


Current status
This case study is still being updated as the project wraps up. Final prototype and additional screens coming soon.
Reflection
Defensible isn't the same as right.
The traditional search design covered all the bases. But it was just copying what already exists — and my mentor's question made me realize I'd been designing for today, not for when the product actually reaches users. That gap can easily be six months or more.
Cross-platform means finding strengths, not porting screens.
Before this project, cross-platform for me meant adapting visuals for different screen sizes. Working on the web experience taught me it's actually about asking what each platform genuinely does better — and designing toward that instead of just translating.
When the product is live, decisions have real weight.
Working with a startup that had a live MVP, real users, and no complete Figma files felt different from any academic project. There was no structured brief, no safety net. And because the startup had already done extensive research before I joined, my job was to move fast and execute — learning to trust existing work rather than re-validating everything.














