grabbit Food Finder App POC
A Visual Feed for Food Discovery
grabbit was a concept developed during my time at Israel Tech Challenge. I led UX/UI from concept through execution — defining the product direction, designing the experience, and shaping how the core model would work.
Role
User strategy, UI design
Team
None
Deliverables
POC prototype
Assumption
Killing the “I Don’t Care, You Pick” Loop
This isn’t just small talk — it’s a daily decision loop that wastes time and leads to mediocre outcomes.
The assumption:
People often don’t know what they’re in the mood for
Others default to indecision to accommodate the group
Existing tools assume you already know what you want because they are built around intent-based search (e.g., “Thai food,” “brunch,” “restaurants near me”)
General searches require guessing the right keywords
Reviews are subjective and often conflicting
Content is buried in long-form blogs or scattered across platforms
Great local spots are often invisible unless you know what to look for
The result: decision fatigue, wasted time, and unsatisfying choices.
Original mock-ups




Key Features
Designing for Indecision
Instead of keyword-based searching, users would browse real, recent food near them and could make faster, more confident “gut” decisions without needing to know what they want upfront.
We focused on a tight MVP to validate the core behavior: fast, visual decision-making without search.
Real-time, user-generated food photos
The foundation of the experience — what are people actually eating right now. No reviews, no long descriptions, just immediate visual signal. Content is primarily user-generated, supplemented by scraped imagery from sources like social media, Google Maps, Yelp, and restaurant-provided photos, ensuring density and variety even in less active areas.
Recency-driven feed
A ranking model to prioritize recent, relevant content ensuring users see what’s good right now, not what was popular months ago, while still allowing light filtering when needed.
Location-based discovery
Content is surfaced based on proximity, making discovery hyper-relevant. Users can explore a specific area and seamlessly transition from browsing to action with one-tap directions via native maps.
“GrabbIt” swipe for the undecided
For users who don’t know what they want, a swipe-based flow removes the need to search entirely. Set a few lightweight parameters, then quickly pass or “GrabbIt” — turning indecision into momentum.
Lightweight content classification
Images are organized using metadata and simple categorization (e.g., burgers, sushi, drinks), making content browsable without requiring manual tagging or effort from users.
