Traffic is healthy, conversion isn't
The gap sits in the analytics, visible to anyone who looks, and mostly ignored because nobody owns it.
Independent Consultant
Location
Kuala Lumpur, Malaysia (GMT+8).
Working with clients across Malaysia, Australia, the US, and Zimbabwe.
Availability
Currently accepting projects for Q2 2026 and beyond.
Get in touchProduct and Experience
Users arrive, find what they need, and still do not act. I identify where they stop, why, and what each fix is worth before a design tool opens.
These are the patterns I see most often when product teams call about UX. Usually more than one applies.
The gap sits in the analytics, visible to anyone who looks, and mostly ignored because nobody owns it.
The friction was structural. The redesign addressed the surface while the user journey remained the same.
Proposals come from screenshots and memory rather than usability testing or behavioural data to anchor decisions.
UX strategy produces three things. Each one goes directly into your team's workflow.
Root-cause analysis grounded in session data and mapped across the full user journey. Each failure point is documented with its cause and its relationship to adjacent drop-off in the flow. Your team can see exactly where the journey breaks and trace why it breaks there.
What matters to the business comes first. Each recommendation includes an effort estimate and a commercial impact assessment so your team can weigh return against resource before committing to any fix.
Detailed enough to hand over cleanly. The engagement ends when your team has what they need to implement, not when you have a document that still requires interpretation.
Most engagements follow four phases over two to four weeks, depending on the number of user journeys in scope.
Review session recordings, analytics funnels, heatmaps, and drop-off data. Identify where users stop, hesitate, or take unexpected paths.
Move from where users stop to why they stop. Each failure point is examined for its root cause. The cause determines the fix.
Friction points are ranked by commercial impact. Output is recommendations with rationale and implementation guidance, most impactful first. I include expected effort alongside expected return when possible so your team can sequence the work realistically (some high-impact fixes take an afternoon, others require a sprint).
Review design and dev work against the spec. Validate changes post-launch using the same data, so each fix is measurable.
From clients across strategy, execution, and advisory engagements. Different sectors, different problems.
“Working with him was straightforward and collaborative. He is very detailed, reliable, and quick to translate discussions into clear, actionable steps. He also takes the time to guide and teach the team, which really helped build our internal capability. I can see why people describe him as a “walking Google” — he brings a wide range of knowledge and connects things quickly.”
Melissa Jailani
Digital Experience & Marketing Lead
“…every decision for the web site is always based on our side and Yassir best experience on the do’s and donts. The price you pay is what you get. Before production, during production and after production it is worth every ringgit spend. Hands down, the best experience and he knows what he is doing. My advice, don’t just get a web site, but get the web site. Great job Yassir, appreciate your work. The best.”
Elisa
Real Estate Negotiator
Answers to the questions I get asked most often.
Analytics platform access, session recording tool access, and a walkthrough of the key user journeys you want to improve. If you have existing user research, that is useful context but not a requirement.
UX strategy is the diagnostic phase. It identifies what to fix and why, ranked by impact. Design executes the fix. The output of this engagement goes to your designer or developer. It does not require working with me further, though I can provide implementation support if useful.
No. The output is a prioritised recommendation spec. Your team implements using whatever designer or developer they choose. The spec is written so it can be handed over cleanly, and the engagement ends when your team has what they need to act.
Both are covered in the audit. Behavioural patterns frequently differ between mobile and desktop in ways that are not obvious from analytics alone. Users fail at different points, for different reasons. The audit treats them separately where the data warrants it.
Post-launch validation is included provided the launch occurs within 90 days of the strategy being delivered. Engagements that extend beyond this are scoped separately.
If the direction is unclear or the backlog keeps shifting, a structured strategy engagement can change that. Let's talk.