Yassir Yahya

Independent Consultant

Product and Experience

Users leave without acting. The cause is rarely the design.

Users arrive, find what they need, and still do not act. I identify where they stop, why, and what each fix is worth before a design tool opens.

The situations I see most often

These are the patterns I see most often when product teams call about UX. Usually more than one applies.

  • Traffic is healthy, conversion isn't

    The gap sits in the analytics, visible to anyone who looks, and mostly ignored because nobody owns it.

  • The redesign improved the screens, but conversion did not follow

    The friction was structural. The redesign addressed the surface while the user journey remained the same.

  • Every improvement proposal is based on instinct

    Proposals come from screenshots and memory rather than usability testing or behavioural data to anchor decisions.

UX strategy produces three things. Each one goes directly into your team's workflow.

  1. A documented map of where and why users fail

    Root-cause analysis grounded in session data and mapped across the full user journey. Each failure point is documented with its cause and its relationship to adjacent drop-off in the flow. Your team can see exactly where the journey breaks and trace why it breaks there.

  2. Recommendations ranked by commercial impact

    What matters to the business comes first. Each recommendation includes an effort estimate and a commercial impact assessment so your team can weigh return against resource before committing to any fix.

  3. Specifications your designer or developer can act on without a briefing

    Detailed enough to hand over cleanly. The engagement ends when your team has what they need to implement, not when you have a document that still requires interpretation.

How it works

Most engagements follow four phases over two to four weeks, depending on the number of user journeys in scope.

  • Behavioural data audit

    Review session recordings, analytics funnels, heatmaps, and drop-off data. Identify where users stop, hesitate, or take unexpected paths.

  • Friction mapping

    Move from where users stop to why they stop. Each failure point is examined for its root cause. The cause determines the fix.

  • Prioritised recommendation spec

    Friction points are ranked by commercial impact. Output is recommendations with rationale and implementation guidance, most impactful first. I include expected effort alongside expected return when possible so your team can sequence the work realistically (some high-impact fixes take an afternoon, others require a sprint).

  • Implementation support

    Review design and dev work against the spec. Validate changes post-launch using the same data, so each fix is measurable.

What clients say about the work

From clients across strategy, execution, and advisory engagements. Different sectors, different problems.

“Working with him was straightforward and collaborative. He is very detailed, reliable, and quick to translate discussions into clear, actionable steps. He also takes the time to guide and teach the team, which really helped build our internal capability. I can see why people describe him as a “walking Google” — he brings a wide range of knowledge and connects things quickly.”

Melissa Jailani

Digital Experience & Marketing Lead

“…every decision for the web site is always based on our side and Yassir best experience on the do’s and donts. The price you pay is what you get. Before production, during production and after production it is worth every ringgit spend. Hands down, the best experience and he knows what he is doing. My advice, don’t just get a web site, but get the web site. Great job Yassir, appreciate your work. The best.”

Elisa

Real Estate Negotiator

Answers to the questions I get asked most often.

  • What access do you need?

    Analytics platform access, session recording tool access, and a walkthrough of the key user journeys you want to improve. If you have existing user research, that is useful context but not a requirement.

  • Is this the same as a UX design engagement?

    UX strategy is the diagnostic phase. It identifies what to fix and why, ranked by impact. Design executes the fix. The output of this engagement goes to your designer or developer. It does not require working with me further, though I can provide implementation support if useful.

  • Do you redesign the product?

    No. The output is a prioritised recommendation spec. Your team implements using whatever designer or developer they choose. The spec is written so it can be handed over cleanly, and the engagement ends when your team has what they need to act.

  • How do you handle mobile vs desktop?

    Both are covered in the audit. Behavioural patterns frequently differ between mobile and desktop in ways that are not obvious from analytics alone. Users fail at different points, for different reasons. The audit treats them separately where the data warrants it.

  • How long is post-launch validation included?

    Post-launch validation is included provided the launch occurs within 90 days of the strategy being delivered. Engagements that extend beyond this are scoped separately.

Time to give your team a strategy they can commit to.

If the direction is unclear or the backlog keeps shifting, a structured strategy engagement can change that. Let's talk.