A shortlist of polished demos and no way to test the claims
Every vendor shows the happy path. Reference calls go to pre-approved customers. Edge cases never quite resolve.
Independent Consultant
Location
Kuala Lumpur, Malaysia (GMT+8).
Working with clients across Malaysia, Australia, the US, and Zimbabwe.
Availability
Currently accepting projects for Q2 2026 and beyond.
Get in touchOperations
Every vendor has a sales process built to answer questions they want you to ask. An independent evaluation follows the evidence.
Procurement decisions made without independent review tend to follow the vendor's narrative. The demos look good, the references check out, and the gaps only surface after the contract is signed.
Every vendor shows the happy path. Reference calls go to pre-approved customers. Edge cases never quite resolve.
The people using the platform cannot stress-test the architecture. That gap is where bad decisions get made.
Wrong platform. Switching costs exceed the licence. It happens when demos replace structured requirements.
A decision the whole organisation can stand behind.
Weighted criteria, scoring logic, and written rationale. Defensible to the board and to the team.
Licence fees, implementation, training, integration, support, and migration costs are all covered.
A clear recommendation with evidence and rationale, including why alternatives were not chosen.
Every evaluation follows four phases. The structure is consistent; the depth scales with the complexity of the decision and the number of vendors involved.
Interview stakeholders. Map must-haves against nice-to-haves. Document decision criteria before any vendor conversations begin.
Your team compiles the candidate list. I review it against the documented requirements and produce a shortlist of three to five vendors, with written rationale for every vendor excluded.
Your team schedules the demos and reference calls. I attend, run the questioning against a scripted framework, and document what the vendor says and what they avoid. Where a proof of concept is warranted, I define the test criteria and review the results.
Completed scoring matrix with full rationale, including a vendor risk assessment for the recommended option. A recommendation document clear enough to present to leadership. Debrief call included.
From clients across strategy, execution, and advisory engagements. Different sectors, different problems.
“Working with him was straightforward and collaborative. He is very detailed, reliable, and quick to translate discussions into clear, actionable steps. He also takes the time to guide and teach the team, which really helped build our internal capability. I can see why people describe him as a “walking Google” — he brings a wide range of knowledge and connects things quickly.”
Melissa Jailani
Digital Experience & Marketing Lead
“…every decision for the web site is always based on our side and Yassir best experience on the do’s and donts. The price you pay is what you get. Before production, during production and after production it is worth every ringgit spend. Hands down, the best experience and he knows what he is doing. My advice, don’t just get a web site, but get the web site. Great job Yassir, appreciate your work. The best.”
Elisa
Real Estate Negotiator
No affiliations, no referral fees, no preferred vendors.
There are no vendor affiliations here. No referral fees, no preferred platform partners, no certifications that would create a financial reason to recommend one system over another. The recommendation follows the evidence. That is a constraint I impose deliberately, because it is the only basis on which an independent evaluation is worth paying for.
I have been on both sides of this process. I have evaluated vendors as a buyer at enterprise scale, been on the vendor side being evaluated, and written RFPs as an independent consultant. Each perspective taught me something different about where the process breaks down.
Requirements definition is where most evaluations go wrong. By the time you reach the scoring stage, the criteria have already determined the outcome. Get the requirements wrong and the scoring becomes a formality.
This is a single-person practice. The person who scopes the evaluation runs it. There is no hand-off to a junior analyst after the kickoff call.
Answers to the questions I get asked most often.
CMS, CRM, ERP, marketing automation, analytics, e-commerce platforms, and any SaaS platform with significant budget and high switching costs. If the decision is consequential and the criteria are not obvious, an independent evaluation is useful.
No. No referral agreements, no preferred partner programmes, no platform certifications that create a conflict of interest. The only reason to recommend a platform is that it fits the documented requirements better than the alternatives.
Four to eight weeks depending on the number of vendors and the depth of evaluation required. Your team handles scheduling. My time goes into the framework, the scoring, and the recommendation. Requirements definition and shortlisting typically take two to three weeks. Structured evaluation of three to five vendors takes another two to four weeks. A fixed timeline can be discussed if you are working to a board approval date.
That is a valid finding, and the output will say so clearly. Sometimes the market has not caught up with the requirements, or the requirements need to be revised before a good platform decision is possible. A recommendation on next steps is included either way.
If you are evaluating platforms or preparing an RFP, bring me in early. Independent review, documented rationale.