Review Methodology
I don’t write about products I haven’t used.
That sounds obvious, but scroll through any “best WordPress plugins” article on the internet and you’ll find lists assembled from screenshots of other people’s screenshots. No installation. No testing. No real opinion.
That’s not how it works here.
Every product reviewed on gauravtiwari.org goes through a testing process I’ve refined over 16+ years and 850+ client projects. This page explains exactly how I evaluate tools, what my rating system means, and why you can trust the recommendations you find here.
My Testing Environment
I test products on real WordPress installations. Not demo sites. Not sandboxes provided by the company. My actual sites.
My primary testing environment:
- WordPress: Latest stable release (currently 6.7.x)
- Server: Hetzner Cloud VPS (CX32), Ubuntu 24.04 LTS, managed via xCloud
- PHP: 8.3 with OPcache enabled
- Database: MariaDB 11.x with Redis object caching
- CDN: Cloudflare with R2 image optimization
- Performance stack: FlyingPress + Perfmatters
- SEO: Rank Math Pro
- Theme: Marketers Delight (block theme)
- Page builder: GenerateBlocks (when needed)
This is the same stack powering gauravtiwari.org right now. When I show you a benchmark or screenshot, it’s from a site running real traffic with real content.
For plugins and themes that need isolated testing (to check performance impact without other variables), I spin up a clean VPS with the same spec and a fresh WordPress install. Same server, same PHP config, just without my other plugins interfering.
How I Select Products to Review
Not everything gets reviewed. I’m selective for a reason.
A product makes it onto my review list when:
- I need it myself. Most reviews start because I’m solving a real problem on my own sites or a client project. I found Rank Math because I needed a better SEO plugin. I found FlyingPress because WP Rocket wasn’t fast enough.
- Readers ask about it. If I get 5+ emails or comments asking “have you tried X?”, that’s signal enough to test it.
- It fills a gap in existing coverage. If I’ve written about a category (like caching plugins or email tools) but missed a major contender, I’ll test it.
- The product genuinely does something different. Me-too products that clone existing features with a different UI don’t get airtime. I look for tools that solve problems in a way others haven’t.
What I won’t review: products that are too early (no stable release), products I can’t install independently (vendor-locked demos only), and products in categories I don’t have experience with.
The Testing Process
Every review follows the same steps. No shortcuts.
Step 1: Purchase and Install
If it’s a paid product, I buy it. Full price. My own credit card.
I’ve accepted review licenses from companies exactly twice in 16 years, and both times I disclosed it in the first paragraph. If a company offers a free license, I’ll take it if I was going to buy it anyway, but I note the arrangement.
Installation happens on a production site. I document the process with screenshots from my actual WordPress dashboard. If the installation is confusing or breaks something, that goes in the review too.
Step 2: Configuration and Setup
I spend real time configuring the product. Not 10 minutes of clicking through settings. I read the documentation, try different configurations, and figure out what works for my use case.
For complex products (like SEO plugins or caching tools), I document my exact settings. In my Rank Math review, you can see every module I’ve enabled, every setting I’ve changed, and the code snippets I use to extend it. That level of detail isn’t possible without actually running the plugin.
Step 3: Extended Use (Minimum 2 Weeks)
This is where most reviewers fail. They install something, take screenshots, and publish.
I run products for a minimum of two weeks before I write the review. For core tools (hosting, SEO plugins, performance tools), I’ve often used them for months or years before reviewing. My Perfmatters review reflects three years of daily use. My hosting comparisons come from running production sites on each provider.
During this period, I’m watching for:
- Performance impact (load times, server response, Core Web Vitals)
- Stability (does it crash? conflict with other plugins?)
- Support quality (when something goes wrong, how fast do they respond?)
- Update frequency (is the team shipping improvements?)
- Real-world usability (is it actually easier, or just different?)
Step 4: Benchmarking
For performance-related products (caching plugins, hosting, CDNs, image optimization), I run quantitative benchmarks.
My benchmarking process:
- Tool: Google PageSpeed Insights + WebPageTest + custom server-side timing
- Methodology: 5 test runs, median taken
- Pages tested: Homepage + a long-form article (2,000+ words with images)
- Conditions: Tested with CDN enabled and disabled, with and without the product active
- Metrics captured: TTFB, LCP, CLS, INP, total page weight, number of requests
I publish the actual numbers. Not vague claims like “it made my site faster.” Specific numbers like “TTFB dropped from 340ms to 180ms” or “total page weight reduced by 38%.”
Step 5: Honest Assessment
After testing, I write the review with a structure that’s consistent across all product reviews:
- Why I tried this product (the real reason, not marketing copy)
- What it actually does (features that matter, not a feature list scraped from the sales page)
- My complete setup (active modules, configuration, code customizations)
- What it does well (with evidence)
- What it doesn’t do well (with evidence)
- Pricing and alternatives
- Who should use it (and who shouldn’t)
- My personal verdict
The “who shouldn’t use it” section matters. Every product has blind spots. Pretending otherwise is dishonest.
My Rating System
I use a 5-star rating scale. Here’s what each rating actually means:
| Rating | What It Means |
|---|---|
| 5/5 | I use this daily. It’s the best option I’ve found for its category. I’d recommend it to anyone who needs this type of tool. |
| 4.5/5 | Excellent product with minor limitations. I use it or have used it. There might be a better option for specific edge cases, but for most people, this is the right choice. |
| 4/5 | Good product that does its job. Some notable drawbacks keep it from being my top pick. Worth considering if it fits your specific needs better than alternatives. |
| 3.5/5 | Decent product with significant caveats. Works, but there are better options. I’d only recommend it for specific situations. |
| 3/5 | Average. Does the bare minimum. You can find better. |
| Below 3/5 | I don’t publish reviews for products I’d rate below 3. If something is that bad, I mention it briefly in comparison articles. I’d rather spend time writing about something worth your attention. |
What Affects the Rating
I weight these factors differently depending on the product category:
For Plugins and Themes
- Does it work reliably? (30%)
- Performance impact (20%)
- Ease of use (15%)
- Feature completeness (15%)
- Support and documentation (10%)
- Value for money (10%)
For Hosting
- Server performance and uptime (35%)
- Support quality (20%)
- Ease of management (15%)
- Pricing transparency (15%)
- Scaling options (15%)
For SaaS Tools
- Core functionality (30%)
- WordPress integration (20%)
- Pricing and value (20%)
- Support and documentation (15%)
- Long-term viability (15%)
These aren’t arbitrary percentages. They come from what I’ve seen matter most across 850+ client projects. A fast plugin with terrible support will eventually cost you time. A cheap host that goes down monthly will cost you more than a premium one.
What I Disclose
Transparency isn’t optional. Here’s what I always tell you:
- Affiliate links: When a link earns me a commission, that’s disclosed. The link is managed through my link management system so you can always identify affiliate links by the URL structure.
- Free licenses: If a company provided a review copy, I say so in the first paragraph.
- Previous relationships: If I’ve done client work for a company whose product I’m reviewing, I disclose that.
- What I actually use: My toolbox page lists every tool I currently run on my own sites. If I’m reviewing something that competes with a tool I use, I’m upfront about my existing preference.
I won’t pretend to be neutral. I have opinions formed by years of experience. But I’ll always tell you when those opinions might be influenced by something beyond the product itself.
How Reviews Get Updated
Products change. My opinions change. What was true 18 months ago might not hold today.
When I update a review:
- The “Last Updated” date changes
- Significant changes (new pricing, removed features, changed recommendation) are noted
- If my rating changes, I explain why
I don’t update reviews just to change the date. If nothing meaningful has changed, the review stays as-is until something does.
Products I’ve stopped recommending get a clear notice at the top: what changed, when, and what I recommend instead.
Questions About My Process
If something about a review feels off, or you want to know more about how I tested something, reach out. I keep notes and benchmarks from every review. If you ask “how did you get that number?”, I can show you.
I’ve been writing reviews since 2008. My reputation is the one asset I can’t buy back if I damage it. Every review reflects that.
Related Policies
- Editorial Policy — How all content is created and managed
- Copyright, Disclaimer and FTC Disclosure — Legal disclosures and affiliate transparency
- My Toolbox — What I actually use
Last updated: February 24, 2026