Testing Done Differently

We're a small crew based in Hat Yai who got tired of watching systems crumble under real-world pressure. So we started testing things the way they actually get used—not just how they look in a demo.

Built on Real Experience

Back in 2023, our founder Linnea Vesper was debugging a payment gateway that kept failing every Friday afternoon. Turned out, nobody had tested it under actual weekend traffic patterns. That kind of oversight happens more than you'd think.

We formed sparkly-lumora because performance testing shouldn't be an afterthought. It's not glamorous work—spending hours simulating database queries or stress-testing APIs—but it prevents the kind of 3am emergency calls that nobody wants to receive.

Our approach is straightforward. We look at how your system performs when things go sideways. Because perfect conditions don't exist in production, and your users won't care that it worked fine during your internal tests.

Team member analyzing performance metrics on multiple monitors

What Drives Our Work

These aren't corporate values written by committee. They're the habits that stuck around because they actually made our work better.

Test What Matters

We focus on scenarios that actually break systems—concurrent users, data spikes, network hiccups. Not checkbox compliance tests that look good on paper but miss real problems.

Clear Communication

Technical reports written like instruction manuals help nobody. We explain what we found, why it matters, and what options you have. No jargon walls or vague recommendations.

Honest Assessments

Sometimes your infrastructure is fine and you're just experiencing normal growing pains. We'll tell you that instead of inventing problems to solve. Integrity matters more than billable hours.

Continuous Learning

Technology shifts constantly. We dedicate time each month to exploring new testing frameworks and methodologies—not because it's trendy, but because staying current serves our clients better.

Collaborative workspace with performance testing setup and analysis tools

How We Actually Work

Most performance testing feels like archaeology—digging through layers of old assumptions to find what's actually slowing things down. We start by understanding how your system gets used in reality, not how the documentation says it should work.

Our team of four brings different perspectives. Kasper handles database optimization and has an unusual talent for spotting inefficient queries. Thora focuses on front-end performance and user experience under load. Rowan manages our testing infrastructure and writes the scripts that simulate real usage patterns.

We work remotely most days but meet in person at our Hat Yai office every Tuesday to review active projects. That face-to-face time helps us catch things that slip through Slack conversations—like when someone discovers an interesting pattern that might apply to another client's situation.

Performance testing isn't about finding every possible issue. It's about identifying which problems will actually impact your users and prioritizing those first. Perfect systems don't exist, but well-tested ones can handle real-world chaos gracefully.

Let's Talk About Your System

If you're concerned about how your application handles actual usage—or you've already experienced problems under load—we should probably talk. We can start with a conversation about what you're seeing and whether testing makes sense right now.

Start a Conversation