• Menu About Connective
  • Menu Core Values
  • Menu Reviews
  • Two person giving high five

Your website isn’t art. It’s a business tool that either drives revenue or wastes opportunity. After evaluating hundreds of designs over the past 15 years, we’ve found the difference between sites that perform and sites that don’t comes down to one thing: whether design decisions were made from customer research or from taste.

Most business owners ask “Do I like this design?” Wrong question. When design decisions come from documented customer research, revenue goes up. When they come from taste and templates, you pay a disconnection tax in lower conversion, lower trust, and price pressure.

This guide gives you the insider framework to tell the difference. You’ll learn to ask the right questions, spot template work, and evaluate whether your designer made decisions that drive business outcomes.

Design decisions vs. design preferences

There’s a distinction most people miss when they evaluate a website.

Design preferences sound like: “I don’t like blue.” “This font feels too corporate.” “Can we make the logo bigger?”

Design decisions sound like: “Does this homepage hierarchy match what customer interviews revealed prospects care about most?” “Is this navigation structure based on how users actually think, or how we’re organized internally?” “Were these trust signals positioned based on journey mapping or gut feel?”

Design choices decide revenue because they shape:

  • Conversion rates: Does the page architecture drive action or create friction?
  • Trust signals: Do prospects feel confident or uncertain?
  • Willingness to pay: Does design communicate premium value or commodity service?
  • Competitive differentiation: Do you look like everyone else or stand out with intention?

The framework: how to evaluate any design decision

Before you react to a design, evaluate the thinking behind it. Ask these five questions about any significant design choice:

  1. Business alignment: Does this solve a specific business problem we identified in discovery?
  2. User impact: Will users actually value this, or does it just look nice to us?
  3. Competitive position: Based on competitive analysis, is this different from what competitors do?
  4. Evidence base: What customer insight or research data informed this choice?
  5. Measurable success: Can we measure whether this decision worked?

When a designer can answer all five questions with specific evidence, you’re looking at research-driven design. When they can’t, you’re looking at guesswork dressed up as professionalism.

Use this framework whether you’re working with us or someone else. The thinking should be provable, not just claimed.

How to tell a strategist from a salesperson

several people discussingwoman worried in a gray background

The difference between research-driven design and template work becomes obvious when you know what questions to ask and what answers reveal real thinking.

The questions that reveal everything

When your designer presents the homepage, ask these three questions:

“What customer insights informed this layout?”

  • Research-driven answer: “In interviews, most prospects mentioned regulatory compliance as their primary concern before price. That’s why we lead with your certifications and compliance story.”
  • Template answer: “Best practices suggest leading with a hero image and value proposition.”

“How does this address patterns you found in user research?”

  • Research-driven answer: “Customer language mining from support tickets showed prospects consistently use ‘headache-free implementation’ but rarely say ‘seamless integration.’ We’re using their actual words.”
  • Template answer: “We optimized for keywords and clear messaging.”

“What did you learn from client interviews that shaped this design?”

  • Research-driven answer: “Your best clients all mentioned they chose you because of your documentation and training. Competitors don’t highlight this. So we made it prominent and differentiated.”
  • Template answer: “We focused on modern design and clear calls to action.”

What you’re listening for

Specific insights they can point to: Customer quotes that influenced decisions. Patterns from multiple data sources. Competitive white space they identified. Trade-offs they considered based on evidence.

References to actual methodology: “In stakeholder interviews we learned…” “Customer language mining revealed…” “Journey mapping showed prospects care about…” “Competitive analysis identified this opportunity…”

Red flags that signal template work

Generic justifications without specifics: “Best practices suggest…” “We’ve seen this work before…” “This template converts well…” “Studies show that users prefer…”

Inability to connect design to discovery: Can’t explain what customer research informed the choice. No reference to your specific competitors. Defends decisions with general principles, not specific insights. Suggests changes would require “starting over” (which signals templates, not custom work).

The integration test

This is the single question that reveals the most about how your agency actually operates:

“What did the person designing my website learn from client interviews?”

If they look confused, you’re looking at coordination between separate departments. Brand team did interviews, handed off findings to web team, web team read a summary, designer never heard actual customer voices. This is typical agency structure: separate departments optimize for their own efficiency, then try to coordinate at the end.

If they can point to specific insights, you’re looking at true integration. The designer participated in customer interviews, heard prospects describe problems in their own words, identified patterns firsthand, and made design choices informed by direct customer understanding.

The difference matters more than most people realize. When brand research, web design, and marketing all start from the same evidence base, everything naturally connects. Your brand positioning reflects customer language. Your website architecture matches how prospects think. Your marketing messages resonate because they came from the same research.

That’s not coordination. That’s integration. And it’s rare because most agencies are structured in department silos that compete for resources and profit independently. A creative director who also oversees strategy has incentives aligned with the outcome. A design department that receives a brief from a separate strategy department has incentives aligned with throughput.

The documentation test

Ask if your agency maintains a comprehensive record of every research insight, every design decision, and the business rationale behind choices. We call ours a Client Journal, but the name doesn’t matter. What matters is that the thinking is documented, searchable, and connected to specific design decisions.

Firms that do research document everything systematically, creating institutional knowledge about your business that compounds over time. They can show you exactly where research insights live and how they connect to design decisions.

Template firms use project management tools to track tasks, but don’t document the thinking. If your agency can’t show you where the research insights are recorded and how they connect to design decisions, the research either didn’t happen or didn’t survive the handoff between departments.

The design decisions that matter most

home icons in blue background

Not all design decisions carry equal weight. Some determine whether your website becomes a revenue engine or an expensive digital brochure. These are the ones worth scrutinizing.

Homepage hierarchy: what gets attention first?

This is the decision with the highest revenue impact, so it’s worth going deep.

The question is simple: what appears above the fold and in what order? Prospects make fast judgments. If your homepage doesn’t immediately address what matters most to them (not to you, to them), you’ve already lost ground.

The way to evaluate this: ask your designer what customer research determined the hierarchy. If they say “customer interviews revealed prospects care about industry experience first, pricing second, and methodology third, so we lead with your decades in healthcare before discussing approach or investment,” you’re in good shape. If they say “we followed best practices: hero image, value proposition, social proof, features, CTA,” they’ve applied a generic formula that has nothing to do with your specific customers.

The subtlety most people miss: homepage hierarchy isn’t just about what’s “above the fold.” It’s about how well the entire page sequence maps to the way your specific prospects evaluate options. A B2B services company where trust is the primary barrier needs a completely different page arc than an e-commerce brand where product clarity is the barrier. Templates can’t account for this. Only research can.

One pattern we’ve seen repeatedly: companies whose homepage hierarchy matches their internal org chart instead of their customers’ decision process. The CEO wants the “about us” story front and center because it’s what they’re proud of. But customer interviews almost always reveal that prospects care about proof of results and relevant experience before they care about your origin story. When you flip the hierarchy to match customer priorities, conversion goes up because you’re answering the questions prospects actually have, in the order they actually ask them.

Navigation, conversion, and trust: the structural decisions

Three other decisions shape how well your site performs, and they’re worth evaluating as a group because they interact with each other.

Navigation structure should mirror customer thinking, not your org chart. The test: ask how your designer determined the navigation labels and organization. If the answer references how customers actually search for solutions (“prospects described looking for ‘compliance solutions’ not ‘products and services,’ and they think about problems by industry, not by your service lines”), you’re looking at research-informed architecture. If the answer is “we used clear categories: About, Services, Industries, Resources, Contact,” you’re looking at convention.

Conversion architecture is where most template work reveals itself most clearly. Random “Contact Us” buttons scattered across every page signal no understanding of the buyer journey. Research-driven CTA placement acknowledges that not everyone is ready to schedule a call on their first visit. Journey mapping might reveal that prospects need to see case studies from their industry before they’re ready to talk, so the primary CTA on service pages leads to relevant case studies, with a secondary consultation CTA that appears after they’ve engaged with proof. That level of specificity can’t come from a template.

Trust signals are the area where generic and research-driven approaches look most different. Every agency website has client logos, testimonials, and awards. These work for everyone, which means they differentiate for no one. Research-driven trust signals address actual objections. If customer interviews reveal that prospects worry about being handed off to junior teams after the sale, the trust solution isn’t a testimonial carousel. It’s practitioner bios, team credentials, and explicit commitments about who they’ll work with. You can only know which trust signals matter by asking.

Competitive differentiation: do you look like everyone else?

This one deserves its own treatment because it’s where the cost of template work compounds most over time.

If your site looks like your competitors’ sites, prospects comparison shop on price. That’s just how it works. Visual and messaging differentiation is what allows you to compete on value instead.

The evaluation is simple: ask what competitive analysis revealed and how the design differentiates you. A research-driven answer points to specific findings (“every competitor uses stock photos of generic office spaces and leads with broad capability claims; competitive white space analysis showed no one highlights methodology or shows their actual process, so we lead with your systematic approach and use real project artifacts instead of stock imagery”). A template answer sounds like “we created a modern, professional design that reflects your brand.”

The compounding problem with template-based competitive positioning: if your designer didn’t study what competitors do, they’ll inadvertently make you look like them. Most agencies in a given industry gravitate toward the same visual conventions, the same stock photo aesthetic, the same messaging patterns. Without deliberate competitive analysis, you end up with a site that could belong to any of your competitors with a logo swap.

This is also the decision that’s hardest to fix later. You can adjust CTA placement, swap trust signals, and reorganize navigation relatively easily. But core competitive positioning, the visual and messaging DNA of the site, is baked into the design system. Getting it wrong means a much more expensive correction down the road.

When to walk away: red flags in the design process

miniature generic red flag

Sometimes the problem isn’t the design. It’s the process that created it.

Walk away if:

Your designer can’t explain the rationale behind decisions. Every choice should connect to business goals and customer insights. If you hear “that’s just how it’s done” or “best practices suggest,” you’re not getting the thinking you’re paying for.

They skipped discovery or rushed through it. Understanding customers deeply takes time. A 30-minute kickoff call isn’t discovery.

Design came faster than research. If you saw designs before customer interviews were complete, they’re guessing and hoping you like it.

They defend designs with personal taste. “I think this looks good” isn’t rationale. “Customer research showed prospects care about X, so we emphasized it” is.

Changes require “starting over.” This signals template-based work, not custom design built on a research foundation. Custom work built on solid research can adapt because the decisions are documented and the rationale is understood. Template work can’t adapt because there’s no underlying logic, just a layout.

Your account manager can’t explain the research rationale. Good firms assign account managers who act as advisors. They participated in discovery, understand the research, and can explain why decisions were made. If your account manager just relays your requests to the production team and can’t speak to the thinking, you’re working with an order-taker, not an advisor.

The tradeoff worth acknowledging: Finding an agency willing to do systematic research takes longer and costs more upfront than hiring someone who starts designing immediately. But companies that invest in research-driven design face less competition precisely because most of their competitors won’t do the work. That’s the advantage.

What this actually costs (honestly)

Let’s talk about what research-driven design takes and what it costs. And since this article argues that you should demand evidence for claims, we’ll frame our own numbers honestly: these ranges come from our experience and the projects we’ve seen, not from industry-wide studies.

Why the research timeline is an advantage

Research-driven track (typical in our experience): 2 to 4 months

  • Customer interviews: 2 to 4 weeks
  • Competitive analysis: 1 to 2 weeks
  • Strategy development: 1 to 3 weeks
  • Design execution: 3 to 8 weeks

Fast template track: 4 to 5 weeks

  • Discovery: 1 to 2 days
  • Strategy: 1 week
  • Design: 2 to 3 weeks

Fast design starts with assumptions and templates. Research-driven design starts with understanding.

Each decision we covered (homepage hierarchy, navigation structure, conversion architecture, trust signals, competitive differentiation) requires investigation. You can’t determine what trust signals address actual objections without interviewing prospects. You can’t create navigation based on customer thinking without understanding how they search for solutions. You can’t position against competitors without studying what white space exists.

Web design at its best follows five clear phases: Discovery (research and understanding), Strategy (informed planning based on evidence), Execution (building based on strategy), Results (measurement and validation), and Evolution (ongoing optimization). Template design skips straight to Execution and hopes for Results.

Most agencies skip thorough research for understandable reasons: it’s time-intensive and hard to scope, clients push to “see design” quickly, templates are easier than custom solutions, and clients often pick the cheaper or faster quote.

What skipping research tends to cost (and we say “tends to” because every situation is different): you build a site based on what you think customers care about rather than what they actually care about. You position like competitors because no one did competitive white space analysis. You optimize for the wrong conversion goals because no one mapped the actual customer journey. Then you spend months and budget fixing preventable problems.

When your brand strategist, web designer, and marketing team all participate in the same customer interviews, they work from a shared foundation. That kind of integration is only possible when you invest the time in proper discovery.

The competitive advantage nobody talks about: The companies willing to invest 3 to 4 months in research-driven design face less competition than those demanding designs in 3 weeks. While your competitors are launching sites built on assumptions, you’re building a competitive position based on customer evidence. The longer timeline isn’t a bug. It’s what creates the gap.

What you’re actually paying for

Research-driven web design typically ranges from $30,000 to $75,000+ because it includes:

  • Discovery phase: Customer interviews, competitive analysis, journey mapping, stakeholder alignment (2 to 4 weeks)
  • Strategy development: Evidence-based planning informed by research (1 to 3 weeks)
  • Custom execution: Building based on your specific insights, not generic templates (3 to 8 weeks)
  • Senior expertise: Work led by experienced practitioners, not junior teams supervised from afar
  • Integration: One team working from shared research, not separate departments coordinating after the fact

Template-based design costs less because it skips these steps. That’s fine if you need commodity execution and differentiation doesn’t matter. But if you’re building a competitive advantage, research isn’t optional.

If budget doesn’t align with full scope: Research-driven firms can phase projects to prove value before you commit to the full investment. Phase 1 might focus on homepage and key service pages with foundational research. Phase 2 expands based on what you learned and what performed. Template firms can’t phase meaningfully because there’s no research foundation to build on.

What great evaluation actually looks like

printed criteria checklistdesigner explaining his work in an online meeting

When you evaluate a design with this framework, the conversation sounds different:

You: “Walk me through the thinking behind this homepage layout.”

Designer: “In customer interviews, we found that prospects have been burned by agencies that overpromise on timelines. The majority mentioned timeline concerns before discussing price or capabilities. Competitive analysis showed every competitor leads with ‘fast turnaround’ or ‘quick results.’

So we made a deliberate choice: lead with realistic timelines and position our research process as competitive advantage. The headline addresses their actual concern (‘Results That Take the Time They Take’), and the hero section explains why research prevents expensive rework.

This positioning differentiates us from competitors who promise speed, and it speaks directly to the documented pain point. It’s contrarian, but it’s backed by evidence.”

You: “I love this. But will prospects actually respond to this?”

Designer: “That’s the right question. We’re betting that prospects who value substance over speed will resonate with this positioning. Those who want quick results will self-select out. Based on customer interviews, the prospects you actually want to work with mentioned they’d rather wait for work grounded in research than rush through templated solutions.

If testing shows this hypothesis is wrong, we’ll adjust. But we’re starting with documented customer insights, not assumptions.”

That’s what the evaluation should sound like. Every choice connects to evidence. Every decision can be explained. Trade-offs are acknowledged.

Compare that to:

You: “Walk me through the thinking behind this homepage layout.”

Designer: “We followed best practices for service-based businesses: strong hero image, clear value proposition, social proof, services overview, and CTAs throughout. This structure has been proven to convert well.”

One conversation reveals thinking backed by your specific customer research. The other reveals generic application backed by conventional principles.

Your evaluation checklist

Before you sign off on a design, run through these five checks. If all five pass, you’re evaluating work worth investing in. If any fail, you’re evaluating template work dressed up as strategy.

Evidence-based decisions. Can the designer point to specific customer research that informed key choices, or is this based on templates and assumptions?

Business alignment. Does each major decision solve a documented business problem identified in discovery?

Competitive differentiation. Does this design stand out from competitors based on competitive analysis, or could it work for any company with a logo swap?

True integration. Did the designer participate in customer interviews and hear prospects directly, or did they just read a summary from another team?

Measurable hypotheses. Is there a plan to test whether design decisions worked, or is success based on hope?

What you should demand of yourself, too. This isn’t one-sided. Great design requires that you invest time in proper discovery, provide access to customers for interviews, trust expertise while requiring explanations, evaluate the thinking before the aesthetics, and measure results. The best agency relationship is a partnership where both sides do the work.

Where to go from here

designer writing down his notes in checklist form

If you’re evaluating a design right now, use this framework to assess whether the thinking is there.

If you’re about to start a website project, use these questions to evaluate whether your agency does research-driven work or template-based work before you sign.

If you’re working with an agency that can’t answer these questions with specific evidence, you’re probably working with the wrong agency.

The companies that invest in research-driven design pull ahead over time. Most won’t do the work. Your choice.

At Connective, we structure branding, web design, and marketing as one team. Everyone participates in the same customer research. We document everything in your Client Journal: every decision, every insight, every trade-off. Our account managers act as advisors who can explain the research rationale behind every design decision.

We’ll tell you honestly if we’re the right fit, or recommend alternatives when someone else would serve you better. Want to talk about whether research-driven design makes sense for your business? Let’s talk about your situation.

This framework works whether you work with Connective or any other agency. Use it to demand real thinking, not just professional polish. Your business deserves design decisions that drive revenue.

Rodney Warner

Founder & CEO

As the Founder and CEO, he is the driving force behind the company’s vision, spearheading all sales and overseeing the marketing direction. His role encompasses generating big ideas, managing key accounts, and leading a dedicated team. His journey from a small town in Upstate New York to establishing a successful 7-figure marketing agency exemplifies his commitment to growth and excellence.

Related articles

Knowledge is power

Stay in the know

Stay ahead in the business game – subscribe to get our email newsletter for invaluable insights and expert tips tailored for savvy leaders like you. No spam, ever – promise.

"*" indicates required fields

Three animated person
Almost there! Complete this form to access your checklist.

"*" indicates required fields