Archive

Opinioneering

Your Software Requirements Are Worthless

Every day, software teams burn millions of pounds building the wrong thing because they mistake fuzzy feelings and opinioneering for engineering specifications

Software teams continue writing requirements like ‘user-friendly’, ‘scalable’, and ‘high-performance’ as if these phrases mean anything concrete.

They don’t.

What they represent is ignorance (of quantification) disguised as intellectual laziness disguised as collaboration. When a product manager says an interface should be ‘intuitive’ and a developer nods in agreement, no communication has actually occurred. Both parties have simply agreed to postpone the hard work of thinking and talking until later—usually until users complain or products break.

The solution isn’t better communication workshops or more stakeholder alignment meetings. It’s operational definitions—the rigorous practice of quantifying every requirement so precisely that a computer could verify compliance.

What Are Operational Definitions?

An operational definition specifies exactly how to measure, observe, or identify something in terms that are meaningful to the Folks That Matter™. Instead of abstract concepts or assumptions, operational definitions state the precise criteria, procedures, or observable behaviours that determine whether something meets a standard—and why that standard creates value for those Folks That Matter™.

The term originates from scientific research, where researchers must ensure their experiments are replicable. Instead of saying a drug ‘improves patient outcomes’, researchers operationally define improvement as ‘a 15% reduction in Hamilton Depression Rating Scale scores measured by trained clinicians using the 17-item version at 6-week intervals, compared to baseline scores taken within 72 hours of treatment initiation, with measurements conducted between 9-11 AM in controlled clinical environments at 21°C ±2°C, amongst patients aged 18-65 with major depressive disorder diagnosed per DSM-5 criteria, excluding those with concurrent substance abuse or psychotic features’.

This example only scratches the surface—a complete operational definition would specify dozens more variables including exact clinician training protocols, inter-rater reliability requirements, patient positioning, statistical procedures, and missing data handling. This precision is what makes scientific breakthroughs reproducible and medical treatments safe.

The Software Development Challenge

Software teams constantly wrestle with ambiguous terms that everyone assumes they understand:

  • ‘This feature should be fast’
  • ‘The user interface needs to be intuitive’
  • ‘We need better code quality’
  • ‘This bug is critical’

These statements appear clear in conversation, but they’re loaded with subjective interpretations. What’s ‘fast’ to a backend engineer may be unacceptably slow to a mobile developer. ‘Intuitive’ means different things to designers, product managers, and end users.

Worse: these fuzzy requirements hide the real question—what specificaly do the Folks That Matter™ actually need?

How Operational Definitions Transform Software Teams

1. Connect Features to the Needs of the Folks That Matter™

Consider replacing ‘the API should be fast’ with an operational definition: ‘API responses return within 200ms for 95% of requests under normal load conditions, as measured by our monitoring system, enabling customer support agents to resolve inquiries 40% faster and increasing customer satisfaction scores by 15 points as measured on <date>.’

This eliminates guesswork, creates shared understanding across disciplines, and directly links technical decisions to the needs of the Folks That Matter™.

2. Turn Subjective Debates Into Objective Decisions

Operational definitions end pointless arguments about code quality. Stop debating whether code is ‘maintainable’. Define maintainability operationally:

  • Code coverage above 80% to reduce debugging time by 50%
  • Cyclomatic complexity below 10 per function to enable new team members to contribute within 2 weeks
  • No functions exceeding 50 lines to support 90% of feature requests completed within single sprint
  • All public APIs documented with examples to achieve zero external developer support tickets for basic integration

Each criterion ties directly to measurable benefits for the Folks That Matter™.

3. Accelerate Decision Making

With operationally defined acceptance criteria, teams spend less time in meetings clarifying requirements and more time attending to folks’ needs. Developers know exactly what ‘done’ looks like, and the Folks That Matter™ verify completion through measurable outcomes.

4. Bridge Cross-Functional Disciplines

Different roles think in different terms. Operational definitions create a common vocabulary focused on the needs of the Folks That Matter™:

  • Product: Transform ‘User-friendly’ into ‘Users complete the checkout flow within 3 steps, with less than 2% abandonment at each step, increasing conversion rates by 12% and generating £2M additional annual revenue
  • Design: Transform ‘Accessible’ into ‘Meets WCAG 2.1 AA standards as verified by automated testing and manual review, enabling compliance with federal accessibility requirements and expanding addressable market by 15%
  • Engineering: Transform ‘Scalable’ into ‘Handles 10x current load with response times under 500ms, supporting planned user growth without additional infrastructure investment for 18 months

5. Evolutionary Improvement

Operational definitions evolve as the needs of the Folks That Matter™ become clearer. Start with basic measurements, then refine scales of measure as you learn what truly drives value. A ‘fast’ system might initially mean ‘under 1 second response time’ but evolve into sophisticated performance profiles that optimise for different user contexts and business scenarios.

Real-World Implementation: Javelin’s QQO Framework

Some teams have already embraced this precision. Falling Blossoms’ Javelin process demonstrates operational definitions in practice through Quantified Quality Objectives (QQOs)—a systematic approach to transforming vague non-functional requirements into quasi or actual operational definitions.

Instead of accepting requirements like ‘the system should be reliable’ or ‘performance must be acceptable’, Javelin teams create detailed QQO matrices where every quality attribute gets operationally defined with:

  • Metric: Exact measurement method and scale
  • Current: Baseline performance (if known)
  • Best: Ideal target level
  • Worst: Minimum acceptable threshold
  • Planned: Realistic target for this release
  • Actual: Measured results for actively monitored QQOs
  • Milestone sequence: Numeric targets at specific dates/times throughout development

A Javelin team might operationally define ‘reliable’ as: ‘System availability measured monthly via automated uptime monitoring: 99.5% by March 1st (MVP launch), 99.7% by June 1st (full feature release), 99.9% by December 1st (enterprise rollout), with worst acceptable level never below 99.0% during any measurement period.’

This transforms the entire conversation. Instead of debating what ‘reliable enough’ means, teams focus on achievable targets, measurement infrastructure, and clear success criteria. QQO matrices grow organically as development progresses, following just-in-time elaboration of folks’ needs. Teams don’t over-specify requirements months in advance; they operationally define quality attributes exactly as needed for immediately upcoming development cycles.

This just-in-time approach prevents requirements from going stale whilst maintaining precision where it matters. A team might start with less than a dozen operationally defined QQOs for an MVP, then expand to hundreds as they approach production deployment and beyond—each new QQO addressing specific quality concerns as they become relevant to actual development work.

Toyota’s Product Development System (TPDS) demonstrates similar precision in manufacturing contexts through Set Based Concurrent Engineering (SBCE). Rather than committing to single design solutions early, Toyota teams define operational criteria for acceptable solutions—precise constraints for cost, performance, manufacturability, and quality. They then systematically eliminate design alternatives, at scheduled decision points, that fail to meet these quantified thresholds, converging on optimal solutions through measured criteria rather than subjective judgement.

Both Javelin’s QQOs and Toyota’s SBCE prove that operational definitions work at scale across industries—turning fuzzy requirements into systematic, measurable decision-making frameworks that deliver value to the Folks That Matter™.

Practical Examples in Software Development

User Story Acceptance Criteria

Before: ‘As a user, I want the search to be fast so I can find results quickly.’

After: ‘As a user, when I enter a search query, I should see results within 1 second for 95% of searches, with a loading indicator appearing within 100ms of pressing enter.’

Bug Priority Classification

Before: ‘This is a critical bug.’

After: ‘Priority 1 (Critical): Bug prevents core user workflow completion OR affects >50% of active users OR causes data loss OR creates security vulnerability.’

Code Review Standards

Before: ‘Code should be clean and well-documented.’

After: Operationally defined code quality standards with measurable criteria:

Documentation Requirements:

  • 100% of public APIs include docstrings with purpose, parameters, return values, exceptions, and working usage examples
  • Complex business logic (cyclomatic complexity >5) requires inline comments explaining the ‘why’, not the ‘what’
  • All configuration parameters documented with valid ranges, default values, and business impact of changes
  • Value to the Folks That Matter™: Reduces onboarding time for new developers from 4 weeks to 1.5 weeks, cuts external API integration support tickets by 80%

Code Structure Metrics:

  • Functions limited to 25 lines maximum (excluding docstrings and whitespace)
  • Cyclomatic complexity below 8 per function as measured by static analysis tools
  • Maximum nesting depth of 3 levels in any code block
  • No duplicate code blocks exceeding 6 lines (DRY principle enforced via automated detection)
  • Value to the Folks That Matter™: Reduces bug fix time by 60%, enables 95% of feature requests completed within single sprint

Naming and Clarity:

  • Variable names must be pronounceable and searchable (no abbreviations except industry-standard: id, url, http)
  • Boolean variables/functions use positive phrasing (isValid not isNotInvalid)
  • Class/function names describe behaviour, not implementation (PaymentProcessor not StripeHandler)
  • Value to the Folks That Matter™: Reduces code review time by 40%, decreases bug report resolution from 3 days to 8 hours average

Security and Reliability:

  • Zero hardcoded secrets, credentials, or environment-specific values in source code
  • All user inputs validated with explicit type checking and range validation
  • Error handling covers all failure modes with logging at appropriate levels
  • All database queries use parameterised statements (zero string concatenation)
  • Value to the Folks That Matter™: Eliminates 90% of security vulnerabilities, reduces production incidents by 75%

Testing Integration:

  • Every new function includes unit tests with >90% branch coverage
  • Integration points include contract tests verifying interface expectations
  • Performance-critical paths include benchmark tests with acceptable thresholds defined
  • Value to the Folks That Matter™: Reduces regression bugs by 85%, enables confident daily deployments

Review Process Metrics:

  • Code reviews completed within 4 business hours of submission
  • Maximum 2 review cycles before merge (initial review + addressing feedback)
  • Review comments focus on maintainability, security, and business logic—not style preferences
  • Value to the Folks That Matter™: Maintains development velocity whilst ensuring quality, reduces feature delivery time by 25%

Performance Requirements

Before: ‘The dashboard should load quickly.’

After: ‘Dashboard displays initial data within 2 seconds on 3G connection, with progressive loading of additional widgets completing within 5 seconds total.’

The Competitive Advantage

Teams that master operational definitions gain significant competitive advantages:

  • Faster delivery cycles from reduced requirement clarification—deploy features 30-50% faster than competitors
  • Higher quality output through measurable standards—reduce post-release defects by 60-80%
  • Improved confidence from the Folks That Matter™ from predictable, verifiable results—increase project approval rates and budget allocations
  • Reduced technical debt through well-defined standards—cut maintenance costs whilst enabling rapid feature development
  • Better team morale from decreased frustration and conflict—retain top talent and attract better candidates

Most importantly: organisations that operationally define their quality criteria can systematically out-deliver competitors who rely on subjective judgement.

Start Today

Choose one ambiguous term your team uses frequently and spend 30 minutes defining it operationally. Ask yourselves:

  1. What value does this QQO deliver to the Folks That Matter™?
  2. What specific, observable criteria determine if this value is achieved?
  3. What scale of measure will we use—percentage, time, count, ratio?
  4. How will we measure this, and how often?
  5. What does ‘good enough’ look like vs. ‘exceptional’ for the Folks That Matter™?

Aim for precision that drives satisfaction of folks’ needs, not perfection. Even rough operational definitions linked to the needs of the Folks That Matter™ provide more clarity than polished ambiguity.

Implementation Strategy

Start Small and Build Consensus

Begin by operationally defining one or two concepts that cause the most confusion in your team. Start with:

  • Definition of ‘done’ for user stories linked to specific value for the Folks That Matter™
  • Bug severity levels tied to business impact measures
  • Performance benchmarks connected to user experience goals
  • Code standards that enable measurable delivery improvements

Define Scales of Measure

Write operational definitions that specify not just the criteria, but the scale of measure—the unit and method of measurement. Include:

  • Measurement method: How you will measure (automated monitoring, user testing, code analysis)
  • Scale definition: Units of measure (response time in milliseconds, satisfaction score 1-10, defect rate per thousand lines)
  • Measurement infrastructure: Tools, systems, and processes needed
  • Frequency: How often measurements occur and when they’re reviewed
  • Connection to the Folks That Matter™: What business need each measurement serves

Evolve Based on Learning

Operational definitions evolve as you learn what truly drives meeting the needs of the Folks That Matter™. Start with basic measurements, then refine scales as you discover which metrics actually predict success. Regular retrospectives can examine not just whether definitions were met, but whether they satisfied the intended needs of the Folks That Matter™.

Document and Automate

Store operational definitions in accessible locations—team wikis, README files, or project documentation. Automate verification through CI/CD pipelines, monitoring dashboards, and testing frameworks wherever possible. The goal is measurement infrastructure that runs automatically and surfaces insights relevant to the needs of the Folks That Matter™.

Conclusion

Operational definitions represent a paradigm shift from ‘we all know what we mean’ to ‘we are crystal clear about what value we’re delivering to the Folks That Matter™’. In software development, where precision enables competitive advantage and the satisfaction of the needs of the Folks That Matter™ determines success, this shift separates organisations that struggle with scope creep and miscommunication from those that systematically out-deliver their competition.

Creating operational definitions pays dividends in reduced rework, faster delivery, happier teams, and measurable value for the Folks That Matter™. Most importantly, it transforms software development from a guessing game into a needs-meeting discipline—exactly what markets demand as digital transformation accelerates and user expectations rise.

Operational definitions aren’t just about better requirements. They’re about systematic competitive advantage through measurable satisfaction of the needs of the Folks That Matter™.

Take action: Pick one fuzzy requirement from your current sprint. Define it operationally in terms of specific needs of the Folks That Matter™. Watch how this precision changes every conversation your team has about priorities, trade-offs, and success.

Further Reading

American Psychiatric Association. (2013). Diagnostic and statistical manual of mental disorders (5th ed.). American Psychiatric Publishing.

Beck, K. (2000). Extreme programming explained: Embrace change. Addison-Wesley.

Cockburn, A. (2004). Crystal clear: A human-powered methodology for small teams. Addison-Wesley.

DeMarco, T. (1982). Controlling software projects: Management, measurement, and estimation. Yourdon Press.

DeMarco, T., & Lister, T. (2013). Peopleware: Productive projects and teams (3rd ed.). Addison-Wesley.

Falling Blossoms. (2006). Our Javelin™ process (Version 2.0a). Falling Blossoms.

Gilb, T. (1988). Principles of software engineering management. Addison-Wesley.

Gilb, T. (2005). Competitive engineering: A handbook for systems engineering management using Planguage. Butterworth-Heinemann.

Gilb, T., & Graham, D. (1993). Software inspection. Addison-Wesley.

Hamilton, M. (1960). A rating scale for depression. Journal of Neurology, Neurosurgery, and Psychiatry, 23(1), 56-62.

Kennedy, M. N., & Harmon, K. (2008). Ready, set, dominate: Implement Toyota’s set-based learning for developing products and nobody can catch you. Oaklea Press.

Morgan, J. M., & Liker, J. K. (2006). The Toyota product development system: Integrating people, process, and technology. Productivity Press.

Sobel, A. E., & Clarkson, M. R. (2002). Formal methods application: An empirical tale of software system development. IEEE Transactions on Software Engineering, 28(3), 308-320.

W3C Web Accessibility Initiative. (2018). Web content accessibility guidelines (WCAG) 2.1. World Wide Web Consortium.

Ward, A. C. (2007). Lean product and process development. Lean Enterprise Institute.

Weinberg, G. M. (1985). The secrets of consulting: A guide to giving and getting advice successfully. Dorset House.

Yourdon, E. (1997). Death march: The complete software developer’s guide to surviving ‘mission impossible’ projects. Prentice Hall.

Why Science Gets No Look-in In Business

“Business is not an exact science” is a phrase often heard in corporate corridors and meeting rooms. It’s a near universal assumption, but one which is not supported by the scientific evidence.

A deeper understanding of this phrase highlights a rather intriguing aspect – the inherent need for those in charge, the decision-makers, to want it to be so.

In an exact science, laws and theories remain constant. The predictability they provide allows for clear, unambiguous paths to solutions. If business were recognised as an exact science, decision-making would be deterministic. However, this undermines the role of leaders, reducing them to mere implementers of pre-defined formulas. Leaders and their lackeys claim their art lies in making decisions amidst uncertainty, demonstrating the ability to take calculated risks, and applying intuition and experience where data falls short (a.k.a. HiPPO – highest paid person’s opinion). To maintain this dynamic, those in charge need business to remain neither a science nor exact.

A parallel is observed when we talk about “dealing with people.” This phrase encompasses a broad spectrum of situations, from human resources to customer relations, from team building to conflict resolution. People, with their diverse backgrounds, perspectives, emotions, and motivations, are incredibly complex. If dealing with people were accepted as an exact science, every interaction would follow a more or less predictable pattern. But again, those in charge need it to be seen as different from that.

If dealing with people were reduced to an exact science, leaders fear their highly rewarded personal touch, empathy, and agency would lose its kudos – and premium.

In definitive terms, leaders actively choose to uphold the notion of business and dealing with people as non-exact sciences to preserve their role as highly rewarded key decision-makers. Their profiles are enhanced by the unpredictability and intricacies of these domains. If every business decision or human interaction could be distilled down to a precise formula, leadership would lose its gloss.

Furthermore, this narrative is conspiratorialy upheld by consultants, analysts, and other business intermediaries. Their existence and remunerations rely heavily on the continued perception of business and human interaction as art forms that demand expert insights, not exact science.

Implicitly, they understand that their sponsors, primarily composed of business leaders, favor the preservation of this “non-exact” paradigm. Consequently, they conspire in maintaining the fiction, weaving it into their advice, thereby safeguarding their relevance and demand.

I Don’t Give a DAMN What You Think

It’s all talk. And no substance.

As they say, “Actions speak louder than words.” In my book, anyways.

Attachment

I have found that some folks are so attached to their thoughts, and their self-image as rational animals, that they fulminate greatly when their thoughts are discounted or dismissed. As if their thoughts were superior to any other’s.

I’m not a great fan of science neither. Cf. Feyerabend (2010). But I’ll take experimental evidence over opinion EVERY day of the week. Cf. Rother (2010)

So, please don’t EVER tell me what you think. It’s only ever pompous windbaggery.

I am however ALWAYS interested in how you’re feeling (and your needs, what’s alive in you).

– Bob

Further Reading

Rother, M. (2010). Toyota Kata: Managing People* for Continuous Improvement and Superior Results. Mcgraw-Hill.

http://www.youtube.com. (2020.). How to say BS in giraffe | Nonviolent Communication explained by Marshall Rosenberg. [online] Available at: https://www.youtube.com/watch?v=wtXogwq80vI [Accessed 24 Nov. 2021].

Feyerabend, P. (2010). Against Method. Verso.

* “You manage things, you lead people.” ~ Grace Hopper