Unit 1 — Diagnostic: Where Are You Really?
Unit 1 — Diagnostic: Where Are You Really?
Unit ID: FO-FND-01
Estimated Time: 60–90 minutes
Delivery Mode: Self-Guided or Catalyst-Led
Applies To: Founders, operators, managers, team leads
Prerequisites: None
Unit Purpose and Role in FlowOps
This unit establishes the foundation for all FlowOps work. Every unit that follows—process creation, handoffs, metrics, capacity, automation—assumes that this diagnostic has been completed honestly and correctly.
The purpose of this unit is not to design solutions.
The purpose is to understand reality.
Organizations that skip or rush this step consistently:
- Fix the wrong problems
- Apply the wrong level of complexity
- Burn time and goodwill
- Blame people instead of systems
This diagnostic creates a shared, objective baseline so improvement efforts are intentional, sequenced, and appropriately scoped.
1. What This Diagnostic Solves
Understanding the Problem Itself
Most operational pain is not caused by a lack of effort. Teams are usually busy, engaged, and trying to do the right thing. The problem is that work is flowing through invisible or unstable systems.
This diagnostic helps surface what is currently implicit, assumed, or improvised and makes it explicit, discussable, and measurable.
Common Operational Symptoms
These symptoms indicate low or inconsistent process maturity:
- Work feels chaotic even when people are working hard
- Problems recur after being “fixed”
- Different people do the same work in different ways
- Progress requires constant follow-up
- Improvements stall or are abandoned
These symptoms are signals — not failures.
Why These Symptoms Exist
They typically stem from:
- Undefined or unclear processes
- Missing ownership
- Poorly designed handoffs
- Inconsistent or missing data
- Lack of visibility into work in progress
Without a diagnostic, teams react to symptoms instead of addressing causes.
Cost of Skipping This Step
When organizations move directly to solutions:
- Improvements are misaligned
- Tools are over- or under-used
- Complexity increases without benefit
- Trust in “process work” erodes
This unit exists to prevent that outcome.
2. The Standard: What “Good” Looks Like
Purpose of This Section
This section defines the target state FlowOps is working toward. It is not a description of perfection — it is a description of operational health.
- Whether something is “good enough”
- When to automate
- When to move on
The standard removes subjectivity.
A Healthy Process Environment
A healthy process environment is one where:
- The current state is understood
- Ownership is explicit
- Work progresses predictably
- Exceptions are intentional
- Improvement is possible without disruption
Non-Negotiable Principles
These principles apply regardless of company size or industry:
- Diagnose before designing
- Document reality, not intent
- Start small and build outward
- Separate process health from people performance
These principles govern every unit in FlowOps.
3. Diagnostic Framework
What Is Being Evaluated
This diagnostic evaluates process maturity, not individual performance. It focuses on how reliably work moves through the system — not how talented or motivated people are.
Each workflow is evaluated across six dimensions that represent the minimum conditions required for stable flow.
The Six Maturity Dimensions
1. Process Clarity
Is the work clearly defined, repeatable, and understood by those performing it?
2. Ownership
Is there a clearly accountable owner responsible for outcomes and improvement?
3. Handoffs
When work changes hands, are expectations, inputs, and acceptance criteria clear?
4. Data Quality
Is required information defined, structured, and enforced?
5. Visibility & Tracking
Can progress be understood without asking individuals for updates?
6. Exception Handling
Are deviations managed intentionally or improvised case-by-case?
Why These Dimensions Matter
Each dimension corresponds to a common failure mode. Together, they provide a complete picture of process health.
These same dimensions reappear throughout later units, making this diagnostic the baseline for future improvement.
4. Scoring Model (Numeric and Descriptive)
Purpose of the Scoring Model
The scoring model allows teams to:
- Compare workflows consistently
- Identify patterns
- Track improvement over time
However, numbers alone are insufficient.
Scoring Scale
Each dimension is scored from 0 to 4:
|
Score |
Meaning |
|
0 |
Does not exist |
|
1 |
Informal / tribal knowledge |
|
2 |
Documented but inconsistent |
|
3 |
Consistent and owned |
|
4 |
Measured and improving |
Required Written Justification
Every numeric score must include a written explanation.
- Numbers provide structure
- Explanations provide insight
A score without context is invalid.
How to Interpret Scores
Scores are signals, not grades.
- Low scores indicate missing structure
- High scores indicate discipline, not completion
Scores should never be used for performance evaluation.
5. Running the Diagnostic
5.1 Selecting Workflows
The quality of the diagnostic depends on workflow selection.
Select 3–5 recurring workflows that:
- Occur frequently
- Are reasonably bounded
- Involve real friction
Avoid enterprise-wide or highly complex flows at this stage.
5.2 Scoring Execution
For each workflow:
- Score all six dimensions
- Document justification for each score
- Capture observed breakdowns
This can be done:
- Self-Guided, or
- Catalyst-Led to reduce bias and align stakeholders
5.3 Identifying Constraints
After scoring, patterns will emerge.
For each workflow, ask:
- Where does work slow or stall?
- Where does rework originate?
- Where does improvisation occur?
Rank constraints by:
- Frequency
- Impact
- Ease of improvement
FlowOps prioritizes leverage, not effort.
5.4 Selecting the First Process
Choose one process to improve first.
Strong candidates are:
- Single-owner
- Repetitive
- Low dependency
- High frustration
This choice determines early momentum.
6. Readiness Tier Output
Purpose of Readiness Tiers
Readiness tiers translate diagnostic data into a simple directional signal. They prevent teams from applying tactics that exceed their current maturity.
Readiness Tiers
|
Tier |
Average Score |
Meaning |
|
Red – Ad Hoc |
0.0–1.4 |
Processes unstable |
|
Yellow – Emerging |
1.5–2.4 |
Inconsistent structure |
|
Green – Operational |
2.5–3.4 |
Repeatable and owned |
|
Blue – Scalable |
3.5–4.0 |
Measured and improving |
The tier determines pace, depth, and sequencing.
7. KPIs and Signals
Purpose of Measurement Here
These metrics evaluate the quality of the diagnostic, not operational performance.
Leading Signals
- % of workflows scored
- % with written justification
Lagging Signals
- Alignment on starting process
- Adoption of structured improvement work
8. Governance
Why Governance Is Required
Without governance, diagnostics decay or become noise.
Governance ensures:
- Consistency
- Trust
- Intentional reuse
Owner
A single role owns the diagnostic framework and scoring integrity.
Cadence
Quarterly or before major initiatives.
Change Control
Framework changes are versioned and never mid-cycle.
9. Common Failure Modes
This diagnostic fails when teams:
- Treat scores as performance reviews
- Select overly complex workflows
- Skip written justification
- Rush to tools or automation
Discomfort often indicates honesty, not failure.
10. Catalyst-Led Option
Catalyst involvement may include:
- Facilitated scoring
- Bias normalization
- Synthesis and roadmap creation
Catalyst accelerates clarity but does not replace ownership.
11. Completion Criteria
Unit 1 is complete only when:
- 3–5 workflows are scored
- Scores include justification
- Constraints are ranked
- A readiness tier is assigned
- A first process is selected
If these conditions are not met, do not proceed.
COPY-PASTE DIAGNOSTIC TEMPLATE
(Word / Docs friendly)
Organization: __________________________
Date: __________________________
Workflow Name: __________________________
Owner: __________________________
|
Dimension |
Score (0–4) |
Justification |
|
Process Clarity |
||
|
Ownership |
||
|
Handoffs |
||
|
Data Quality |
||
|
Visibility |
||
|
Exceptions |
Observed Constraints:
Average Score: ______
Readiness Tier: ______
Selected First Process: __________________________