Readiness Scorecard · Process Digitiser

Process Digitisation
Readiness Scorecard

A 26-point self-audit for any repeatable workflow. Run it before buying software, commissioning automation, or starting an AI build — it tells you whether the process is ready to digitise or will stall at the first tool change.

Who it’s for

Operations owners, process leads, and anyone about to scope AI, automation, or a full digitisation build.

When to use it

Before committing budget or build time. Run it alone first, then share it with the process owner before any scoping conversation.

How to score

Tick every statement that is true for the process as it actually runs today — not how you intend it to run. Add up your ticks. Maximum 26.

What the score means

20–26: Ready to digitise.  11–19: Partially ready, blockers ahead.  Under 11: Stabilise first.

Download plain-text copy

Part 1: Is the process stable enough to digitise?

Automation applied to an unstable or unclear process creates faster chaos. Answer honestly.

  • The process has one clear trigger (what starts it) and one clear end state (what “done” looks like).
  • More than one person knows how it works — it is not locked in one person’s head.
  • The process produces the same type of output every time, even if the inputs vary.
  • You can describe the steps in roughly 10 bullets without debating what counts as a step.
  • When the usual person is away, someone else can cover it (even if they complain).

If you ticked fewer than 3: stabilise the process before you automate it. Digitising a mess produces a digital mess.

Part 2: Where does it break?

Every process has a failure signature. Naming it early is worth more than any tool comparison.

  • There is one step where work regularly stops or waits for someone to act.
  • Errors are discovered late — after the next person in the chain has already acted on the bad data.
  • There is a manual re-entry step (copying data from one system to another by hand).
  • There is an approval or sign-off that blocks progress but rarely changes the outcome.
  • The process produces rework: work that gets done, then undone, then redone.
  • You have to chase people to find out where something is.

Bottleneck (work waits): look for routing automation or approval triggers. Late discovery (errors found downstream): look for validation or checking steps. Manual re-entry: look for integration or structured intake. Phantom approvals: consider delegating or removing the approval. Rework: fix the upstream intake before automating the downstream steps.

Part 3: What data does it run on?

Automation depends on structured, accessible, trustworthy data. This section catches the most common failure mode.

  • The input data is consistent — same fields, same format, same source — at least 80% of the time.
  • The data lives in one primary system (not spread across email, spreadsheets, and a shared drive).
  • You can look at a completed case and reconstruct what happened without asking the person who did it.
  • The data does not regularly contain sensitive personal or client information that would require special handling.
  • You know what a bad or incomplete input looks like and could write a rule for it.

If data is inconsistent or scattered: structured intake (a form, a template, a defined handoff) is often the first and most valuable digitisation step — before any AI or automation is involved.

Part 4: Where could AI or automation realistically help?

Match the capability to the actual job. Not every step benefits from the same treatment.

  • There is a step that is repetitive, rule-based, and produces the same output for similar inputs — candidate for automation.
  • There is a step that involves reading, summarising, classifying, or drafting from variable text inputs — candidate for AI-assist.
  • There is a step where a human currently makes a judgment call based on pattern recognition — candidate for AI recommendation with human approval.
  • There is a step where someone currently monitors for a condition (a date, a threshold, a missing item) — candidate for automated alerting or triggering.
  • There is a step that requires authoritative decision-making with accountability — keep this with a human, with a clear audit trail.

Part 5: What are the risk factors?

Automation risk is usually underestimated. These items are not reasons to stop — they are reasons to design carefully.

  • The process touches personally identifiable information, client data, or regulated data. (Requires compliance review before automation.)
  • Errors in this process have real consequences for customers, employees, or regulators. (Requires human review checkpoints in the future-state design.)
  • The people who run this process have not been consulted about the redesign. (High risk of rejection, workarounds, or shadow processes post-launch.)
  • There is no existing way to log what happened in the process. (Requires audit trail design before going live.)
  • The technology you plan to use is not yet approved or procured. (Add procurement lead time to your plan.)

Scoring guide

Count your ticked items across all five sections (maximum: 26).

20–26 Ready to digitise

The process is stable, the failure points are clear, and the data is workable. A Process Digitiser engagement or a structured build sprint is the logical next step.

11–19 Partially ready

You have enough to start designing but will hit blockers. Fix the stability or data issues before committing to a full build.

Under 11 Not ready yet

The process needs stabilisation before digitisation. Start by documenting the current state, naming the owner, and resolving the data fragmentation.

What to do with this

Run it on your target process, share it with the process owner, and discuss the gaps before scoping any technical work.

If the audit surfaces real problems you want expert eyes on, Process Digitiser is the next step. Upload the evidence pack, spend 30 minutes with Rob, and get a current-state map, failure-point diagnosis, and a practical AI-accelerated fix plan in 48 hours.

Further reading: Why AI automation fails when the process is not ready — the four patterns that show up most often and what to do about them.

Want this sent to your email?

Send a note and Rob will email you a clean copy of the checklist, plus the occasional Signal Desk digest when there is something worth pulling together.

Email me the checklist
Cite or link

If you reference this scorecard in a post, presentation, or client handout, please link to the canonical URL:

https://roballandale.com/process-digitisation-checklist/

Plain-text copy available to share or print: download checklist (.txt)

Share this signal
Signal soundtrack Dark Driving Techno
0:00 0:00