Differentiation without rewriting everything: introducing the Scaffolding Assistant
Create Low, Low+, Medium and High support versions of the same exam question in minutes — with scaffolds aligned to your marking criteria, not generic 'helpful' prompts.
Quick Summary
- •Generate multiple scaffold levels for the same question without rewriting everything.
- •Choose between Low, Low+, Medium and High depending on how much support students need.
- •Scaffolds are built backwards from your marking criteria, so they stay mark-scheme focused.
- •Teachers stay in control: generate fast, then review and tweak before students see it.
Summary: You've got one exam question… and you need three different ways into it. The Scaffolding Assistant generates Low, Low+, Medium and High support versions of the same question — fast, mark-scheme aligned, and ready for you to tweak.
The problem: differentiation takes time you don't have
You've got one exam question… and you need three different ways into it.
One group can handle a clean structure and a nudge. Another group will produce two lines unless you give them sentence starters. And your top end will groan if it looks like a writing frame.
You can differentiate properly. You just can't justify spending your evening rewriting the same support sheet three times.
That's why we built the Scaffolding Assistant in Teach Edge.
What we mean by "scaffolding" (in plain English)
Scaffolding is the support that helps students do today what they can't yet do independently, so that tomorrow they can.
Done well, it doesn't "give the answer".
It reduces the blank-page problem, nudges students towards the thinking the mark scheme rewards, and helps them build a response that would otherwise be out of reach.
The hard part isn't understanding scaffolding. It's having the time to pitch it right, consistently, across lots of questions, across lots of classes.
So what is the Scaffolding Assistant?
In one sentence: it generates differentiated response plans (and optional model answers and mark schemes) for any exam question, aligned to your marking criteria.
The workflow is deliberately simple:
- Choose your marking prompt (or paste your own criteria)
- Add the exam question (and any extract/data if relevant)
- Pick a scaffolding level (Low, Low+, Medium, High)
- Generate — then review and tweak if you want
It's fast, but it's not hands-off. We've built it to be teacher-facing:
- you decide what level of support is appropriate
- you check the output
- you decide what goes in front of students
The four scaffolding levels
We built four levels because "support" isn't one thing. It's a spectrum.
Low
Structure only. A clear response framework without content nudges.
Best for confident students, stretch, and retrieval practice.
Low+
Structure plus light hints. A nudge towards the right areas without doing the thinking for them.
Medium
Key content signposted. More guidance on what a strong answer typically includes, including clearer chains of reasoning and the kinds of points that often lift students into higher levels.
High
Maximum support. Sentence starters and detailed prompts that get students moving, build confidence, and can be gradually removed over time.
What it looks like in practice (same question, different support)
Question: Evaluate the view that a rise in interest rates will significantly reduce inflation in the UK.
Low scaffolding (structure only)
Introduction
Define inflation and set out a clear line of argument (how far you agree).
Paragraph 1
Explain how higher interest rates affect aggregate demand, and link this to inflation.
Paragraph 2
Evaluate the strength of that channel (how quickly it works, how strongly it works, what affects it).
Paragraph 3
Consider supply-side or cost-push inflation and why interest rates may be less effective here.
Paragraph 4
Discuss trade-offs and wider impacts (growth, unemployment, distributional effects). Link back to "significantly".
Conclusion
Balanced judgement: when is it likely to reduce inflation significantly, and when not?
High scaffolding (detailed prompts and sentence starters)
Introduction
Start with: "Inflation is a sustained rise in the general price level. A rise in interest rates can reduce inflation by…"
Judgement starter: "Overall, I agree to an extent because… however the impact depends on…"
Paragraph 1: Demand-side channel
Sentence starter: "Higher interest rates increase the cost of borrowing for households and firms, which tends to reduce…"
Chain of reasoning prompt: "Lower spending and investment reduces aggregate demand, which reduces demand-pull inflation because…"
Add a supporting point: "This is likely to be stronger when households have high mortgage exposure / when confidence is fragile / when debt levels are high."
Paragraph 2: Why it might not be significant
Time lag prompt: "Monetary policy affects inflation with a delay because…"
Sensitivity prompt: "If demand is relatively interest-inelastic (for example…), the fall in spending may be limited."
Expectations prompt: "If inflation expectations are already high, firms may still raise prices because…"
Paragraph 3: Cost-push inflation
Prompt: "If inflation is being driven by energy prices, supply disruption, or imported inflation, interest rate rises…"
Trade-off starter: "In this case, higher rates may reduce inflation only slightly, but they can still reduce growth by…"
Paragraph 4: Final evaluation and context
Prompt: "Bring in UK context: wage growth, energy prices, exchange rate effects, and the risk of recession."
Judgement starter: "This suggests interest rate rises are most effective when…, but less effective when…"
Conclusion
Judgement frame: "Therefore, interest rates can reduce inflation significantly when…, but they are unlikely to be sufficient on their own when…"
Same question. Same objective. Different levels of support — without you rebuilding everything from scratch.
Why this is different from generic AI chatbots
You can paste a question into a general AI tool and ask for a plan. The output often looks tidy.
The issue is that it regularly fails in ways teachers spot immediately:
- it confuses "helpful" with "mark-winning", so students write plenty but miss the assessment objectives
- it blurs command words, so "evaluate" becomes "explain a lot"
- it isn't anchored to your criteria, so the pitch drifts away from what actually earns marks
The Scaffolding Assistant works backwards from the marking prompt you've chosen. That keeps the structure and prompts tied to the exam-board expectations you're teaching towards.
A two-minute way to try it
If you want a quick, low-risk test:
- Take a question you're setting next week
- Generate Low+ and High
- Use Low+ with most of the class, and High with the students who tend to stall
- Next lesson, step one group up or down and see what changes in the quality of work
You'll get a fast read on whether the scaffolding levels match your students — and you'll start to see where removing scaffolds over time becomes realistic.
The principle behind it
We don't think AI should replace teacher judgement. We think it should remove the repeatable grind that eats your time.
You still choose the level. You still adapt it. You still teach.
The assistant just helps you get to a usable, well-pitched scaffold quickly, so you can spend your energy where it matters.
Gary Roebuck is Head of Economics at Holy Cross School, New Malden and the creator of TeachEdge.
Related Posts
Beyond Marking: How UK Teachers Are Getting Creative with AI
AI tools like ChatGPT can help with marking and feedback — but many UK teachers are using them for far more: lesson planning, classroom engagement, differentiation, communication, and personalised support.
Using AI To Improve the Quality & Productivity of Your Student UCAS References
UCAS references matter, but writing them for a full cohort can swallow weeks. Here's how Teach Edge's UCAS Reference Generation Tool helps UK sixth-form teachers produce high-quality, personalised references in minutes — while staying fully in control.
Stop Trying to Catch Them: Why AI Detection is a Dead End for UK Secondary Schools
AI detection tools cannot reliably prove whether a GCSE or A Level student used generative AI. Schools will get further by modelling good AI use and protecting supervised writing time.
Ready to transform your marking workflow?