Results, not reports.

HAPSS measures its work by what changes in public institutions after we leave, not by what we delivered while we were there. These are the numbers that matter.

10,800
Target
Civil servants to be trained across the South-South region
6
States
Sub-national governments in active programme scope
4
Tracks
Integrated delivery tracks per engagement: records, data, services, security
100%
Standard
Independently verified milestones before programme disbursement
Case studies

Programmes and engagements.

Digital Capacity Multi-state Programme South-South Region

South-South Digital Capacity Programme (SSDC)

A three-year regional initiative commissioned through the South-South Development Commission to build digital capability across six states' civil services — Akwa Ibom, Bayelsa, Cross River, Delta, Edo, and Rivers. Phase 1 covers two states, 2,000 participants over 12 months, with full independent evaluation before Phase 2 commitment.

2,000
Civil servants targeted in Phase 1 across 2 states
12
Months to complete Phase 1 with independent evaluation gate
4
Delivery tracks: records, analytics, citizen services, cybersecurity
What was delivered
  • Baseline capacity assessment across 12 agencies before programme design
  • Blended training delivery: residential cohorts plus on-the-job modules
  • Applied project requirement: each participant delivers a workstream improvement
  • Participant tracking system with six-month post-programme tool usage monitoring
Programme structure
  • Phase 1: 2 states, 2,000 participants, ₦550M–₦750M, 12 months
  • Independent evaluator contracted by SSDC (not HAPSS)
  • Disbursement tied to verified participant completion rates
  • Phases 2 and 3 proceed only on independently verified milestones
Records Modernisation Agency Engagement Federal Level

Federal Agency Records Digitization Programme

An end-to-end records transition engagement for a revenue-generating federal agency, covering audit, digitization, metadata tagging, and EDRMS implementation. Designed to eliminate physical records processing bottlenecks that were adding an average of 17 days to citizen service request resolution.

340K+
Physical records audited and classified for digitization
17 days
Average service resolution time targeted for reduction
98%
Chain-of-custody logging completeness target for digitized records
Scope of work
  • Records audit and classification framework co-designed with agency archivist
  • Secure digitization with chain-of-custody logging at every step
  • EDRMS configuration mapped to existing workflow roles
  • Staff training programme covering all 240 records-handling employees
Transition approach
  • Parallel-run period: physical and digital systems operated simultaneously
  • Phased handover: one department at a time to manage risk
  • 90-day post-go-live support embedded in contract
  • Disaster recovery and backup procedures documented and tested
Data and Analytics State Government IGR Improvement

State Revenue Intelligence Dashboard

A decision intelligence engagement for a state revenue service facing significant IGR leakage and limited visibility into collection patterns. HAPSS designed and deployed a real-time revenue dashboard giving senior leadership line-of-sight into taxpayer registration, assessment, billing, and payment reconciliation across all collection channels.

Real-time
Visibility into multi-channel revenue collection — first time in the agency's history
14 days
From data integration to first operational dashboard used by Commissioners
3 levels
Role-based access tiers: Commissioners, Directors, and field supervisors
Technical delivery
  • Data pipelines integrating 6 existing revenue collection systems
  • Anomaly detection alerts for collection pattern irregularities
  • Automated monthly IGR report generation replacing manual compilation
  • Mobile-optimised interface for field supervisors
Capability transfer
  • In-house data analyst trained and certified on dashboard administration
  • Full system documentation and runbooks handed to agency IT team
  • 6-month post-deployment support with knowledge transfer tracking
  • Agency now independently maintains and extends the system

More case studies in preparation

Additional programmes are underway. Detailed case studies are published after independent evaluation is complete. Request a briefing →

Measurement standards

How we define and verify results.

Independent verification

Every impact number we publish is verified by an independent evaluator contracted by the client institution, not by HAPSS. We do not self-certify outcomes.

Six-month tracking

We track tool adoption, process changes, and participant outcomes six months after programme completion — because delivery day is not impact day.

No survivorship bias

We report on all participants and all tracked metrics — including those who did not complete or did not show measurable improvement. Selective reporting is not reporting.

Request a briefing

Want the full programme numbers?

For procurement committees, donor programme officers, or institutional leadership: we can provide a detailed programme briefing with full methodology and verification documentation upon request.

Request a briefing