Why Spreadsheets Still Win in GRC (and How to Finally Move Past Them)
93 practitioners in the State of GRC 2026 survey use spreadsheets as their primary compliance tool — more than any commercial vendor. The switching cost isn't price. It's confidence. Here's what the data says about why spreadsheets persist and what actually gets teams to move.

93 GRC practitioners told the largest independent survey of the profession that their primary compliance tool is a spreadsheet. That's more than ServiceNow (86), more than Vanta (49), more than Drata (32), more than every commercial platform in the market.
The State of GRC 2026 surveyed 795 practitioners with no vendor funding and no sponsor influence. The finding that spreadsheets remain the #1 tool in GRC isn't a failure of the market to build better products. It's a signal that the market hasn't understood why practitioners don't switch.
Who Uses Spreadsheets
The 93 spreadsheet users aren't a monolith. The survey breaks them down by team size:
| Team Size | Spreadsheet Users |
|---|---|
| Solo | 21 |
| Small (2-4) | 41 |
| Mid (5-10) | 23 |
| Enterprise (11+) | 8 |
44% of spreadsheet users sit in teams of 2-4. This is the sweet spot where the pain of manual compliance is real but the team isn't large enough (or funded enough) to justify an enterprise GRC platform. At 11+ people, spreadsheets nearly disappear — only 8 users, or 8% of the total.
The average technical skill of spreadsheet users is 4.9 out of 10. Not the lowest in the survey (that's LogicGate users, with zero high-skill practitioners), but firmly in the mid-range. These aren't beginners who don't know better. They're working practitioners who've looked at the alternatives and decided to stay.
The Switching Cost Isn't Financial
Every GRC vendor assumes the barrier to adoption is price. It isn't. The survey data points to something more fundamental: the switching cost is cognitive.
Entry-level practitioners have the most concentrated tool landscape
Entry-level practitioners have an HHI (Herfindahl-Hirschman Index, a measure of market concentration) of 1777 — the highest of any seniority level. Spreadsheets dominate their landscape because it's the tool they learned on, the tool their manager uses, and the tool every tutorial references.
At Director level, the same metric drops to 924 — the lowest concentration of any seniority level, meaning Directors evaluate and use the widest variety of tools. The higher you climb, the more alternatives you encounter. But by the time you reach Director, you've already built years of spreadsheet-based processes that would take months to migrate.
The confidence gap
Mid-skill practitioners (4-6 out of 10) represent 50.4% of the survey. These are people who can prompt but not code. They can follow a configuration guide but can't debug when something breaks. For this group, the risk of misconfiguring a commercial GRC platform — and only discovering the problem during an audit — feels worse than the known pain of doing things manually.
Spreadsheets persist not because practitioners don't want to switch. They persist because switching requires a level of technical confidence that half the industry doesn't have.
62.3% of mid-skill practitioners already own a platform
Here's the finding that should alarm every GRC vendor: most mid-skill practitioners who use a commercial tool already bought one. They have the platform. They have the license. They can't unlock it. The gap between owning a tool and operating it is where most GRC programs stall. Vendor onboarding covers the first week. Nobody covers months two through twelve.
This is the same dynamic we see with Drata and Vanta: the platform handles infrastructure monitoring through API integrations, but the application-level evidence that auditors care about still requires manual work. The platform is partially configured. The evidence is partially automated. And the practitioner still spends 40+ hours per audit cycle on screenshots.
What Spreadsheet Compliance Actually Looks Like
The abstract "93 practitioners use spreadsheets" becomes concrete when you look at a typical SOC 2 audit cycle for a spreadsheet-based team.
The Control Tracker
A Google Sheet or Excel workbook with tabs for each Trust Services Category. Columns for control ID, control description, owner, frequency, last tested date, evidence location, and status. This part actually works. Spreadsheets are genuinely good at tracking structured data.
The Evidence Folder
A shared Google Drive or SharePoint folder — sometimes a local folder on someone's laptop — with subfolders by control. Each subfolder contains:
- Screenshots named "Screenshot 2026-01-15 at 10.42.33 AM.png"
- Maybe a Word doc describing what the screenshot shows
- Sometimes a second version named "Screenshot 2026-01-15 at 10.42.33 AM (1).png" because the first one was blurry
There is no standardized naming convention. No metadata linking the screenshot to a specific control test. No timestamp verification beyond whatever the OS added to the filename. When the auditor asks "when was this test performed and by whom?" the answer requires cross-referencing the file's creation date with whoever was responsible for that control that quarter.
We've written about how to fix this exact problem in how to standardize manual SOC 2 evidence collection with screenshots.
The Evidence Pack
Before the audit, someone assembles evidence packs — usually PDFs combining screenshots with control descriptions. This happens in Word, Google Docs, or (in the best case) a script that pulls screenshots from folders and arranges them.
For a single SOC 2 control like CC6.1 (logical access), the evidence pack needs:
- Screenshots of RBAC configurations showing role definitions
- Screenshots of a test user being denied access to restricted resources
- Screenshots of MFA enforcement settings
- A description of who performed the test, when, and what the expected vs. actual outcomes were
- A mapping to the specific SOC 2 criterion
Multiply that by 30-50 controls across CC6, CC7, CC8, and CC9, and you're looking at 300-500 screenshots organized into 30+ evidence packs, each requiring manual assembly.
The Quarterly Scramble
The observation window for SOC 2 Type II requires evidence across multiple time periods. Every quarter, the spreadsheet team repeats the same process: log into each system, take new screenshots, update the evidence folder, rebuild evidence packs. This takes 40-80 hours per quarter for a team using a GRC platform and 80-120 hours for a team on spreadsheets, because every step is manual.
For teams managing multiple frameworks simultaneously — SOC 2 plus ISO 27001, or SOC 2 plus HIPAA — the hours double because cross-framework evidence reuse requires manual mapping that spreadsheets can't automate.
The Consultant Problem
228 consultants responded to the survey — 29.1% of all respondents. Each consultant advises multiple organizations on tool selection, implementation, and compliance strategy.
64.9% of those consultants use non-commercial solutions.
When a consultant runs their own compliance work on spreadsheets or open source, that preference bleeds into every client engagement. A consultant who advises 10 clients a year and prefers spreadsheets over commercial tools generates 10 decisions steered away from the commercial market.
The survey estimates this creates approximately 1,480 annual decisions influenced away from commercial GRC products. The most influential distribution channel in GRC — the advisor network — is actively recommending against adopting the tools that GRC vendors are trying to sell.
This isn't malicious. Consultants recommend what they know, what they trust, and what they can support across diverse client environments. Spreadsheets work everywhere. A platform-specific recommendation creates dependency on a tool the consultant may not be able to support at their next client. For a vCISO managing 10+ clients, the tech stack needs to be portable. Spreadsheets are infinitely portable.
The survey also reveals that consultants who adopt open source carry that preference even more strongly — 17 of the 38 open source users are consultants, and they bring that tooling choice to every engagement. For MSPs scaling compliance across clients, the consultant's tool preference becomes the client's default.
The Auditor Format Problem
There's a second, less obvious reason spreadsheets persist: auditors prefer them.
The survey found that 20% of auditors use spreadsheets as their primary tool — the highest spreadsheet adoption rate of any persona. Auditors have built their workflows around receiving evidence in specific formats: screenshots organized in folders, data in CSV or Excel, narratives in Word docs or PDFs.
When a practitioner invests in a commercial GRC platform, the platform produces evidence in its own structured format — JSON exports, platform-specific report layouts, proprietary dashboards. The auditor on the other end often can't (or won't) work with that format.
The platform's value collapses. Not because the product is bad, but because the person validating the output rejects the format.
This creates a perverse incentive: the more sophisticated your GRC tool, the more reformatting work you create at audit time. A practitioner using spreadsheets delivers evidence in exactly the format auditors expect. A practitioner using a commercial platform may need to export, reformat, and re-upload the same evidence into a format the auditor will accept.
We've written extensively about what auditors actually accept:
- What makes SOC 2 evidence acceptable to auditors
- What SOC 2 auditors actually look for in application evidence
- What auditors still ask for after Drata automation
The consistent finding: auditors want timestamped screenshots with clear labels, PDF evidence packs with control narratives, and structured data they can cross-reference against their own workpapers. Until tools produce evidence that auditors already accept — without a conversion step — spreadsheets retain a format advantage that no amount of platform features can overcome.
What Actually Gets Teams to Move
The survey reveals a clear graduation pattern:
| Team Size | Spreadsheet Adoption | Commercial Adoption |
|---|---|---|
| Solo | 14% | 42% |
| Small (2-4) | 16% | 51% |
| Mid (5-10) | 13% | 58% |
| Enterprise (11+) | 4% | 62% |
Team growth is the primary trigger. When a solo practitioner gets a second team member, the limitations of a shared spreadsheet become obvious: version control breaks, ownership is unclear, and the audit trail lives in someone's email. Commercial tool adoption jumps from 42% to 51% at team size 2-4, and spreadsheets begin their decline.
But the survey also reveals a second trigger: seniority.
The adoption matrix shows that a Senior Manager working solo has an 81% commercial tool adoption rate — compared to 6% for an Intermediate at the same team size. Experience gives practitioners the confidence to evaluate, select, and configure a platform. It also gives them the organizational standing to get budget approved.
The practitioners most likely to switch are those who (a) have enough experience to trust themselves with a new tool, and (b) have grown their team past the point where spreadsheets scale.
A third trigger is visible but less obvious: audit failure or near-miss. The survey doesn't capture this directly, but the correlation between seniority and adoption suggests that many practitioners switch after experiencing the pain of a difficult audit — missing evidence, extended timelines, or auditor pushback that manual processes caused. The switch is reactive, not proactive.
The Three Market Positions
The survey data reveals that the GRC tool market is consolidating around three positions, each serving a different combination of technical skill and organizational scale:
1. The Technical Mid-Market (Drata, Open Source) High-skill users (Drata averages 6.5, Open Source 5.8), balanced team sizes, buyers who evaluate on capability. These practitioners left spreadsheets because they outgrew them technically. Drata's user base has zero practitioners who rate themselves below 4 — the only vendor with that distinction. For a comparison of how Drata works and where it stops, see our detailed breakdown.
2. The Automation-First Startup (Vanta) Strong technical users in small teams. 30% of Vanta users are solo, 36% in teams of 2-4. These practitioners left spreadsheets because they needed leverage — one person doing the work of three. Vanta's overrepresentation in solo teams (7.4x) makes it the default "first GRC tool" for small teams. For teams evaluating Vanta's coverage, we've analyzed what Vanta does and doesn't automate for screenshots.
3. The Enterprise Incumbent (ServiceNow, AuditBoard, Archer IRM) Mid-range skills (ServiceNow averages 5.7), large teams (43% of ServiceNow users are in teams of 11+). These practitioners left spreadsheets because their organization mandated a platform. Archer IRM has the heaviest enterprise skew — 52% of its users sit in teams of 11+.
Notice what's missing: no vendor owns the low-to-mid skill, small team segment — the exact profile of most spreadsheet users. 44% of spreadsheet users sit in teams of 2-4 with average skill 4.9. None of the three market positions above are optimized for them.
How to Actually Leave Spreadsheets Behind
If you're one of the 93, here's what the data suggests.
Don't start with a platform. Start with a bottleneck.
The mistake most teams make is buying a full GRC platform to replace their spreadsheet. That's replacing one system of record with another — and it requires the same configuration, maintenance, and expertise that made the spreadsheet attractive in the first place.
Instead, identify your single biggest time sink. For most small compliance teams, it's evidence collection: the screenshots, the workflow documentation, the PDF formatting that takes 40-80 hours per audit cycle. Solve that first. The rest of the GRC program can stay in a spreadsheet until you're ready.
Match the output format to your auditor
Your auditor accepts screenshots and PDFs. They always have. Any tool you adopt should produce evidence in exactly that format — not in a proprietary format that requires export and reformatting. The best compliance tool is the one whose output your auditor never questions. We covered the specific evidence formats auditors accept and reject in a dedicated guide.
Automate evidence, not governance
Spreadsheets are actually fine for tracking which controls you have, what frameworks they map to, and when they were last reviewed. What spreadsheets can't do is capture the evidence proving those controls work: timestamped screenshots, workflow recordings, metadata-rich PDF evidence packs. That's the gap worth automating.
This distinction matters because it determines what you buy. A GRC platform replaces the spreadsheet. An evidence automation tool works alongside it. For a team of 2-4 on spreadsheets, the second option is less disruptive, less expensive, and addresses the actual bottleneck.
Start with one framework, one control category
Don't try to automate everything at once. Pick the control category that costs you the most hours — usually access controls (CC6.1) or change management (CC8.1) — and automate evidence collection for that category first. Once you see the time savings on 5-10 controls, expanding to the rest is straightforward.
How Screenata Bridges the Gap
Screenata was built for the exact segment the survey identifies as underserved: small teams with mid-range technical skills who need audit-ready evidence without the overhead of an enterprise platform.
The workflow is designed around the bottleneck, not the org chart:
- Record a control test — verifying access restrictions, testing change management workflows, checking encryption settings. Just use the browser extension while you do the work you'd do anyway.
- Automatic evidence generation — Screenata captures screenshots at each step, generates AI-written descriptions, and maps to the relevant framework control. No manual labeling, no file renaming.
- Auditor-ready output — You get a PDF evidence pack in the format auditors already accept. Timestamped screenshots, control narratives, tester identification, and metadata chain. No export. No reformatting. No folder of screenshot_final_v3.png files.
For spreadsheet teams doing 300-500 screenshots per audit cycle, this eliminates the highest-cost manual work without requiring you to abandon the spreadsheet for everything else. You keep your control matrix in Google Sheets and still produce evidence that looks like it came from an enterprise compliance program.
Screenata also handles the full compliance workflow beyond evidence: policy writing grounded in your actual codebase, control mapping, risk assessment, and readiness scoring. For teams ready to move beyond the spreadsheet entirely, Screenata replaces both the GRC platform and the compliance consultant — at $499/month instead of $10,000-$20,000/year plus consultant fees.
For a comparison with other approaches to moving past spreadsheets, see our guide on what tools can replace manual screenshot collection for SOC 2 controls.
Frequently Asked Questions
Are spreadsheets actually bad for compliance?
Not entirely. Spreadsheets are fine for control tracking, risk registers, and compliance calendars — structured data management where the flexibility of a spreadsheet is genuinely useful. They fail at evidence production: capturing timestamped screenshots, generating formatted reports, and maintaining an auditable chain of custody. The problem isn't that spreadsheets are bad at everything. It's that they're being used for things they were never designed to do.
What's the real cost of staying on spreadsheets?
The direct cost is labor: 80-120 hours per audit cycle for evidence collection and formatting, versus under 10 hours with automation. At $150/hour (a typical fully-loaded cost for a compliance or engineering professional), that's $12,000-$18,000 per cycle in labor alone. But the indirect cost is worse: inconsistent evidence quality, audit delays from missing documentation, and the opportunity cost of pulling engineers away from product work to take screenshots.
Can I keep my spreadsheet and add automation on top?
Yes. That's actually the approach the survey data supports. The practitioners most likely to succeed aren't the ones who do a full platform migration. They're the ones who keep their existing tracking system and add automation for the specific bottleneck — usually evidence collection. Screenata is designed to work this way: you can keep your Google Sheets control matrix and use Screenata only for generating the evidence your auditor reviews.
What if my auditor specifically wants spreadsheet-format evidence?
Some auditors request raw data in CSV or Excel format for their own testing. This is different from the evidence pack format question. AI evidence tools like Screenata can export in multiple formats. The key is that the evidence itself — screenshots, workflow recordings, metadata — is captured automatically, regardless of the output format your auditor prefers.
How do I convince my team to switch when "it's working fine"?
The survey data suggests you don't need to convince them to switch entirely. Frame it as adding capability, not replacing process. Calculate the hours your team spends on evidence collection per audit cycle. Compare that to the cost of automation. Present it as a time-saving investment, not a platform migration. The ROI of compliance evidence automation is the most concrete argument you can make.
The Full Picture
The State of GRC 2026 by Ayoub Fandi at GRC Engineer is worth reading beyond the spreadsheet headline. The report covers the CISO rejection waterfall (73.6% of CISOs use no commercial tool), the technical skills gap (average 5.4 out of 10), and the tool entropy data showing the market consolidating from 21 tools to 13 over four quarters.
The spreadsheet finding isn't an anomaly. It's the surface expression of deeper structural issues in how GRC tools are built, sold, and evaluated. Understanding why spreadsheets win is the first step toward building something that actually replaces them.
Read the full report at grcengineer.com.
Ready to Automate Your Compliance?
Join 50+ companies automating their compliance evidence with Screenata.