AI Visibility Audit

GoGuardian
Visibility Report

Competitive intelligence for AI-mediated buying decisions. Where GoGuardian wins, where it loses, and a prioritized three-layer execution plan — built from 150 buyer queries across ChatGPT + Claude + Gemini.

150 Buyer Queries
5 Personas
8 Buying Jobs
ChatGPT + Claude + Gemini
April 3, 2026

TL;DR

65.3%
Visibility
98 of 150 queries
5.3%
Win Rate
8 wins of 150 queries
52
Invisible
queries where GoGuardian absent
31
Recommendations
targeting 148 gap queries (+ 4 near-rebuild optimizations)
Three things to know
GoGuardian is on every shortlist and chosen from none
GoGuardian appears in 100% of Shortlisting queries (25/25) — the highest visibility of any buying stage — yet wins 0% of them. By contrast, Problem Identification queries yield a 27.3% win rate (3/11 visible queries). GoGuardian performs better at the beginning and loses at the end: the content that creates presence cannot close a recommendation.
0% Shortlisting win rate · 25 queries
A sitemap missing timestamps on 1,027 URLs suppresses every piece of content GoGuardian publishes
GoGuardian's sitemap contains 1,027 URLs with no lastmod, changefreq, or priority attributes. AI crawlers use lastmod to detect freshness — without it, all 1,027 pages are freshness-equivalent to a crawler and must be re-crawled individually to detect updates. Additionally, 10 of 19 blog posts carry dates from 2018-2019; freshness-weighted AI citation algorithms deprioritize content this old regardless of its relevance. L1 fixes resolve both issues before any L2 or L3 content is published.
Technical fix · 1,027 URLs
Five feature areas GoGuardian's product supports are invisible in AI responses — Lightspeed wins them unopposed
CIPA compliance documentation, parent engagement features, off-network device protection, YouTube filtering controls, and edtech usage analytics are GoGuardian product capabilities with no corresponding content GoGuardian owns in AI responses. These 59 L3 gap queries (39.3% of the 150-query set) are entirely uncontested — GoGuardian's product is capable, but AI models cannot cite evidence of it. No product changes are required; only new content creation.
Content void · 59 queries
Section 1
Visible Everywhere, Winning Nowhere: GoGuardian's GEO Visibility Audit

GoGuardian's 65.33% visibility and near-zero win rate are not contradictions — they are symptoms of a single structural failure: content that places GoGuardian in the room but does not earn a recommendation.

Early Funnel — Where GoGuardian is visible but not winning
Requirements Building
46.7%
Solution Exploration
80%
Problem Identification
84.6%
Late Funnel — Where GoGuardian competes
Shortlisting
100%
Validation
83.3%
Consensus Creation
50%
Comparison
40.6%
Artifact Creation
28.6%

[Mechanism] Three compounding gaps create the pattern. First, 59 queries are entirely invisible because GoGuardian lacks content for five feature areas (CIPA compliance, parent engagement, off-network protection, YouTube filtering, BYOD, edtech ROI) — competitors enter these queries unopposed and define the evaluation criteria before GoGuardian appears. Second, existing pages match 83 buyer queries but lose because they describe product capabilities without providing the structured Comparison data, evaluation frameworks, and evidence-backed claims that produce AI recommendations — this is why Shortlisting visibility is 100% but Shortlisting win rate is 0%.

Third, blog content dated 2018-2019 and a sitemap missing lastmod timestamps on all 1,027 URLs suppress the freshness signals AI citation algorithms use to prioritize content, meaning even good content is deprioritized against fresher competitor pages.

Layer 1
Fix the Foundation
7 L1 technical fixes address site-wide crawlability, freshness signaling, and schema verification — unblocking AI indexing infrastructure before content investment begins.
5 fixes + 1 checks · Days to 2 weeks
Layer 2
Deepen Existing Pages
20 L2 content optimizations add Comparison depth, evaluation frameworks, and evidence-backed claims to 83 existing-page gaps where GoGuardian is visible but fails to earn recommendations.
20 recommendations · 2–6 weeks
Layer 3
Build Missing Content
5 L3 NIOs create new content hubs for 59 queries where GoGuardian is currently invisible — targeting CIPA compliance, family engagement, YouTube/BYOD, edtech ROI, and Comparison architecture.
5 recommendations · 1–3 months

[Synthesis] L1 technical fixes must execute before L2/L3 work because the sitemap lastmod fix directly unblocks AI crawlers' ability to detect when any page — new or refreshed — has been updated. Without lastmod, all 1,027 GoGuardian URLs are equivalent in freshness priority to a crawler; with lastmod, newly published L3 pages and refreshed L2 pages will receive immediate freshness credit. Additionally, adding visible dates to Comparison pages (L1 finding: comparison_pages_undated) directly increases citation confidence for the /admin/vs-competitors and /beacon/vs-competitors pages that L2 recommendations optimize — the date fix and the content fix must ship together to maximize impact.

Reference
How to Read This Report

Visibility

Whether GoGuardian is mentioned at all in an AI response to a buyer query. Being visible does not mean being recommended — it just means GoGuardian appeared somewhere in the answer.

Win Rate

Of the queries where GoGuardian is visible, the percentage where it is the primary recommendation — the vendor the AI tells the buyer to evaluate first.

Share of Voice (SOV)

How often a vendor is mentioned by AI across all 150 buyer queries. Measures brand presence in AI-generated answers, not ad spend or traditional media.

Buying Jobs

The 8 non-linear tasks buyers perform during a purchase: Problem Identification, Solution Exploration, Requirements Building, Shortlisting, Comparison, Validation, Consensus Creation, and Artifact Creation.

NIO

Narrative Intelligence Opportunity — a cluster of related buyer queries where GoGuardian has no content. Each NIO includes a blueprint of on-domain pages and off-domain actions to close the gap.

L1 / L2 / L3

The three execution layers. L1 = technical infrastructure fixes. L2 = optimization of existing pages. L3 = new content creation and off-domain authority building.

Citation

When an AI tool references a specific webpage as its source. AI systems build recommendations from cited pages — if your pages aren't cited, your content didn't influence the answer.

Invisible Query

A buyer query where GoGuardian does not appear in the AI response at all. Distinct from a positioning gap, where GoGuardian appears but is not the recommended vendor.

Gap Query

A query where GoGuardian is either invisible (not mentioned in any AI response) or has a positioning gap (mentioned but not winning the recommendation). Gap queries are the union of invisible queries and positioning gap queries.
Section 2
Visibility Analysis

Where GoGuardian appears and where it doesn't — across personas, buying jobs, and platforms.

[TL;DR] GoGuardian is visible in 65% of buyer queries and wins 8% of those. Converting visibility to wins is the primary challenge (57% conversion gap — GoGuardian appears but doesn’t win). High-intent queries run higher at 72%.

GoGuardian's early-funnel invisibility (30.2% across Problem Identification, Solution Exploration, and Requirements Building) lets competitors define the evaluation criteria before GoGuardian enters — a structural disadvantage that compounds at every subsequent stage.

Platform Visibility

DimensionCombined
All Queries65.3%
By Persona
Chief Technology Officer / Director of Technology69.2%
Director of Curriculum & Instruction66.7%
Director of Student Services & Safety53.8%
Network Administrator / Systems Engineer72.4%
Superintendent62.5%
By Buying Job
Artifact Creation28.6%
Comparison40.6%
Consensus Creation50%
Problem Identification84.6%
Requirements Building46.7%
Shortlisting100%
Solution Exploration80%
Validation83.3%

Visibility by Buying Job

Artifact Creation28.6% (4/14)
Comparison40.6% (13/32)
Consensus Creation50% (6/12)
Problem Identification84.6% (11/13)
Requirements Building46.7% (7/15)
Shortlisting100% (25/25)
Solution Exploration80% (12/15)
Validation83.3% (20/24)
High-intent visibility
Shortlist + Compare + Validate
71.6% (58/81)
High-intent win rate5.2% (3/58)

Visibility & Win Rate by Persona

Chief Technology Officer / Director of Technology69.2% vis · 14.8% win (4/27)
Director of Curriculum & Instruction66.7% vis · 6.2% win (1/16)
Director of Student Services & Safety53.8% vis · 0% win (0/14)
Network Administrator / Systems Engineer72.4% vis · 4.8% win (1/21)
Superintendent62.5% vis · 10% win (2/20)
Decision-maker win rate
Chief Technology Officer / Director of Technology + Network Administrator / Systems Engineer + Superintendent
10.3% (7/68 visible)
Evaluator win rate
Director of Curriculum & Instruction + Director of Student Services & Safety
3.3% (1/30 visible)
Role type gap7 percentage points

Visibility by Feature Focus

Byod Support80% vis (4/5) · 0% win (0/4)
Cipa Compliance54.5% vis (6/11) · 0% win (0/6)
Classroom Management73.7% vis (14/19) · 14.3% win (2/14)
Cross Platform Support66.7% vis (8/12) · 12.5% win (1/8)
Digital Hall Pass33.3% vis (2/6) · 0% win (0/2)
Edtech ROI16.7% vis (1/6) · 0% win (0/1)
Off Network Protection66.7% vis (6/9) · 33.3% win (2/6)
Parent Engagement57.1% vis (4/7) · 25% win (1/4)
Reporting Analytics60% vis (3/5) · 0% win (0/3)
Student Safety Alerting57.1% vis (16/28) · 0% win (0/16)
Web Filtering83.3% vis (20/24) · 10% win (2/20)
Youtube Filtering80% vis (4/5) · 0% win (0/4)

Visibility by Pain Point

Alert Fatigue44.4% vis (4/9) · 0% win (0/4)
Cipa Compliance Burden55.6% vis (5/9) · 0% win (0/5)
Digital Distraction100% vis (8/8) · 12.5% win (1/8)
Edtech Sprawl28.6% vis (2/7) · 0% win (0/2)
Filter Bypass Vulnerability100% vis (3/3) · 0% win (0/3)
Multi Os Complexity75% vis (6/8) · 0% win (0/6)
Overblocking Educational Content85.7% vis (6/7) · 0% win (0/6)
Student Safety Crisis69.2% vis (9/13) · 0% win (0/9)
Take Home Device Gap69.2% vis (9/13) · 33.3% win (3/9)
Vendor Fragmentation75% vis (6/8) · 16.7% win (1/6)

[Data] Overall visibility: 65.33% (98/150 queries). By buying job: Shortlisting 100% (25/25), Problem Identification 84.62% (11/13), Validation 83.33% (20/24), Solution Exploration 80% (12/15), Requirements Building 46.67% (7/15), Comparison 40.62% (13/32), Artifact Creation 28.57% (4/14). Early-funnel invisibility rate: 30.2% across Problem Identification, Solution Exploration, and Requirements Building (13/43 queries invisible).

[Synthesis] GoGuardian's visibility is strongest at late-funnel stages (Shortlisting: 100%, Validation: 83.33%) and weakest at early-funnel stages (Requirements Building: 46.67%, Comparison: 40.62%, Artifact Creation: 28.57%). The early-funnel invisibility rate of 30.2% (13/43 queries) means competitors frame the problem and define the evaluation criteria before GoGuardian enters. Because Requirements Building shapes how buyers score vendors at Shortlisting, early-funnel absence has a compounding effect on late-funnel conversion.

Invisibility Gaps — 52 Queries Where GoGuardian Doesn’t Appear

20 queries won by named competitors · 11 no clear winner · 21 no vendor mentioned

Sorted by competitive damage — competitor-winning queries first.

IDQueryPersonaStageWinner
⚑ Competitor Wins — 20 queries where a named competitor captures the buyer
gg_023"What tools exist for tracking which edtech apps and software licenses schools are actually using?"Chief Technology Officer / Director of TechnologySolution ExplorationLightspeed Systems
gg_041"What should a school district look for in edtech usage analytics to cut wasted software spending?"SuperintendentRequirements BuildingLightspeed Systems
gg_073"Lightspeed vs LanSchool for device monitoring — which handles both Chromebooks and Windows better?"Network Administrator / Systems EngineerComparisonLightspeed Systems
gg_074"We're replacing our firewall-based filter — Lightspeed Systems vs Securly, which cloud web filter is better for a Chromebook-heavy district?"Network Administrator / Systems EngineerComparisonLightspeed Systems
gg_075"Gaggle vs Lightspeed Alert for student safety — how do their alert accuracy and response times compare?"Director of Student Services & SafetyComparisonGaggle
gg_076"Dyknow vs Lightspeed Classroom for screen monitoring — which is easier for teachers to use?"Director of Curriculum & InstructionComparisonDyknow
gg_077"Lightspeed Filter vs Securly for YouTube filtering controls in K-12 schools"Director of Curriculum & InstructionComparisonLightspeed Systems
gg_078"Which K-12 web filter has the best CIPA compliance reporting and E-Rate documentation — Lightspeed or Securly?"Chief Technology Officer / Director of TechnologyComparisonLightspeed Systems
gg_079"Securly vs Linewize for parent engagement and take-home device monitoring — which gives parents better visibility?"SuperintendentComparisonSecurly
gg_082"Comparing Lightspeed, Securly, and Gaggle — which student safety platform is strongest for a mid-size district?"Chief Technology Officer / Director of TechnologyComparisonLightspeed Systems
Show 10 more competitor wins + 32 uncontested queries

Remaining competitor wins: Lightspeed Systems ×4, Securly ×1, LanSchool ×1, Hāpara ×1, Bark for Schools ×1, Gaggle ×1, Blocksi ×1. 11 queries with no clear winner. 21 queries with no vendor mentioned. Full query-level data available in the analysis export.

Positioning Gaps — 90 Queries Where GoGuardian Appears But Loses

Queries where GoGuardian is mentioned but a competitor is positioned more favorably.

IDQueryPersonaBuying JobWinnerGoGuardian Position
gg_001"What are the main approaches to keeping students safe online in K-12 school districts?"SuperintendentProblem IdentificationNo Vendor MentionedMentioned In List
gg_002"How are school districts handling student self-harm detection on school-issued devices?"Director of Student Services & SafetyProblem IdentificationNo Clear WinnerMentioned In List
gg_003"Teachers spending half the class chasing students off YouTube and games — what do other districts do?"Director of Curriculum & InstructionProblem IdentificationNo Clear WinnerMentioned In List
gg_004"We have Chromebooks, Windows laptops, and iPads — how do districts enforce consistent web filtering across all of them?"Chief Technology Officer / Director of TechnologyProblem IdentificationNo Clear WinnerMentioned In List
gg_005"Our filter blocks half the educational sites teachers need — how do we fix overblocking without opening everything up?"Network Administrator / Systems EngineerProblem IdentificationNo Clear WinnerBrief Mention
gg_006"E-Rate audit is coming and I can't prove CIPA compliance — what are other districts using for documentation?"Chief Technology Officer / Director of TechnologyProblem IdentificationNo Vendor MentionedMentioned In List
gg_007"Students figured out VPNs to bypass our web filter — what solutions actually stop filter circumvention?"Network Administrator / Systems EngineerProblem IdentificationNo Clear WinnerMentioned In List
gg_012"What do districts do about student devices when kids bring their own phones and laptops to school?"Director of Student Services & SafetyProblem IdentificationNo Vendor MentionedMentioned In List
gg_015"Difference between agent-based filtering and DNS-based filtering for school devices"Network Administrator / Systems EngineerSolution ExplorationNo Vendor MentionedMentioned In List
gg_017"Should we get one platform for web filtering, classroom management, and safety monitoring or use separate best-of-breed tools?"Chief Technology Officer / Director of TechnologySolution ExplorationNo Clear WinnerMentioned In List
Show 80 more queries
IDQueryPersonaBuying JobWinnerGoGuardian Position
gg_019"How do classroom management platforms integrate with Google Workspace for Education?"Director of Curriculum & InstructionSolution ExplorationNo Clear WinnerBrief Mention
gg_020"What's the difference between human-reviewed safety alerts and fully automated AI detection for student threats?"Director of Student Services & SafetySolution ExplorationNo Clear WinnerBrief Mention
gg_021"We're on an appliance-based filter and thinking about going cloud — what's the real difference for a mixed device school district?"Network Administrator / Systems EngineerSolution ExplorationNo Clear WinnerMentioned In List
gg_024"Approaches to filtering YouTube in schools — blocking it entirely vs. granular video-level controls"Director of Curriculum & InstructionSolution ExplorationNo Clear WinnerBrief Mention
gg_025"How do school safety platforms handle off-campus monitoring on 1:1 devices?"SuperintendentSolution ExplorationNo Clear WinnerMentioned In List
gg_026"What options exist for monitoring student-owned BYOD devices on a school network without installing agents?"Network Administrator / Systems EngineerSolution ExplorationNo Vendor MentionedMentioned In List
gg_027"How do schools give parents visibility into what their kids are doing on school devices at home?"SuperintendentSolution ExplorationNo Clear WinnerMentioned In List
gg_028"What tools help districts monitor student internet use across apps, not just web browsers?"Chief Technology Officer / Director of TechnologySolution ExplorationNo Clear WinnerMentioned In List
gg_029"What features matter most when evaluating student web filtering platforms for a district with 10,000 students?"Chief Technology Officer / Director of TechnologyRequirements BuildingNo Clear WinnerBrief Mention
gg_031"Must-have vs. nice-to-have features for student safety monitoring software in K-12"Director of Student Services & SafetyRequirements BuildingNo Clear WinnerBrief Mention
gg_032"Security and privacy requirements checklist for evaluating student monitoring platforms in K-12"Network Administrator / Systems EngineerRequirements BuildingNo Clear WinnerBrief Mention
gg_033"What CIPA compliance features should a web filter have to pass an E-Rate audit?"Chief Technology Officer / Director of TechnologyRequirements BuildingNo Clear WinnerBrief Mention
gg_034"We're replacing our current filter — what should I look for in a web filter that works across Chromebooks, iPads, and Windows?"Network Administrator / Systems EngineerRequirements BuildingLightspeed SystemsMentioned In List
gg_035"Evaluation criteria for YouTube filtering in schools — how granular should controls be?"Director of Curriculum & InstructionRequirements BuildingNo Clear WinnerBrief Mention
gg_037"Our current filter doesn't protect devices off-campus — what requirements should we set for a replacement?"Chief Technology Officer / Director of TechnologyRequirements BuildingNo Clear WinnerMentioned In List
gg_044"We've outgrown our current web filter — best K-12 web filtering platforms for mid-size districts with mixed device fleets"Chief Technology Officer / Director of TechnologyShortlistingLightspeed SystemsStrong 2nd
gg_045"Top student safety monitoring platforms that detect self-harm and violence threats on school devices"Director of Student Services & SafetyShortlistinggoguardianPrimary Recommendation
gg_046"Best classroom management software for K-12 teachers to monitor student screens and keep kids on task"Director of Curriculum & InstructionShortlistinggoguardianPrimary Recommendation
gg_047"We're running separate filters for each device type — which school web filters work across Chromebooks, iPads, and Windows in one platform?"Network Administrator / Systems EngineerShortlistingLightspeed SystemsMentioned In List
gg_048"Top K-12 platforms that combine web filtering, classroom management, and student safety in one tool"Chief Technology Officer / Director of TechnologyShortlistinggoguardianMentioned In List
gg_049"Best web filtering solutions for CIPA compliance and E-Rate audit documentation"SuperintendentShortlistingLightspeed SystemsMentioned In List
gg_050"K-12 student safety platforms with the lowest false positive rates for self-harm alerts"Director of Student Services & SafetyShortlistingGaggleMentioned In List
gg_051"school web filters that actually stop VPN bypass attempts by students"Network Administrator / Systems EngineerShortlistingLightspeed SystemsPrimary Recommendation
gg_052"Best classroom management tools that teachers with low tech skills can actually learn quickly"Director of Curriculum & InstructionShortlistingNo Clear WinnerBrief Mention
gg_053"Our current safety tool only monitors during school hours — which student safety platforms provide 24/7 monitoring including nights and weekends?"SuperintendentShortlistingGaggleMentioned In List
gg_054"school web filtering platforms that protect 1:1 take-home devices off-campus"Chief Technology Officer / Director of TechnologyShortlistinggoguardianPrimary Recommendation
gg_055"Best YouTube filtering tools for schools that let teachers use educational videos while blocking inappropriate content"Director of Curriculum & InstructionShortlistingLightspeed SystemsStrong 2nd
gg_056"Top school safety platforms with strong parent communication and take-home device visibility"Director of Student Services & SafetyShortlistingSecurlyStrong 2nd
gg_057"Best digital hall pass systems for K-12 schools that integrate with classroom management software"Director of Curriculum & InstructionShortlistingSecurlyStrong 2nd
gg_058"K-12 edtech usage analytics tools that show which software licenses are actually being used"Chief Technology Officer / Director of TechnologyShortlistingLightspeed SystemsBrief Mention
gg_059"Best school web filters with detailed usage reporting for IT administrators"Network Administrator / Systems EngineerShortlistinggoguardianMentioned In List
gg_060"K-12 web filtering platforms that handle BYOD without requiring agents on personal devices"Network Administrator / Systems EngineerShortlistingSecurlyPrimary Recommendation
gg_061"Looking for a school safety platform that meets CIPA requirements and handles state-level mandates for student internet safety"SuperintendentShortlistingLightspeed SystemsStrong 2nd
gg_062"school web filter shortlist for a district with 8,000 students running mostly Chromebooks plus some Windows and iPad"Chief Technology Officer / Director of TechnologyShortlistingLightspeed SystemsMentioned In List
gg_063"Best student monitoring solutions with off-network protection for 1:1 iPad deployments"Director of Student Services & SafetyShortlistingLightspeed SystemsMentioned In List
gg_064"Which classroom management platforms let teachers control all student tabs from one screen during lessons?"SuperintendentShortlistinggoguardianPrimary Recommendation
gg_065"recommended student safety platforms for districts with both Google Workspace and Microsoft 365"Network Administrator / Systems EngineerShortlistingNo Clear WinnerBrief Mention
gg_066"alternatives to our current web filter that keeps blocking educational sites teachers need"Chief Technology Officer / Director of TechnologyShortlistingNo Clear WinnerBrief Mention
gg_067"student safety monitoring tools with parent notification features for take-home devices"Director of Student Services & SafetyShortlistingBark for SchoolsMentioned In List
gg_068"Is GoGuardian a good choice for a mid-size school district with 12,000 students?"SuperintendentShortlistinggoguardianPrimary Recommendation
gg_069"GoGuardian vs Lightspeed Systems for K-12 web filtering — which is better for a district with 10,000 students?"Chief Technology Officer / Director of TechnologyComparisonLightspeed SystemsStrong 2nd
gg_070"GoGuardian vs Securly — which student safety platform has better self-harm detection?"Director of Student Services & SafetyComparisongoguardianPrimary Recommendation
gg_071"Dyknow vs LanSchool for classroom management — which do teachers prefer?"Director of Curriculum & InstructionComparisonDyknowBrief Mention
gg_080"Is it better to get an all-in-one K-12 safety platform or use Gaggle for safety and a separate tool for filtering?"SuperintendentComparisonGaggleMentioned In List
gg_085"Switching from Gaggle to a platform that also does web filtering — what are the best options?"Chief Technology Officer / Director of TechnologyComparisonLightspeed SystemsStrong 2nd
gg_091"Our teachers hate our current classroom management tool — is Dyknow actually better for teacher satisfaction?"Director of Curriculum & InstructionComparisonDyknowBrief Mention
gg_092"LanSchool Air vs Lightspeed Classroom — how do they compare for mixed Chromebook and Windows environments?"Network Administrator / Systems EngineerComparisonNo Clear WinnerBrief Mention
gg_093"We're unhappy with our current YouTube filtering — which K-12 platforms have the most granular video-level controls?"Director of Curriculum & InstructionComparisonLightspeed SystemsMentioned In List
gg_095"Which digital hall pass systems integrate with classroom management and web filtering platforms?"Chief Technology Officer / Director of TechnologyComparisonSecurlyMentioned In List
gg_097"How do Bark for Schools, Gaggle, and Securly compare for student suicide prevention monitoring?"Director of Student Services & SafetyComparisonGaggleBrief Mention
gg_099"Which K-12 web filter handles BYOD the best — we need filtering for student personal devices on the school network"Network Administrator / Systems EngineerComparisonLightspeed SystemsMentioned In List
gg_101"GoGuardian implementation problems for large school districts"Chief Technology Officer / Director of TechnologyValidationNo Clear WinnerPrimary Recommendation
gg_102"Lightspeed Systems problems and complaints from school districts"Chief Technology Officer / Director of TechnologyValidationNo Clear WinnerBrief Mention
gg_103"Securly customer complaints — what do school IT teams not like about it?"Network Administrator / Systems EngineerValidationNo Clear WinnerBrief Mention
gg_104"Gaggle safety monitoring problems — how often do they miss real threats?"Director of Student Services & SafetyValidationNo Clear WinnerBrief Mention
gg_105"Dyknow reviews and complaints from school districts — what are the downsides?"Director of Curriculum & InstructionValidationNo Clear WinnerBrief Mention
gg_107"Common complaints about GoGuardian from teachers — is it hard to use?"Director of Curriculum & InstructionValidationNo Clear WinnerPrimary Recommendation
gg_108"Does GoGuardian slow down Chromebooks? Performance issues reported by schools"Network Administrator / Systems EngineerValidationNo Clear WinnerPrimary Recommendation
gg_109"Biggest risks of choosing Lightspeed Systems for web filtering at a mid-size district"Chief Technology Officer / Director of TechnologyValidationNo Clear WinnerBrief Mention
gg_110"Hidden costs of GoGuardian that school districts don't expect — licensing, training, add-ons"SuperintendentValidationNo Clear WinnerPrimary Recommendation
gg_111"Securly false positive rate for student safety alerts — is it better or worse than competitors?"Director of Student Services & SafetyValidationNo Clear WinnerBrief Mention
gg_113"Gaggle customer support quality — what do school admins say about response times?"Chief Technology Officer / Director of TechnologyValidationNo Clear WinnerBrief Mention
gg_114"Student privacy concerns with GoGuardian — do they comply with FERPA and COPPA?"SuperintendentValidationNo Clear WinnerMentioned In List
gg_117"How long does a typical K-12 web filter implementation take for a district with 8,000+ devices?"Network Administrator / Systems EngineerValidationNo Vendor MentionedMentioned In List
gg_118"What do schools say about switching from Lightspeed to a different web filter — was the migration worth it?"SuperintendentValidationNo Clear WinnerMentioned In List
gg_119"LanSchool contract and licensing complaints — are there lock-in issues?"SuperintendentValidationNo Clear WinnerBrief Mention
gg_121"Can students bypass school web filters with VPNs or browser extensions? Which filters are hardest to get around?"Network Administrator / Systems EngineerValidationNo Clear WinnerMentioned In List
gg_122"Securly data privacy concerns — how do they handle student monitoring data?"SuperintendentValidationNo Clear WinnerBrief Mention
gg_124"Can K-12 web filters actually track edtech app usage or is that a separate tool? What are the reporting gaps?"Chief Technology Officer / Director of TechnologyValidationNo Clear WinnerMentioned In List
gg_125"LanSchool deployment complexity — is it harder to roll out than cloud-based classroom management alternatives?"Network Administrator / Systems EngineerValidationLanSchoolMentioned In List
gg_126"ROI of implementing a student safety monitoring platform for a mid-size school district"SuperintendentConsensus CreationNo Vendor MentionedMentioned In List
gg_127"How to justify spending on web filtering and classroom management software to a school board"SuperintendentConsensus CreationNo Clear WinnerMentioned In List
gg_128"Case studies of school districts that reduced student safety incidents after deploying monitoring software"Director of Student Services & SafetyConsensus CreationGaggleMentioned In List
gg_132"How to convince teachers to adopt classroom management software — what does successful rollout look like?"Director of Curriculum & InstructionConsensus CreationNo Clear WinnerBrief Mention
gg_134"How do districts justify the cost of CIPA-compliant web filtering to protect E-Rate funding?"Chief Technology Officer / Director of TechnologyConsensus CreationNo Vendor MentionedMentioned In List
gg_136"Evidence that classroom management software improves instructional time and student engagement"Director of Curriculum & InstructionConsensus CreationNo Vendor MentionedMentioned In List
gg_137"Draft an RFP for K-12 web filtering and student safety monitoring for a district with 12,000 students across Chromebooks, Windows, and iPads"Chief Technology Officer / Director of TechnologyArtifact CreationNo Clear WinnerMentioned In List
gg_139"Build a TCO model for implementing a K-12 web filtering and safety platform across a 10,000-student district over 3 years"SuperintendentArtifact CreationNo Clear WinnerMentioned In List
gg_146"Build a feature Comparison spreadsheet for K-12 web filtering platforms including cross-platform support, YouTube controls, BYOD, and CIPA compliance"Network Administrator / Systems EngineerArtifact CreationLightspeed SystemsMentioned In List
gg_147"Create an executive summary comparing the cost of running separate filtering, classroom management, and safety tools versus consolidating to one platform"SuperintendentArtifact CreationNo Clear WinnerBrief Mention
Section 3
Competitive Position

Who’s winning when GoGuardian isn’t — and who controls the narrative at each buying stage.

[TL;DR] GoGuardian wins 5.3% of queries (8/150), ranks #3 in SOV — H2H record: 57W–31L across 9 competitors.

GoGuardian's SOV rank of #3 (only 3 mentions behind the leader) masks a win-rate problem: it appears alongside Lightspeed frequently but loses the head-to-head 16-14, and loses to Gaggle 5-1 in student safety queries despite Beacon being a stronger product.

Share of Voice

CompanyMentionsShare
Lightspeed Systems10421.1%
Securly10320.8%
GoGuardian10120.4%
Linewize428.5%
Bark for Schools377.5%
Gaggle316.3%
Blocksi244.9%
Hāpara204%
LanSchool183.6%
Dyknow142.8%

Head-to-Head Records

When GoGuardian and a competitor both appear in the same response, who gets the recommendation? One query with multiple competitors generates a matchup against each — so H2H totals will exceed the query count.

Win = primary recommendation (cross-platform majority). Loss = competitor was. Tie = neither or third party.

vs. Lightspeed Systems14W – 16L – 52T (82 mentioned together)
vs. Securly14W – 5L – 66T (85 mentioned together)
vs. Gaggle1W – 5L – 16T (22 mentioned together)
vs. Dyknow4W – 2L – 3T (9 mentioned together)
vs. LanSchool4W – 1L – 9T (14 mentioned together)
vs. Hāpara5W – 0L – 10T (15 mentioned together)
vs. Bark for Schools5W – 1L – 28T (34 mentioned together)
vs. Blocksi5W – 0L – 18T (23 mentioned together)
vs. Linewize5W – 1L – 32T (38 mentioned together)

Invisible Query Winners

For the 52 queries where GoGuardian is completely absent:

Lightspeed Systems11 wins (21.1%)
Gaggle3 wins (5.8%)
Hāpara1 win (1.9%)
LanSchool1 win (1.9%)
Bark for Schools1 win (1.9%)
Securly1 win (1.9%)
Dyknow1 win (1.9%)
Blocksi1 win (1.9%)
Uncontested (no winner)32 queries (61.5%)

Surprise Competitors

Vendors appearing in responses not in GoGuardian’s defined competitive set.

Cisco Umbrella — 5.1% SOVFlagged
ManagedMethods — 4% SOVFlagged
Deledao — 2.8% SOVFlagged
SmartPass — 2.4% SOVFlagged
DNSFilter — 2.2% SOVFlagged
ContentKeeper — 1.8% SOVFlagged
NetSupport School — 1.6% SOVFlagged
iboss — 1.6% SOVFlagged
Minga — 1.4% SOVFlagged
Raptor Technologies — 1.4% SOVFlagged
Cloudflare Gateway — 1.2% SOVFlagged
CurrentWare — 1.2% SOVFlagged
Netsweeper — 1.2% SOVFlagged
WebTitan — 1.2% SOVFlagged
Jamf — 1% SOVFlagged
iBoss — 1% SOVFlagged
LearnPlatform — 1% SOVFlagged
Clever — 1% SOVFlagged
ClassDojo — 1% SOVFlagged

[Synthesis] GoGuardian's SOV position (rank #3, 20.4% share) understates its competitive disadvantage at the query level. Despite appearing within 3 mentions of the category leader, GoGuardian wins pairwise matchups against Lightspeed only 14 of 30 contested queries (46.7%) and loses to Gaggle 5 of 6 contested queries (83.3% loss rate). Win rate and H2H record diverge because H2H measures co-appearance wins while win rate measures primary recommendation rate — GoGuardian appears alongside competitors frequently but is rarely recommended as the top choice.

Gaggle's 5-1 head-to-head advantage is the most actionable competitive gap: Beacon's content depth at Comparison and consensus stages is losing to Gaggle's outcome-specific case study and methodology content.

Section 4
Citation & Content Landscape

What AI reads and trusts in this category.

[TL;DR] GoGuardian had 31 unique pages cited across buyer queries, ranking #5 among all cited domains. 10 high-authority domains cite competitors but not GoGuardian.

Being cited as a source — not just mentioned — is what drives AI recommendations, and GoGuardian's citation rank of #5 despite SOV rank of #3 signals that AI models reference GoGuardian's content less often than competitors' for the same queries.

Top Cited Domains (citation instances)

lightspeedsystems.com149
Securly.com70
Gaggle.net62
g2.com54
goguardian.com47 (#5)
Show 15 more domains
reddit.com46
en.wikipedia.org35
Blocksi.net33
capterra.com31
Linewize.com27
managedmethods.com22
support.Securly.com20
sourceforge.net19
controld.com18
LanSchool.com18
Dyknow.com17
Hāpara.com15
blog.Securly.com15
eff.org14
trustradius.com14

GoGuardian URL Citations by Page

www.goguardian.com7
6
www.goguardian.com/blog/pros-and-cons-of-implem...4
www.goguardian.com/admin/vs-competitors4
www.goguardian.com/beacon3
Show 26 more pages
www.goguardian.com/blog/a-guide-to-web-and-cont...3
www.goguardian.com/beacon/vs-competitors3
www.goguardian.com/teacher2
www.goguardian.com/pricing2
www.goguardian.com/privacy-and-trust2
support.goguardian.com/s/article/GoGuardian-Par...1
www.goguardian.com/blog/how-students-bypass-sch...1
www.goguardian.com/product-update/beacon-24-71
www.goguardian.com/blog/goguardian-parent-helps...1
www.goguardian.com/blog/delivering-outcomes-at-...1
www.goguardian.com/blog/life-saving-tech-how-gi...1
support.goguardian.com/s/article/Troubleshootin...1
www.goguardian.com/blog/product-update-beacon-s...1
www.goguardian.com/blog/big-updates-for-admin-a...1
www.goguardian.com/blog/seven-challenges-shapin...1
support.goguardian.com/s/article/About-GoGuardi...1
www.goguardian.com/blog/what-is-a-dns-filter-an...1
www.goguardian.com/admin1
www.goguardian.com/product-update/the-goguardia...1
support.goguardian.com/s/article/Scheduling-Set...1
www.goguardian.com/blog/how-schools-can-stay-ah...1
www.goguardian.com/policies/eula1
www.goguardian.com/policies/product-privacy1
www.goguardian.com/blog/faqs-from-k-12-leaders-...1
www.goguardian.com/policies/coppa-disclosure1
www.goguardian.com/blog/ai-in-the-classroom-how...1
Total GoGuardian unique pages cited31
GoGuardian domain rank#5

Competitor URL Citations

Note: Domain-level citation counts (above) tally instances per individual domain. Competitor-level counts (below) aggregate across all domains owned by a single vendor, which may include subdomains.

Lightspeed Systems161 URL citations
Securly121 URL citations
Gaggle71 URL citations
LanSchool46 URL citations
Blocksi40 URL citations
Linewize33 URL citations
Dyknow20 URL citations
Hāpara17 URL citations
Bark for Schools10 URL citations

Third-Party Citation Gaps

Non-competitor domains citing other vendors but not GoGuardian — off-domain authority opportunities.

These domains cited competitors but did not cite GoGuardian pages in the queries analyzed. This reflects citation patterns in AI responses, not overall platform presence.

reddit.com46 citations · GoGuardian not cited
en.wikipedia.org35 citations · GoGuardian not cited
capterra.com31 citations · GoGuardian not cited
managedmethods.com22 citations · GoGuardian not cited
sourceforge.net19 citations · GoGuardian not cited
Show 5 more domains
controld.com18 citations · GoGuardian not cited
eff.org14 citations · GoGuardian not cited
trustradius.com14 citations · GoGuardian not cited
govtech.com13 citations · GoGuardian not cited
edtechmagazine.com11 citations · GoGuardian not cited

[Synthesis] GoGuardian's citation rank of #5 despite SOV rank of #3 reveals a structural authority gap: GoGuardian is mentioned in AI responses more often than it is cited as a source. The 31 unique pages cited (vs Lightspeed's 161 citation instances across its domains) indicates GoGuardian's content is less frequently used as a primary reference than its SOV presence would suggest. The concentration of citations on the homepage (7 instances) and Comparison pages (4 instances) points to AI models defaulting to surface-level pages rather than feature-specific content — a pattern the L2 and L3 content investments directly address.

Section 5
Prioritized Action Plan

Three layers of recommendations ranked by commercial impact and implementation speed.

[TL;DR] 31 priority recommendations (plus 4 near-rebuild optimizations) targeting 142 gap queries (52 invisible, 90 positioning gaps). 5 L1 technical fixes + 1 verification checks, 20 content optimizations (L2), 5 new content initiatives (L3).

The 32 recommendations follow a strict dependency sequence: L1 technical fixes first (they unblock freshness credit for everything after), then L2 content deepening, then L3 new content creation — skipping sequence means new content ships without the infrastructure to be indexed at full freshness.

Reading the priority numbers: Recommendations are ranked 1–31 across all three layers by commercial impact × implementation speed. Within each layer, items appear in priority order. Gaps in the sequence (e.g., L1 shows 1, 2, then 12) mean higher-priority items belong to a different layer.

Layer 1 Technical Fixes

Configuration and infrastructure changes. Owner: Engineering / DevOps. Timeline: Days to weeks.

Priority Finding Impact Timeline
#1Majority of blog content is severely outdatedHigh2-4 weeks

Issue: 10 out of 19 content marketing pages have confirmed publication dates older than 365 days, with the oldest dating to February 2018. Seven blog posts were published in 2018-2019 and have not been visibly updated. Only 2 blog posts (August 2025, May 2025) fall within the past 12 months, and none fall within the 90-day dominant AI citation window.

Fix: Audit all blog posts published before 2024 for accuracy and refresh with current data, updated statistics, and visible 'Last updated' dates. Prioritize the high-intent posts: 'How Students Bypass School Web Filters' (2019), 'Why Not All Filtering Solutions Are Created Equal' (2018), 'Pros and Cons of Implementing School Web Filtering' (2019), and 'To Block or Not to Block' (2018).

#21High-value competitor Comparison pages have no visible datesMedium< 1 day

Issue: All four competitor Comparison pages (/admin/vs-competitors, /teacher/vs-competitors, /beacon/vs-competitors, /competitor-Comparison) have no visible publication or last-updated dates. These pages are classified as content_marketing and default to a freshness score of 0.2 under the scoring methodology.

Fix: Add visible 'Last reviewed' or 'Updated' dates to all Comparison pages. Implement a quarterly review cadence to verify Comparison claims remain accurate and update the visible date on each review.

#22Pricing page contains no actionable pricing informationMedium1-3 days

Issue: The pricing page at /pricing scores 0.3 on content depth and 0.4 on heading hierarchy. It contains only a lead generation form with four brief paragraphs about pricing factors (enrollment, bundles, contract length, professional services) but no actual pricing tiers, ranges, per-student costs, or plan comparisons.

Fix: Add at minimum: starting price ranges or per-student annual pricing, a plan Comparison table showing what is included at each tier, and clear feature differentiation between bundles. Even 'Starting at $X/student/year' provides a citable data point for AI models. Consider adding FAQ schema to address common pricing questions.

#23Schema markup, meta descriptions, and OG tags require manual verificationMedium1-2 weeks

Issue: Our analysis method returns rendered page content as markdown text, which means JSON-LD schema blocks, meta description tags, and Open Graph tags are not visible in the output. We cannot confirm or deny the presence of structured data markup on any of the 41 analyzed pages.

Fix: Verify schema markup implementation using Google Rich Results Test or Schema.org Validator for all commercial pages. Ensure Product schema on product pages, FAQ schema on pages with FAQ sections, Article schema on blog posts, and Organization schema site-wide. Also verify meta descriptions are present and descriptive, and OG tags are properly configured.

#24Sitemap lacks lastmod timestamps on all 1,027 URLsMedium1-3 days

Issue: The sitemap at https://www.goguardian.com/sitemap.xml contains 1,027 URL entries but none include lastmod, changefreq, or priority attributes. Only the required loc element is present for each URL.

Fix: Add accurate lastmod timestamps to all sitemap entries, particularly for product pages, Comparison pages, and blog posts. Ensure timestamps update automatically when page content changes. Also consider adding changefreq and priority attributes to guide crawler resource allocation toward high-value commercial pages.

Verification Checks

Items requiring manual review before determining if action is needed.

Priority Finding Impact Timeline
#31Client-side rendering status cannot be confirmedLow< 1 day

Issue: All 41 analyzed pages returned substantive content via our rendering pipeline, suggesting server-side rendering or pre-rendering is in place. However, we cannot confirm whether any pages rely on client-side JavaScript rendering that might fail for certain AI crawlers that do not execute JavaScript.

Fix: Verify rendering method by disabling JavaScript in Chrome DevTools and checking that all commercial pages still display full content. A quick test with curl or wget against the top 10 commercial pages will reveal any CSR dependencies.

Click any row to expand full issue/fix detail.

Layer 2 Existing Content Optimization

Existing pages that need restructuring or deepening. Owner: Content Team. Timeline: Weeks.

Beacon/vs-competitors page: three-way safety alerting Comparison matrices Near-Rebuild → L3

Priority 4
Currently: coveredThe /beacon/vs-competitors page has coverage depth 0.8 for AI-Powered Student Safety & Self-Harm Detection but loses queries where GoGuardian is not named in the buyer's question. All 7 queries in this cluster have winner='Gaggle', 'Securly', 'Bark for Schools', or 'No Clear Winner' — GoGuardian wins none of these three-way Comparison queries.

The /beacon/vs-competitors page does not include a section addressing the 'Gaggle vs Securly' Comparison (gg_072, gg_083) — buyers evaluating these two vendors cannot find GoGuardian's perspective, so GoGuardian is excluded from their decision process. The /beacon/vs-competitors page does not address Gaggle's human-reviewed alert methodology vs Securly's AI detection approach (gg_094) — a critical positioning question where GoGuardian Beacon has a distinct answer that differentiates it from both. The page has no section for 'Bark for Schools vs Gaggle' (gg_088) — a common Comparison involving the free-tier Bark option, where GoGuardian's paid Beacon can make a superior commercial case.

Queries affected: gg_072, gg_075, gg_082, gg_083, gg_088, gg_094, gg_097

Competitor-Comparison page: multi-vendor switching ROI and executive cost analysis Near-Rebuild → L3

Priority 6
Currently: addressedCoverage depth 0.6 for Districts manage separate vendors for filtering, classroom management, safety mo. gg_100 (winner: Lightspeed Systems) is a direct consolidation opportunity — a buyer who already runs Lightspeed and Gaggle separately is being sent to Lightspeed for the answer to switching away from Lightspeed. The page cannot currently compete.

The /competitor-Comparison page does not address the 'We're running Lightspeed and Gaggle separately — would switching to a single platform save money?' scenario (gg_100) — Lightspeed wins this query from a buyer already using its product by providing a consolidation-with-Lightspeed narrative. The page does not contain a 'Business Case for Consolidating to GoGuardian' section (gg_129) with specific ROI arguments for replacing separate filtering, safety, and classroom management vendors with GoGuardian's suite. No executive summary format or cost Comparison template exists (gg_147) — a buyer who needs to present a consolidation analysis to a Superintendent cannot construct it from GoGuardian's current content.

Queries affected: gg_100, gg_129, gg_147

Competitor-Comparison page: platform consolidation strategic framework Near-Rebuild → L3

Priority 7
Currently: addressedCoverage depth 0.6 for Districts manage separate vendors for filtering, classroom management, safety mo. The page is 'addressed' but not 'covered' — it partially responds to consolidation queries but loses gg_080 to Gaggle. GoGuardian wins gg_048 but in a positioning context, not a decision-framework context.

The /competitor-Comparison page does not provide a structured 'one platform vs. best-of-breed' decision framework (gg_017) — a buyer asking 'should we consolidate or use separate tools?' cannot find GoGuardian's reasoned answer on this page. The page does not address the 'is an all-in-one platform like Gaggle better or should we use best-of-breed?' query (gg_080) — it positions GoGuardian but does not directly engage the consolidation architecture question that Gaggle wins by default. Success stories (Pickerington, Camdenton) are linked but do not appear to contain specific vendor consolidation outcome data that would answer 'what did consolidation actually save us?' in a citable format.

Queries affected: gg_017, gg_048, gg_080

Admin page: consensus, ROI, and RFP support content

Priority 10
Currently: coveredThe /admin page covers product capabilities but has no ROI section, no payback period data, and no board-justification framing. Blog posts are outdated (2018-2019) and do not contain the current data points these consensus and artifact queries require.

The /admin page has no ROI or payback period section — buyers asking 'typical payback period for web filtering deployment' (gg_133) cannot find a GoGuardian-sourced answer, and Lightspeed wins by default because it publishes ROI data. The /admin page has no board justification content — buyers asking 'how to justify spending on web filtering to a school board' (gg_127) need specific ROI framing, regulatory compliance arguments, and incident prevention data that the page does not provide. The /admin page does not provide an RFP template or feature Comparison framework (gg_137), missing the artifact-creation query pattern where buyers want a structured evaluation document they can customize.

Queries affected: gg_085, gg_127, gg_133, gg_137, gg_146

Admin page: evaluation criteria and Shortlisting buyer framework

Priority 11
Currently: coveredThe /admin page and 'why-not-all-filtering-solutions-are-created-equal' blog post have relevant content but frame GoGuardian's capabilities as marketing claims rather than buyer evaluation criteria. No structured checklist, evaluation rubric, or capability Comparison table exists.

The /admin page lists features but does not frame them as evaluation criteria — buyers asking 'what features matter most for a 10,000-student district?' cannot use the page to build a shortlist or score vendors. The /admin page has no content specifically addressing VPN bypass prevention as a Shortlisting criterion (gg_051), leaving Lightspeed — which explicitly documents bypass resistance — to win this positioning. The /admin page does not address the 'overblocking alternatives' query pattern (gg_066) — buyers seeking an alternative to their current filter that blocks too much educational content cannot find GoGuardian's answer on this page.

Queries affected: gg_029, gg_044, gg_051, gg_066

Admin page: overblocking remediation and VPN bypass prevention content

Priority 12
Currently: coveredThe /admin page is marketing-oriented; it describes what the filter does but not how to solve specific operational problems. The 'a-guide-to-web-and-content-filtering-for-schools' blog post is the closest existing asset but does not specifically address overblocking remediation or bypass prevention with extractable claims.

The /admin page has no section addressing overblocking — it describes filtering capabilities but never acknowledges or solves the 'our filter blocks half the educational sites teachers need' problem buyers bring to the evaluation. The /admin page has no content on VPN/proxy bypass prevention — a top-of-funnel problem query (gg_007) that Lightspeed wins because its content explicitly describes bypass-resistance mechanisms. The /admin page does not explain the difference between agent-based and DNS-based filtering in terms buyers can act on — gg_015 queries this distinction and the page provides no citable answer.

Queries affected: gg_001, gg_005, gg_007, gg_015

Admin/vs-competitors page: GoGuardian vs Lightspeed direct Comparison depth

Priority 13
Currently: coveredThe /admin/vs-competitors page has a Comparison page structure but does not win GoGuardian vs Lightspeed queries despite Lightspeed being the primary competitor (SOV rank #1 with 104 mentions). The page likely lacks the structured Comparison tables and specific capability claims that win direct Comparison queries.

The /admin/vs-competitors page does not contain a dedicated 'GoGuardian Admin vs. Lightspeed Filter' section with a structured feature-by-feature Comparison table — the primary Comparison query (gg_069) covering a 10,000-student district context is not specifically addressed. The page has no visible publication date (L1 finding), which reduces AI citation confidence for all Comparison content — buyers asking 'which cloud web filter is better for a Chromebook-heavy district?' need temporally credible data. The page does not address the budget/scale dimension (gg_096: Blocksi vs Lightspeed for smaller districts on tight budgets) — an opportunity to position GoGuardian's value across district sizes.

Queries affected: gg_069, gg_074, gg_090, gg_096

Admin/vs-competitors page: competitor risk and complaint counter-positioning

Priority 14
Currently: coveredThe /admin/vs-competitors page exists with coverage depth 0.7 for K-12 Web Filtering & Content Control, but it does not contain sections specifically addressing known Lightspeed or Securly weaknesses, migration complexity, or bypass vulnerability evidence that these Validation-stage queries require.

The /admin/vs-competitors page presents GoGuardian's strengths but does not address the specific complaints buyers are researching: Lightspeed's reported implementation complexity (gg_109), Securly's customer service issues (gg_103), and bypass vulnerability patterns (gg_121). The page does not contain a 'Migration from Lightspeed/Securly' section — buyers asking 'was the migration worth it?' (gg_118) need to see GoGuardian's specific migration story, not a generic Comparison table. The /admin/vs-competitors page has no visible publication date (L1 finding: comparison_pages_undated), which degrades AI citation confidence for any Comparison content it does contain.

Queries affected: gg_102, gg_103, gg_109, gg_118, gg_121

Beacon page: alert fatigue, AI detection methodology, and problem identification

Priority 15
Currently: coveredThe /beacon page and supporting blog post have depth 0.8 for AI-Powered Student Safety & Self-Harm Detection overall, but early-funnel problem-identification queries require problem-framing content that the current page structure does not provide. Alert fatigue is particularly under-addressed given the Director of Student Services query context.

The /beacon page does not address the '200 alerts a day and counselors are ignoring them' alert fatigue problem (gg_010) — it describes Beacon's detection capabilities but does not explain how GoGuardian reduces noise and prioritizes actionable alerts over raw volume. The /beacon page does not explain how AI-based detection differs from keyword-only monitoring (gg_016) — a solution-exploration query that determines whether a buyer considers moving beyond their current keyword tool. The /beacon page does not address human-reviewed vs. fully automated AI detection (gg_020) — a positioning question that differentiates GoGuardian from Gaggle (human-reviewed) and Securly (automated).

Queries affected: gg_002, gg_010, gg_016, gg_020

Beacon page: consensus building — liability, ROI, and case studies with safety outcomes

Priority 16
Currently: coveredThe /beacon page has coverage depth 0.8 for AI-Powered Student Safety & Self-Harm Detection but loses gg_128 (case studies of districts that reduced incidents, winner: Gaggle). Gaggle wins consensus queries by publishing specific outcome data. The /beacon page and blog post do not contain the liability or ROI evidence that gg_131 and gg_126 require.

The /beacon page does not include a liability/duty-of-care section (gg_131) addressing the question 'what is a district's liability if it doesn't deploy student safety monitoring?' — Gaggle wins this framing by publishing governance and legal risk content. The /beacon page lacks case studies with specific, quantified safety outcomes (gg_128) — Gaggle leads this query by publishing district-level incident reduction data. GoGuardian has case study content but it's fragmented across blog posts rather than concentrated on the product page. The /beacon page does not provide ROI data for student safety monitoring investment (gg_126) — superintendents need to justify cost to boards, and the page provides no financial justification framework.

Queries affected: gg_126, gg_128, gg_131, gg_142, gg_143

Beacon page: evaluation criteria, false positive rates, and requirements checklist

Priority 17
Currently: coveredDepth 0.8 for AI-Powered Student Safety & Self-Harm Detection overall, but requirements-building queries need structured criteria content. The existing page lists Beacon's features but does not frame them as evaluation criteria buyers can apply when scoring vendors.

The /beacon page does not include a 'Must-Have Features for Student Safety Platforms' section (gg_031) — buyers building vendor scorecards cannot use the page to develop evaluation criteria, so they rely on competitor pages that do provide this structure. The /beacon page does not address false positive rates as an evaluation criterion (gg_042, gg_050) — Gaggle explicitly markets its human-review approach as a low-false-positive solution, and GoGuardian has no equivalent claim on the /beacon page. The /beacon page does not include a security and privacy requirements checklist (gg_032) — network administrators evaluating student monitoring platforms need to know what data is collected, how it is stored, and what FERPA/COPPA compliance certifications apply.

Queries affected: gg_031, gg_032, gg_042, gg_050

Beacon page: Shortlisting presence — 24/7 monitoring and multi-platform coverage

Priority 18
Currently: coveredGoGuardian wins gg_045 (top student safety platforms for self-harm detection) but loses gg_053 (24/7 monitoring including nights and weekends, winner: Gaggle). The product-update/beacon-24-7 page exists but appears only once in citations — the main /beacon page does not surface 24/7 monitoring as a primary headline claim.

The /beacon page does not prominently feature GoGuardian Beacon's 24/7 off-hours monitoring capability (gg_053) as a top-level claim — the product-update/beacon-24-7 page exists but is buried, and the main /beacon page does not reference it prominently enough for AI extraction. The /beacon page does not explicitly address monitoring coverage for both Google Workspace and Microsoft 365 environments (gg_065) — multi-platform coverage is a Shortlisting requirement for districts running mixed productivity environments. The page wins 'top student safety platforms for self-harm detection' (gg_045) but this win should be reinforced with more extractable claims — the page's win may be fragile if competitors improve their content depth on this query.

Queries affected: gg_045, gg_053, gg_065

Beacon page: Validation queries — privacy, false positives, and vendor performance data

Priority 19
Currently: coveredGoGuardian wins gg_070 (GoGuardian vs Securly self-harm detection, winner: goguardian) but the other four Validation queries are 'No Clear Winner' — meaning AI models cannot find sufficient evidence to recommend any vendor. The /beacon page lacks specific performance metrics and the /privacy-and-trust page (2 citations) is not well-connected to safety-specific Validation queries.

The /beacon page does not include specific performance data (detection rate, response time, false positive rate) that would allow AI models to confidently recommend GoGuardian over Gaggle or Securly in Validation queries (gg_104, gg_111). The /beacon page does not link to or integrate privacy documentation from /privacy-and-trust in a way that addresses 'data privacy concerns' Validation queries (gg_122) — the privacy page gets 2 citations but only from a Validation query about Securly, not GoGuardian. The /beacon/vs-competitors page has no visible date (L1 finding), reducing citation confidence for all competitive claims made on the page, including GoGuardian vs Securly self-harm detection (gg_070).

Queries affected: gg_070, gg_104, gg_111, gg_113, gg_122

Windows and Apple pages: cross-platform unified filtering narrative

Priority 20
Currently: coveredDepth 0.7 for Cross-Platform Device Coverage. The /windows and /apple pages exist as individual platform coverage documentation but are matched to queries requiring a unified 'mixed device environment' decision framework. gg_108 (GoGuardian Chromebook performance issues) is 'No Clear Winner' — GoGuardian's own performance answer is absent.

The /windows page documents Windows support but does not address the 'we have Chromebooks, Windows laptops, and iPads — how do we enforce consistent filtering across all of them?' problem scenario (gg_004) — the cross-platform consistency question requires content that bridges all three OS pages. The /windows and /apple pages do not address the appliance-to-cloud migration scenario (gg_021) — districts switching from an on-premise appliance filter to a cloud-native solution need GoGuardian's cloud architecture described in migration terms, not just feature terms. The /windows page does not address GoGuardian Chromebook performance complaints (gg_108) — a GoGuardian-specific Validation query that should be answered authoritatively on GoGuardian's own site but currently yields no clear winner.

Queries affected: gg_004, gg_021, gg_034, gg_106, gg_108, gg_116

Teacher/vs-competitors page: three-way classroom management Comparison matrices Near-Rebuild → L3

Priority 25
Currently: coveredDepth 0.7 for Real-Time Classroom Management. GoGuardian wins H2H vs Dyknow (4-2) and LanSchool (4-1) but does not appear in queries comparing these two competitors to each other. Winner for gg_076 and gg_087 is Dyknow/Hapara; gg_071 winner is Dyknow; gg_091 winner is Dyknow.

The /teacher/vs-competitors page does not include a 'Dyknow vs LanSchool — and Why GoGuardian Teacher Is the Better Choice' section (gg_071, gg_091) — buyers evaluating these two competitors are unreachable by GoGuardian's current content architecture. The page does not address the 'Hapara vs Dyknow for Google Workspace-heavy districts' Comparison (gg_087) — a query where GoGuardian's Chromebook-native approach should be a relevant alternative but is not introduced. The page has no visible date (L1 finding: comparison_pages_undated), reducing all competitive claims to unverifiable historical assertions.

Queries affected: gg_071, gg_076, gg_087, gg_091

Hall-pass page: digital vs paper Comparison, integration specs, and disruption data

Priority 26
Currently: coveredAll four queries have winner='No Clear Winner' — no vendor currently wins the digital hall pass query category. This represents a relatively open competitive landscape compared to web filtering and student safety.

The /hall-pass page does not address the 'how digital hall pass works compared to paper passes' solution-exploration query (gg_022) — a fundamental buyer education question that the page should answer to enter the consideration set for districts not yet using digital passes. The page does not provide a 'what features should a district-wide deployment require' framework (gg_040) — districts replacing paper passes need deployment criteria, not just product marketing. The page does not address known problems and complaints about digital hall pass systems (gg_123) — a Validation-stage query where GoGuardian's transparency would outcompete absent competitor responses.

Queries affected: gg_022, gg_040, gg_123, gg_150

Teacher page: consensus and adoption — implementation evidence, teacher buy-in, and artifact creation

Priority 27
Currently: coveredDepth 0.7 for Real-Time Classroom Management, but consensus and artifact queries require outcome evidence and template content that the /teacher page currently lacks. gg_132, gg_136, gg_140, gg_148 are all either 'No Clear Winner' or 'No Vendor Mentioned' — meaning no vendor currently wins this content category.

The /teacher page has no section providing evidence that classroom management software improves instructional time or student engagement (gg_136) — curriculum directors and superintendents need this research to justify the investment to skeptical teachers. The /teacher page does not address how to convince teachers to adopt classroom management software (gg_132) — a change-management question that is as important as the product selection decision for districts concerned about teacher resistance. No classroom management rollout template or teacher adoption plan exists (gg_140, gg_148) — artifact creation queries represent the final stage of consensus building, and GoGuardian has no resources to support this buyer workflow.

Queries affected: gg_132, gg_136, gg_140, gg_148

Teacher page: problem framing, Google Workspace integration, and solution exploration

Priority 28
Currently: coveredDepth 0.7 for Real-Time Classroom Management. The /teacher page covers GoGuardian Teacher features but does not frame the 'teachers spending half the class chasing students off YouTube' problem scenario or provide specific Google Workspace integration depth.

The /teacher page does not open with the teacher's problem ('half the class is on YouTube and I can't teach') — it presents GoGuardian's features without connecting them to the specific distraction crisis curriculum directors are researching when they search gg_003. The /teacher page does not provide specific Google Workspace for Education integration details (gg_019) — how GoGuardian Teacher integrates with Google Classroom, which Google APIs it uses, and what data it accesses from Google Workspace. The page does not address the 'questions to ask classroom management vendors about teacher usability and adoption' query (gg_030) — a requirements-framing question that, if answered by GoGuardian's content, would make GoGuardian's evaluation criteria the standard.

Queries affected: gg_003, gg_019, gg_030

Teacher page: Shortlisting — low-tech teacher adoption and tab control prominence

Priority 29
Currently: coveredGoGuardian wins Shortlisting queries about screen monitoring and tab control but does not own the 'easy for non-technical teachers' positioning. Curriculum directors managing mass-rollout decisions weight teacher adoption as a primary criterion.

The /teacher page does not have an explicit 'Easy for Non-Technical Teachers' positioning section — gg_052's 'low tech skills can actually learn quickly' query finds no clear winner because no vendor prominently owns this positioning, leaving an open opportunity for GoGuardian. The page does not quantify teacher onboarding time or adoption metrics — statements like 'easy to use' without evidence are less citable than specific claims like 'teachers reach proficiency in one class session' or 'average setup time: X minutes.' The 'control all student tabs from one screen' capability (gg_064) is a GoGuardian win but may be fragile — the page should reinforce this specific claim with more extractable specifics to defend the win.

Queries affected: gg_046, gg_052, gg_064

Teacher page: Validation — GoGuardian usability complaints and competitor lock-in counter-positioning

Priority 30
Currently: coveredThese three queries are all 'No Clear Winner' — meaning no vendor currently provides strong enough content to win these Validation queries. GoGuardian has a direct opportunity to own 'GoGuardian usability complaints' by providing its own authoritative response to this query before a competitor does.

The /teacher page has no section addressing the 'common complaints about GoGuardian from teachers' query (gg_107) — GoGuardian's own page is silent on this, meaning AI models cannot cite GoGuardian's perspective and must defer to third-party review sites (which may present negative framing). The /teacher page does not document Dyknow's limitations or LanSchool's contract complexity (gg_105, gg_119) — competitors that GoGuardian beats head-to-head (Dyknow 4-2, LanSchool 4-1) but does not proactively position against in specific weakness content. The /teacher/vs-competitors page has no visible date (L1 finding: comparison_pages_undated), reducing AI citation confidence for any Comparison content it contains.

Queries affected: gg_105, gg_107, gg_119

Layer 3 Narrative Intelligence Opportunities

Net new content addressing visibility and positioning gaps. Owner: Content Strategy. Timeline: Months.

NIO #1: CIPA Compliance & Data Privacy Documentation Hub
Gap Type: Content Type Deficit — GoGuardian lacks a dedicated CIPA compliance and data privacy content hub, despite its web filter natively supporting CIPA-mandated controls. 11 of 59 L3 gap queries (18.6%) target CIPA compliance, E-Rate documentation, and FERPA/COPPA adherence — Lightspeed wins the majority by providing structured compliance documentation AI models can extract and cite.
Critical

E-Rate funding is the lifeblood of K-12 technology purchasing, and CIPA compliance is the gate every Superintendent and CTO must pass to access it. GoGuardian's product demonstrably meets CIPA requirements, but its content provides no proof — no compliance guide, no E-Rate audit checklist, no FERPA/COPPA explainer a buyer can cite to a board or auditor. Lightspeed fills this vacuum with structured documentation, winning these queries by default. The commercial cost is direct: every district asking 'will this hold up in an E-Rate audit?' is sent to a competitor before GoGuardian enters the conversation. For districts — including mid-size buyers where GoGuardian's product is well-suited — this content absence functions as a pre-qualification failure.

Show query cluster, blueprint & platform acuity
Query Cluster
IDs: gg_006, gg_033, gg_043, gg_049, gg_061, gg_078, gg_114, gg_115, gg_134, gg_141, gg_144
“E-Rate audit is coming and I can't prove CIPA compliance — what are other districts using for documentation?”
“Which K-12 web filter has the best CIPA compliance reporting and E-Rate documentation — Lightspeed or Securly?”
“Draft a CIPA compliance checklist for evaluating web filtering vendors including E-Rate documentation requirements”
“Student privacy concerns with GoGuardian — do they comply with FERPA and COPPA?”
Blueprint
  • On-Domain: Create a dedicated /cipa-compliance landing page with a structured CIPA requirements checklist, E-Rate audit documentation guide, and GoGuardian's specific compliance certifications (FERPA, COPPA, CIPA) in citation-ready format with explicit regulatory section references.
  • On-Domain: Add a FERPA/COPPA/privacy explainer section to /privacy-and-trust, including data retention policies, student data handling procedures, and a downloadable security questionnaire template addressing FERPA, COPPA, and state-specific requirements.
  • On-Domain: Publish a blog post titled 'How GoGuardian's Web Filter Satisfies CIPA Requirements for E-Rate Funding' with structured headings matching the questions auditors ask (harmful content categories, monitoring requirements, CIPA Section 1709 specifics).
  • On-Domain: Add a state-level compliance appendix covering state-specific student internet safety mandates with explicit statement of GoGuardian's coverage per state.
  • Off-Domain: Seek co-authorship or citation in CoSN, ISTE, or SETDA compliance guidance documents to establish GoGuardian as a reference authority on K-12 regulatory compliance.
  • Off-Domain: Submit to EdTech industry roundup lists that cover 'CIPA-compliant web filters for E-Rate' — these are the third-party sources AI models cite when asked about compliance-certified vendors.
  • Off-Domain: Contribute a case study to AASA or COSN publications describing how GoGuardian helped a district pass an E-Rate audit, establishing third-party proof of compliance track record.
Platform Acuity

ChatGPT (high): ChatGPT leads visibility across this audit by 5pp over Claude and Gemini. Compliance queries with structured checklist format match ChatGPT's tendency to cite pages with enumerable, structured claim sets. Claude (medium): Claude favors well-structured, factually grounded content with clear authority signals. A compliance page with regulatory citations (specific CIPA sections, FCC E-Rate program rules) would meet Claude's depth threshold for citation. Gemini (high): Gemini responds strongly to structured data and comprehensive entity coverage. FAQPage and HowTo schema markup on a compliance page combined with explicit regulatory entity references (CIPA, E-Rate, FERPA) would maximize Gemini extraction.

NIO #2: Family Engagement & Off-Network Device Protection Hub
Gap Type: Content Type Deficit — GoGuardian has no dedicated content hub for parent communication features or off-campus device protection, two capabilities its product supports but its content does not represent. 13 of 59 L3 gap queries (22%) target these themes, and Securly, Linewize, and Bark for Schools win by default because they publish specific parent-engagement and take-home-device content GoGuardian does not.
Critical

The shift to 1:1 device programs — where students take Chromebooks and iPads home — has created a buying requirement most legacy edtech vendors fail to address: what happens when the device leaves campus? GoGuardian's product provides off-network protection and parent visibility features, but no content page makes this claim in a form AI models can extract and cite. Securly and Bark for Schools win these queries by publishing specific parent-app documentation and off-campus monitoring explainers. For Director of Student Services and Superintendent personas — both veto holders — the absence of this content functions as a deal-disqualifying gap at requirements and Shortlisting stages where competitors are already present.

Show query cluster, blueprint & platform acuity
Query Cluster
IDs: gg_025, gg_027, gg_037, gg_039, gg_054, gg_056, gg_063, gg_067, gg_079, gg_089, gg_120, gg_135, gg_145
“Which K-12 web filter has the best off-network protection for take-home Chromebooks — Securly or Lightspeed?”
“How do schools give parents visibility into what their kids are doing on school devices at home?”
“Top school safety platforms with strong parent communication and take-home device visibility”
“Write a requirements document for off-network device protection covering take-home Chromebooks, parental controls, and off-campus safety monitoring”
Blueprint
  • On-Domain: Create a dedicated /off-network-protection landing page explaining GoGuardian's take-home device filtering architecture, including explicit coverage of Chromebooks, iPads, and Windows devices off school networks, with deployment configuration specifics.
  • On-Domain: Create a /parent-engagement landing page showcasing GoGuardian Parent app features: real-time alerts, browsing visibility, safety notification workflows, and privacy safeguards — written for both parent and administrator audiences.
  • On-Domain: Publish a Comparison blog post titled 'How GoGuardian Protects 1:1 Devices On and Off Campus' addressing the specific buyer question of what changes when a device leaves the network, with architecture diagrams.
  • On-Domain: Develop a board-presentation-ready page framing the liability and duty-of-care argument for off-campus device monitoring, citing relevant incident statistics and governance considerations.
  • Off-Domain: Pursue listing in COSN's Digital Citizenship and Parent Engagement resource guides, which are high-authority third-party sources AI models cite for parent-engagement queries.
  • Off-Domain: Seek media coverage or co-authored pieces in School Administrator Magazine or District Administration Magazine on off-campus student safety — establishing third-party corroboration for GoGuardian's off-network capabilities.
  • Off-Domain: Build or co-sponsor a parent technology survey that GoGuardian can publish as primary research, giving AI models a GoGuardian-sourced citation for parent engagement data.
Platform Acuity

ChatGPT (high): Parent engagement and off-network queries are specification-heavy questions. ChatGPT's stronger overall performance in this audit (+5pp vs Claude and Gemini) makes it the highest-impact platform for a structured product feature page in this cluster. Claude (high): Queries like 'board-level argument for off-campus device protection' require nuanced framing and factual depth. A page with specific incident statistics and duty-of-care legal framing would match Claude's citation preferences. Gemini (medium): Gemini responds to structured data. An /off-network-protection page with explicit OS/device coverage tables and integration specs would be extractable, but Gemini's slightly lower overall visibility in this audit signals a need for stronger structured markup.

NIO #3: YouTube Filtering Controls & BYOD Device Coverage
Gap Type: Content Type Deficit — GoGuardian has no dedicated content pages for YouTube filtering granularity or BYOD filtering, two feature areas its product supports but its content inventory classifies as 'thin.' 10 of 59 L3 gap queries (16.9%) target these capabilities, and Lightspeed and Securly win the majority by publishing specific YouTube control documentation and BYOD filtering architecture explainers.
High

YouTube is the single most contested content moderation battleground in K-12 schools — teachers need educational videos while administrators need to block inappropriate content, and buyers ask explicitly who handles this tension best. GoGuardian's web filter has YouTube filtering capabilities, but its content inventory contains no page that answers 'how granular are your YouTube controls?' or 'can I allow educational YouTube without opening everything?' Lightspeed wins this positioning vacuum. Similarly, BYOD filtering — student-owned devices on school networks — is a growing requirement that GoGuardian's DNS-based filtering can address, but no content page makes this claim. These 10 queries are resolved before GoGuardian enters the conversation.

Show query cluster, blueprint & platform acuity
Query Cluster
IDs: gg_012, gg_024, gg_026, gg_035, gg_038, gg_055, gg_060, gg_077, gg_093, gg_099
“Lightspeed Filter vs Securly for YouTube filtering controls in K-12 schools”
“Approaches to filtering YouTube in schools — blocking it entirely vs. granular video-level controls”
“K-12 web filtering platforms that handle BYOD without requiring agents on personal devices”
“Which K-12 web filter handles BYOD the best — we need filtering for student personal devices on the school network”
Blueprint
  • On-Domain: Create a dedicated /youtube-filtering landing page explaining GoGuardian's YouTube control architecture: channel-level blocking, educational-content allow-listing, SafeSearch enforcement, and teacher override capabilities — with a Comparison table against 'block YouTube entirely' approaches.
  • On-Domain: Create a /byod-filtering page explaining GoGuardian DNS's agentless BYOD filtering capability: how DNS-based filtering works for personal devices on school networks, which device types are supported, and how administrators configure network-level policies without installing agents on personal hardware.
  • On-Domain: Publish a blog post 'YouTube in the Classroom: How GoGuardian Lets Teachers Use Educational Videos Without Opening the Floodgates' — directly addressing the overblocking problem in a format curriculum directors share.
  • On-Domain: Add a BYOD architecture diagram and FAQ section to the /dns product page, explicitly addressing the 'without requiring agents on personal devices' query pattern.
  • Off-Domain: Pursue citation in ISTE and EdSurge roundup articles covering YouTube filtering tools and BYOD filtering options — these are the high-authority third-party URLs currently winning many of these queries.
  • Off-Domain: Submit to CoSN's BYOD guidance resources, which are frequently cited by AI models when K-12 administrators ask about personal device management on school networks.
Platform Acuity

ChatGPT (high): Technical feature queries (YouTube filtering granularity, agentless BYOD) perform well on ChatGPT, which leads this audit by 5pp. A structured feature page with explicit capability claims and Comparison tables matches ChatGPT's citation pattern. Claude (medium): Claude favors factual depth and authoritative framing. A YouTube filtering page that cites specific methodology (DNS vs agent-based, SafeSearch API integration) would meet Claude's depth requirements. Gemini (high): Gemini responds strongly to structured data. Device compatibility tables, filtering level comparisons, and explicit entity relationships (GoGuardian DNS + BYOD + school network) would make this content highly extractable for Gemini.

NIO #4: Usage Analytics & EdTech ROI Intelligence Hub
Gap Type: Structural Gap — GoGuardian has usage analytics and reporting capabilities across its platform but no content hub that makes these capabilities visible to buyers evaluating edtech spend accountability. 11 of 59 L3 gap queries (18.6%) target edtech usage analytics, ROI measurement, and reporting visibility — and Lightspeed wins the majority by positioning its analytics features as an explicit value proposition for budget-conscious superintendents and CTOs.
High

K-12 technology budgets are under rising board-level scrutiny, and the question 'which of our edtech licenses are actually being used?' has become a deal-stage conversation. GoGuardian's platform generates usage data as a byproduct of its filtering and monitoring functions — data that could directly answer Superintendent and CTO concerns about software ROI and license waste. But GoGuardian's content never frames this as a value proposition. Lightspeed wins these queries by explicitly marketing its analytics as an edtech spend management tool. These are Superintendent and CTO queries at consensus-creation and requirements stages — the final commercial gate before a purchase decision — where GoGuardian's data advantages could close deals that its product capabilities have already earned.

Show query cluster, blueprint & platform acuity
Query Cluster
IDs: gg_009, gg_023, gg_028, gg_036, gg_041, gg_058, gg_059, gg_098, gg_124, gg_130, gg_149
“We're paying for dozens of edtech tools and nobody can tell me which ones teachers actually use”
“K-12 edtech usage analytics tools that show which software licenses are actually being used”
“Lightspeed vs Securly for usage reporting — which gives IT admins better visibility into app and website usage?”
“We found $200K in unused edtech licenses last year — how do other districts use usage analytics to justify cutting shelfware?”
Blueprint
  • On-Domain: Create a /usage-analytics (or /edtech-insights) landing page framing GoGuardian's reporting capabilities as an edtech accountability tool: what data it surfaces (app usage, browsing patterns, engagement signals), how administrators act on it, and what decisions it enables.
  • On-Domain: Publish a blog post 'How GoGuardian Helps Districts Identify Unused EdTech Licenses' — using real or composite customer data to illustrate the ROI case, directly intercepting the 'we found unused licenses' query pattern.
  • On-Domain: Add an IT Admin reporting section to the /admin product page with explicit call-outs to usage analytics, app-level reporting, and district-wide dashboard capabilities — with screenshot or demo embed.
  • On-Domain: Create a 'Justifying EdTech Spend with GoGuardian Usage Data' resource page or downloadable guide, directly addressing the consensus-creation and artifact-creation query patterns for board presentations.
  • Off-Domain: Pursue placement in CoSN annual edtech spending reports or digital equity surveys — high-authority sources AI models cite for K-12 technology ROI queries.
  • Off-Domain: Seek independent case study placement in EdTech Magazine, THE Journal, or eSchool News covering a GoGuardian customer district that used usage data to rationalize its edtech stack.
  • Off-Domain: Submit data or analysis to an edtech research organization for independent Validation of GoGuardian's usage analytics methodology, generating citable third-party corroboration.
Platform Acuity

ChatGPT (high): Data-driven queries ('which licenses are actually being used', 'how to justify edtech spend') match ChatGPT's preference for content with specific, extractable data points. A page with real usage statistics and ROI calculations would be highly citable. Claude (high): Claude favors content with clear causal reasoning. An edtech ROI page that explains the data collection mechanism, what insights it surfaces, and how those insights connect to budget decisions would match Claude's citation criteria. Gemini (medium): Gemini responds to structured data. A usage analytics page with structured data tables (feature comparisons, ROI formulas) would score well, but Gemini's slightly lower overall visibility in this audit requires additional structured markup to maximize extraction.

NIO #5: Comparison Architecture & Self-Positioning Content
Gap Type: Structural Gap — 14 of 59 L3 gap queries (23.7%) involve Comparison or Shortlisting contexts where GoGuardian's existing content uses the wrong page type — product and feature pages matched to queries requiring Comparison pages or case-study-style content — or GoGuardian has no coverage entry at all. Lightspeed wins the majority because it maintains structured Comparison and Shortlisting content architectures that GoGuardian's current site does not replicate.
High

GoGuardian appears in 100% of Shortlisting queries across this audit — but wins 0 of 25 (0% Shortlisting win rate). This paradox has a structural explanation: AI models retrieve GoGuardian's feature and product pages when buyers ask 'best tools for X' or 'compare X vs Y,' but those page types don't contain the third-party-validated, Comparison-structured content that earns a primary recommendation. Meanwhile, 6 queries — including 'GoGuardian implementation problems,' 'hidden costs of GoGuardian,' and 'Is GoGuardian a good choice for a mid-size district?' — have no coverage entry at all, meaning GoGuardian never appears when buyers are actively vetting it. A competitor's page can answer 'GoGuardian implementation problems' if GoGuardian's own site cannot. These 14 queries represent the most structurally tractable gap in the L3 set: new page types (validated Comparison matrices, transparent implementation guides, TCO models) would directly address the Shortlisting win-rate failure.

Show query cluster, blueprint & platform acuity
Query Cluster
IDs: gg_047, gg_057, gg_062, gg_068, gg_073, gg_086, gg_092, gg_095, gg_101, gg_110, gg_117, gg_125, gg_138, gg_139
“Is GoGuardian a good choice for a mid-size school district with 12,000 students?”
“GoGuardian implementation problems for large school districts”
“Hidden costs of GoGuardian that school districts don't expect — licensing, training, add-ons”
“Create a vendor Comparison scorecard for Lightspeed Systems, Securly, and Gaggle focused on web filtering and student safety”
Blueprint
  • On-Domain: Create a 'GoGuardian vs. [Lightspeed | Securly | Dyknow | LanSchool]' structured Comparison series with separate URLs for each head-to-head — each page answering the specific query patterns (cross-platform support, implementation complexity, pricing transparency) that these queries target.
  • On-Domain: Publish a transparent 'GoGuardian Implementation Guide for Mid-Size Districts' covering deployment timelines, known complexity points, training requirements, and migration paths from common competitors — directly answering the 'implementation problems' and 'hidden costs' queries with GoGuardian's own voice.
  • On-Domain: Create a '/tco-calculator' or 'Total Cost of Ownership' resource page with a 3-year TCO model for districts of 5K, 10K, and 15K students, covering licensing, implementation, training, and renewal costs — making GoGuardian's pricing structure citable for board presentations.
  • On-Domain: Build a cross-platform Comparison matrix page (/cross-platform-support) with a structured table showing GoGuardian's coverage across Chromebook, Windows, iPad, Android, and BYOD scenarios — specifically designed for the 'which filter works across all device types' Shortlisting query pattern.
  • Off-Domain: Pursue G2, TrustRadius, and Capterra profile optimization — AI models cite these platforms heavily for 'GoGuardian implementation problems' and 'customer complaints' queries. Complete, well-reviewed profiles with specific implementation guidance are the primary off-domain lever for this cluster.
  • Off-Domain: Submit to EdTech industry analyst reports (EdWeek Market Brief, HolonIQ) that publish K-12 edtech vendor comparisons — these are the authoritative third-party sources AI models cite for 'best platform for mid-size district' queries.
Platform Acuity

ChatGPT (high): Shortlisting and Comparison queries favor ChatGPT in this audit (it leads by 5pp overall). Structured Comparison content with explicit feature matrices and third-party Validation signals match ChatGPT's tendency to synthesize competitor comparisons from well-structured source pages. Claude (high): Claude is well-suited to nuanced vendor evaluation queries. A 'GoGuardian for mid-size districts' page with specific customer outcomes, implementation evidence, and balanced discussion of limitations would match Claude's preference for substantive, balanced content. Gemini (medium): Gemini responds to structured entity data. Comparison matrix pages with schema markup (Table, ComparisonPage) and explicit entity relationships (GoGuardian + district size + feature coverage) would be highly extractable, but Gemini's citation gap in Comparison queries requires stronger structured data implementation.

Unified Priority Ranking

All recommendations across all three layers, ranked by commercial impact × implementation speed.

  • 1

    Majority of blog content is severely outdated

    10 out of 19 content marketing pages have confirmed publication dates older than 365 days, with the oldest dating to February 2018. Seven blog posts were published in 2018-2019 and have not been visibly updated. Only 2 blog posts (August 2025, May 2025) fall within the past 12 months, and none fall within the 90-day dominant AI citation window.

    Technical Fix · Content · 10+ blog posts targeting commercial-intent queries for web filtering, student safety, and educational technology evaluation
  • 2

    CIPA Compliance & Data Privacy Documentation Hub

    GoGuardian lacks a dedicated CIPA compliance and data privacy content hub, despite its web filter natively supporting CIPA-mandated controls. 11 of 59 L3 gap queries (18.6%) target CIPA compliance, E-Rate documentation, and FERPA/COPPA adherence — Lightspeed wins the majority by providing structured compliance documentation AI models can extract and cite.

    New Content · Content · 11 queries affecting personas: Chief Technology Officer / Director of Technology, Superintendent, Network Administrator / Systems Engineer
  • 3

    Family Engagement & Off-Network Device Protection Hub

    GoGuardian has no dedicated content hub for parent communication features or off-campus device protection, two capabilities its product supports but its content does not represent. 13 of 59 L3 gap queries (22%) target these themes, and Securly, Linewize, and Bark for Schools win by default because they publish specific parent-engagement and take-home-device content GoGuardian does not.

    New Content · Content · 13 queries affecting personas: Director of Student Services & Safety, Superintendent, Chief Technology Officer / Director of Technology, Network Administrator / Systems Engineer
  • 4

    Beacon/vs-competitors page: three-way safety alerting Comparison matrices

    The /beacon/vs-competitors page does not include a section addressing the 'Gaggle vs Securly' Comparison (gg_072, gg_083) — buyers evaluating these two vendors cannot find GoGuardian's perspective, so GoGuardian is excluded from their decision process.

    Content Optimization → New Content · Content · 7 queries, personas: Director of Student Services & Safety, Superintendent, Chief Technology Officer / Director of Technology
  • 5

    Comparison Architecture & Self-Positioning Content

    14 of 59 L3 gap queries (23.7%) involve Comparison or Shortlisting contexts where GoGuardian's existing content uses the wrong page type — product and feature pages matched to queries requiring Comparison pages or case-study-style content — or GoGuardian has no coverage entry at all. Lightspeed wins the majority because it maintains structured Comparison and Shortlisting content architectures that GoGuardian's current site does not replicate.

    New Content · Content · 14 queries affecting personas: Chief Technology Officer / Director of Technology, Superintendent, Network Administrator / Systems Engineer, Director of Curriculum & Instruction
  • 6

    Competitor-Comparison page: multi-vendor switching ROI and executive cost analysis

    The /competitor-Comparison page does not address the 'We're running Lightspeed and Gaggle separately — would switching to a single platform save money?' scenario (gg_100) — Lightspeed wins this query from a buyer already using its product by providing a consolidation-with-Lightspeed narrative.

    Content Optimization → New Content · Content · 3 queries, personas: Superintendent, Chief Technology Officer / Director of Technology
  • 7

    Competitor-Comparison page: platform consolidation strategic framework

    The /competitor-Comparison page does not provide a structured 'one platform vs. best-of-breed' decision framework (gg_017) — a buyer asking 'should we consolidate or use separate tools?' cannot find GoGuardian's reasoned answer on this page.

    Content Optimization → New Content · Content · 3 queries, personas: Superintendent, Chief Technology Officer / Director of Technology, Director of Student Services & Safety
  • 8

    Usage Analytics & EdTech ROI Intelligence Hub

    GoGuardian has usage analytics and reporting capabilities across its platform but no content hub that makes these capabilities visible to buyers evaluating edtech spend accountability. 11 of 59 L3 gap queries (18.6%) target edtech usage analytics, ROI measurement, and reporting visibility — and Lightspeed wins the majority by positioning its analytics features as an explicit value proposition for budget-conscious superintendents and CTOs.

    New Content · Content · 11 queries affecting personas: Superintendent, Chief Technology Officer / Director of Technology, Network Administrator / Systems Engineer
  • 9

    YouTube Filtering Controls & BYOD Device Coverage

    GoGuardian has no dedicated content pages for YouTube filtering granularity or BYOD filtering, two feature areas its product supports but its content inventory classifies as 'thin.' 10 of 59 L3 gap queries (16.9%) target these capabilities, and Lightspeed and Securly win the majority by publishing specific YouTube control documentation and BYOD filtering architecture explainers.

    New Content · Content · 10 queries affecting personas: Director of Curriculum & Instruction, Network Administrator / Systems Engineer, Director of Student Services & Safety
  • 10

    Admin page: consensus, ROI, and RFP support content

    The /admin page has no ROI or payback period section — buyers asking 'typical payback period for web filtering deployment' (gg_133) cannot find a GoGuardian-sourced answer, and Lightspeed wins by default because it publishes ROI data.

    Content Optimization · Content · 5 queries, personas: Superintendent, Chief Technology Officer / Director of Technology, Network Administrator / Systems Engineer
  • 11

    Admin page: evaluation criteria and Shortlisting buyer framework

    The /admin page lists features but does not frame them as evaluation criteria — buyers asking 'what features matter most for a 10,000-student district?' cannot use the page to build a shortlist or score vendors.

    Content Optimization · Content · 4 queries, personas: Chief Technology Officer / Director of Technology, Network Administrator / Systems Engineer, Superintendent
  • 12

    Admin page: overblocking remediation and VPN bypass prevention content

    The /admin page has no section addressing overblocking — it describes filtering capabilities but never acknowledges or solves the 'our filter blocks half the educational sites teachers need' problem buyers bring to the evaluation.

    Content Optimization · Content · 4 queries, personas: Network Administrator / Systems Engineer, Superintendent, Chief Technology Officer / Director of Technology
  • 13

    Admin/vs-competitors page: GoGuardian vs Lightspeed direct Comparison depth

    The /admin/vs-competitors page does not contain a dedicated 'GoGuardian Admin vs. Lightspeed Filter' section with a structured feature-by-feature Comparison table — the primary Comparison query (gg_069) covering a 10,000-student district context is not specifically addressed.

    Content Optimization · Content · 4 queries, personas: Chief Technology Officer / Director of Technology, Network Administrator / Systems Engineer, Superintendent
  • 14

    Admin/vs-competitors page: competitor risk and complaint counter-positioning

    The /admin/vs-competitors page presents GoGuardian's strengths but does not address the specific complaints buyers are researching: Lightspeed's reported implementation complexity (gg_109), Securly's customer service issues (gg_103), and bypass vulnerability patterns (gg_121).

    Content Optimization · Content · 5 queries, personas: Chief Technology Officer / Director of Technology, Network Administrator / Systems Engineer, Superintendent
  • 15

    Beacon page: alert fatigue, AI detection methodology, and problem identification

    The /beacon page does not address the '200 alerts a day and counselors are ignoring them' alert fatigue problem (gg_010) — it describes Beacon's detection capabilities but does not explain how GoGuardian reduces noise and prioritizes actionable alerts over raw volume.

    Content Optimization · Content · 4 queries, personas: Director of Student Services & Safety, Chief Technology Officer / Director of Technology
  • 16

    Beacon page: consensus building — liability, ROI, and case studies with safety outcomes

    The /beacon page does not include a liability/duty-of-care section (gg_131) addressing the question 'what is a district's liability if it doesn't deploy student safety monitoring?' — Gaggle wins this framing by publishing governance and legal risk content.

    Content Optimization · Content · 5 queries, personas: Superintendent, Director of Student Services & Safety, Chief Technology Officer / Director of Technology
  • 17

    Beacon page: evaluation criteria, false positive rates, and requirements checklist

    The /beacon page does not include a 'Must-Have Features for Student Safety Platforms' section (gg_031) — buyers building vendor scorecards cannot use the page to develop evaluation criteria, so they rely on competitor pages that do provide this structure.

    Content Optimization · Content · 4 queries, personas: Director of Student Services & Safety, Network Administrator / Systems Engineer
  • 18

    Beacon page: Shortlisting presence — 24/7 monitoring and multi-platform coverage

    The /beacon page does not prominently feature GoGuardian Beacon's 24/7 off-hours monitoring capability (gg_053) as a top-level claim — the product-update/beacon-24-7 page exists but is buried, and the main /beacon page does not reference it prominently enough for AI extraction.

    Content Optimization · Content · 3 queries, personas: Director of Student Services & Safety, Superintendent, Network Administrator / Systems Engineer
  • 19

    Beacon page: Validation queries — privacy, false positives, and vendor performance data

    The /beacon page does not include specific performance data (detection rate, response time, false positive rate) that would allow AI models to confidently recommend GoGuardian over Gaggle or Securly in Validation queries (gg_104, gg_111).

    Content Optimization · Content · 5 queries, personas: Director of Student Services & Safety, Chief Technology Officer / Director of Technology, Superintendent
  • 20

    Windows and Apple pages: cross-platform unified filtering narrative

    The /windows page documents Windows support but does not address the 'we have Chromebooks, Windows laptops, and iPads — how do we enforce consistent filtering across all of them?' problem scenario (gg_004) — the cross-platform consistency question requires content that bridges all three OS pages.

    Content Optimization · Content · 6 queries, personas: Network Administrator / Systems Engineer, Chief Technology Officer / Director of Technology
  • 21

    High-value competitor Comparison pages have no visible dates

    All four competitor Comparison pages (/admin/vs-competitors, /teacher/vs-competitors, /beacon/vs-competitors, /competitor-Comparison) have no visible publication or last-updated dates. These pages are classified as content_marketing and default to a freshness score of 0.2 under the scoring methodology.

    Technical Fix · Marketing · 4 competitor Comparison pages covering Admin, Teacher, Beacon, and the main Comparison hub
  • 22

    Pricing page contains no actionable pricing information

    The pricing page at /pricing scores 0.3 on content depth and 0.4 on heading hierarchy. It contains only a lead generation form with four brief paragraphs about pricing factors (enrollment, bundles, contract length, professional services) but no actual pricing tiers, ranges, per-student costs, or plan comparisons.

    Technical Fix · Marketing · Pricing page (/pricing) — a critical decision-stage page for buyers evaluating GoGuardian
  • 23

    Schema markup, meta descriptions, and OG tags require manual verification

    Our analysis method returns rendered page content as markdown text, which means JSON-LD schema blocks, meta description tags, and Open Graph tags are not visible in the output. We cannot confirm or deny the presence of structured data markup on any of the 41 analyzed pages.

    Technical Fix · Engineering · All 41 analyzed pages; highest priority for product pages, Comparison pages, and blog posts
  • 24

    Sitemap lacks lastmod timestamps on all 1,027 URLs

    The sitemap at https://www.goguardian.com/sitemap.xml contains 1,027 URL entries but none include lastmod, changefreq, or priority attributes. Only the required loc element is present for each URL.

    Technical Fix · Engineering · All 1,027 URLs in sitemap.xml
  • 25

    Teacher/vs-competitors page: three-way classroom management Comparison matrices

    The /teacher/vs-competitors page does not include a 'Dyknow vs LanSchool — and Why GoGuardian Teacher Is the Better Choice' section (gg_071, gg_091) — buyers evaluating these two competitors are unreachable by GoGuardian's current content architecture.

    Content Optimization → New Content · Content · 4 queries, personas: Director of Curriculum & Instruction, Network Administrator / Systems Engineer, Chief Technology Officer / Director of Technology
  • 26

    Hall-pass page: digital vs paper Comparison, integration specs, and disruption data

    The /hall-pass page does not address the 'how digital hall pass works compared to paper passes' solution-exploration query (gg_022) — a fundamental buyer education question that the page should answer to enter the consideration set for districts not yet using digital passes.

    Content Optimization · Content · 4 queries, personas: Director of Curriculum & Instruction, Chief Technology Officer / Director of Technology
  • 27

    Teacher page: consensus and adoption — implementation evidence, teacher buy-in, and artifact creation

    The /teacher page has no section providing evidence that classroom management software improves instructional time or student engagement (gg_136) — curriculum directors and superintendents need this research to justify the investment to skeptical teachers.

    Content Optimization · Content · 4 queries, personas: Director of Curriculum & Instruction, Superintendent
  • 28

    Teacher page: problem framing, Google Workspace integration, and solution exploration

    The /teacher page does not open with the teacher's problem ('half the class is on YouTube and I can't teach') — it presents GoGuardian's features without connecting them to the specific distraction crisis curriculum directors are researching when they search gg_003.

    Content Optimization · Content · 3 queries, personas: Director of Curriculum & Instruction, Superintendent
  • 29

    Teacher page: Shortlisting — low-tech teacher adoption and tab control prominence

    The /teacher page does not have an explicit 'Easy for Non-Technical Teachers' positioning section — gg_052's 'low tech skills can actually learn quickly' query finds no clear winner because no vendor prominently owns this positioning, leaving an open opportunity for GoGuardian.

    Content Optimization · Content · 3 queries, personas: Director of Curriculum & Instruction, Superintendent
  • 30

    Teacher page: Validation — GoGuardian usability complaints and competitor lock-in counter-positioning

    The /teacher page has no section addressing the 'common complaints about GoGuardian from teachers' query (gg_107) — GoGuardian's own page is silent on this, meaning AI models cannot cite GoGuardian's perspective and must defer to third-party review sites (which may present negative framing).

    Content Optimization · Content · 3 queries, personas: Director of Curriculum & Instruction, Network Administrator / Systems Engineer, Superintendent
  • 31

    Client-side rendering status cannot be confirmed

    All 41 analyzed pages returned substantive content via our rendering pipeline, suggesting server-side rendering or pre-rendering is in place. However, we cannot confirm whether any pages rely on client-side JavaScript rendering that might fail for certain AI crawlers that do not execute JavaScript.

    Technical Fix · Engineering · All pages, but particularly product and Comparison pages that drive conversion

Workstream Mapping

All three workstreams can start this week.

Engineering / DevOps

Layer 1 — Technical Fixes
Timeline: Days to 2 weeks
  • Majority of blog content is severely outdated
  • Sitemap lacks lastmod timestamps on all 1,027 URLs
  • High-value competitor Comparison pages have no visible dates
  • Schema markup, meta descriptions, and OG tags require…

Content Team

Layer 2 — Content Optimization
Timeline: 2–6 weeks
  • Admin page: overblocking remediation and VPN bypass…
  • Admin page: evaluation criteria and Shortlisting buyer…
  • Admin/vs-competitors page: competitor risk and complaint…
  • Admin page: consensus, ROI, and RFP support content

Content Strategy

Layer 3 — NIOs + Off-Domain
Timeline: 1–3 months
  • Create a dedicated /cipa-compliance landing page with a…
  • Create a dedicated /off-network-protection landing page…
  • Create a dedicated /youtube-filtering landing page…
  • Create a /usage-analytics (or /edtech-insights) landing…
  • Create a 'GoGuardian vs. [Lightspeed | Securly | Dyknow |…

[Synthesis] Execution order is not optional: L1 technical fixes — particularly sitemap lastmod deployment and blog content refresh with visible dates — must precede L2 and L3 work because AI crawlers use freshness signals to prioritize re-crawl. New L3 content and refreshed L2 pages will not receive their true freshness credit until sitemap timestamps signal their update dates. L2 content optimizations (20 recommendations) then improve the 83 existing-page gaps before L3 NIOs add new content for the 59 invisible queries.

The 5 NIOs represent the highest structural leverage: these are entirely uncontested query sets where GoGuardian's product is capable but content is absent.

Methodology
Audit Methodology

Query Construction

150 queries constructed from persona × buying job × feature focus × pain point matrix
Every query carries four metadata fields assigned at creation time
High-intent jobs (Shortlisting + Comparison + Validation): 54% of queries (81 of 150)
Note: 150 queries across full buying journey.

Personas

Chief Technology Officer / Director of Technology — Chief Technology Officer / Director of Technology · Decision Maker
Superintendent — Superintendent · Decision Maker
Director of Student Services & Safety — Director of Student Services & Safety · Evaluator
Network Administrator / Systems Engineer — Network Administrator / Systems Engineer · Decision Maker
Director of Curriculum & Instruction — Director of Curriculum & Instruction · Evaluator

Buying Jobs Framework

8 non-linear buying jobs: Artifact Creation → Comparison → Consensus Creation → Problem Identification → Requirements Building → Shortlisting → Solution Exploration → Validation
High-intent jobs (Shortlisting + Comparison + Validation): 54% of queries (81 of 150)

Competitive Set

Primary: Lightspeed Systems, Securly, Gaggle, Dyknow, LanSchool
Secondary: Hāpara, Bark for Schools, Blocksi, Linewize
Surprise: Cisco Umbrella, ManagedMethods, Deledao — flagged for review

Platforms & Scoring

Platforms: ChatGPT + Claude + Gemini
Platforms were selected based on market share among the client’s buyer segment and AI search adoption patterns. This audit deviates from the standard ChatGPT + Perplexity pair. Claude was included as an audited platform. This audit is produced by an independent pipeline; no platform-specific optimization is applied to query construction or result interpretation.
Visibility: Binary — does the client appear in the response?
Win rate: Of visible queries, is the client the primary recommendation?

Cross-Platform Counting (Union Method)

When a query is run on multiple platforms, union logic is applied: a query counts as “visible” if the client appears on any platform, not each platform separately.
Winner resolution: When platforms disagree on the winner, majority vote is used. Vendor names are preferred over meta-values (e.g. “no clear winner”). True ties resolve to “no clear winner.”
Share of Voice: Each entity is counted once per query across platforms (union dedup), preventing double-counting when both platforms mention the same company.
This approach ensures headline metrics reflect real buyer-query outcomes rather than inflated per-platform counts.

Terminology

Mentions: Query-level visibility count. A company receives one mention per query where it appears in any platform response (union-deduped). This is the numerator for Share of Voice.
Unique Pages Cited: Count of distinct client page URLs cited across all platform responses, after URL normalization (stripping tracking parameters). The footer total in the Citation section uses this measure.
Citation Instances (Top Cited Domains): Raw count of citation occurrences per domain across all responses. A single domain can accumulate multiple citation instances from different queries and platforms. The Top Cited Domains table uses this measure.