Competitive intelligence for AI-mediated buying decisions. Where GoGuardian wins, where it loses, and a prioritized three-layer execution plan — built from 150 buyer queries across ChatGPT + Claude + Gemini.
GoGuardian's 65.33% visibility and near-zero win rate are not contradictions — they are symptoms of a single structural failure: content that places GoGuardian in the room but does not earn a recommendation.
[Mechanism] Three compounding gaps create the pattern. First, 59 queries are entirely invisible because GoGuardian lacks content for five feature areas (CIPA compliance, parent engagement, off-network protection, YouTube filtering, BYOD, edtech ROI) — competitors enter these queries unopposed and define the evaluation criteria before GoGuardian appears. Second, existing pages match 83 buyer queries but lose because they describe product capabilities without providing the structured Comparison data, evaluation frameworks, and evidence-backed claims that produce AI recommendations — this is why Shortlisting visibility is 100% but Shortlisting win rate is 0%.
Third, blog content dated 2018-2019 and a sitemap missing lastmod timestamps on all 1,027 URLs suppress the freshness signals AI citation algorithms use to prioritize content, meaning even good content is deprioritized against fresher competitor pages.
[Synthesis] L1 technical fixes must execute before L2/L3 work because the sitemap lastmod fix directly unblocks AI crawlers' ability to detect when any page — new or refreshed — has been updated. Without lastmod, all 1,027 GoGuardian URLs are equivalent in freshness priority to a crawler; with lastmod, newly published L3 pages and refreshed L2 pages will receive immediate freshness credit. Additionally, adding visible dates to Comparison pages (L1 finding: comparison_pages_undated) directly increases citation confidence for the /admin/vs-competitors and /beacon/vs-competitors pages that L2 recommendations optimize — the date fix and the content fix must ship together to maximize impact.
Where GoGuardian appears and where it doesn't — across personas, buying jobs, and platforms.
[TL;DR] GoGuardian is visible in 65% of buyer queries and wins 8% of those. Converting visibility to wins is the primary challenge (57% conversion gap — GoGuardian appears but doesn’t win). High-intent queries run higher at 72%.
GoGuardian's early-funnel invisibility (30.2% across Problem Identification, Solution Exploration, and Requirements Building) lets competitors define the evaluation criteria before GoGuardian enters — a structural disadvantage that compounds at every subsequent stage.
| Dimension | Combined |
|---|---|
| All Queries | 65.3% |
| By Persona | |
| Chief Technology Officer / Director of Technology | 69.2% |
| Director of Curriculum & Instruction | 66.7% |
| Director of Student Services & Safety | 53.8% |
| Network Administrator / Systems Engineer | 72.4% |
| Superintendent | 62.5% |
| By Buying Job | |
| Artifact Creation | 28.6% |
| Comparison | 40.6% |
| Consensus Creation | 50% |
| Problem Identification | 84.6% |
| Requirements Building | 46.7% |
| Shortlisting | 100% |
| Solution Exploration | 80% |
| Validation | 83.3% |
[Data] Overall visibility: 65.33% (98/150 queries). By buying job: Shortlisting 100% (25/25), Problem Identification 84.62% (11/13), Validation 83.33% (20/24), Solution Exploration 80% (12/15), Requirements Building 46.67% (7/15), Comparison 40.62% (13/32), Artifact Creation 28.57% (4/14). Early-funnel invisibility rate: 30.2% across Problem Identification, Solution Exploration, and Requirements Building (13/43 queries invisible).
[Synthesis] GoGuardian's visibility is strongest at late-funnel stages (Shortlisting: 100%, Validation: 83.33%) and weakest at early-funnel stages (Requirements Building: 46.67%, Comparison: 40.62%, Artifact Creation: 28.57%). The early-funnel invisibility rate of 30.2% (13/43 queries) means competitors frame the problem and define the evaluation criteria before GoGuardian enters. Because Requirements Building shapes how buyers score vendors at Shortlisting, early-funnel absence has a compounding effect on late-funnel conversion.
20 queries won by named competitors · 11 no clear winner · 21 no vendor mentioned
Sorted by competitive damage — competitor-winning queries first.
| ID | Query | Persona | Stage | Winner |
|---|---|---|---|---|
| ⚑ Competitor Wins — 20 queries where a named competitor captures the buyer | ||||
| gg_023 | "What tools exist for tracking which edtech apps and software licenses schools are actually using?" | Chief Technology Officer / Director of Technology | Solution Exploration | Lightspeed Systems |
| gg_041 | "What should a school district look for in edtech usage analytics to cut wasted software spending?" | Superintendent | Requirements Building | Lightspeed Systems |
| gg_073 | "Lightspeed vs LanSchool for device monitoring — which handles both Chromebooks and Windows better?" | Network Administrator / Systems Engineer | Comparison | Lightspeed Systems |
| gg_074 | "We're replacing our firewall-based filter — Lightspeed Systems vs Securly, which cloud web filter is better for a Chromebook-heavy district?" | Network Administrator / Systems Engineer | Comparison | Lightspeed Systems |
| gg_075 | "Gaggle vs Lightspeed Alert for student safety — how do their alert accuracy and response times compare?" | Director of Student Services & Safety | Comparison | Gaggle |
| gg_076 | "Dyknow vs Lightspeed Classroom for screen monitoring — which is easier for teachers to use?" | Director of Curriculum & Instruction | Comparison | Dyknow |
| gg_077 | "Lightspeed Filter vs Securly for YouTube filtering controls in K-12 schools" | Director of Curriculum & Instruction | Comparison | Lightspeed Systems |
| gg_078 | "Which K-12 web filter has the best CIPA compliance reporting and E-Rate documentation — Lightspeed or Securly?" | Chief Technology Officer / Director of Technology | Comparison | Lightspeed Systems |
| gg_079 | "Securly vs Linewize for parent engagement and take-home device monitoring — which gives parents better visibility?" | Superintendent | Comparison | Securly |
| gg_082 | "Comparing Lightspeed, Securly, and Gaggle — which student safety platform is strongest for a mid-size district?" | Chief Technology Officer / Director of Technology | Comparison | Lightspeed Systems |
Remaining competitor wins: Lightspeed Systems ×4, Securly ×1, LanSchool ×1, Hāpara ×1, Bark for Schools ×1, Gaggle ×1, Blocksi ×1. 11 queries with no clear winner. 21 queries with no vendor mentioned. Full query-level data available in the analysis export.
Queries where GoGuardian is mentioned but a competitor is positioned more favorably.
| ID | Query | Persona | Buying Job | Winner | GoGuardian Position |
|---|---|---|---|---|---|
| gg_001 | "What are the main approaches to keeping students safe online in K-12 school districts?" | Superintendent | Problem Identification | No Vendor Mentioned | Mentioned In List |
| gg_002 | "How are school districts handling student self-harm detection on school-issued devices?" | Director of Student Services & Safety | Problem Identification | No Clear Winner | Mentioned In List |
| gg_003 | "Teachers spending half the class chasing students off YouTube and games — what do other districts do?" | Director of Curriculum & Instruction | Problem Identification | No Clear Winner | Mentioned In List |
| gg_004 | "We have Chromebooks, Windows laptops, and iPads — how do districts enforce consistent web filtering across all of them?" | Chief Technology Officer / Director of Technology | Problem Identification | No Clear Winner | Mentioned In List |
| gg_005 | "Our filter blocks half the educational sites teachers need — how do we fix overblocking without opening everything up?" | Network Administrator / Systems Engineer | Problem Identification | No Clear Winner | Brief Mention |
| gg_006 | "E-Rate audit is coming and I can't prove CIPA compliance — what are other districts using for documentation?" | Chief Technology Officer / Director of Technology | Problem Identification | No Vendor Mentioned | Mentioned In List |
| gg_007 | "Students figured out VPNs to bypass our web filter — what solutions actually stop filter circumvention?" | Network Administrator / Systems Engineer | Problem Identification | No Clear Winner | Mentioned In List |
| gg_012 | "What do districts do about student devices when kids bring their own phones and laptops to school?" | Director of Student Services & Safety | Problem Identification | No Vendor Mentioned | Mentioned In List |
| gg_015 | "Difference between agent-based filtering and DNS-based filtering for school devices" | Network Administrator / Systems Engineer | Solution Exploration | No Vendor Mentioned | Mentioned In List |
| gg_017 | "Should we get one platform for web filtering, classroom management, and safety monitoring or use separate best-of-breed tools?" | Chief Technology Officer / Director of Technology | Solution Exploration | No Clear Winner | Mentioned In List |
| ID | Query | Persona | Buying Job | Winner | GoGuardian Position |
|---|---|---|---|---|---|
| gg_019 | "How do classroom management platforms integrate with Google Workspace for Education?" | Director of Curriculum & Instruction | Solution Exploration | No Clear Winner | Brief Mention |
| gg_020 | "What's the difference between human-reviewed safety alerts and fully automated AI detection for student threats?" | Director of Student Services & Safety | Solution Exploration | No Clear Winner | Brief Mention |
| gg_021 | "We're on an appliance-based filter and thinking about going cloud — what's the real difference for a mixed device school district?" | Network Administrator / Systems Engineer | Solution Exploration | No Clear Winner | Mentioned In List |
| gg_024 | "Approaches to filtering YouTube in schools — blocking it entirely vs. granular video-level controls" | Director of Curriculum & Instruction | Solution Exploration | No Clear Winner | Brief Mention |
| gg_025 | "How do school safety platforms handle off-campus monitoring on 1:1 devices?" | Superintendent | Solution Exploration | No Clear Winner | Mentioned In List |
| gg_026 | "What options exist for monitoring student-owned BYOD devices on a school network without installing agents?" | Network Administrator / Systems Engineer | Solution Exploration | No Vendor Mentioned | Mentioned In List |
| gg_027 | "How do schools give parents visibility into what their kids are doing on school devices at home?" | Superintendent | Solution Exploration | No Clear Winner | Mentioned In List |
| gg_028 | "What tools help districts monitor student internet use across apps, not just web browsers?" | Chief Technology Officer / Director of Technology | Solution Exploration | No Clear Winner | Mentioned In List |
| gg_029 | "What features matter most when evaluating student web filtering platforms for a district with 10,000 students?" | Chief Technology Officer / Director of Technology | Requirements Building | No Clear Winner | Brief Mention |
| gg_031 | "Must-have vs. nice-to-have features for student safety monitoring software in K-12" | Director of Student Services & Safety | Requirements Building | No Clear Winner | Brief Mention |
| gg_032 | "Security and privacy requirements checklist for evaluating student monitoring platforms in K-12" | Network Administrator / Systems Engineer | Requirements Building | No Clear Winner | Brief Mention |
| gg_033 | "What CIPA compliance features should a web filter have to pass an E-Rate audit?" | Chief Technology Officer / Director of Technology | Requirements Building | No Clear Winner | Brief Mention |
| gg_034 | "We're replacing our current filter — what should I look for in a web filter that works across Chromebooks, iPads, and Windows?" | Network Administrator / Systems Engineer | Requirements Building | Lightspeed Systems | Mentioned In List |
| gg_035 | "Evaluation criteria for YouTube filtering in schools — how granular should controls be?" | Director of Curriculum & Instruction | Requirements Building | No Clear Winner | Brief Mention |
| gg_037 | "Our current filter doesn't protect devices off-campus — what requirements should we set for a replacement?" | Chief Technology Officer / Director of Technology | Requirements Building | No Clear Winner | Mentioned In List |
| gg_044 | "We've outgrown our current web filter — best K-12 web filtering platforms for mid-size districts with mixed device fleets" | Chief Technology Officer / Director of Technology | Shortlisting | Lightspeed Systems | Strong 2nd |
| gg_045 | "Top student safety monitoring platforms that detect self-harm and violence threats on school devices" | Director of Student Services & Safety | Shortlisting | goguardian | Primary Recommendation |
| gg_046 | "Best classroom management software for K-12 teachers to monitor student screens and keep kids on task" | Director of Curriculum & Instruction | Shortlisting | goguardian | Primary Recommendation |
| gg_047 | "We're running separate filters for each device type — which school web filters work across Chromebooks, iPads, and Windows in one platform?" | Network Administrator / Systems Engineer | Shortlisting | Lightspeed Systems | Mentioned In List |
| gg_048 | "Top K-12 platforms that combine web filtering, classroom management, and student safety in one tool" | Chief Technology Officer / Director of Technology | Shortlisting | goguardian | Mentioned In List |
| gg_049 | "Best web filtering solutions for CIPA compliance and E-Rate audit documentation" | Superintendent | Shortlisting | Lightspeed Systems | Mentioned In List |
| gg_050 | "K-12 student safety platforms with the lowest false positive rates for self-harm alerts" | Director of Student Services & Safety | Shortlisting | Gaggle | Mentioned In List |
| gg_051 | "school web filters that actually stop VPN bypass attempts by students" | Network Administrator / Systems Engineer | Shortlisting | Lightspeed Systems | Primary Recommendation |
| gg_052 | "Best classroom management tools that teachers with low tech skills can actually learn quickly" | Director of Curriculum & Instruction | Shortlisting | No Clear Winner | Brief Mention |
| gg_053 | "Our current safety tool only monitors during school hours — which student safety platforms provide 24/7 monitoring including nights and weekends?" | Superintendent | Shortlisting | Gaggle | Mentioned In List |
| gg_054 | "school web filtering platforms that protect 1:1 take-home devices off-campus" | Chief Technology Officer / Director of Technology | Shortlisting | goguardian | Primary Recommendation |
| gg_055 | "Best YouTube filtering tools for schools that let teachers use educational videos while blocking inappropriate content" | Director of Curriculum & Instruction | Shortlisting | Lightspeed Systems | Strong 2nd |
| gg_056 | "Top school safety platforms with strong parent communication and take-home device visibility" | Director of Student Services & Safety | Shortlisting | Securly | Strong 2nd |
| gg_057 | "Best digital hall pass systems for K-12 schools that integrate with classroom management software" | Director of Curriculum & Instruction | Shortlisting | Securly | Strong 2nd |
| gg_058 | "K-12 edtech usage analytics tools that show which software licenses are actually being used" | Chief Technology Officer / Director of Technology | Shortlisting | Lightspeed Systems | Brief Mention |
| gg_059 | "Best school web filters with detailed usage reporting for IT administrators" | Network Administrator / Systems Engineer | Shortlisting | goguardian | Mentioned In List |
| gg_060 | "K-12 web filtering platforms that handle BYOD without requiring agents on personal devices" | Network Administrator / Systems Engineer | Shortlisting | Securly | Primary Recommendation |
| gg_061 | "Looking for a school safety platform that meets CIPA requirements and handles state-level mandates for student internet safety" | Superintendent | Shortlisting | Lightspeed Systems | Strong 2nd |
| gg_062 | "school web filter shortlist for a district with 8,000 students running mostly Chromebooks plus some Windows and iPad" | Chief Technology Officer / Director of Technology | Shortlisting | Lightspeed Systems | Mentioned In List |
| gg_063 | "Best student monitoring solutions with off-network protection for 1:1 iPad deployments" | Director of Student Services & Safety | Shortlisting | Lightspeed Systems | Mentioned In List |
| gg_064 | "Which classroom management platforms let teachers control all student tabs from one screen during lessons?" | Superintendent | Shortlisting | goguardian | Primary Recommendation |
| gg_065 | "recommended student safety platforms for districts with both Google Workspace and Microsoft 365" | Network Administrator / Systems Engineer | Shortlisting | No Clear Winner | Brief Mention |
| gg_066 | "alternatives to our current web filter that keeps blocking educational sites teachers need" | Chief Technology Officer / Director of Technology | Shortlisting | No Clear Winner | Brief Mention |
| gg_067 | "student safety monitoring tools with parent notification features for take-home devices" | Director of Student Services & Safety | Shortlisting | Bark for Schools | Mentioned In List |
| gg_068 | "Is GoGuardian a good choice for a mid-size school district with 12,000 students?" | Superintendent | Shortlisting | goguardian | Primary Recommendation |
| gg_069 | "GoGuardian vs Lightspeed Systems for K-12 web filtering — which is better for a district with 10,000 students?" | Chief Technology Officer / Director of Technology | Comparison | Lightspeed Systems | Strong 2nd |
| gg_070 | "GoGuardian vs Securly — which student safety platform has better self-harm detection?" | Director of Student Services & Safety | Comparison | goguardian | Primary Recommendation |
| gg_071 | "Dyknow vs LanSchool for classroom management — which do teachers prefer?" | Director of Curriculum & Instruction | Comparison | Dyknow | Brief Mention |
| gg_080 | "Is it better to get an all-in-one K-12 safety platform or use Gaggle for safety and a separate tool for filtering?" | Superintendent | Comparison | Gaggle | Mentioned In List |
| gg_085 | "Switching from Gaggle to a platform that also does web filtering — what are the best options?" | Chief Technology Officer / Director of Technology | Comparison | Lightspeed Systems | Strong 2nd |
| gg_091 | "Our teachers hate our current classroom management tool — is Dyknow actually better for teacher satisfaction?" | Director of Curriculum & Instruction | Comparison | Dyknow | Brief Mention |
| gg_092 | "LanSchool Air vs Lightspeed Classroom — how do they compare for mixed Chromebook and Windows environments?" | Network Administrator / Systems Engineer | Comparison | No Clear Winner | Brief Mention |
| gg_093 | "We're unhappy with our current YouTube filtering — which K-12 platforms have the most granular video-level controls?" | Director of Curriculum & Instruction | Comparison | Lightspeed Systems | Mentioned In List |
| gg_095 | "Which digital hall pass systems integrate with classroom management and web filtering platforms?" | Chief Technology Officer / Director of Technology | Comparison | Securly | Mentioned In List |
| gg_097 | "How do Bark for Schools, Gaggle, and Securly compare for student suicide prevention monitoring?" | Director of Student Services & Safety | Comparison | Gaggle | Brief Mention |
| gg_099 | "Which K-12 web filter handles BYOD the best — we need filtering for student personal devices on the school network" | Network Administrator / Systems Engineer | Comparison | Lightspeed Systems | Mentioned In List |
| gg_101 | "GoGuardian implementation problems for large school districts" | Chief Technology Officer / Director of Technology | Validation | No Clear Winner | Primary Recommendation |
| gg_102 | "Lightspeed Systems problems and complaints from school districts" | Chief Technology Officer / Director of Technology | Validation | No Clear Winner | Brief Mention |
| gg_103 | "Securly customer complaints — what do school IT teams not like about it?" | Network Administrator / Systems Engineer | Validation | No Clear Winner | Brief Mention |
| gg_104 | "Gaggle safety monitoring problems — how often do they miss real threats?" | Director of Student Services & Safety | Validation | No Clear Winner | Brief Mention |
| gg_105 | "Dyknow reviews and complaints from school districts — what are the downsides?" | Director of Curriculum & Instruction | Validation | No Clear Winner | Brief Mention |
| gg_107 | "Common complaints about GoGuardian from teachers — is it hard to use?" | Director of Curriculum & Instruction | Validation | No Clear Winner | Primary Recommendation |
| gg_108 | "Does GoGuardian slow down Chromebooks? Performance issues reported by schools" | Network Administrator / Systems Engineer | Validation | No Clear Winner | Primary Recommendation |
| gg_109 | "Biggest risks of choosing Lightspeed Systems for web filtering at a mid-size district" | Chief Technology Officer / Director of Technology | Validation | No Clear Winner | Brief Mention |
| gg_110 | "Hidden costs of GoGuardian that school districts don't expect — licensing, training, add-ons" | Superintendent | Validation | No Clear Winner | Primary Recommendation |
| gg_111 | "Securly false positive rate for student safety alerts — is it better or worse than competitors?" | Director of Student Services & Safety | Validation | No Clear Winner | Brief Mention |
| gg_113 | "Gaggle customer support quality — what do school admins say about response times?" | Chief Technology Officer / Director of Technology | Validation | No Clear Winner | Brief Mention |
| gg_114 | "Student privacy concerns with GoGuardian — do they comply with FERPA and COPPA?" | Superintendent | Validation | No Clear Winner | Mentioned In List |
| gg_117 | "How long does a typical K-12 web filter implementation take for a district with 8,000+ devices?" | Network Administrator / Systems Engineer | Validation | No Vendor Mentioned | Mentioned In List |
| gg_118 | "What do schools say about switching from Lightspeed to a different web filter — was the migration worth it?" | Superintendent | Validation | No Clear Winner | Mentioned In List |
| gg_119 | "LanSchool contract and licensing complaints — are there lock-in issues?" | Superintendent | Validation | No Clear Winner | Brief Mention |
| gg_121 | "Can students bypass school web filters with VPNs or browser extensions? Which filters are hardest to get around?" | Network Administrator / Systems Engineer | Validation | No Clear Winner | Mentioned In List |
| gg_122 | "Securly data privacy concerns — how do they handle student monitoring data?" | Superintendent | Validation | No Clear Winner | Brief Mention |
| gg_124 | "Can K-12 web filters actually track edtech app usage or is that a separate tool? What are the reporting gaps?" | Chief Technology Officer / Director of Technology | Validation | No Clear Winner | Mentioned In List |
| gg_125 | "LanSchool deployment complexity — is it harder to roll out than cloud-based classroom management alternatives?" | Network Administrator / Systems Engineer | Validation | LanSchool | Mentioned In List |
| gg_126 | "ROI of implementing a student safety monitoring platform for a mid-size school district" | Superintendent | Consensus Creation | No Vendor Mentioned | Mentioned In List |
| gg_127 | "How to justify spending on web filtering and classroom management software to a school board" | Superintendent | Consensus Creation | No Clear Winner | Mentioned In List |
| gg_128 | "Case studies of school districts that reduced student safety incidents after deploying monitoring software" | Director of Student Services & Safety | Consensus Creation | Gaggle | Mentioned In List |
| gg_132 | "How to convince teachers to adopt classroom management software — what does successful rollout look like?" | Director of Curriculum & Instruction | Consensus Creation | No Clear Winner | Brief Mention |
| gg_134 | "How do districts justify the cost of CIPA-compliant web filtering to protect E-Rate funding?" | Chief Technology Officer / Director of Technology | Consensus Creation | No Vendor Mentioned | Mentioned In List |
| gg_136 | "Evidence that classroom management software improves instructional time and student engagement" | Director of Curriculum & Instruction | Consensus Creation | No Vendor Mentioned | Mentioned In List |
| gg_137 | "Draft an RFP for K-12 web filtering and student safety monitoring for a district with 12,000 students across Chromebooks, Windows, and iPads" | Chief Technology Officer / Director of Technology | Artifact Creation | No Clear Winner | Mentioned In List |
| gg_139 | "Build a TCO model for implementing a K-12 web filtering and safety platform across a 10,000-student district over 3 years" | Superintendent | Artifact Creation | No Clear Winner | Mentioned In List |
| gg_146 | "Build a feature Comparison spreadsheet for K-12 web filtering platforms including cross-platform support, YouTube controls, BYOD, and CIPA compliance" | Network Administrator / Systems Engineer | Artifact Creation | Lightspeed Systems | Mentioned In List |
| gg_147 | "Create an executive summary comparing the cost of running separate filtering, classroom management, and safety tools versus consolidating to one platform" | Superintendent | Artifact Creation | No Clear Winner | Brief Mention |
Who’s winning when GoGuardian isn’t — and who controls the narrative at each buying stage.
[TL;DR] GoGuardian wins 5.3% of queries (8/150), ranks #3 in SOV — H2H record: 57W–31L across 9 competitors.
GoGuardian's SOV rank of #3 (only 3 mentions behind the leader) masks a win-rate problem: it appears alongside Lightspeed frequently but loses the head-to-head 16-14, and loses to Gaggle 5-1 in student safety queries despite Beacon being a stronger product.
| Company | Mentions | Share |
|---|---|---|
| Lightspeed Systems | 104 | 21.1% |
| Securly | 103 | 20.8% |
| GoGuardian | 101 | 20.4% |
| Linewize | 42 | 8.5% |
| Bark for Schools | 37 | 7.5% |
| Gaggle | 31 | 6.3% |
| Blocksi | 24 | 4.9% |
| Hāpara | 20 | 4% |
| LanSchool | 18 | 3.6% |
| Dyknow | 14 | 2.8% |
When GoGuardian and a competitor both appear in the same response, who gets the recommendation? One query with multiple competitors generates a matchup against each — so H2H totals will exceed the query count.
Win = primary recommendation (cross-platform majority). Loss = competitor was. Tie = neither or third party.
For the 52 queries where GoGuardian is completely absent:
Vendors appearing in responses not in GoGuardian’s defined competitive set.
[Synthesis] GoGuardian's SOV position (rank #3, 20.4% share) understates its competitive disadvantage at the query level. Despite appearing within 3 mentions of the category leader, GoGuardian wins pairwise matchups against Lightspeed only 14 of 30 contested queries (46.7%) and loses to Gaggle 5 of 6 contested queries (83.3% loss rate). Win rate and H2H record diverge because H2H measures co-appearance wins while win rate measures primary recommendation rate — GoGuardian appears alongside competitors frequently but is rarely recommended as the top choice.
Gaggle's 5-1 head-to-head advantage is the most actionable competitive gap: Beacon's content depth at Comparison and consensus stages is losing to Gaggle's outcome-specific case study and methodology content.
What AI reads and trusts in this category.
[TL;DR] GoGuardian had 31 unique pages cited across buyer queries, ranking #5 among all cited domains. 10 high-authority domains cite competitors but not GoGuardian.
Being cited as a source — not just mentioned — is what drives AI recommendations, and GoGuardian's citation rank of #5 despite SOV rank of #3 signals that AI models reference GoGuardian's content less often than competitors' for the same queries.
Note: Domain-level citation counts (above) tally instances per individual domain. Competitor-level counts (below) aggregate across all domains owned by a single vendor, which may include subdomains.
Non-competitor domains citing other vendors but not GoGuardian — off-domain authority opportunities.
These domains cited competitors but did not cite GoGuardian pages in the queries analyzed. This reflects citation patterns in AI responses, not overall platform presence.
[Synthesis] GoGuardian's citation rank of #5 despite SOV rank of #3 reveals a structural authority gap: GoGuardian is mentioned in AI responses more often than it is cited as a source. The 31 unique pages cited (vs Lightspeed's 161 citation instances across its domains) indicates GoGuardian's content is less frequently used as a primary reference than its SOV presence would suggest. The concentration of citations on the homepage (7 instances) and Comparison pages (4 instances) points to AI models defaulting to surface-level pages rather than feature-specific content — a pattern the L2 and L3 content investments directly address.
Three layers of recommendations ranked by commercial impact and implementation speed.
[TL;DR] 31 priority recommendations (plus 4 near-rebuild optimizations) targeting 142 gap queries (52 invisible, 90 positioning gaps). 5 L1 technical fixes + 1 verification checks, 20 content optimizations (L2), 5 new content initiatives (L3).
The 32 recommendations follow a strict dependency sequence: L1 technical fixes first (they unblock freshness credit for everything after), then L2 content deepening, then L3 new content creation — skipping sequence means new content ships without the infrastructure to be indexed at full freshness.
Reading the priority numbers: Recommendations are ranked 1–31 across all three layers by commercial impact × implementation speed. Within each layer, items appear in priority order. Gaps in the sequence (e.g., L1 shows 1, 2, then 12) mean higher-priority items belong to a different layer.
Configuration and infrastructure changes. Owner: Engineering / DevOps. Timeline: Days to weeks.
| Priority | Finding | Impact | Timeline |
|---|---|---|---|
| #1 | Majority of blog content is severely outdated | High | 2-4 weeks |
| #21 | High-value competitor Comparison pages have no visible dates | Medium | < 1 day |
| #22 | Pricing page contains no actionable pricing information | Medium | 1-3 days |
| #23 | Schema markup, meta descriptions, and OG tags require manual verification | Medium | 1-2 weeks |
| #24 | Sitemap lacks lastmod timestamps on all 1,027 URLs | Medium | 1-3 days |
Items requiring manual review before determining if action is needed.
| Priority | Finding | Impact | Timeline |
|---|---|---|---|
| #31 | Client-side rendering status cannot be confirmed | Low | < 1 day |
Click any row to expand full issue/fix detail.
Existing pages that need restructuring or deepening. Owner: Content Team. Timeline: Weeks.
The /beacon/vs-competitors page does not include a section addressing the 'Gaggle vs Securly' Comparison (gg_072, gg_083) — buyers evaluating these two vendors cannot find GoGuardian's perspective, so GoGuardian is excluded from their decision process. The /beacon/vs-competitors page does not address Gaggle's human-reviewed alert methodology vs Securly's AI detection approach (gg_094) — a critical positioning question where GoGuardian Beacon has a distinct answer that differentiates it from both. The page has no section for 'Bark for Schools vs Gaggle' (gg_088) — a common Comparison involving the free-tier Bark option, where GoGuardian's paid Beacon can make a superior commercial case.
Queries affected: gg_072, gg_075, gg_082, gg_083, gg_088, gg_094, gg_097
The /competitor-Comparison page does not address the 'We're running Lightspeed and Gaggle separately — would switching to a single platform save money?' scenario (gg_100) — Lightspeed wins this query from a buyer already using its product by providing a consolidation-with-Lightspeed narrative. The page does not contain a 'Business Case for Consolidating to GoGuardian' section (gg_129) with specific ROI arguments for replacing separate filtering, safety, and classroom management vendors with GoGuardian's suite. No executive summary format or cost Comparison template exists (gg_147) — a buyer who needs to present a consolidation analysis to a Superintendent cannot construct it from GoGuardian's current content.
Queries affected: gg_100, gg_129, gg_147
The /competitor-Comparison page does not provide a structured 'one platform vs. best-of-breed' decision framework (gg_017) — a buyer asking 'should we consolidate or use separate tools?' cannot find GoGuardian's reasoned answer on this page. The page does not address the 'is an all-in-one platform like Gaggle better or should we use best-of-breed?' query (gg_080) — it positions GoGuardian but does not directly engage the consolidation architecture question that Gaggle wins by default. Success stories (Pickerington, Camdenton) are linked but do not appear to contain specific vendor consolidation outcome data that would answer 'what did consolidation actually save us?' in a citable format.
Queries affected: gg_017, gg_048, gg_080
The /admin page has no ROI or payback period section — buyers asking 'typical payback period for web filtering deployment' (gg_133) cannot find a GoGuardian-sourced answer, and Lightspeed wins by default because it publishes ROI data. The /admin page has no board justification content — buyers asking 'how to justify spending on web filtering to a school board' (gg_127) need specific ROI framing, regulatory compliance arguments, and incident prevention data that the page does not provide. The /admin page does not provide an RFP template or feature Comparison framework (gg_137), missing the artifact-creation query pattern where buyers want a structured evaluation document they can customize.
Queries affected: gg_085, gg_127, gg_133, gg_137, gg_146
The /admin page lists features but does not frame them as evaluation criteria — buyers asking 'what features matter most for a 10,000-student district?' cannot use the page to build a shortlist or score vendors. The /admin page has no content specifically addressing VPN bypass prevention as a Shortlisting criterion (gg_051), leaving Lightspeed — which explicitly documents bypass resistance — to win this positioning. The /admin page does not address the 'overblocking alternatives' query pattern (gg_066) — buyers seeking an alternative to their current filter that blocks too much educational content cannot find GoGuardian's answer on this page.
Queries affected: gg_029, gg_044, gg_051, gg_066
The /admin page has no section addressing overblocking — it describes filtering capabilities but never acknowledges or solves the 'our filter blocks half the educational sites teachers need' problem buyers bring to the evaluation. The /admin page has no content on VPN/proxy bypass prevention — a top-of-funnel problem query (gg_007) that Lightspeed wins because its content explicitly describes bypass-resistance mechanisms. The /admin page does not explain the difference between agent-based and DNS-based filtering in terms buyers can act on — gg_015 queries this distinction and the page provides no citable answer.
Queries affected: gg_001, gg_005, gg_007, gg_015
The /admin/vs-competitors page does not contain a dedicated 'GoGuardian Admin vs. Lightspeed Filter' section with a structured feature-by-feature Comparison table — the primary Comparison query (gg_069) covering a 10,000-student district context is not specifically addressed. The page has no visible publication date (L1 finding), which reduces AI citation confidence for all Comparison content — buyers asking 'which cloud web filter is better for a Chromebook-heavy district?' need temporally credible data. The page does not address the budget/scale dimension (gg_096: Blocksi vs Lightspeed for smaller districts on tight budgets) — an opportunity to position GoGuardian's value across district sizes.
Queries affected: gg_069, gg_074, gg_090, gg_096
The /admin/vs-competitors page presents GoGuardian's strengths but does not address the specific complaints buyers are researching: Lightspeed's reported implementation complexity (gg_109), Securly's customer service issues (gg_103), and bypass vulnerability patterns (gg_121). The page does not contain a 'Migration from Lightspeed/Securly' section — buyers asking 'was the migration worth it?' (gg_118) need to see GoGuardian's specific migration story, not a generic Comparison table. The /admin/vs-competitors page has no visible publication date (L1 finding: comparison_pages_undated), which degrades AI citation confidence for any Comparison content it does contain.
Queries affected: gg_102, gg_103, gg_109, gg_118, gg_121
The /beacon page does not address the '200 alerts a day and counselors are ignoring them' alert fatigue problem (gg_010) — it describes Beacon's detection capabilities but does not explain how GoGuardian reduces noise and prioritizes actionable alerts over raw volume. The /beacon page does not explain how AI-based detection differs from keyword-only monitoring (gg_016) — a solution-exploration query that determines whether a buyer considers moving beyond their current keyword tool. The /beacon page does not address human-reviewed vs. fully automated AI detection (gg_020) — a positioning question that differentiates GoGuardian from Gaggle (human-reviewed) and Securly (automated).
Queries affected: gg_002, gg_010, gg_016, gg_020
The /beacon page does not include a liability/duty-of-care section (gg_131) addressing the question 'what is a district's liability if it doesn't deploy student safety monitoring?' — Gaggle wins this framing by publishing governance and legal risk content. The /beacon page lacks case studies with specific, quantified safety outcomes (gg_128) — Gaggle leads this query by publishing district-level incident reduction data. GoGuardian has case study content but it's fragmented across blog posts rather than concentrated on the product page. The /beacon page does not provide ROI data for student safety monitoring investment (gg_126) — superintendents need to justify cost to boards, and the page provides no financial justification framework.
Queries affected: gg_126, gg_128, gg_131, gg_142, gg_143
The /beacon page does not include a 'Must-Have Features for Student Safety Platforms' section (gg_031) — buyers building vendor scorecards cannot use the page to develop evaluation criteria, so they rely on competitor pages that do provide this structure. The /beacon page does not address false positive rates as an evaluation criterion (gg_042, gg_050) — Gaggle explicitly markets its human-review approach as a low-false-positive solution, and GoGuardian has no equivalent claim on the /beacon page. The /beacon page does not include a security and privacy requirements checklist (gg_032) — network administrators evaluating student monitoring platforms need to know what data is collected, how it is stored, and what FERPA/COPPA compliance certifications apply.
Queries affected: gg_031, gg_032, gg_042, gg_050
The /beacon page does not prominently feature GoGuardian Beacon's 24/7 off-hours monitoring capability (gg_053) as a top-level claim — the product-update/beacon-24-7 page exists but is buried, and the main /beacon page does not reference it prominently enough for AI extraction. The /beacon page does not explicitly address monitoring coverage for both Google Workspace and Microsoft 365 environments (gg_065) — multi-platform coverage is a Shortlisting requirement for districts running mixed productivity environments. The page wins 'top student safety platforms for self-harm detection' (gg_045) but this win should be reinforced with more extractable claims — the page's win may be fragile if competitors improve their content depth on this query.
Queries affected: gg_045, gg_053, gg_065
The /beacon page does not include specific performance data (detection rate, response time, false positive rate) that would allow AI models to confidently recommend GoGuardian over Gaggle or Securly in Validation queries (gg_104, gg_111). The /beacon page does not link to or integrate privacy documentation from /privacy-and-trust in a way that addresses 'data privacy concerns' Validation queries (gg_122) — the privacy page gets 2 citations but only from a Validation query about Securly, not GoGuardian. The /beacon/vs-competitors page has no visible date (L1 finding), reducing citation confidence for all competitive claims made on the page, including GoGuardian vs Securly self-harm detection (gg_070).
Queries affected: gg_070, gg_104, gg_111, gg_113, gg_122
The /windows page documents Windows support but does not address the 'we have Chromebooks, Windows laptops, and iPads — how do we enforce consistent filtering across all of them?' problem scenario (gg_004) — the cross-platform consistency question requires content that bridges all three OS pages. The /windows and /apple pages do not address the appliance-to-cloud migration scenario (gg_021) — districts switching from an on-premise appliance filter to a cloud-native solution need GoGuardian's cloud architecture described in migration terms, not just feature terms. The /windows page does not address GoGuardian Chromebook performance complaints (gg_108) — a GoGuardian-specific Validation query that should be answered authoritatively on GoGuardian's own site but currently yields no clear winner.
Queries affected: gg_004, gg_021, gg_034, gg_106, gg_108, gg_116
The /teacher/vs-competitors page does not include a 'Dyknow vs LanSchool — and Why GoGuardian Teacher Is the Better Choice' section (gg_071, gg_091) — buyers evaluating these two competitors are unreachable by GoGuardian's current content architecture. The page does not address the 'Hapara vs Dyknow for Google Workspace-heavy districts' Comparison (gg_087) — a query where GoGuardian's Chromebook-native approach should be a relevant alternative but is not introduced. The page has no visible date (L1 finding: comparison_pages_undated), reducing all competitive claims to unverifiable historical assertions.
Queries affected: gg_071, gg_076, gg_087, gg_091
The /hall-pass page does not address the 'how digital hall pass works compared to paper passes' solution-exploration query (gg_022) — a fundamental buyer education question that the page should answer to enter the consideration set for districts not yet using digital passes. The page does not provide a 'what features should a district-wide deployment require' framework (gg_040) — districts replacing paper passes need deployment criteria, not just product marketing. The page does not address known problems and complaints about digital hall pass systems (gg_123) — a Validation-stage query where GoGuardian's transparency would outcompete absent competitor responses.
Queries affected: gg_022, gg_040, gg_123, gg_150
The /teacher page has no section providing evidence that classroom management software improves instructional time or student engagement (gg_136) — curriculum directors and superintendents need this research to justify the investment to skeptical teachers. The /teacher page does not address how to convince teachers to adopt classroom management software (gg_132) — a change-management question that is as important as the product selection decision for districts concerned about teacher resistance. No classroom management rollout template or teacher adoption plan exists (gg_140, gg_148) — artifact creation queries represent the final stage of consensus building, and GoGuardian has no resources to support this buyer workflow.
Queries affected: gg_132, gg_136, gg_140, gg_148
The /teacher page does not open with the teacher's problem ('half the class is on YouTube and I can't teach') — it presents GoGuardian's features without connecting them to the specific distraction crisis curriculum directors are researching when they search gg_003. The /teacher page does not provide specific Google Workspace for Education integration details (gg_019) — how GoGuardian Teacher integrates with Google Classroom, which Google APIs it uses, and what data it accesses from Google Workspace. The page does not address the 'questions to ask classroom management vendors about teacher usability and adoption' query (gg_030) — a requirements-framing question that, if answered by GoGuardian's content, would make GoGuardian's evaluation criteria the standard.
Queries affected: gg_003, gg_019, gg_030
The /teacher page does not have an explicit 'Easy for Non-Technical Teachers' positioning section — gg_052's 'low tech skills can actually learn quickly' query finds no clear winner because no vendor prominently owns this positioning, leaving an open opportunity for GoGuardian. The page does not quantify teacher onboarding time or adoption metrics — statements like 'easy to use' without evidence are less citable than specific claims like 'teachers reach proficiency in one class session' or 'average setup time: X minutes.' The 'control all student tabs from one screen' capability (gg_064) is a GoGuardian win but may be fragile — the page should reinforce this specific claim with more extractable specifics to defend the win.
Queries affected: gg_046, gg_052, gg_064
The /teacher page has no section addressing the 'common complaints about GoGuardian from teachers' query (gg_107) — GoGuardian's own page is silent on this, meaning AI models cannot cite GoGuardian's perspective and must defer to third-party review sites (which may present negative framing). The /teacher page does not document Dyknow's limitations or LanSchool's contract complexity (gg_105, gg_119) — competitors that GoGuardian beats head-to-head (Dyknow 4-2, LanSchool 4-1) but does not proactively position against in specific weakness content. The /teacher/vs-competitors page has no visible date (L1 finding: comparison_pages_undated), reducing AI citation confidence for any Comparison content it contains.
Queries affected: gg_105, gg_107, gg_119
Net new content addressing visibility and positioning gaps. Owner: Content Strategy. Timeline: Months.
E-Rate funding is the lifeblood of K-12 technology purchasing, and CIPA compliance is the gate every Superintendent and CTO must pass to access it. GoGuardian's product demonstrably meets CIPA requirements, but its content provides no proof — no compliance guide, no E-Rate audit checklist, no FERPA/COPPA explainer a buyer can cite to a board or auditor. Lightspeed fills this vacuum with structured documentation, winning these queries by default. The commercial cost is direct: every district asking 'will this hold up in an E-Rate audit?' is sent to a competitor before GoGuardian enters the conversation. For districts — including mid-size buyers where GoGuardian's product is well-suited — this content absence functions as a pre-qualification failure.
ChatGPT (high): ChatGPT leads visibility across this audit by 5pp over Claude and Gemini. Compliance queries with structured checklist format match ChatGPT's tendency to cite pages with enumerable, structured claim sets. Claude (medium): Claude favors well-structured, factually grounded content with clear authority signals. A compliance page with regulatory citations (specific CIPA sections, FCC E-Rate program rules) would meet Claude's depth threshold for citation. Gemini (high): Gemini responds strongly to structured data and comprehensive entity coverage. FAQPage and HowTo schema markup on a compliance page combined with explicit regulatory entity references (CIPA, E-Rate, FERPA) would maximize Gemini extraction.
The shift to 1:1 device programs — where students take Chromebooks and iPads home — has created a buying requirement most legacy edtech vendors fail to address: what happens when the device leaves campus? GoGuardian's product provides off-network protection and parent visibility features, but no content page makes this claim in a form AI models can extract and cite. Securly and Bark for Schools win these queries by publishing specific parent-app documentation and off-campus monitoring explainers. For Director of Student Services and Superintendent personas — both veto holders — the absence of this content functions as a deal-disqualifying gap at requirements and Shortlisting stages where competitors are already present.
ChatGPT (high): Parent engagement and off-network queries are specification-heavy questions. ChatGPT's stronger overall performance in this audit (+5pp vs Claude and Gemini) makes it the highest-impact platform for a structured product feature page in this cluster. Claude (high): Queries like 'board-level argument for off-campus device protection' require nuanced framing and factual depth. A page with specific incident statistics and duty-of-care legal framing would match Claude's citation preferences. Gemini (medium): Gemini responds to structured data. An /off-network-protection page with explicit OS/device coverage tables and integration specs would be extractable, but Gemini's slightly lower overall visibility in this audit signals a need for stronger structured markup.
YouTube is the single most contested content moderation battleground in K-12 schools — teachers need educational videos while administrators need to block inappropriate content, and buyers ask explicitly who handles this tension best. GoGuardian's web filter has YouTube filtering capabilities, but its content inventory contains no page that answers 'how granular are your YouTube controls?' or 'can I allow educational YouTube without opening everything?' Lightspeed wins this positioning vacuum. Similarly, BYOD filtering — student-owned devices on school networks — is a growing requirement that GoGuardian's DNS-based filtering can address, but no content page makes this claim. These 10 queries are resolved before GoGuardian enters the conversation.
ChatGPT (high): Technical feature queries (YouTube filtering granularity, agentless BYOD) perform well on ChatGPT, which leads this audit by 5pp. A structured feature page with explicit capability claims and Comparison tables matches ChatGPT's citation pattern. Claude (medium): Claude favors factual depth and authoritative framing. A YouTube filtering page that cites specific methodology (DNS vs agent-based, SafeSearch API integration) would meet Claude's depth requirements. Gemini (high): Gemini responds strongly to structured data. Device compatibility tables, filtering level comparisons, and explicit entity relationships (GoGuardian DNS + BYOD + school network) would make this content highly extractable for Gemini.
K-12 technology budgets are under rising board-level scrutiny, and the question 'which of our edtech licenses are actually being used?' has become a deal-stage conversation. GoGuardian's platform generates usage data as a byproduct of its filtering and monitoring functions — data that could directly answer Superintendent and CTO concerns about software ROI and license waste. But GoGuardian's content never frames this as a value proposition. Lightspeed wins these queries by explicitly marketing its analytics as an edtech spend management tool. These are Superintendent and CTO queries at consensus-creation and requirements stages — the final commercial gate before a purchase decision — where GoGuardian's data advantages could close deals that its product capabilities have already earned.
ChatGPT (high): Data-driven queries ('which licenses are actually being used', 'how to justify edtech spend') match ChatGPT's preference for content with specific, extractable data points. A page with real usage statistics and ROI calculations would be highly citable. Claude (high): Claude favors content with clear causal reasoning. An edtech ROI page that explains the data collection mechanism, what insights it surfaces, and how those insights connect to budget decisions would match Claude's citation criteria. Gemini (medium): Gemini responds to structured data. A usage analytics page with structured data tables (feature comparisons, ROI formulas) would score well, but Gemini's slightly lower overall visibility in this audit requires additional structured markup to maximize extraction.
GoGuardian appears in 100% of Shortlisting queries across this audit — but wins 0 of 25 (0% Shortlisting win rate). This paradox has a structural explanation: AI models retrieve GoGuardian's feature and product pages when buyers ask 'best tools for X' or 'compare X vs Y,' but those page types don't contain the third-party-validated, Comparison-structured content that earns a primary recommendation. Meanwhile, 6 queries — including 'GoGuardian implementation problems,' 'hidden costs of GoGuardian,' and 'Is GoGuardian a good choice for a mid-size district?' — have no coverage entry at all, meaning GoGuardian never appears when buyers are actively vetting it. A competitor's page can answer 'GoGuardian implementation problems' if GoGuardian's own site cannot. These 14 queries represent the most structurally tractable gap in the L3 set: new page types (validated Comparison matrices, transparent implementation guides, TCO models) would directly address the Shortlisting win-rate failure.
ChatGPT (high): Shortlisting and Comparison queries favor ChatGPT in this audit (it leads by 5pp overall). Structured Comparison content with explicit feature matrices and third-party Validation signals match ChatGPT's tendency to synthesize competitor comparisons from well-structured source pages. Claude (high): Claude is well-suited to nuanced vendor evaluation queries. A 'GoGuardian for mid-size districts' page with specific customer outcomes, implementation evidence, and balanced discussion of limitations would match Claude's preference for substantive, balanced content. Gemini (medium): Gemini responds to structured entity data. Comparison matrix pages with schema markup (Table, ComparisonPage) and explicit entity relationships (GoGuardian + district size + feature coverage) would be highly extractable, but Gemini's citation gap in Comparison queries requires stronger structured data implementation.
All recommendations across all three layers, ranked by commercial impact × implementation speed.
10 out of 19 content marketing pages have confirmed publication dates older than 365 days, with the oldest dating to February 2018. Seven blog posts were published in 2018-2019 and have not been visibly updated. Only 2 blog posts (August 2025, May 2025) fall within the past 12 months, and none fall within the 90-day dominant AI citation window.
GoGuardian lacks a dedicated CIPA compliance and data privacy content hub, despite its web filter natively supporting CIPA-mandated controls. 11 of 59 L3 gap queries (18.6%) target CIPA compliance, E-Rate documentation, and FERPA/COPPA adherence — Lightspeed wins the majority by providing structured compliance documentation AI models can extract and cite.
GoGuardian has no dedicated content hub for parent communication features or off-campus device protection, two capabilities its product supports but its content does not represent. 13 of 59 L3 gap queries (22%) target these themes, and Securly, Linewize, and Bark for Schools win by default because they publish specific parent-engagement and take-home-device content GoGuardian does not.
The /beacon/vs-competitors page does not include a section addressing the 'Gaggle vs Securly' Comparison (gg_072, gg_083) — buyers evaluating these two vendors cannot find GoGuardian's perspective, so GoGuardian is excluded from their decision process.
14 of 59 L3 gap queries (23.7%) involve Comparison or Shortlisting contexts where GoGuardian's existing content uses the wrong page type — product and feature pages matched to queries requiring Comparison pages or case-study-style content — or GoGuardian has no coverage entry at all. Lightspeed wins the majority because it maintains structured Comparison and Shortlisting content architectures that GoGuardian's current site does not replicate.
The /competitor-Comparison page does not address the 'We're running Lightspeed and Gaggle separately — would switching to a single platform save money?' scenario (gg_100) — Lightspeed wins this query from a buyer already using its product by providing a consolidation-with-Lightspeed narrative.
The /competitor-Comparison page does not provide a structured 'one platform vs. best-of-breed' decision framework (gg_017) — a buyer asking 'should we consolidate or use separate tools?' cannot find GoGuardian's reasoned answer on this page.
GoGuardian has usage analytics and reporting capabilities across its platform but no content hub that makes these capabilities visible to buyers evaluating edtech spend accountability. 11 of 59 L3 gap queries (18.6%) target edtech usage analytics, ROI measurement, and reporting visibility — and Lightspeed wins the majority by positioning its analytics features as an explicit value proposition for budget-conscious superintendents and CTOs.
GoGuardian has no dedicated content pages for YouTube filtering granularity or BYOD filtering, two feature areas its product supports but its content inventory classifies as 'thin.' 10 of 59 L3 gap queries (16.9%) target these capabilities, and Lightspeed and Securly win the majority by publishing specific YouTube control documentation and BYOD filtering architecture explainers.
The /admin page has no ROI or payback period section — buyers asking 'typical payback period for web filtering deployment' (gg_133) cannot find a GoGuardian-sourced answer, and Lightspeed wins by default because it publishes ROI data.
The /admin page lists features but does not frame them as evaluation criteria — buyers asking 'what features matter most for a 10,000-student district?' cannot use the page to build a shortlist or score vendors.
The /admin page has no section addressing overblocking — it describes filtering capabilities but never acknowledges or solves the 'our filter blocks half the educational sites teachers need' problem buyers bring to the evaluation.
The /admin/vs-competitors page does not contain a dedicated 'GoGuardian Admin vs. Lightspeed Filter' section with a structured feature-by-feature Comparison table — the primary Comparison query (gg_069) covering a 10,000-student district context is not specifically addressed.
The /admin/vs-competitors page presents GoGuardian's strengths but does not address the specific complaints buyers are researching: Lightspeed's reported implementation complexity (gg_109), Securly's customer service issues (gg_103), and bypass vulnerability patterns (gg_121).
The /beacon page does not address the '200 alerts a day and counselors are ignoring them' alert fatigue problem (gg_010) — it describes Beacon's detection capabilities but does not explain how GoGuardian reduces noise and prioritizes actionable alerts over raw volume.
The /beacon page does not include a liability/duty-of-care section (gg_131) addressing the question 'what is a district's liability if it doesn't deploy student safety monitoring?' — Gaggle wins this framing by publishing governance and legal risk content.
The /beacon page does not include a 'Must-Have Features for Student Safety Platforms' section (gg_031) — buyers building vendor scorecards cannot use the page to develop evaluation criteria, so they rely on competitor pages that do provide this structure.
The /beacon page does not prominently feature GoGuardian Beacon's 24/7 off-hours monitoring capability (gg_053) as a top-level claim — the product-update/beacon-24-7 page exists but is buried, and the main /beacon page does not reference it prominently enough for AI extraction.
The /beacon page does not include specific performance data (detection rate, response time, false positive rate) that would allow AI models to confidently recommend GoGuardian over Gaggle or Securly in Validation queries (gg_104, gg_111).
The /windows page documents Windows support but does not address the 'we have Chromebooks, Windows laptops, and iPads — how do we enforce consistent filtering across all of them?' problem scenario (gg_004) — the cross-platform consistency question requires content that bridges all three OS pages.
All four competitor Comparison pages (/admin/vs-competitors, /teacher/vs-competitors, /beacon/vs-competitors, /competitor-Comparison) have no visible publication or last-updated dates. These pages are classified as content_marketing and default to a freshness score of 0.2 under the scoring methodology.
The pricing page at /pricing scores 0.3 on content depth and 0.4 on heading hierarchy. It contains only a lead generation form with four brief paragraphs about pricing factors (enrollment, bundles, contract length, professional services) but no actual pricing tiers, ranges, per-student costs, or plan comparisons.
Our analysis method returns rendered page content as markdown text, which means JSON-LD schema blocks, meta description tags, and Open Graph tags are not visible in the output. We cannot confirm or deny the presence of structured data markup on any of the 41 analyzed pages.
The sitemap at https://www.goguardian.com/sitemap.xml contains 1,027 URL entries but none include lastmod, changefreq, or priority attributes. Only the required loc element is present for each URL.
The /teacher/vs-competitors page does not include a 'Dyknow vs LanSchool — and Why GoGuardian Teacher Is the Better Choice' section (gg_071, gg_091) — buyers evaluating these two competitors are unreachable by GoGuardian's current content architecture.
The /hall-pass page does not address the 'how digital hall pass works compared to paper passes' solution-exploration query (gg_022) — a fundamental buyer education question that the page should answer to enter the consideration set for districts not yet using digital passes.
The /teacher page has no section providing evidence that classroom management software improves instructional time or student engagement (gg_136) — curriculum directors and superintendents need this research to justify the investment to skeptical teachers.
The /teacher page does not open with the teacher's problem ('half the class is on YouTube and I can't teach') — it presents GoGuardian's features without connecting them to the specific distraction crisis curriculum directors are researching when they search gg_003.
The /teacher page does not have an explicit 'Easy for Non-Technical Teachers' positioning section — gg_052's 'low tech skills can actually learn quickly' query finds no clear winner because no vendor prominently owns this positioning, leaving an open opportunity for GoGuardian.
The /teacher page has no section addressing the 'common complaints about GoGuardian from teachers' query (gg_107) — GoGuardian's own page is silent on this, meaning AI models cannot cite GoGuardian's perspective and must defer to third-party review sites (which may present negative framing).
All 41 analyzed pages returned substantive content via our rendering pipeline, suggesting server-side rendering or pre-rendering is in place. However, we cannot confirm whether any pages rely on client-side JavaScript rendering that might fail for certain AI crawlers that do not execute JavaScript.
All three workstreams can start this week.
[Synthesis] Execution order is not optional: L1 technical fixes — particularly sitemap lastmod deployment and blog content refresh with visible dates — must precede L2 and L3 work because AI crawlers use freshness signals to prioritize re-crawl. New L3 content and refreshed L2 pages will not receive their true freshness credit until sitemap timestamps signal their update dates. L2 content optimizations (20 recommendations) then improve the 83 existing-page gaps before L3 NIOs add new content for the 59 invisible queries.
The 5 NIOs represent the highest structural leverage: these are entirely uncontested query sets where GoGuardian's product is capable but content is absent.