Understanding Crawl Results
A comprehensive guide to interpreting crawl results, including pages, accessibility, SEO, security, and performance reports.
Understanding Crawl Results
After a crawl completes, AegisRunner provides detailed results about every page discovered, forms detected, and issues found. Understanding these results helps you make the most of your automated testing.
Accessing Crawl Results
View crawl results in several ways:
- Crawl History - Click any crawl in the history list on the Crawl page
- Dashboard - Click a crawl in the Recent Crawls section
- Test Suites - Filter suites by crawl session to see related tests
Crawl Summary
The crawl summary shows high-level statistics:
| Metric | Description |
|---|---|
| Status | Completed, Running, or Failed |
| Pages Discovered | Total unique pages found during the crawl |
| Forms Detected | Number of HTML forms found across all pages |
| Duration | Total time taken to complete the crawl |
| Errors | Count of pages that failed to load or had issues |
Discovered Pages
For each page discovered, you can see:
- URL - Full path to the page
- Title - The page's HTML title
- Status Code - HTTP response code (200, 404, 500, etc.)
- Load Time - How long the page took to load
- Forms - Number of forms on the page
- Links - Number of links found on the page
- 2xx - Success (page loaded correctly)
- 3xx - Redirect (page redirects elsewhere)
- 4xx - Client error (page not found, forbidden)
- 5xx - Server error (server-side issue)
Detected Forms
Forms are automatically detected and categorized. For each form, you'll see:
- Form Name/ID - Identifier for the form
- Action URL - Where the form submits to
- Method - GET or POST
- Fields - List of input fields with their types
- Form Type - Login, registration, search, contact, etc.
Forms are grouped by crawl session in the Test Data → Detected Forms tab, making it easy to configure test data for each form.
Accessibility Results
When accessibility testing is enabled, each page includes:
- Accessibility Score - Overall score (0-100)
- Violations - List of accessibility issues found
- Severity - Critical, serious, moderate, or minor
- WCAG Level - Which WCAG criteria was violated (A, AA, AAA)
- Element - The specific HTML element with the issue
- How to Fix - Recommendations for resolving the issue
SEO Audit Results Pro+
Pro and Business plans include SEO analysis:
- Title Tag - Present, length, and content analysis
- Meta Description - Present and length check
- Headings - H1-H6 structure and hierarchy
- Images - Alt text presence on images
- Links - Internal vs external link ratio
- Mobile Friendly - Viewport and responsive checks
- Page Speed - Load time analysis
Security Audit Results Business
Business plans include security scanning:
- HTTPS - SSL/TLS configuration check
- Headers - Security headers analysis (CSP, HSTS, X-Frame-Options)
- Cookies - Secure and HttpOnly flags
- Mixed Content - HTTP resources on HTTPS pages
- Form Security - CSRF tokens, password field security
Error Analysis
When errors occur during crawling, the results include:
- Error Type - Network error, timeout, JavaScript error, etc.
- Error Message - Specific error details
- Page URL - Where the error occurred
- Screenshot - Visual capture at time of error (if Error Snapshots enabled)
HAR Files Pro+
When HAR recording is enabled, you can download HTTP Archive files containing:
- All network requests made during the crawl
- Request and response headers
- Timing information
- Resource sizes
HAR files can be analyzed in Chrome DevTools or HAR viewer tools.
Using Crawl Results
Generate Tests
Crawl results feed into AI test generation. The more pages and forms discovered, the more comprehensive your test suites will be.
Filter Test Suites
In the Test Suites page, filter by crawl session to see only the suites generated from a specific crawl.
Configure Test Data
Use detected forms to set up test data for form filling during test runs.
Troubleshooting Crawl Issues
Crawl found fewer pages than expected
- Check your max pages limit
- Increase crawl depth
- Review include/exclude patterns
- Some pages may require authentication
- JavaScript-rendered content may need longer wait times
Many pages showing errors
- Check if your server is blocking the crawler (rate limiting)
- Verify the base URL is correct
- Some pages may require login - use pre-authentication cookies
- Check for JavaScript errors in your application
Forms not being detected
- Forms may be dynamically rendered - ensure JavaScript executes
- Forms might be in iframes (not currently supported)
- Shadow DOM forms may not be detected
Related Documentation
- Starting a New Crawl - Configure crawl settings
- AI Test Generation - Generate tests from results
- Test Data Management - Configure form data