Skip to main content
Can AI see it

Know what AI sees. Measure what it's worth.

What is Qwantbot?

Direct Answer: Qwantbot is a web crawler operated by Qwant, a European search engine that prioritizes user privacy, used for search indexing and web content discovery.

Operator: Qwant Type: Search Engine Crawler Purpose: Search indexing and web content discovery

Qwantbot is a web crawler designed by Qwant, a search engine based in Europe that focuses on protecting user privacy. The bot uses various user-agents to crawl the web and announces itself with a specific identifier, 'Qwantbot', in its user-agent strings. Qwantbot respects the robots.txt standard and provides methods for verifying its crawlers, including reverse and forward DNS lookups and IP range matching.

User-Agent Identification

The following user-agent strings identify Qwantbot in your live traffic data:

  • Mozilla/5.0 (compatible; Qwantbot/2.4w; +https://www.qwant.com/)
  • Mozilla/5.0 (compatible; Qwantbot-junior/1.0; +https://www.qwant.com/)
  • Mozilla/5.0 (compatible; Qwantbot-news/2.0; +https://www.qwant.com/)
  • Mozilla/5.0 (compatible; Qwantbot-official/1.0; +https://www.qwant.com/)
  • Mozilla/5.0 (compatible; Qwantbot-wikidata/1.0; +https://www.qwant.com/)
  • Mozilla/5.0 (compatible; Qwantbot-opt/1.0; +Qwantbot@qwant.com)

robots.txt Rules for Qwantbot

Respects robots.txt: Yes

Use the following robots.txt rules to control Qwantbot access:

# Block Qwantbot
User-agent: Qwantbot
Disallow: /

# Allow Qwantbot
User-agent: Qwantbot
Allow: /

Robots.txt is a directive, not a barrier

Qwant states that Qwantbot respects robots.txt. However, configuration mistakes, caching delays, and edge cases mean your directives may not always be followed as expected. Live traffic verification confirms whether Qwantbot actually obeys your rules in practice.

Need continuous verification across 500+ bots? Can AI See It automates this.

Crawl Behavior

Frequency:Not Documented

Request Pattern:Not Documented

Official Documentation Quotes

"The crawler respects the robots rules standard described at https://www.robotstxt.org/orig.html"

"To check if a web crawler accessing your server is from Qwant, perform a reverse DNS lookup and verify that it resolves to a name ending with “qwant.com”."

Crawl Activity Index

Relative crawl activity for Qwantbot over the past 28 days. Higher values indicate increased crawling intensity compared to the period baseline.

View recent activity data (last 7 days)
Date Activity Index
Mar 28, 2026 29.1
Mar 29, 2026 28.3
Mar 30, 2026 20.9
Mar 31, 2026 25.7
Apr 1, 2026 27.7
Apr 2, 2026 17.9
Apr 3, 2026 15.2

Source: Cloudflare Radar

Why track Qwantbot traffic?

Measure what Qwant gives back. Qwantbot crawls thousands of your pages — but how much traffic does Qwant actually send in return? Track referral visits from Qwant's search products relative to crawl volume.

Monitor crawl budget and indexation health. Qwantbot determines which of your pages appear in Qwant's search results. Tracking its crawl patterns reveals how often your key pages are visited, what gets ignored, and where crawl budget is wasted.

Detect crawl anomalies early. A sudden drop in Qwantbot activity can signal indexation problems — before they show up as organic traffic losses.

Catch 4XX and 5XX errors before they hurt rankings. If Qwantbot hits broken pages or server errors during crawling, those URLs may be dropped from the index. Early detection in your logs lets you fix the issue before it impacts your organic visibility.

Validate that your robots.txt rules are enforced. Configuring robots.txt is one thing — confirming that Qwantbot actually respects your directives is another. Live traffic validation is the only way to verify.

Why live traffic verification instead of Search Console? Search Console shows what Qwant tells you. Live traffic verification shows what actually happened — including AI-related crawling that Search Console doesn't report.

Read: Live traffic verification vs Search Console for crawl monitoring →

Log Verification

To verify Qwantbot traffic in your live traffic data:

  1. Search access logs for the user-agent strings listed above
  2. Check if the IP addresses match documented ranges (if provided by Qwant)
  3. Verify the crawl pattern matches documented behavior
  4. Use reverse DNS lookup for additional verification if available

IP Verification: Qwant provides official IP verification via Reverse DNS, Published IP ranges. View verification instructions →

Perform a reverse DNS lookup to verify that the crawler's IP address resolves to a name ending with 'qwant.com'. Optionally, perform a forward DNS lookup to confirm the IP address.

Note: Observed behavior in production environments may differ from official documentation. Live traffic monitoring provides the only reliable verification of actual bot behavior.

Undocumented Information

The following information is not officially documented for Qwantbot:

  • crawl frequency
  • request pattern
  • JavaScript rendering

Monitor Qwantbot alongside 500+ other bots

Track crawl health, detect anomalies, and measure how AI features are changing your referral traffic — all from your live traffic data.

  • Crawl frequency, coverage, and error monitoring for Qwantbot
  • Compare traditional organic referrals vs AI-generated referrals
  • Detect fake Qwantbot traffic (user-agent spoofing)

Measure business impact from Qwantbot

Crawl activity directly impacts organic visibility. The question is: is Qwantbot crawling the right pages at the right frequency?

  • Crawl coverage: which paths and page types Qwantbot is actually crawling
  • Crawl freshness: how recently Qwantbot visited key URLs
  • Health: response code distribution (2xx, 3xx, 4xx, 5xx) with alerts when failed crawls spike
  • Referral tracking: Qwantbot takes — measure what Qwant gives back. Track actual visits arriving from Qwant's products to your site.
Monitor Qwantbot crawl health →

Based on your live traffic data and analytics — not synthetic prompt tests.

Official Documentation

View Official Qwantbot Documentation →

Information sourced from official documentation. Content generated with AI assistance.