Create Scraper
Tool to create a new empty scraper for your account. Returns a scraper_id that can be used with the generate endpoint to generate scraping code.
13 tools available
Parsera is a lightweight Python library for scraping websites using large language models (LLMs).
Tool to create a new empty scraper for your account. Returns a scraper_id that can be used with the generate endpoint to generate scraping code.
Tool to delete an existing scraper by its ID. Use when you need to remove a scraper that was created through the /v1/scrapers/new endpoint.
Tool to perform LLM-powered data extraction from a live webpage URL with specified attributes. Use when you need to extract structured data from web pages based on field descriptions.
Tool to extract markdown content from a file or URL.
Tool to retrieve standardized LLM capabilities and pricing specifications. Use to get up-to-date information about models from various providers.
Tool to retrieve the list of available proxy countries for web scraping requests. Use when you need to know which countries are supported for proxy-based scraping.
Tool to verify API availability and operational status. Use to check if the Parsera service is accessible before making other API calls.
Tool to retrieve all available agents for the authenticated user. Use when you need to list agents that can be used for scraping tasks.
Tool to list all templates and old scrapers for the authenticated user. Use when you need to retrieve available scraper configurations.
Tool to extract structured data from raw HTML or text content using AI with advanced options. Use when you have content already loaded and need to extract specific fields with pagination or different extraction modes.
Tool to delete an existing agent by name. Use when you need to remove a previously created agent from the Parsera platform.
Tool to run a scraper template on a specified URL with optional proxy and cookies. Use when you need to execute a pre-defined scraper template to extract structured data from web pages.
Tool to run a previously generated scraper agent on a specific URL to extract structured data. Use when you need to apply an existing scraper to a webpage.
Wire it up in minutes. No coding required.