Parser API FAQ
Welcome to the ScrapeOps Parser API FAQ page. Here you will find answers to the most common questions users have when using the ScrapeOps Parser API.
Frequently Asked Questions
Overview
- What Is The ScrapeOps Parser API?
- How Is The Parser API Different From A Web Scraping API?
- Which Websites Are Supported?
Getting Started & Usage
- How To Use The Parser API?
- What Data Fields Are Extracted?
- Can I Use The Parser API With My Existing Scraping Setup?
- Can I Use The Parser API Through The Proxy API Aggregator?
Data Quality & Maintenance
Plans & Billing
- How Much Does A Parse Request Cost?
- What Is The Free Trial?
- What Happens If I Run Out Of Credits Before The End Of My Current Subscription?
- How Do I Cancel A Plan?
- How Do I Get A Refund?
Overview
What Is The ScrapeOps Parser API?
The ScrapeOps Parser API is a service that converts raw HTML into clean, structured JSON data. You send HTML from pages you've scraped, and the API extracts and returns the relevant data fields in a consistent JSON format.
This means you don't need to write or maintain your own parsing code - simply POST the HTML to our endpoint and receive structured data back.
How Is The Parser API Different From A Web Scraping API?
Web scraping APIs (like our Proxy API Aggregator) fetch pages and return HTML, handling proxies and anti-bot measures for you.
The Parser API is specifically for parsing: you provide the HTML, and we extract the structured data. This means you can use any scraping method or proxy provider you prefer, then send the HTML to our Parser API for extraction.
| Feature | Parser API | Proxy API Aggregator |
|---|---|---|
| Input | Raw HTML you provide | URL you want to scrape |
| Output | Structured JSON data | Raw HTML response |
| Use Case | Data extraction from HTML | Fetching pages with proxies |
| Proxy Handling | N/A - you handle proxies | Built-in proxy management |
Which Websites Are Supported?
We currently have managed parsers for:
Ecommerce
- Amazon - Product pages
- Walmart - Product pages
- eBay - Product pages
- Target - Product pages
Real Estate
- Redfin - Property listings
- Zillow - Property listings
Job Portals
- Indeed - Job listings
Search Engines
- Google - Search results, Maps, Patents, Scholar
- Bing - Search results
- Yandex - Search results
We're continuously adding new parsers based on user demand. Each parser is maintained by our team to ensure consistent data extraction even when page structures change.
Getting Started & Usage
How To Use The Parser API?
To use the Parser API, you need an API key that you can obtain by signing up for a free account here.
You then send a POST request to the appropriate parser endpoint with the HTML you want to parse:
import requests
# First, get the HTML (using your preferred method)
response = requests.get('https://www.amazon.com/dp/B08WM3LMJF')
if response.status_code == 200:
html = response.text
# Send to Parser API
data = {
'url': 'https://www.amazon.com/dp/B08WM3LMJF',
'html': html,
}
parse_response = requests.post(
url='https://parser.scrapeops.io/v1/amazon',
params={'api_key': 'YOUR_API_KEY'},
json=data
)
print(parse_response.json())
For more information, check out our Parser API documentation.
What Data Fields Are Extracted?
Each parser extracts the most relevant data fields for that page type. For example:
Amazon product pages return:
- Title, price, rating, review count
- Images (main and gallery)
- Product specifications
- Availability status
- Seller information
- And more...
Check our documentation for the complete schema for each supported website:
Can I Use The Parser API With My Existing Scraping Setup?
Yes! The Parser API works independently of your scraping infrastructure. You can use:
- Any proxy provider
- Any scraping framework (Scrapy, Puppeteer, Selenium, Playwright, etc.)
- Any programming language that can make HTTP requests
Simply POST the HTML to our API endpoint and receive JSON data in response.
Can I Use The Parser API Through The Proxy API Aggregator?
Yes! If you're using our Proxy API Aggregator, you can enable Auto Extract functionality which automatically parses supported pages and returns JSON data at no extra cost.
This combines fetching and parsing in a single API call - the Proxy API Aggregator fetches the page and automatically runs it through the Parser API before returning the structured data.
For more details, see the Auto Extract documentation.
Data Quality & Maintenance
What Happens When A Website Changes Its Page Structure?
Our team continuously monitors data coverage for all parsers. When a website updates its page structure, we update our parsers to maintain data coverage.
You don't need to do anything - the Parser API continues to return consistent data without any changes on your end. This is one of the main benefits of using managed parsers.
How Is Data Quality Maintained?
We continuously monitor the data coverage performance of all our parsers to ensure you always get the data you need. Our monitoring includes:
- Automatic detection of page structure changes
- Data field coverage tracking
- Parser performance metrics
If a parser starts returning incomplete data due to website changes, our team updates it promptly.
Plans & Billing
How Much Does A Parse Request Cost?
The Parser API uses a credit-based system where you are only charged for successful parse requests.
Each successful parse request consumes 1 API credit. Failed requests (4xx or 5xx errors) are not charged.
Free Trial
ScrapeOps offers 1,000 free API credits to test the Parser API with your target websites. No credit card required.
What Happens If I Run Out Of Credits Before The End Of My Current Subscription?
If you run out of API Credits before your plan's renewal date, one of three things can happen depending on your settings:
- Auto-Renew: If you enable the
auto-renewsetting, your plan will be automatically renewed when you consume all your API credits. Your used credits will be reset to zero and the renewal date will be reset. - Auto-Upgrade: If you enable the
auto-upgradesetting, your plan will be upgraded to the next plan up when you consume all your credits. - Blocked Account: If you enable the
do-nothingsetting, your account will be blocked and will return401error codes until your plan is renewed or upgraded.
Canceling A Plan
You can cancel your subscription at any time in your dashboard by going to the settings tab.
Getting A Refund
We offer a 7-day refund policy. If you are unhappy with the service, contact support and we will refund you right away.