Using a "Scraper" as a Remote Control: Build Your Exclusive Intelligence Agent with n8n + Novada Without Writing a Single Line of Code
"Scraper," "API," "Automation"...
To be honest, the first time you heard these words, did your scalp feel numb? They feel like things from another world—incantations recited by programmers in their flannel shirts and the sound of mechanical keyboards late at night. Cool, but too far from the lives of us ordinary people.
What if I told you that these things, which sound profound and mysterious, are essentially just tools that everyone can easily use—as simple as your TV remote control or the Lego bricks your kids play with? Would you believe me?
Today, I want to peel away the intimidating outer shell of these technical terms and hand the truly useful core directly to you. By the end of this article, you will have the ability to build an automated information intelligence agent that works 7x24 hours for you without writing a single line of code.
Your "Clairvoyance" is Not Some Hacker Technique
Let’s talk about the word "scraper" (crawler) first. It sounds a bit sneaky, even slightly aggressive. Many people immediately associate it with hacking behaviors like stealing data or attacking websites.
This is truly a huge misunderstanding.
The web scrapers we are talking about only obtain publicly available information from websites. Its working principle is essentially no different from you opening a webpage in a browser and seeing its content. The only difference is that it is ten thousand times faster and ten thousand times more diligent than you.
You can imagine it as a scout with "clairvoyant" abilities.
For example, if you want to know the prices of all cameras on an e-commerce site, doing it manually requires you to click each product page one by one and then copy and paste the price into Excel. For thousands of products, this might take you days and nights without eating or drinking.
But for the "scraper" scout, you only need to give it one command: "Go to this website and copy back all the camera prices for me." Then, it will instantly send out countless "clones" to visit those thousands of pages simultaneously and deliver all the prices to you neatly within minutes.
It is not a thief; it is just an extremely efficient information porter, replacing countless instances of repetitive, boring browsing and copy-pasting for you.
A Powerful "Data Excavator": You Only Need a Remote Control
Okay, we know a scraper is a very powerful tool. But here’s the problem: for such a powerful tool, do I have to learn complex programming languages to control it?
In the past, yes. You had to set up your own environment, write your own code to simulate browser visits, and fight a battle of wits against various complex anti-scraping strategies, such as handling captchas and changing IP addresses. This process was enough to discourage 99% of non-professionals.
But now, times have changed.
A service called "Scraper API" has appeared on the market, such as one of our main characters today, the Novada Scraper API.
What is this? Don’t let the word "API" scare you.
You can imagine Novada as a heavy machinery company that has already built a powerful, fully intelligent "data excavator." This excavator can automatically handle various complex geological conditions (anti-scraping strategies), accurately mine the ore you want (data), and even process the extracted ore directly into standard-sized metal ingots (structured data).
And the API is the "remote control" for this super excavator.
This remote control is very simple, perhaps with just a few buttons:
1.Target Address Button: Tell the excavator which mountain to dig (input the URL you want to scrape).
2.Excavation Target Button: Tell the excavator specifically what ore to dig (for example, only price and reviews).
3.Start Button: Begin excavation.
As the user, you don’t need to understand how the excavator's engine is built, what material the tracks are made of, or how complex the hydraulic system is. The only thing you need to learn is how to press this remote control.
Novada Scraper API is just such a remote control. Through a simple command, you can mobilize the powerful scraper cluster behind it to work for you, and it will return clean, regular data directly to you. You are completely liberated from tedious technical details.
"Lego-style" Automation: Letting the Remote Control Move Itself
Now, we have a "clairvoyant" scout (scraper) and a "remote control" (Novada API) to easily command it. But currently, we still need to manually press the remote control once for it to work once.
Can we make it start automatically at 8:00 AM every morning, perform a scout mission, and then automatically organize the results into my spreadsheet?
Of course. This requires our third protagonist: n8n.
If Novada API is the remote control, then n8n is a magical "Lego toy table."
On this table, there are various "functional building blocks."
One block is called "Scheduled Alarm"; you can set it to ring once every day, every week, or every month at a certain time.
One block is called "Press Remote Control"; it is specifically used to press the button of the Novada API remote control we just mentioned.
There is also a block called "Write to Excel/Google Sheets," which can automatically fill the results obtained into your electronic spreadsheet.
There are even blocks for "Send Email," "Send DingTalk/Feishu Message," etc.
And "automation" here is no longer about writing complex programs. All you need to do is connect these functional blocks in the order you want with lines, just like playing with Legos.
For example, you can connect them like this:
"Scheduled Alarm" -> "Press Novada Remote Control" -> "Write results to Google Sheets"
Once connected, a complete, fully automatic workflow is born. It will automatically command Novada to scrape data at the time you set and then automatically save the data to your spreadsheet. No intervention from you is needed throughout the entire process.
This is the so-called "workflow automation," and in the world of n8n, it is just that intuitive and simple.
Pro-level Tutorial: Build Your E-commerce Price Monitor
After saying so much, are your hands itching to try it? Let’s get to work and combine the "Clairvoyance," "Remote Control," and "Lego Table" to make a truly automated tool of your own.
Our Goal: Automatically scrape the price and the latest user review of a specified product from an e-commerce website and save it to a file on your computer.
Preparation:
1.Register a Novada account and get the "Remote Control Key"
Visit Novada's official website and register an account. They provide a free trial quota, which is enough for today's exercise. After logging in, find your API Key on the dashboard or API documentation page. This is a string of characters that acts as your exclusive "remote control key"; you will need it later. Please keep it safe and do not disclose it.
2.Install n8n, bring the "Lego Table" home
For beginners, I strongly recommend installing the n8n Desktop version directly on your computer. It is completely free, and the installation process is as simple as installing any ordinary software. Go to the n8n website to download the installer for your computer system (Windows/Mac) and double-click to install.
After installing and opening it, you will see a clean canvas—this is your "Lego Table."
Start Building!
Step 1: Place the first block, Manual Trigger
Every n8n workflow needs a "trigger" to start. We’ll use the simplest "Manual Trigger" first.
Click the "+" sign in the middle of the canvas, type "Manual" in the search box, and select it. This block is now added to your canvas. It represents "I manually click once, and the workflow starts running."
Step 2: Place the core block, "Press the Remote Control"
This is the most critical step. We want to add a block that can "press the Novada remote control." In n8n, this operation is usually performed by a block called "HTTP Request."
1.Click the "+" sign to the right of the "Manual" node, search for "HTTP Request," and add it.
2.Now, we need to configure this "remote control button," telling it specifically how to press. Click the "HTTP Request" node you just added, and its settings panel will pop up on the right.
Please fill it out strictly according to the following instructions:
●Authentication: Select Header Auth.
●Name: Enter Authorization.
●Value: Enter Bearer (note the space after Bearer), then paste the API key you just copied from the Novada website. It should look like this: Bearer sk-xxxxxxxxxx.
●URL: This is where you fill in the address of the Novada "remote control receiver." According to Novada's documentation, it is usually https://api.novada.vn/v1/crawler.
●Options -> Add Option: Click to add an option, select Body Content Type, and then select JSON from the dropdown menu.
●Body Parameters: This is where you tell the "excavator" specifically where to dig and what to dig.
Click Add Parameter.
○Name: Enter url. Value: Enter the full URL of the e-commerce product you want to scrape, e.g., https://item.jd.com/1000xxxxxx.html.
Click Add Parameter again.
○Name: Enter element_selectors. Value: This is slightly more complex; we need to switch it to "Expression" mode. Click the icon to the right of the Value input box.
In the expression editor that pops up, we need to enter an "incantation" specifying the excavation target. The format of this incantation is fixed:
[
{
"name": "price",
"selector": ".price_color"
},
{
"name": "comment",
"selector": ".comment-item .comment-content"
}
]
Step 3: Test it! See what the "Excavator" brought back
After all settings are complete, click the "Execute Node" button below the "HTTP Request" node. n8n will immediately press the "remote control button." Wait a few seconds, and if everything goes well, you will see the results returned by Novada in the "Output" area on the right.
Step 4: Place the last block, save the results
We don't want the results to just stay in n8n. Let’s save them to a file on your computer.
1.Click the "+" sign to the right of the "HTTP Request" node, search for "Write to File," and add it.
2.Click this new node to configure it:
○File Name: Give the file a name, for example, ~/Desktop/price_watch.txt.
○Content: Price: {{ $json.body.data[0].price }} Comment: {{ $json.body.data[0].comment }}
○Append: Turn this switch on.
Final Step: Make it fully automatic
1.Delete the initial "Manual" node.
2.Click the "+" sign, search for "Cron," and add it.
3.Click the Cron node: Mode: Every Day, Hour: 9.
4.Connect the Cron node to the HTTP Request node.
5.Click the "Active" switch in the upper right corner.
Success! You have successfully created your own automated, 7x24 hour "information intelligence agent." It will silently help you monitor the information you want at the time you set and record it.