Learn how to build AI agents as I share my progress towards my 100% AI business
I thought automating a business end-to-end with AI agents might just be straightforward in 2025. It’s already hitting walls 😅. I’ve tried 3 ways to scrape Google Trends, but it’s clear Google does not approve. So now I have to get creative.

In this post I’ll share my plan for my first AI agent: AI Directory Prospector, as well as the early hiccups. This agent will need to imagine and validate ideas for new web directories, returning me the cream of the crop. Let’s learn how to build AI agents!
New here? Welcome! This is the journey of building a 100% automated AI business in 2025. You’re jumping in after we’ve already kicked things off, so you might want to catch up first.
Check out these key posts to get the full story—and don’t forget to subscribe for updates and exclusive perks:
AI Directory Maker: Directory Prospector Agent
- How to Build AI Agents: Directory Prospector
- Before We Get Lost In the Workflows…
- Directory Prospector Workflow
- Directory Prospector API Calls
- Directory Prospector Browser Agent
- Directory Prospector PHP routines
- Directory Prospector: Reusable Logic
- Directory Prospector: Reliance & Risk
- Follow my Work on Directory Prospector
- Do you want the code?
How to Build AI Agents: Directory Prospector
For this part of our AI automated directory business we’ll need a few different tools which I think will be crucial for your business too.
Based on my recent reviews of n8n and Gumloop, and some other research rabbit holes, here’s what I think the flow for this first AI agent will look like:

Breaking that down into processes & associated AI tools:
Step | Primary Tool | Secondary Tools |
---|---|---|
Input | Human (for now) | |
Ideate | AI (GPT) via Workflow tool | API (Search engine data) |
Explore Trends | API (Google Trends data) | AI Browser Agent |
Niche Research | API (Niche Report) | NicheReport.io |
Make Decision | AI (GPT) via Workflow tool | |
Domain Prospecting | API (Woody’s Domain Prospector) | |
Output | JSON to feed next agent |
So fundamentally to complete this agent, we’re going to need:
- Workflow tool (Gumloop or n8n)
- ProfitSwarm Architect API
- Some middleman API calls (e.g. calling rapidapi for search engine data)
- Some Browser agent runs (e.g. retrieving Google Trends data)
- Some PHP routines (e.g. domain prospecting)
I completed part of this agent last week, so I can share a lot of how this’ll work – but there are still unknowns. How reliable will my browser agent automations be? Can I find an AI workflow tool which works within my long execution time windows?
Let’s dig in.
Before We Get Lost In the Workflows…
I’m aware some of these posts get a little technical – I’m still working out the best way of sharing this stuff, so if you just want the summary, read this and skip to the summary 😀. (Please reply and let me know in the comments if you did, so I can tweak how I share this with you.)
Too long; didn’t read:
I’m using a workflow tool (Gumloop) as the ‘bones’ of this agent, and my own code as the muscles. I’ll use existing APIs to grab data, as well as some tools I made before. I do this to keep the responsibility for most of the ‘work’ the agent does within my control.

I’m part way through building this agent! Skip the technical and click here to read more.
Directory Prospector Workflow
It might be daunting to learn about all the different AI tools that might help your business be more profitable – imagine trying to make the whole thing out of them! Seriously though, if you’re here looking for ways to bring AI benefits to your company, I think you should start with a workflow tool.

The main ‘bones’ of this agent will be built in an AI workflow tool. Right now I’m split between n8n and Gumloop, but it’s not so important which workflow tool we use, all we need here is:
- Ability to design cascading sequences of processes
- Ability to nest workflows (so later we can combine all of these processes with the other processes of the business to make a single ‘team’ of agents)
- Ability to use AI ‘tool agents’ to process / analyse / decide
- Ability to call my API
This being the first proper experiment of my challenge, I’m going with a hybrid approach: part paid tools (workflow tools/API’s), part self-written code. I reserve the right to think this was madness later, or write my own workflow tool if needs be.
In any case, for now let’s work on the basis that we use a workflow tool to orchestrate the various operations.
The first thing the Prospector agent will need to do is call APIs…
Directory Prospector API Calls
APIs are the backend of the web. They let us ask remote services to give us specific data or do specific actions. My prospector agent will need to GET data and POST data, waiting for the response and then acting on what it gets back.

Q: What API’s do you use? Let me know in the comments!
For Directory Prospector I’ve spun up a simplistic PHP API using old code and vibe coding.

I’ll host this API on my own server as it gives me the power to use workflow designers to do pretty much anything.
My API will need the following endpoints:
- Retrieve search engine data
- Retrieve Google Trends data
- Build and return a Niche Report
- Generate and return domain opportunities
The workflow tool will then use AI to iterate and think over what’s returned, ultimately doing its best to discern whether an idea (generated by me, or itself) is a viable one.
Directory Prospector Browser Agent
You’d think scraping Google Trends data would be easy.
As I explored the different needs of this agent I could already vouch for a few being attainable. I’ve already got scripts to pull search engine data. I’ve already made NicheReport.io which generates niche reports, and it’s largely automated. I’ve got a very good domain name prospecting tool I wrote last year.
… but Google Trends? I hadn’t ever pulled data from it programmatically.
So I tried it.
1. Established ‘Black Market’ APIs
RapidAPI isn’t really a ‘black market’, I joke. It is kind of a grey area sometimes though, as it’s a platform where individuals have made ‘middleman’ APIs which collect data, often from public sources like Google or social media platforms.
Sometimes these are super reliable and performant.
Sometimes they don’t work at all.

I tried 8 different Google Trends middlemen. None of them worked.

… onto the next option.
2. Online AI browser agents
AI browser agents are useful if you want to simulate a user doing something in a browser (in this case, we want to pretend we’re a user searching on the Google Trends page for keywords, and export the data there).
Effectively you give the agent a prompt, and it uses a given LLM to carry out what you’ve asked it to do in a browser (often headless), and then return you the data.
I tried this for Google Trends as I was reviewing Gumloop, you can read about the whole process here, but in essence, GOOGLE SAYS NO.

Google is not daft. Google knows people want to do things like this, using their services in a way they don’t want you to. Likely, Gumloop & other automation services are all blocked by Google precisely for this reason.
Proxies? I tried a few variants of these online AI browser agents, some look super promising, but every one hit this Google block, proxied or not.
So…
3. Local AI browser agent
If it’s possible to grab Google Trends data programmatically, it should be doable via a local AI browser agent.
I absolutely should be able to simulate myself visiting Trends in my own browser, and then have it grab the results.

I’m not certain this’ll work, because Google is smart.
But this is what I’m going to run with.
I’ll set up my macbook to accept incoming job requests (via my API), then to go through the steps needed to retrieve the data. Later I can move this onto another computer if it proves valuable.
This will be useful for other aspects of this challenge, so 🤞🏼.
Directory Prospector PHP routines
Ah old familiar PHP.
I show my age, but PHP is my groove. I’ve done so many different things in PHP. I still use it as my go-to code language. If any of you are happier in Python or JS, you’ll probably have an easier time doing this sort of thing.
Making an API which leverages various tooling took less than an hour.
Directory Prospector: Reusable Logic
For me the value of using PHP is that I have a lot of existing code that I can reuse.

For example I can pretty much plug and play different tools I’ve built before into this agent:
- Niche Report.io to build a snapshot report of the whole niche
- Domain Prospector to generate and check availability for thousands of domain names
- My search engine keyword research tool
My own API written in PHP, hosted on my server provides me some protection against changes in external systems (e.g. workflow tools) and will be quick to build.
Directory Prospector: Reliance & Risk
The only real risk I see in making this agent is its reliance on external services.
It looks as though you will be able to recreate these workflow agents in basically any of the AI Workflow tools. So that makes agents fairly portable.

There are several services we’ll have to rely on here, namely APIs like those on RapidAPI, and any services accessed by browser agents I write (e.g. Google Trends). So far I have backup plans for all of these; if my search engine data API goes down, for example, there are several others (though more expensive).
By building a chunk of this agent within my own API I take ownership of more of the process; pushing the reliance on external components to a minimum. I hope that’ll help me mitigate as much risk as I can.
It’ll also keep the value my side of the fence.
Follow my Work on Directory Prospector
If you’d like to see where I’ve got to with my Directory Prospect agent, I’ve set up an ‘experiment page’ here. Hopefully there’ll be ideas or automations you can use in your own business.

I’ll be updating that page as I achieve or fail at each stage of the workflow. But if you’re not into the minutia, no worries, I’ll give you summaries here when anything important happens 😅.
Next week I’ll share whether or not I managed to get this AI agent running.
Do you want the code?
Thanks for reading my directory prospector agent plan!
Quick question, do you want me to share the code I use for this challenge? I want to share as much as is useful, but I don’t want to flood you with stuff. Please do leave a comment below if you do.
Never stop building!
Leave a Reply