How to Build AI Agents: AI Directory Maker Part 1

How to Make an AI Agent for AI Directory business

Learn how to build AI agents as I share my progress towards my 100% AI business

I thought automating a business end-to-end with AI agents might just be straightforward in 2025. It’s already hitting walls 😅. I’ve tried 3 ways to scrape Google Trends, but it’s clear Google does not approve. So now I have to get creative.

How to build an AI agent: Me trying to retrieve Google Trends data with AI Agent
Me trying to retrieve Google Trends data with AI Agent

In this post I’ll share my plan for my first AI agent: AI Directory Prospector, as well as the early hiccups. This agent will need to imagine and validate ideas for new web directories, returning me the cream of the crop. Let’s learn how to build AI agents!

New here? Welcome! This is the journey of building a 100% automated AI business in 2025. You’re jumping in after we’ve already kicked things off, so you might want to catch up first.

Check out these key posts to get the full story—and don’t forget to subscribe for updates and exclusive perks:

AI Directory Maker: Directory Prospector Agent

How to Build AI Agents: Directory Prospector

For this part of our AI automated directory business we’ll need a few different tools which I think will be crucial for your business too.

Based on my recent reviews of n8n and Gumloop, and some other research rabbit holes, here’s what I think the flow for this first AI agent will look like:

AI Directory Maker: Part 1: Directory Prospector Agent

Breaking that down into processes & associated AI tools:

StepPrimary ToolSecondary Tools
Input Human (for now)
IdeateAI (GPT) via Workflow toolAPI (Search engine data)
Explore TrendsAPI (Google Trends data)AI Browser Agent
Niche ResearchAPI (Niche Report)NicheReport.io
Make DecisionAI (GPT) via Workflow tool
Domain ProspectingAPI (Woody’s Domain Prospector)
OutputJSON to feed next agent

So fundamentally to complete this agent, we’re going to need:

  • Workflow tool (Gumloop or n8n)
  • ProfitSwarm Architect API
    • Some middleman API calls (e.g. calling rapidapi for search engine data)
    • Some Browser agent runs (e.g. retrieving Google Trends data)
    • Some PHP routines (e.g. domain prospecting)

I completed part of this agent last week, so I can share a lot of how this’ll work – but there are still unknowns. How reliable will my browser agent automations be? Can I find an AI workflow tool which works within my long execution time windows?

Let’s dig in.

Before We Get Lost In the Workflows…

I’m aware some of these posts get a little technical – I’m still working out the best way of sharing this stuff, so if you just want the summary, read this and skip to the summary 😀. (Please reply and let me know in the comments if you did, so I can tweak how I share this with you.)

Too long; didn’t read:

I’m using a workflow tool (Gumloop) as the ‘bones’ of this agent, and my own code as the muscles. I’ll use existing APIs to grab data, as well as some tools I made before. I do this to keep the responsibility for most of the ‘work’ the agent does within my control.

AI workflow is bones, my code is the muscles

I’m part way through building this agent! Skip the technical and click here to read more.

Directory Prospector Workflow

It might be daunting to learn about all the different AI tools that might help your business be more profitable – imagine trying to make the whole thing out of them! Seriously though, if you’re here looking for ways to bring AI benefits to your company, I think you should start with a workflow tool.

AI tools everywhere, can you just imagine the future, entrepreneur!

The main ‘bones’ of this agent will be built in an AI workflow tool. Right now I’m split between n8n and Gumloop, but it’s not so important which workflow tool we use, all we need here is:

  • Ability to design cascading sequences of processes
  • Ability to nest workflows (so later we can combine all of these processes with the other processes of the business to make a single ‘team’ of agents)
  • Ability to use AI ‘tool agents’ to process / analyse / decide
  • Ability to call my API

This being the first proper experiment of my challenge, I’m going with a hybrid approach: part paid tools (workflow tools/API’s), part self-written code. I reserve the right to think this was madness later, or write my own workflow tool if needs be.

In any case, for now let’s work on the basis that we use a workflow tool to orchestrate the various operations.

The first thing the Prospector agent will need to do is call APIs…

Directory Prospector API Calls

APIs are the backend of the web. They let us ask remote services to give us specific data or do specific actions. My prospector agent will need to GET data and POST data, waiting for the response and then acting on what it gets back.

API calls are the backbone of an automated internet

Q: What API’s do you use? Let me know in the comments!


For Directory Prospector I’ve spun up a simplistic PHP API using old code and vibe coding.

Vibe Coding - Rick Rubin

I’ll host this API on my own server as it gives me the power to use workflow designers to do pretty much anything.

My API will need the following endpoints:

  • Retrieve search engine data
  • Retrieve Google Trends data
  • Build and return a Niche Report
  • Generate and return domain opportunities

The workflow tool will then use AI to iterate and think over what’s returned, ultimately doing its best to discern whether an idea (generated by me, or itself) is a viable one.

Directory Prospector Browser Agent 

You’d think scraping Google Trends data would be easy.

As I explored the different needs of this agent I could already vouch for a few being attainable. I’ve already got scripts to pull search engine data. I’ve already made NicheReport.io which generates niche reports, and it’s largely automated. I’ve got a very good domain name prospecting tool I wrote last year.

… but Google Trends? I hadn’t ever pulled data from it programmatically. 

So I tried it.

1. Established ‘Black Market’ APIs

RapidAPI isn’t really a ‘black market’, I joke. It is kind of a grey area sometimes though, as it’s a platform where individuals have made ‘middleman’ APIs which collect data, often from public sources like Google or social media platforms.

Sometimes these are super reliable and performant. 

Sometimes they don’t work at all.

There are many grey-hat Google Trends APIs
There are many grey hat Google Trends APIs

I tried 8 different Google Trends middlemen. None of them worked.

Google Trends API error 500 when AI Agent tries to call it

… onto the next option.

2. Online AI browser agents

AI browser agents are useful if you want to simulate a user doing something in a browser (in this case, we want to pretend we’re a user searching on the Google Trends page for keywords, and export the data there).

Effectively you give the agent a prompt, and it uses a given LLM to carry out what you’ve asked it to do in a browser (often headless), and then return you the data.

I tried this for Google Trends as I was reviewing Gumloop, you can read about the whole process here, but in essence, GOOGLE SAYS NO.

Google Blocked my Gumloop AI Agent

Google is not daft. Google knows people want to do things like this, using their services in a way they don’t want you to. Likely, Gumloop & other automation services are all blocked by Google precisely for this reason.

Proxies? I tried a few variants of these online AI browser agents, some look super promising, but every one hit this Google block, proxied or not.

So…

3. Local AI browser agent

If it’s possible to grab Google Trends data programmatically, it should be doable via a local AI browser agent. 

I absolutely should be able to simulate myself visiting Trends in my own browser, and then have it grab the results.

I will use Browser Use as my local AI Browser Agent
I will use Browser Use as my local AI Browser Agent

I’m not certain this’ll work, because Google is smart. 

But this is what I’m going to run with.

I’ll set up my macbook to accept incoming job requests (via my API), then to go through the steps needed to retrieve the data. Later I can move this onto another computer if it proves valuable.

This will be useful for other aspects of this challenge, so 🤞🏼.

Directory Prospector PHP routines

Ah old familiar PHP. 

I show my age, but PHP is my groove. I’ve done so many different things in PHP. I still use it as my go-to code language. If any of you are happier in Python or JS, you’ll probably have an easier time doing this sort of thing.

Making an API which leverages various tooling took less than an hour.

Directory Prospector: Reusable Logic

For me the value of using PHP is that I have a lot of existing code that I can reuse.

Reusable AI Agent Code

For example I can pretty much plug and play different tools I’ve built before into this agent:

  • Niche Report.io to build a snapshot report of the whole niche
  • Domain Prospector to generate and check availability for thousands of domain names
  • My search engine keyword research tool

My own API written in PHP, hosted on my server provides me some protection against changes in external systems (e.g. workflow tools) and will be quick to build.

Directory Prospector: Reliance & Risk

The only real risk I see in making this agent is its reliance on external services.

It looks as though you will be able to recreate these workflow agents in basically any of the AI Workflow tools. So that makes agents fairly portable. 

How to build AI Agents: By building my own AI Agent API I keep more of the dependencies within my control

There are several services we’ll have to rely on here, namely APIs like those on RapidAPI, and any services accessed by browser agents I write (e.g. Google Trends). So far I have backup plans for all of these; if my search engine data API goes down, for example, there are several others (though more expensive).

By building a chunk of this agent within my own API I take ownership of more of the process; pushing the reliance on external components to a minimum. I hope that’ll help me mitigate as much risk as I can.

It’ll also keep the value my side of the fence.

Follow my Work on Directory Prospector

If you’d like to see where I’ve got to with my Directory Prospect agent, I’ve set up an ‘experiment page’ here. Hopefully there’ll be ideas or automations you can use in your own business.

How to Build AI Agents: Come see my AI Agent Experiments

I’ll be updating that page as I achieve or fail at each stage of the workflow. But if you’re not into the minutia, no worries, I’ll give you summaries here when anything important happens 😅.

Next week I’ll share whether or not I managed to get this AI agent running.

Do you want the code?

Thanks for reading my directory prospector agent plan!

Quick question, do you want me to share the code I use for this challenge? I want to share as much as is useful, but I don’t want to flood you with stuff. Please do leave a comment below if you do.

Never stop building!

Comments

2 responses to “How to Build AI Agents: AI Directory Maker Part 1”

  1. Ray Clanan avatar
    Ray Clanan

    Yes, as a developer myself, I would love to see the code that you are using for all of this. Though I am not much of a PHP developer, I have done some in my previous years. Overall this experiment you are going on is sparking all sorts of ideas for myself and somethings I have had ideas on as well as expanding my knowledge around all of this. I really love how you are breaking everything down and explaining it, makes it very easy to understand these new concepts, or new to some of us.

    1. Woody Hayday avatar

      Thanks for the reply Ray.

      Will share code once I get it in tidy order 😀

      I’m grateful that this experiment is sparking up ideas, that’s the best place to be – a world of opportunities! How do you keep track of yours? (I have to keep a spreadsheet of them for my sanity).

      Thanks for your kind words, I’m doing my best to balance high-level/practical and keep it all pragmatic so we can learn what’s what out of this challenge.

      Appreciate the feedback, will keep at it.

Leave a Reply

Your email address will not be published. Required fields are marked *