A single n8n workflow that listens for a keyword, pulls the most relevant YouTube videos and Google Trends of the last 48 hours, writes an SEO-ready blog post and a matching X post with GPT, and publishes both automatically through Blogger and Buffer.
Three independent mini-flows live in the same canvas. Each one can be triggered on its own — so you can scout YouTube on demand, pull Google Trends on a schedule, and publish drafts once a human has marked them as ready.
The chat trigger accepts a keyword from you. n8n searches YouTube for the most relevant videos from the last 48 hours in Spain and writes each one as a new row in the youtube tab of your Google Sheet.
A separate branch hits Google Trends' public RSS feed for Spain, parses the XML, and appends the trending queries to the google trends tab of the same sheet — useful as a complementary signal to what's moving on YouTube.
After you tick Publish = yes on a row, the manual trigger reads it back and asks GPT-4.1-mini to draft (a) an SEO-optimised HTML blog post and (b) a short X post. The blog goes straight to Blogger; the X post is queued through Buffer.
Live blog output · lovepoemscreator.blogspot.com — every post was drafted and published by this workflow.
Template spreadsheet · Click to make a copy — opens the reference sheet with the two tabs (youtube, google trends) and column schema already in place.
Because they share the same Google Sheet as a lightweight database. The sheet lets a non-technical editor curate topics between the automation stages — decide which videos become posts, review the draft copy, rewrite if needed, and only then set the Publish column to yes.
This is what the three flows look like after you import the JSON. Each sticky-note band groups the nodes of one step, and you can execute any band independently while building.
Before you import the workflow, create the accounts below and have each one's API key or OAuth login on hand. None require a credit card to get started.
Sign up at n8n.cloud for the 14-day free trial, or run npx n8n locally. Every node you see in this guide runs inside n8n.
Enable the API in Google Cloud Console, create an OAuth 2.0 client, and authorise it against your Google account. n8n will handle the token refresh.
Use the reference spreadsheet template — click the link, Make a copy to your Drive, and the two tabs (youtube and google trends) with all the right columns are already there. Grab your copy's ID from the URL.
Sign up at platform.openai.com, add pay-as-you-go billing (a few cents per post), and copy your API key. The workflow uses gpt-4.1-mini.
Create a blog at blogger.com. Enable the Blogger API in Google Cloud Console, create an OAuth 2.0 client with the blogger scope, and grab the numeric blog ID from the Blogger dashboard URL. Live demo blog produced by this workflow: lovepoemscreator.blogspot.com — every post there was drafted and published end-to-end by the nodes below.
Sign up at buffer.com, connect your X (Twitter) account as a channel, then go to Account → Apps & Extras to create an API access token. Copy the channel ID from the channel's URL.
You'll paste them into the workflow once: the Google Sheets spreadsheet ID, the Blogger blog ID, the Buffer access token, the Buffer channel ID, and the numeric gid of your google trends tab (it's in the sheet's URL after #gid=).
If you just want to see it running, follow these four steps. The step-by-step build from scratch is in §05.
In n8n: Workflows → Import from File → pick the JSON. All 17 nodes land on the canvas, pre-wired and annotated.
REPLACE_WITH_*Open each node that shows a red warning. Swap placeholders for your real IDs and tokens — spreadsheet ID (from your copy of the template), Blogger blog ID (see the live demo blog), Buffer token + channel, and the trends tab gid.
Attach your YouTube, Google Sheets, OpenAI and Blogger OAuth credentials in the nodes that ask for them. Hit Execute workflow on either branch to run it.
Fourteen functional nodes plus three sticky notes. Read the three flows top-to-bottom and you can spot exactly where each piece of data comes from and where it's going.
Opens a hosted chat UI where you type a keyword. Whatever the user types lands in {{ $json.chatInput }} for the rest of the flow to consume. Great for on-demand scouting — replace with a Schedule Trigger if you want a nightly crawl.
Searches YouTube for videos matching chatInput, restricted to uploads from the last 48 h in Spain (regionCode: ES), ordered by relevance. Returns one item per video with snippet + id fields.
Writes one row per video to the youtube tab of your spreadsheet, mapping id.videoId → ID, snippet.title → Title, snippet.description → Description, snippet.channelTitle → Channel, snippet.publishTime → Publish Time, snippet.thumbnails.high.url → Thumbnail URL. The Publish column is left empty so an editor can tick it later.
Holds three tuneable parameters: min_traffic (500), max_results (3), and jina_key (empty for now). Acts as a single source of truth so you can change the thresholds without touching the RSS or Sheets nodes.
Fetches Spain's public trending-searches RSS feed. executeOnce: true means it fires once per workflow run, not once per upstream item. retryOnFail is enabled because the endpoint is sometimes flaky.
Turns the RSS payload into a JSON tree. explicitArray: false and normalize: true keep the output shallow and trim whitespace — easier to reference downstream via $json.rss.channel.item.
Writes the trending queries to the google trends tab. The Title column receives the whole rss.channel.item array so you can see every trend per run on a single row (or split it with a downstream Item Lists node if you prefer one-row-per-trend).
Manual kick-off for the publishing branch. In production, swap this for a Schedule Trigger firing every morning, or a Google Sheets Trigger that fires the moment a row's Publish column flips to yes.
Reads every row from the youtube tab where the Publish column equals yes. That's your curation gate — any row an editor hasn't marked is skipped.
The orchestrator. Its prompt template reads "Based on this title {{ Title }} and description {{ Description }}, generate a blog post optimised for SEO and an X post. Output minimised HTML without break lines." The node connects to two sub-nodes: the OpenAI model (C4) and a structured output parser (C5) that enforces a strict JSON shape.
Sub-node wired into the AI Agent's ai_languageModel input. Uses the cheap-and-fast gpt-4.1-mini; swap for gpt-4.1 if you want richer copy at higher cost.
Forces the model to return a strict JSON object with two branches: blog: { network, blog_title, blog_content } and x: { network, x_content }. Everything downstream references the same predictable shape.
Sends a signed OAuth2 request to Blogger with a JSON body containing title and content from the agent's blog output. Uses n8n's generic OAuth2 credential — so the same credential works for any Google API, not just Blogger.
Calls Buffer's GraphQL createPost mutation with the X post content, the target channelId, and mode: shareNow. Buffer then fans it out to the connected X account. Authorisation is a Bearer token in the header.
Skip this section if you imported the JSON. If you're building it to learn the n8n UI, follow the steps below. Everywhere it says Add node, press Tab inside n8n and type the name.
Video. Operation: Get Many. Return All: true. Filters — Q: {{ $json.chatInput }}, Region Code: ES, Published After: {{ new Date(Date.now() - 2 * 24 * 60 * 60 * 1000).toISOString() }}. Options → Order: relevance. Create a YouTube OAuth2 credential and link it. Connect Chat Trigger → YouTube.Append. Document: pick your spreadsheet. Sheet: youtube. Mapping: ID={{ $json.id.videoId }}, Title={{ $json.snippet.title }}, Description={{ $json.snippet.description }}, Channel={{ $json.snippet.channelTitle }}, Publish Time={{ $json.snippet.publishTime }}, Thumbnail URL={{ $json.snippet.thumbnails.high.url }}. Leave Publish blank. Link a Google Sheets OAuth2 credential.CONFIG. Add three fields: min_traffic (Number, 500), max_results (Number, 3), jina_key (String, empty).https://trends.google.com/trending/rss?geo=ES. Node settings → Execute Once: on. Retry On Fail: on. Connect CONFIG → HTTP Request.XML to JSON. Options → Explicit Array: false, Normalize: true. Connect HTTP Request → XML.google trends. Mapping: Title = {{ $json.rss.channel.item }}. Connect XML → new Google Sheets node.Get Row(s), same document, sheet youtube. Filters → Lookup Column: Publish, Lookup Value: yes. Wire Manual Trigger → Google Sheets.define. Text: Based on this title "{{ $json.Title }}" and description "{{ $json.Description }}" generate a Blog Post optimized for SEO and a X post. The output will be minimized HTML without break lines. Has Output Parser: on. Then add two sub-nodes: OpenAI Chat Model (model gpt-4.1-mini) wired into the agent's ai_languageModel slot; and Structured Output Parser with a JSON schema example describing { blog: { network, blog_title, blog_content }, x: { network, x_content } }, wired into ai_outputParser.https://www.googleapis.com/blogger/v3/blogs/<YOUR_BLOG_ID>/posts. Authentication: Generic Credential Type → OAuth2 API. Create a new OAuth2 credential with the Blogger scope. Header: Content-Type: application/json. Body: JSON, expression: {"kind":"blogger#post","title":{{ JSON.stringify($('AI Agent').item.json.output.blog.blog_title) }},"content":{{ JSON.stringify($('AI Agent').item.json.output.blog.blog_content) }}}. Wire AI Agent → this node.https://api.buffer.com/graphql. Headers: Authorization: Bearer <YOUR_BUFFER_TOKEN>, Content-Type: application/json. Body: JSON, expression that wraps a GraphQL createPost mutation with channelId, text: ($('AI Agent').first().json.output.x.x_content || '').trim(), mode: shareNow, schedulingType: automatic. Wire AI Agent → this node as well (agent has a second outgoing connection).Run each branch on its own first. Flow A via the chat trigger; Flow B via clicking Execute on the CONFIG node; Flow C by flipping Publish = yes on a test row and clicking Execute. Only activate the workflow once all three branches pass.
The workflow's shape generalises well. Below are the most useful tweaks — each is a single-node change.
Swap ES in the YouTube node's regionCode and in the Google Trends URL (?geo=ES) for any ISO country code. Spain → Mexico: MX. Argentina: AR.
The YouTube filter currently looks back 48 hours. Change 2 * 24 * 60 * 60 * 1000 to 7 * 24 * 60 * 60 * 1000 for a weekly window.
Replace gpt-4.1-mini with gpt-4.1 for richer copy — roughly 4–8× the cost per post, but noticeably better long-form structure.
Replace the Chat Trigger with a Schedule Trigger and hard-code the keyword in a Set node upstream of YouTube. Same for Flow C — fire it every morning instead of manually.
Buffer's createPost mutation works the same for every connected channel. Clone the C4 node, swap the channelId, and tweak the agent's output schema to include an ig_content or linkedin_content branch.
Insert a Wait + Send email after the AI Agent, CC'ing the editor with the draft + an "approve / reject" link. The publishing nodes fire only after a resume webhook is hit.