Welcome to this step-by-step guide on creating engaging "Would You Rather" videos completely automatically. We will build a powerful workflow that takes a simple topic, uses AI to generate questions, and then programmatically creates a polished video, ready for social media.

This tutorial is designed for users with minimal technical experience. We will be using three powerful tools:

Examples

Prerequisites

Before you begin, make sure you have accounts for the following services:

1. Setting up Your Airtable Base

Airtable will be our command center. Instead of building a table from scratch, you can simply clone our pre-configured template.

1.1. Clone the Airtable Base

  1. Open the following link in a new tab: "Entertainment" Airtable Base Template.
  2. In the top-left corner of the page, click the "Copy base" button.
  3. Airtable will ask you to select a workspace to add the new base to. Choose your desired workspace.
  4. The base, named "Entertainment", is now copied to your account and ready to use.

This base is shared with other automation tutorials, like Quizzes. Everything related to the "Would You Rather" videos is in the table named "Would you rather".

Airtable base for would you rather videos

1.2. Get Your Airtable Personal Access Token

N8N needs permission to access your new Airtable base. We'll do this using a Personal Access Token.

  1. Go to your Airtable developer hub.
  2. Click "Create new token".
  3. Give your token a name, like "N8N Integration".
  4. For Scopes, you must add data.records:read, data.records:write and schema.bases:read.
  5. For Access, click "Add a base" and select the "Entertainment" base you just cloned.
  6. Click "Create token". Copy this token and save it somewhere safe. You will need it in Part 3.

2. Getting Your API Keys

Next, we need to gather the API keys for OpenAI and JSON2Video.

2.1. Get Your OpenAI API Key

  1. Log in to your OpenAI account and go to the API Keys page.
  2. Click the "+ Create new secret key" button.
  3. Give the key a name, for example, "N8N Key".
  4. Click "Create secret key".
  5. Important: OpenAI will only show you this key once. Copy it immediately and save it somewhere secure.

2.2. Get Your JSON2Video API Key

  1. Log in to your JSON2Video Dashboard.
  2. Go to the "API Keys" section.
  3. Your primary API key is available there. For better security, it's a good practice to create a secondary key for N8N with "Render" permissions, but for this tutorial, the primary key will work fine.
  4. Copy the API key and save it securely.

3. Building the N8N Workflow

Now, let's set up the automation in N8N. We will use the provided workflow file to get started quickly.

N8N workflow for would you rather videos

3.1. Import the Workflow

  1. Download the workflow file by right-clicking and saving this link: would-you-rather-01-workflow.json.
  2. In your N8N canvas, go to File > Import from file... and select the JSON file you just downloaded.
  3. The complete workflow will appear on your canvas.

3.2. Update node settings

The imported workflow needs a few tweaks to work with your accounts.

Airtable nodes

We need to update the credentials for the two Airtable nodes, the search and update nodes.

Airtable search node:

Airtable update node:

OpenAI node

Now we need to update the OpenAI node.

HTTP Request Nodes

We need to update the HTTP Request nodes to use your JSON2Video API key.

Submit a new job node:

Check status node:

4. Run Your First Automated Video Creation!

Everything is now set up. Let's create a video.

  1. Go to your "Entertainment" base in Airtable.
  2. Create a new row in the "Would you rather" table.
    • In the Topic field, enter "Superpowers".
    • In the Language field, enter "English".
    • In the Voice Name field, enter en-US-BrianMultilingualNeural.
    • In the Voice Model field, enter azure.
    • In the Font field, enter Oswald Bold.
    • Set the Status to Todo.
  3. Go back to your N8N workflow.
  4. Click the "Test workflow" button in the bottom-left corner.

The workflow will now execute. You will see green checkmarks appear on each node as it successfully completes. The workflow will pause for 15 seconds at a time while it waits for the video to render. A typical video from this template takes about 3-4 minutes to complete.

Once finished, the "Airtable update record" node will execute, and you can check your Airtable base. The status will be updated to "Done", and the "Result" field will contain a URL to your freshly created video!

Localizing Your Videos into Other Languages

One of the most powerful features of this workflow is its ability to generate videos in multiple languages automatically. The AI can write scripts in Spanish, Korean, Japanese, and many other languages, and JSON2Video can render them with the correct characters. Here's how to set it up.

Step 1: Set the Target Language in Airtable

The entire localization process starts with a single field in your Airtable base. In the row for the video you want to create, simply enter the desired language into the Language column.

The N8N workflow sends this value to the OpenAI node, which will then generate all the questions, options, and voiceover text in that target language.

Step 2: Choose a Compatible Font

Standard fonts like "Oswald Bold" do not contain the characters needed for languages like Korean or Japanese. If you use a font that doesn't support the language, the text in your video will appear as empty squares (□□□) or simple the text will be missing.

To fix this, you must specify a font that supports your target language in the Font column in Airtable. You can use any font from Google Fonts.

Here are some recommended fonts for different languages:

Simply type the correct font name into the Font column in your Airtable row.

Step 3: Select a Matching Voice

Finally, the voiceover must match the language of the script. In the Voice Name column, you need to provide a voice code that corresponds to your target language.

You can find a complete list of available voices in the JSON2Video Azure Voice Catalog. For example:

Example: Creating a Video in Korean

To create a "Would you rather" video about "K-Pop" in Korean, your Airtable row would look like this:

Input variables Resulting video
  • Topic: K-Pop
  • Language: Korean
  • Voice Name: ko-KR-SunHiNeural
  • Voice Model: azure
  • Font: Noto Sans KR
  • Status: Todo

After filling out this row, running the N8N workflow will produce a complete video in Korean with the correct text characters and a native-sounding voiceover.

Using Alternative AI Models

This tutorial is pre-configured to use AI models that do not consume extra credits by the JSON2Video API: Azure for voice generation and Flux Schnell for image generation.

However, you have the flexibility to use other, more advanced models like ElevenLabs for voice or Flux Pro for images.

Important: Using these alternative models will consume additional credits from your JSON2Video account. Please review the Credit Consumption page for detailed pricing before proceeding.

Switching to ElevenLabs for Voiceovers

If you prefer the high-quality voices from ElevenLabs, you can easily switch the voice model directly in Airtable.

  1. In your Airtable row, go to the Voice Model column and change the value from azure to elevenlabs.
  2. Next, you must update the Voice Name column to a valid ElevenLabs voice. These are typically simple names like Rachel, Daniel, or Adam. You can find available pre-made voices in the ElevenLabs Voice Library.

That's it! When you run the workflow, it will now use your selected ElevenLabs voice to generate the audio, and the corresponding credits will be deducted from your account.

Switching to Flux Pro for Image Generation

If you prefer the high-quality images from Flux Pro, you can easily switch the image model directly in Airtable as well.

  1. In your Airtable row, go to the Image Model column and change the value from flux-schnell to flux-pro.

That's it! When you run the workflow, it will now use your selected Flux Pro model to generate the images, and the corresponding credits will be deducted from your account.

Customizing Your 'Would You Rather' Videos

The provided N8N workflow and JSON2Video template are designed to work perfectly out of the box, but their real power lies in customization. You can easily change the look, feel, and content of your videos. This section covers three levels of customization, from simple variable changes to advanced template editing.

Method 1: Easy Customization with Template Variables

This is the simplest way to change your video's appearance. The "Would you rather" template (ID: GSUiFX8nSbXwhWDHFWGp) has been built with several variables that act as easy-to-use switches for colors and fonts. You don't need to edit the template itself; you just need to add these variables to your N8N workflow.

To do this, open the "Submit a new job" (HTTP Request) node in N8N. In the "Body" tab, you can add new key-value pairs to the variables object.

Changing Colors

You can control the three main colors used in the rotating background and the central "OR" ball.

Changing Fonts

You can specify any Google Font for the different text elements.

Example N8N Body Configuration

Here is how you would modify the JSON body in the "Submit a new job" node to create a video with a new color scheme and font.


{
    "template": "GSUiFX8nSbXwhWDHFWGp",
    "variables": {
        "voice_name": "{{ $('Airtable').item.json['Voice Name'] }}",
        "voice_model": "{{ $('Airtable').item.json['Voice Model'] }}",
        "like_and_subscribe_voiceover_text": "{{ $json.message.content.like_and_subscribe_voiceover_text }}",
        "questions": {{ JSON.stringify($json.message.content.questions) }},
        "background_color1": "#0D3B66",
        "background_color2": "#FAF0CA",
        "background_color3": "#F95738",
        "options_font_family": "Bangers",
        "options_text_color": "#FFFFFF",
        "result_font_family": "Bangers",
        "result_text_color": "#F95738"
    }
}

Method 2: Refining the AI-Generated Content

If you want to change the *substance* of the videos—the tone, style, or number of questions—you should edit the prompt sent to the AI. This is done directly within the "OpenAI" node in your N8N workflow.

Click the node and find the "Messages" parameter. The "System" message contains the core instructions for the AI. You can modify this prompt to:

Important: When editing the prompt, be careful not to change the instructions related to the JSON output format. The workflow relies on receiving the data in that exact structure.

Method 3: Advanced Customization by Editing the Template

For complete control over animations, timing, sound effects, and layout, you will need to edit the JSON2Video template itself. This is an advanced method and requires care.

  1. Duplicate the Public Template: First, you need your own editable copy.
    • Go to the JSON2Video Visual Editor.
    • In the top menu, select Template > Open template by ID.
    • Enter the template ID: GSUiFX8nSbXwhWDHFWGp and click "Open".
    • Save your own copy by going to Template > Save template as. Give it a new name.
  2. Edit the Template JSON:
    • With your new template open, go to the top menu and select Template > Edit as JSON.
    • Here you can change core properties of the video. For example:
      • Timing: Find the variables inside the scene like timer_length or result_duration. Changing "timer_length": 4 to "timer_length": 6 will give the viewer two extra seconds to decide.
      • Sound Effects: Find the audio elements with IDs like tic_tac or tada. You can change their src URL to a different sound file or remove the element entirely to silence it.
      • Animations: The animations are controlled by keyframes and animate blocks. For instance, in the image1 element, you can change the easing property from "ease-out-elastic" to "ease-in-out-back" for a different visual effect.
    • Click "Apply" to save your JSON changes. The visual preview will update.
  3. Update Your N8N Workflow:
    • Go to the Template menu in the editor and select Show template ID to get the ID of your newly modified template.
    • In your N8N workflow, open the "Submit a new job" node and replace the old template ID with your new one.

Conclusion and Next Steps

You have now successfully built a powerful, fully automated video production pipeline. By combining the data management of Airtable, the workflow logic of N8N, and the rendering power of JSON2Video, you can generate an endless stream of content with minimal effort.

This is just the beginning. The skills you've learned in this guide can be adapted to create many other types of automated videos. Ready for your next project? Explore our other tutorials to learn how to create social media reels, dynamic slideshows, and more.

Happy automating!

Published on July 3rd, 2025

Author
Joaquim Cardona
Joaquim Cardona Senior Internet business executive with more than 20 years of broad experience in Internet business, media sector, digital marketing, online video and mobile technologies.