Welcome to this step-by-step guide on creating engaging "Would You Rather" videos completely automatically. We will build a powerful workflow that takes a simple topic, uses AI to generate questions, and then programmatically creates a polished video, ready for social media.
This tutorial is designed for users with minimal technical experience. We will be using three powerful tools:
- Airtable: To manage our video ideas and track their status.
- N8N: A workflow automation tool that will act as the brain of our operation, connecting all the services.
- JSON2Video: The video rendering engine that will take our script and turn it into a final MP4 video.
Examples
Prerequisites
Before you begin, make sure you have accounts for the following services:
- N8N (You can use the cloud version or self-host it).
- Airtable (A free account is sufficient).
- OpenAI (You will need an API key).
- JSON2Video (The free plan is perfect for starting out).
1. Setting up Your Airtable Base
Airtable will be our command center. Instead of building a table from scratch, you can simply clone our pre-configured template.
1.1. Clone the Airtable Base
- Open the following link in a new tab: "Entertainment" Airtable Base Template.
- In the top-left corner of the page, click the "Copy base" button.
- Airtable will ask you to select a workspace to add the new base to. Choose your desired workspace.
- The base, named "Entertainment", is now copied to your account and ready to use.
This base is shared with other automation tutorials, like Quizzes. Everything related to the "Would You Rather" videos is in the table named "Would you rather".

1.2. Get Your Airtable Personal Access Token
N8N needs permission to access your new Airtable base. We'll do this using a Personal Access Token.
- Go to your Airtable developer hub.
- Click "Create new token".
- Give your token a name, like "N8N Integration".
- For Scopes, you must add
data.records:read
,data.records:write
andschema.bases:read
. - For Access, click "Add a base" and select the "Entertainment" base you just cloned.
- Click "Create token". Copy this token and save it somewhere safe. You will need it in Part 3.
2. Getting Your API Keys
Next, we need to gather the API keys for OpenAI and JSON2Video.
2.1. Get Your OpenAI API Key
- Log in to your OpenAI account and go to the API Keys page.
- Click the "+ Create new secret key" button.
- Give the key a name, for example, "N8N Key".
- Click "Create secret key".
- Important: OpenAI will only show you this key once. Copy it immediately and save it somewhere secure.
2.2. Get Your JSON2Video API Key
- Log in to your JSON2Video Dashboard.
- Go to the "API Keys" section.
- Your primary API key is available there. For better security, it's a good practice to create a secondary key for N8N with "Render" permissions, but for this tutorial, the primary key will work fine.
- Copy the API key and save it securely.
3. Building the N8N Workflow
Now, let's set up the automation in N8N. We will use the provided workflow file to get started quickly.

3.1. Import the Workflow
- Download the workflow file by right-clicking and saving this link: would-you-rather-01-workflow.json.
- In your N8N canvas, go to File > Import from file... and select the JSON file you just downloaded.
- The complete workflow will appear on your canvas.
3.2. Update node settings
The imported workflow needs a few tweaks to work with your accounts.
Airtable nodes
We need to update the credentials for the two Airtable nodes, the search and update nodes.
Airtable search node:
- Double-click on the Airtable search record node (the one on the left):
- In the "Credentials" dropdown, select "Create New".
- Give it a name (e.g., "My Airtable Token").
- Paste the Airtable Personal Access Token you saved in step 1.2.
- Click "Save".
Airtable update node:
- Double-click on the Airtable update record node (the one on the right):
- In the "Credentials" dropdown, select the one you created above.
- Under "Base", select your "Entertainment" base.
- Under "Table", select the "Would you rather" table.
OpenAI node
Now we need to update the OpenAI node.
- Double-click on the "OpenAI" node.
- In the "Credentials" dropdown, select "Create New".
- Give it a name (e.g., "My OpenAI Key").
- Paste in your OpenAI API key you saved in step 2.1.
- Click "Save".
HTTP Request Nodes
We need to update the HTTP Request nodes to use your JSON2Video API key.
Submit a new job node:
- Double-click on the node.
- In the
Headers Parameters
section, update the value of thex-api-key
header with your JSON2Video API key you saved in step 2.2.
Check status node:
- Double-click on the node.
- In the
Headers Parameters
section, update the value of thex-api-key
header with your JSON2Video API key you saved in step 2.2.
4. Run Your First Automated Video Creation!
Everything is now set up. Let's create a video.
- Go to your "Entertainment" base in Airtable.
- Create a new row in the "Would you rather" table.
- In the Topic field, enter "Superpowers".
- In the Language field, enter "English".
- In the Voice Name field, enter
en-US-BrianMultilingualNeural
. - In the Voice Model field, enter
azure
. - In the Font field, enter
Oswald Bold
. - Set the Status to
Todo
.
- Go back to your N8N workflow.
- Click the "Test workflow" button in the bottom-left corner.
The workflow will now execute. You will see green checkmarks appear on each node as it successfully completes. The workflow will pause for 15 seconds at a time while it waits for the video to render. A typical video from this template takes about 3-4 minutes to complete.
Once finished, the "Airtable update record" node will execute, and you can check your Airtable base. The status will be updated to "Done", and the "Result" field will contain a URL to your freshly created video!
Localizing Your Videos into Other Languages
One of the most powerful features of this workflow is its ability to generate videos in multiple languages automatically. The AI can write scripts in Spanish, Korean, Japanese, and many other languages, and JSON2Video can render them with the correct characters. Here's how to set it up.
Step 1: Set the Target Language in Airtable
The entire localization process starts with a single field in your Airtable base. In the row for the video you want to create, simply enter the desired language into the Language column.
- For a Spanish video, you would type:
Spanish
- For a Korean video, you would type:
Korean
- For a Japanese video, you would type:
Japanese
The N8N workflow sends this value to the OpenAI node, which will then generate all the questions, options, and voiceover text in that target language.
Step 2: Choose a Compatible Font
Standard fonts like "Oswald Bold" do not contain the characters needed for languages like Korean or Japanese. If you use a font that doesn't support the language, the text in your video will appear as empty squares (□□□) or simple the text will be missing.
To fix this, you must specify a font that supports your target language in the Font column in Airtable. You can use any font from Google Fonts.
Here are some recommended fonts for different languages:
- Korean:
Noto Sans KR
- Japanese:
Noto Sans JP
- Chinese (Simplified):
Noto Sans SC
- Chinese (Traditional):
Noto Sans TC
- Thai:
Noto Sans Thai
- Arabic:
Noto Sans Arabic
- Spanish, French, German, etc.: Most standard Google Fonts will work, such as
Roboto
orOpen Sans
.
Simply type the correct font name into the Font column in your Airtable row.
Step 3: Select a Matching Voice
Finally, the voiceover must match the language of the script. In the Voice Name column, you need to provide a voice code that corresponds to your target language.
You can find a complete list of available voices in the JSON2Video Azure Voice Catalog. For example:
- For Spanish (Mexico), you might choose:
es-MX-DaliaNeural
- For Japanese, you might choose:
ja-JP-NanamiNeural
Example: Creating a Video in Korean
To create a "Would you rather" video about "K-Pop" in Korean, your Airtable row would look like this:
Input variables | Resulting video |
---|---|
|
|
After filling out this row, running the N8N workflow will produce a complete video in Korean with the correct text characters and a native-sounding voiceover.
Using Alternative AI Models
This tutorial is pre-configured to use AI models that do not consume extra credits by the JSON2Video API: Azure for voice generation and Flux Schnell for image generation.
However, you have the flexibility to use other, more advanced models like ElevenLabs for voice or Flux Pro for images.
Important: Using these alternative models will consume additional credits from your JSON2Video account. Please review the Credit Consumption page for detailed pricing before proceeding.
Switching to ElevenLabs for Voiceovers
If you prefer the high-quality voices from ElevenLabs, you can easily switch the voice model directly in Airtable.
- In your Airtable row, go to the Voice Model column and change the value from
azure
toelevenlabs
. - Next, you must update the Voice Name column to a valid ElevenLabs voice. These are typically simple names like
Rachel
,Daniel
, orAdam
. You can find available pre-made voices in the ElevenLabs Voice Library.
That's it! When you run the workflow, it will now use your selected ElevenLabs voice to generate the audio, and the corresponding credits will be deducted from your account.
Switching to Flux Pro for Image Generation
If you prefer the high-quality images from Flux Pro, you can easily switch the image model directly in Airtable as well.
- In your Airtable row, go to the Image Model column and change the value from
flux-schnell
toflux-pro
.
That's it! When you run the workflow, it will now use your selected Flux Pro model to generate the images, and the corresponding credits will be deducted from your account.
Customizing Your 'Would You Rather' Videos
The provided N8N workflow and JSON2Video template are designed to work perfectly out of the box, but their real power lies in customization. You can easily change the look, feel, and content of your videos. This section covers three levels of customization, from simple variable changes to advanced template editing.
Method 1: Easy Customization with Template Variables
This is the simplest way to change your video's appearance. The "Would you rather" template (ID: GSUiFX8nSbXwhWDHFWGp
) has been built with several variables that act as easy-to-use switches for colors and fonts. You don't need to edit the template itself; you just need to add these variables to your N8N workflow.
To do this, open the "Submit a new job" (HTTP Request) node in N8N. In the "Body" tab, you can add new key-value pairs to the variables
object.
Changing Colors
You can control the three main colors used in the rotating background and the central "OR" ball.
background_color1
: The first primary color.background_color2
: The second primary color.background_color3
: The color of the circle behind the "OR" text.options_text_color
: The color of the "Fast Food" / "Slow Food" text.or_text_color
: The color of the "OR" text.result_text_color
: The color of the percentage result (e.g., "15%").
Changing Fonts
You can specify any Google Font for the different text elements.
options_font_family
: The font for the main choices.or_font_family
: The font for the "OR" text.result_font_family
: The font for the percentage result.
Example N8N Body Configuration
Here is how you would modify the JSON body in the "Submit a new job" node to create a video with a new color scheme and font.
{
"template": "GSUiFX8nSbXwhWDHFWGp",
"variables": {
"voice_name": "{{ $('Airtable').item.json['Voice Name'] }}",
"voice_model": "{{ $('Airtable').item.json['Voice Model'] }}",
"like_and_subscribe_voiceover_text": "{{ $json.message.content.like_and_subscribe_voiceover_text }}",
"questions": {{ JSON.stringify($json.message.content.questions) }},
"background_color1": "#0D3B66",
"background_color2": "#FAF0CA",
"background_color3": "#F95738",
"options_font_family": "Bangers",
"options_text_color": "#FFFFFF",
"result_font_family": "Bangers",
"result_text_color": "#F95738"
}
}
Method 2: Refining the AI-Generated Content
If you want to change the *substance* of the videos—the tone, style, or number of questions—you should edit the prompt sent to the AI. This is done directly within the "OpenAI" node in your N8N workflow.
Click the node and find the "Messages" parameter. The "System" message contains the core instructions for the AI. You can modify this prompt to:
- Change the number of questions: Find the line that says "Create 5 questions" and change the number to your preference.
- Make it funnier or more serious: Modify the instructions about tone. For example, change "You can be funny and introduce randomly a joke" to "The tone should be serious and thought-provoking."
- Target a niche audience: Add instructions to focus on a specific topic, like "All questions should be related to 90s video games."
- Change the "OR" text: You can change the word used between options by modifying the
or_text
variable.
Important: When editing the prompt, be careful not to change the instructions related to the JSON output format. The workflow relies on receiving the data in that exact structure.
Method 3: Advanced Customization by Editing the Template
For complete control over animations, timing, sound effects, and layout, you will need to edit the JSON2Video template itself. This is an advanced method and requires care.
-
Duplicate the Public Template: First, you need your own editable copy.
- Go to the JSON2Video Visual Editor.
- In the top menu, select Template > Open template by ID.
- Enter the template ID:
GSUiFX8nSbXwhWDHFWGp
and click "Open". - Save your own copy by going to Template > Save template as. Give it a new name.
-
Edit the Template JSON:
- With your new template open, go to the top menu and select Template > Edit as JSON.
- Here you can change core properties of the video. For example:
- Timing: Find the variables inside the scene like
timer_length
orresult_duration
. Changing"timer_length": 4
to"timer_length": 6
will give the viewer two extra seconds to decide. - Sound Effects: Find the
audio
elements with IDs liketic_tac
ortada
. You can change theirsrc
URL to a different sound file or remove the element entirely to silence it. - Animations: The animations are controlled by
keyframes
andanimate
blocks. For instance, in theimage1
element, you can change theeasing
property from"ease-out-elastic"
to"ease-in-out-back"
for a different visual effect.
- Timing: Find the variables inside the scene like
- Click "Apply" to save your JSON changes. The visual preview will update.
-
Update Your N8N Workflow:
- Go to the Template menu in the editor and select Show template ID to get the ID of your newly modified template.
- In your N8N workflow, open the "Submit a new job" node and replace the old template ID with your new one.
Conclusion and Next Steps
You have now successfully built a powerful, fully automated video production pipeline. By combining the data management of Airtable, the workflow logic of N8N, and the rendering power of JSON2Video, you can generate an endless stream of content with minimal effort.
This is just the beginning. The skills you've learned in this guide can be adapted to create many other types of automated videos. Ready for your next project? Explore our other tutorials to learn how to create social media reels, dynamic slideshows, and more.
Happy automating!
Published on July 3rd, 2025
