Introduction
In the digital age, short, engaging videos dominate social media feeds. Among the most captivating are quiz videos, which challenge viewers' knowledge and encourage interaction. These videos are incredibly popular because they offer a quick, fun way to learn new facts, test oneself, and even compete with friends. From "General Knowledge" to "Pop Culture Trivia," quiz videos keep audiences hooked, boosting engagement rates and making content highly shareable across platforms like TikTok, Instagram Reels, and YouTube Shorts.
Examples
Input variables | Resulting video |
---|---|
|
|
|
|
|
|
Overview of the automation
This tutorial outlines an automated workflow that generates quiz videos using n8n and JSON2Video. The process begins in Airtable, where you define quiz topics, difficulty, and language. N8n then triggers an OpenAI node to generate the quiz questions, answers, and accompanying voiceover scripts. This content is then sent to JSON2Video, which uses a pre-designed template to render a high-quality video. Finally, n8n updates your Airtable base with the URL of the newly created quiz video, making the entire process seamless and efficient.

Prerequisites
To follow this tutorial, you will need accounts and API keys for the following services:
- Airtable: We chose Airtable for data management because its robust API makes integration with no-code tools like n8n and Make.com significantly easier compared to Google Sheets. Airtable also offers a generous free tier, making it accessible without requiring a paid subscription.
- n8n: The open-source workflow automation platform.
- OpenAI: For generating quiz content and voiceover scripts using AI.
- JSON2Video: Our powerful API for programmatic video creation.
Build the automation
This section will guide you through setting up the necessary components and building the n8n workflow to automate your quiz video creation.
Setting the Airtable base
Clone the Airtable base
To begin, you'll need a structured Airtable base to store your quiz details. You can easily clone our pre-built template:
- Open the Airtable template.
- Click on the "Copy base" button located beside the base name at the top of the page.
- A new window will open. Select the destination workspace in your Airtable account where you'd like to copy the base.

The Airtable base includes the following fields:
Field | Description |
---|---|
ID | Auto-generated unique ID for each quiz. |
Topic | The subject of the quiz (e.g., "Everyday science"). |
Difficulty | The difficulty level of the quiz (e.g., "Easy", "Average", "Expert"). |
Language | The language for the quiz content and voiceover (e.g., "English", "Spanish"). |
Voice Name | The specific AI voice to use for the quiz voiceover (e.g., "en-US-AmandaMultilingualNeural"). |
Voice Model | The AI model for voice generation (e.g., "azure", "elevenlabs"). |
Font | The font family to use for text in the video (e.g., "Oswald"). |
Status | Current status of the video creation process ("Todo", "In progress", "Done"). |
Result | The URL of the generated video once completed. |
Get your Airtable personal access token
To allow n8n to connect with your Airtable base, you'll need a Personal Access Token (PAT). Follow these steps to obtain it:
- Go to your Airtable developer hub.
- Click "Create new token."
- Give your token a name (e.g., "n8n JSON2Video demos").
- Under "Scopes," add the following permissions:
data.records:read
data.records:write
schema.bases:read
- Under "Access," select "Add a base" and choose the "Entertainment" base or the name you gave to the base when you cloned it for this tutorial.
- Click "Create token" and copy the generated token. Keep it safe, as you won't be able to see it again.
Getting your API keys
Get your OpenAI API key
To enable n8n to communicate with OpenAI for content generation, you'll need an API key:
- Log in to your OpenAI platform account.
- Navigate to the API keys section, typically found under "API keys" in the sidebar or your profile menu.
- Click on "Create new secret key."
- Give your key a name (e.g., "n8n Quiz Videos") for identification.
- Copy the generated key immediately. You will not be able to view it again after closing the window. Store it securely.
Get your JSON2Video API key
To allow n8n to send video rendering requests to JSON2Video, you'll need an API key:
- Go to the JSON2Video API keys dashboard page. If you don't have an account, you can get a free one.
- For this tutorial, using your "Primary API key" is sufficient. However, for best security practices in a production environment, we recommend creating a "Secondary API key" with specific permissions (e.g., "Render" permission).
- Copy your desired API key. Keep it safe, as it grants access to your JSON2Video account.
Create the workflow
Import the workflow
To simplify the setup, you can import a pre-built n8n workflow:
- Download the workflow definition file: workflow.json.
- Open your n8n instance and navigate to the Workflows section.
- Click on "New" (or the plus icon) to create a new workflow.
- In the top right corner, click on the three dots icon (More options) and select "Import from File...".
- Choose the downloaded
workflow.json
file. - The workflow will now appear in your n8n editor.
Update the node settings
After importing the workflow, you need to configure the credentials and settings for each service:
Update the Airtable nodes
The workflow contains two Airtable nodes: "Airtable - Read" and "Airtable - Update". Both need your Airtable Personal Access Token:
- Double-click the "Airtable - Read" node (or "Airtable - Update" node).
- Under the "Credential to connect with" field, click "+ Create new credential".
- Choose "Access Token" as the authentication method.
- In the "Access Token" field, paste the Personal Access Token you obtained earlier in the "Get your Airtable personal access token" section.
- Give the credential a descriptive name (e.g., "My Airtable PAT").
- Click "Create" or "Save" to securely store your credential.
Repeat these steps for the other Airtable node in the workflow.
Update the OpenAI nodes
The workflow uses an OpenAI node to generate quiz content. You need to connect it with your OpenAI API key:
- Double-click the "OpenAI" node.
- Under the "Credential to connect with" field, click "+ Create new credential".
- Choose "API Key" as the authentication method.
- In the "API Key" field, paste your OpenAI API key you obtained earlier.
- Give the credential a descriptive name (e.g., "My OpenAI API Key").
- Click "Create" or "Save".
Update the JSON2Video nodes
The workflow includes two JSON2Video HTTP Request nodes: "Submit a new job" and "Check status". Both require your JSON2Video API key:
- Double-click the "Submit a new job" node.
- Under "Headers", locate the "x-api-key" parameter.
- Replace the
-- YOUR API KEY HERE --
placeholder with your JSON2Video API key. - Repeat these steps for the "Check status" node.

The JSON payload passed to the JSON2Video API in the "Submit a new job" node utilizes a pre-designed JSON2Video template with ID cSTYFRZhXeBZotbwcjuM
. It passes dynamic content as variables, including the voice name, voice model, topic, intro, outro, questions, and font from your Airtable data and OpenAI's output. The background video and color scheme are set statically in this template for a consistent quiz aesthetic.
Run your first automated video creation
Once all credentials are configured, you're ready to run your first automated quiz video:
- In your Airtable base, go to the "Quizzes" table.
- Add a new row and enter values for "Topic" (e.g., "Sports Trivia"), "Difficulty" (e.g., "Average"), "Language" (e.g., "English"), "Voice Name" (e.g., "en-US-JennyMultilingualNeural"), "Voice Model" (e.g., "azure"), and "Font" (e.g., "Oswald"). Set the "Status" to "Todo".
- In your n8n workflow, click on the "Test workflow" button in the bottom-center of the screen.
- The workflow will execute, reading the Airtable row, generating content with OpenAI, submitting the video job to JSON2Video, and then waiting for it to complete.
- Monitor the execution progress in n8n.
- Once the workflow finishes, check your Airtable base. The "Status" for your quiz entry should now be "Done", and the "Result" column should be populated with the URL to your new video. Click the URL to watch your automated quiz video!
Localizing your videos into other languages
One of the significant advantages of this automated setup is the ease with which you can localize your quiz videos into different languages. By simply adjusting a few fields in your Airtable base, you can generate content, voiceovers, and even select appropriate fonts for a global audience.
The steps are straightforward:
- Set the target language in Airtable: Update the "Language" column in your Airtable row to the desired language (e.g., "Japanese", "Spanish", "Arabic").
- Choose a compatible font: Ensure the "Font" column specifies a font that supports the characters of your chosen language. The JSON2Video template uses Google Fonts, so you can select one that supports your target language (e.g., "Noto Sans KR" for Korean, "Noto Sans JP" for Japanese).
- Select a matching voice: Update the "Voice Name" column with an appropriate voice for your chosen language and "Voice Model" (e.g., a Japanese voice for Japanese content). You can find supported voices in the JSON2Video documentation: Azure voices by language or ElevenLabs voices by language.
Example: creating a video in Arabic
Let's create a quiz video in Arabic to demonstrate the localization process:
- In your Airtable "Quizzes" table, create a new row.
- Set "Topic" to something suitable for Arabic (e.g., "تاريخ الشرق الأوسط" - Middle East History).
- Set "Difficulty" to "Average".
- Set "Language" to "Arabic".
- For "Voice Name", choose an Arabic voice supported by Azure, like
ar-SA-ZariyahNeural
. - For "Voice Model", keep it as
azure
. - For "Font", select
Noto Sans Arabic
, which is a Google Font with excellent Arabic script support. - Set "Status" to "Todo".
- Run your n8n workflow.
The OpenAI node will generate quiz questions and voiceover scripts in Arabic. JSON2Video will then render the video using the specified Arabic font and voice, producing a fully localized quiz video.
Input variables | Resulting video |
---|---|
|
|
Using alternative AI models
By default, the workflow is configured to use Azure for voice generation, which is free to use with your JSON2Video plan. However, you have the flexibility to switch to other AI voice models like ElevenLabs, which offers different voice qualities and characteristics, though it consumes extra credits. For more details, refer to the Credit consumption documentation.
Using ElevenLabs
To use ElevenLabs for your voiceovers, you just need to modify the corresponding fields in your Airtable base:
- In your Airtable "Quizzes" table, find the row for the quiz you want to generate.
- Change the "Voice Model" column to
elevenlabs
. - Update the "Voice Name" column with a supported ElevenLabs voice (e.g.,
Adam
,Rachel
,Bella
). You can find a full list of supported ElevenLabs voices on the JSON2Video AI voices page. - Ensure the "Status" is "Todo" and then run your n8n workflow.
The workflow will now use ElevenLabs to generate the voiceovers for your video.
Customizing your videos
The provided JSON2Video template is designed for flexibility, allowing you to customize various aspects of your quiz videos without diving deep into the JSON structure. This is achieved through the use of template variables and the underlying AI models.
Using template variables
The JSON2Video movie template (cSTYFRZhXeBZotbwcjuM
) defines multiple variables that allow for easy customization. You can modify these values in the N8N "Submit a new job" node to change your video's appearance and content:
voiceName
: The specific AI voice used for voiceovers (e.g., "en-US-JennyMultilingualNeural").voiceModel
: The AI model for voice generation ("azure" or "elevenlabs").topic
: The main subject of the quiz, displayed in the intro.intro_voiceover
: The spoken introduction for the quiz video.like_and_subscribe_voiceover
: The spoken call to action at the end of the video.questions
: An array containing the quiz questions, answers, and correct answer index.fontFamily
: The font family used for text in the video (e.g., "Oswald", "Luckiest Guy").primary_color
: A primary color used for elements like answer backgrounds (e.g., "#8d6ad9").secondary_color
: A secondary color used for elements like incorrect answer backgrounds (e.g., "#d0bef7").title_color
: The color of the main "Trivia Time!" title (e.g., "#FF0000").answers_bgcolor
: Background color for answer options.answers_fgcolor
: Foreground color (text color) for answer options.correct_bgcolor
: Background color for the correct answer highlight.correct_fgcolor
: Foreground color for the correct answer text.incorrect_bgcolor
: Background color for incorrect answer highlight.incorrect_fgcolor
: Foreground color for incorrect answer text.background_video
: URL to the looping background video for the quiz.intro_duration
: Duration of the introductory scene in seconds.question_duration
: Duration of each question scene in seconds.outro_duration
: Duration of the outro scene in seconds.
Refining the AI-Generated content
The quiz questions, answers, and voiceover scripts are dynamically generated by OpenAI based on a "system prompt" and your Airtable inputs. You can modify the system prompt within the OpenAI node in n8n to customize the resulting content:
You are an entertainment expert.
Create a quiz video script on the given topic and difficulty.
* Include a list of short, challenging multiple-choice questions.
* DO NOT create more than 5 questions.
* Each question must have 4 short answer options (preferably one word each).
* Only one answer is correct.
* Keep the questions direct and concise (no more than 7 words per question).
* The quiz should be engaging for a broad audience.
* Topic, questions, answers and voiceovers must be in given language.
* If necessary, translate and improve the provided "intro_voiceover" and "like_and_subscribe_voiceover".
Return the output in this exact JSON format:
```json
{
"topic": "Everyday science",
"intro_voiceover": "",
"like_and_subscribe_voiceover": "",
"questions": [
{
"question": "What planet is known as the Red Planet?",
"answer1": "Earth",
"answer2": "Mars",
"answer3": "Jupiter",
"answer4": "Venus",
"correct_answer": 2
}
]
}
```
Only return the JSON. Do not add explanations or introductions.
By adjusting this prompt, you can influence the tone, style, and specific requirements for the AI-generated quiz content.
Editing the movie template
For advanced customization of the video's structure, timing, and animations, you can duplicate and directly edit the JSON2Video movie template. This requires a deeper understanding of the JSON2Video API and its documentation.
Follow these steps:
- Open the provided movie template in the JSON2Video Visual Editor.
- From the top bar "Template" menu, click "Save template as...". This will create a copy of the template in your JSON2Video account.
- You can now edit this new template to make deep changes to its structure, add or remove elements, modify animations, and adjust timings.
- Once you're satisfied with your changes, go to "Template > Show Template ID" to get the new unique ID of your duplicated and modified template.
- In your n8n workflow, double-click the "Submit a new job" node.
- In the "JSON Body" field, locate the
"template": "cSTYFRZhXeBZotbwcjuM"
line. - Replace
cSTYFRZhXeBZotbwcjuM
with the ID of your newly created template.
Now, every time your n8n workflow runs, it will use your customized video template.
Conclusion and next steps
Congratulations! You've successfully built an automated system to generate engaging quiz videos using Airtable, OpenAI, n8n, and JSON2Video. You've learned how to:
- Set up your data in Airtable.
- Connect your API keys for seamless integration.
- Automate content generation with OpenAI.
- Render dynamic videos using JSON2Video templates.
- Localize your videos for different languages.
- Customize video elements and templates.
This powerful automation not only saves significant time but also allows for rapid, scalable content creation. You can now effortlessly produce a high volume of personalized and engaging quiz videos for your audience. Explore other tutorials to discover more ways to leverage JSON2Video for diverse video content needs.
Published on July 7th, 2025
