Overview
Gravity Forms OpenAI is a free plugin that integrates Gravity Forms with OpenAI – the leading provider of cutting-edge AI language models.
This plugin allows you to send prompts constructed from your form data to OpenAI and capture its responses alongside the submission. You can also utilize the power of AI to edit submitted data for grammar, spelling, word substitutions, and even full rewrites for readability or tone. Lastly, GF OpenAI allows you to moderate submissions and flag, block, or spam undesirable content.
Getting Started
Install plugin
Get an OpenAI API key
Assuming you’re already using Gravity Forms, there’s only one thing you’ll need to get started with GF OpenAI: an OpenAI API key!
We’ll let ChatGPT (powered by OpenAI) explain how to get your own. 😉
Here’s a shortcut if you already have an OpenAI account.
Once you have your API key, copy it and paste it into the GF OpenAI plugin settings by navigating to Forms › Settings › OpenAI.
Workshop Crash Course
Before we dig into the instructions for using this plugin, you may find our crash course workshop to be a useful starting point. Take a look as we explore some popular use cases, fine-tune models, explore how a marketing agency can use OpenAI with Gravity Forms, and answer questions from the audience!
Don't miss a thing. Get our weekly updates by owl or email. Ok, we're still working on the owl service, so just email for now.
Using the Plugin
Our plugin works with Gravity Forms in a few different ways to make it easy to use OpenAI’s powerful AI capabilities with your forms.
Chat Completions
Chat Completions is OpenAI’s newest endpoint, while also the cheapest and fastest by far. It’s also the most robust model, able to handle and respond accurately with to largest variety of prompt types. In fact, it’s the same model that powers ChatGPT.
Some great use cases for Chat Completions include: debugging code, generating documentation, and answering user questions!
Chat Completion Options
OpenAI Model – Chat completions uses the gpt-3.5-turbo
model. gpt-4
is also supported, but gpt-3.5-turbo
is the fastest and most cost-effective option from OpenAI.
Prompt – Any combination of form data (represented as merge tags) and static text that will be sent to the selected OpenAI model and to which the model will respond.
Merge Tag – Enable merge tags to output the result of an OpenAI feed in form confirmations, notifications, or even live in your form fields (via Populate Anything’s Live Merge Tags).
Map Results to Field – All OpenAI responses will be captured as an entry note. Use this setting to optionally map the response to a form field.
Completions
Completions provide an easy way to interact with OpenAI’s versatile language models. Send any combination of form data as a prompt and whatever model you’ve selected will generate a text completion with its best attempt to match the intent of your prompt.
For example, if you give the API the prompt, “Write a slogan for Gravity Forms”, it will return a completion like “Effortlessly capture and manage your data with Gravity Forms.”
Completion Options
OpenAI Model – Select the model that is the best fit for your needs. OpenAI offers a variety of models, each with its own strengths and capabilities. This plugin supports gpt-3.5-turbo
and gpt-4
, among others. The key differentiators between each model will be speed, quality, and cost.
Prompt – Any combination of form data (represented as merge tags) and static text that will be sent to the selected OpenAI model and to which the model will respond.
Merge Tag – Enable merge tags to output the result of an OpenAI feed in form confirmations, notifications, or even live in your form fields (via Populate Anything’s Live Merge Tags).
Map Results to Field – All OpenAI responses will be captured as an entry note. Use this setting to optionally map the response to a form field.
Edits
Edits makes it easy to generate and capture suggestions for texts edits to submitted form data. Provide a prompt and an instruction for how that prompt should be edited and OpenAI will return an edited version of the prompt.
This can be useful for a variety of purposes, such as improving the grammar and clarity of user-generated content, or providing automated copyediting services.
OpenAI Model – Select the model that is the best fit for your needs. OpenAI offers two models here – one focused on editing text and the other on code.
Input – Any combination of form data (represented as merge tags) and static text that will be sent to the selected OpenAI model and to which the model will apply the Instruction.
Instruction – Explain to the model how you would like the input edited. Example: “Fix spelling mistakes and format as a sentence.”
Merge Tag – Enable merge tags to output the result of an OpenAI feed in form confirmations, notifications, or even live in your form fields (via Populate Anything’s Live Merge Tags).
Map Results to Field – All OpenAI responses will be captured as an entry note. Use this setting to optionally map the response to a form field.
Moderations
Moderations allow you to check if submitted data complies with OpenAI’s content policy. This will flag inappropriate or harmful content and our plugin will let you decide what to do with that submission. You can block the submission, returning a validation error, mark the entry as spam, or record OpenAI’s response but do nothing.
OpenAI Model – Select the model that is the best fit for your needs. If you want the latest AI technology, use test-moderation-latest
. Otherwise, stick with text-moderation-stable
.
Input – Any combination of form data (represented as merge tags) and static text that will be sent to the selected OpenAI model for evaluation.
Behavior – Decide what to do if the submitted input fails validation. You can:
- Prevent submission by showing validation error
The submission will be blocked and a validation error will be displayed. Due to the nature of how data is sent to OpenAI, specific fields cannot be highlighted as having failed validation. - Mark entry as spam
The submission will be allowed by the entry will be marked as spam. Gravity Forms does not process notifications or feeds for spammed entries. - Do nothing
The submission will be allowed and the failed validation will be logged as a note on the entry.
Live Results with Live Merge Tags
If you’d like to show live results from OpenAI directly in your form, Populate Anything’s Live Merge Tags (LMTs) makes this possible. LMTs provide the ability to automatically replace a merge tag with live content as soon as its associated field is updated. LMTs work anywhere inside your form (field values, labels, HTML content) and can be used to process GF OpenAI feeds.
You can get the “live” version of your GF OpenAI feed merge tag by enabling the Enable Merge Tag setting on your GF OpenAI feed and copying the desired merge tag.
@{:1:openai_feed_2}
→ Will process the specified feed anytime the value in field ID 1 is changed. Update the “1” to any field that you would like to trigger the Live Merge Tag.
@{all_fields:openai_feed_2}
→ This LMT will be processed when any field value changes. This is useful if your prompt contains more than one field value.
Trigger Live Results by Button Click
When you use Live Merge Tags with an OpenAI feed, by default, the feed processes as soon as the merge tag value changes. This may not be ideal if the user is still entering data.
Thankfully, there’s a snippet that solves this problem. This snippet changes the default behavior to only trigger the feed when a button is clicked instead of when the input field is changed.
Merge Tag Modifiers
Include Line Breaks in Text
If you’re looking to include line breaks when using an OpenAI merge tag with an HTML field, you can include the :nl2br
modifier in your merge tag code. Here’s an example of what the output looks like with and without the modifier:
Without the modifier, all results are on one line:
@{:1:openai_feed_2}
1. Mars 2. Venus 3. Pluto
With the modifier added, each result gets separated on its own line:
@{:1:openai_feed_2,nl2br}
1. Mars
2. Venus
3. Pluto
Integrations
Populate Anything
With the power of Live Merge Tags, you can process OpenAI feeds as your users type and display the response live in an HTML field or capture it in any text-based field type (i.e. Single Line Text, Paragraph, etc).
Gravity Flow
GF OpenAI is fully integrated with Gravity Flow. While you’re building your workflow, choose Open AI as the step type and select the OpenAI feed you’d like to use:
Here’s a quick example of what a workflow in Gravity Flow could look like:
- A user submits a form designed to create an outline for their article.
- GF OpenAI handles the outline.
- A Gravity Flow approval step requires an editor to approve the outline.
- After approval, a rough article draft is generated based on that outline via GF OpenAI.
- A Notification step brings users back to review the generated article, make edits, and submit a final version.
- Post Creation generates an article and publishes it.
FAQs
How much does this cost?
Gravity Forms OpenAI (this plugin) is free! OpenAI (the service this plugin integrates with) also gives you $18 in free credit when you signup. After that, OpenAI is still remarkably inexpensive. As of Dec 15th, 2022, you’ll pay about $0.02 per 1000 tokens (~750 words) generated by OpenAI. Full pricing details here.
I need some inspiration… what can I do with OpenAI?
OpenAI has an amazing list of examples here. If that doesn’t whet your whistle, try asking ChatGPT itself. You’ll be surprised with how clever it can be. 😉
I want to contribute to this plugin!
If you’re looking to contribute to the codebase, PRs are welcome! Come work with us on Github. If you want to contribute financially, pick up a Gravity Perks license. It’s a win-win. 😄
Did this resource help you do something awesome with Gravity Forms?
Then you'll absolutely love Gravity Perks; a suite of 47+ essential add-ons for Gravity Forms with support you can count on.
Hi
Sorry if this question is too easy. I have installed this plugin and configured it. I now see GPT answers in my entry list. How could the user see the GPT answer of his question in the confirmation page?
Thanks
Hi Karl,
If you’ve mapped the result to a field, enter the merge tag of the field in the confirmation page. If not, enter, the openai merge tag in the confirmation page.
Best,
First of all, I would like to apologize for making some many questions. And at the same time, would like to thank you for promptly answering all of them. I find so many useful possibilities for this tool.
How difficult/impossible would it be to integrate it into GravityCharts? I was wondering if it could be used to generate (different kinds of charts) charts based on the prompts.
Hi Claudio,
This isn’t currently planned, but I’ll pass it along to our product manager.
Hi,
Is gpt 4 already usable in the plugin?
Hi Harry,
Not yet. We plan on adding support for GPT-4, but we’re still waiting for access.
Hi @GravityWiz We can provide you a GPT4-API-Key for providing support for GPT4. Just contact us.
We have one now and are working on the adding support for GPT-4 to the plugin as we speak. 😄
Hi GravityWiz, thanks for this great plugin, very interesting to explore the possibilities of AI within Gravity Forms.
I recently watched your ‘OpenAI Unleashed’ workshop and was particularly interested in the marketing and AI section. There was one idea in here where Cole had a mock-up of a form for content optimisation, whereby the user inputs a piece of content into one field and then some checkboxes underneath for suggestions such as ‘Readability’, ‘Wordiness’, ‘Clarity’, ‘Tone’, ‘Keyword Density’ etc. Initially I have tried the following prompt (and multiple variations of the prompt):
“Provide suggestions to optimise the below content. Separate into a list with the following sections: Readability, Wordiness, Clarity, Tone, Keyword Density
{Content to Optimise:1}”
But it is just outputting them as a list with generic explanations for those sections, rather than it being specific to the content from the field. The above prompt works fine within ChatGPT itself.
It may be that I am just missing something obvious, but I have tried many different prompts and can’t seem to get it to work for, wondering if you have any suggestions?
Also the next step would be replace the hardcoded sections with values from checkboxes. Assuming this works, would it be a case of using the merge tag for the checkbox? What about the case where a checkbox hasn’t been checked but the merge tag is still passed to the prompt?
Any help/advice would be really appreciated, thanks!
Kind regards, Adam
Hey Adam,
Thanks for reaching out.
If the prompt works as intended in ChatGPT, this sounds like an issue with plugin configuration and/or making sure those the inputted field values (i.e. your content yet to be optimized) are being sent to OpenAI and back.
Have you ensured your OpenAI feeds are configured correctly by testing that they works with other, simpler prompts and field inputs?
After that, I would recommend moving the merge tag around within the prompt and reducing it to a single request first (i.e. just requesting clarity, or tone, etc) to see if this provides a result relevant to the text in {Content to Optimize:1}. Once you’ve confirmed this works, you can start adding more.
Let me know how this goes and I can help further.
Regarding your second question — have you explored using this in part with Populate Anything’s Live Merge Tags?
https://gravitywiz.com/documentation/gravity-forms-populate-anything/#live-merge-tags:~:text=The%20Frontend-,Live%20Merge%20Tags,-When%20GF%20Populate
Thanks Adam!
PS — Feel free to reach out directly at support@gravitywiz.com.
Hi Cole,
Many thanks for your detailed response! Yes I have a few other test forms such as a blog post generator (from certain input fields e.g. keywords, target audience, tone, etc.), one for sentiment analysis like in the workshop and another for content summarisation.
These are all working correctly, so I think you’re right about it being the prompt/placement of the merge tag in the prompt that is the issue. I have tried a few combinations and reducing it to request just one optimisation and it has been a bit hit and miss still. It has output something I would expect at times and not at others.
Therefore I would say it’s a case of trial and error before getting the right prompt, given that it is working for other feeds, so I don’t think it is a configuration problem with the plugin specifically, maybe also I need to try adjusting the other settings like Max Tokens, but for now I would say I will try a few more iterations of the prompt and see what happens.
It is also not something essential I am trying to do, as it is more experimental at the moment, as I work at a University in the UK and we are currently researching various AI tools for our department. We already use many Gravity Forms across the site, so I thought this would be a good use case to trial how this works and maybe we will go further with it and in that case I think looking into something like the populate anything extension could be another potential use case for us.
Many thanks again for the reply, your advice is appreciated and if I have anything else, I will reach out to your support!
Hey Adam,
Thanks for the response. Sounds like an exciting use case. I’ll second that — it sounds like you may have to tinker a bit more with your prompting and configuration. Wishing you the best of luck though, we’re always here otherwise.
And one other thought — if you do find yourself creating something you’re excited about and want to share it with us, we’re always open to feature customers in Gravity Wiz Weekly! Just drop us a line if so.
Cheers!
Hi guys – loving this plugin! Having a little issue though and not sure if anyone else has the same when trying to view paragraphs of text from AI on a results page…
I am using Completions and Mapping Result to a Field (in my case a paragraph field). I can see on the entry details Open Ai has created 3 separate paragraphs, but in the field itself there is no paragraph breaks at all, and so the output is one continuous block of text (see what I mean on https://smadigital.app/6steps-actioncoach/quiz-complete-thankyou/?geid=93&eid=02a465 towards the middle of the page).
Any help is super appreciated!
Hi Steve,
To keep the paragraph breaks, enable the option to use the Rich Text editor under the Advanced tab of the Paragraph field setting.
Best,
Something I noticed is that I cannot change the order of the feed once is created. If that feed depends on the value of a different feed that was created after, it will not show the result. It would be good to implement the “click and drag” option to change the order of the feeds
Thanks for the feedback, Claudio. I saw that you submitted an issue as well. I’ll link that here for other folks to weigh in on and we’ll prioritize accordingly.
https://github.com/gravitywiz/gravityforms-openai/issues/10
I was wondering when GravityWiz was going to officially add DALL E support to this plugin.
It looks like it’s in the code last I looked, but isn’t an option to use, unless I missed that announcement. I thought it was to be integrated with another GravityWiz add-on related to uploads. This would be very useful!
Hey Alexander, we do have plans to add DALL·E support but it hasn’t arrived just yet. We did start to explore it but the code that’s currently in place does not work. No definite ETA but customer demand is hugely motivational for us so thanks for letting us know you’re waiting for it. 🙂
Hi I have installed the gravity main plugin and also open ai integration plugin. I tried to use edits mode.Here are my questions: 1) should I always write in input {Prompt:1} ? what does it mean exactly {Prompt:1}? 2) can my users correv=ct their mistakes with this programm?
Hello! The input for the Edit endpoint would be the merge tag for whatever field you want to be edited by the AI. You can use the merge tag selector (the {..} icon directly above the field, top right) to select the desired field and input its merge tag into the setting.
If you want to overwrite the user’s original input with the AI-edited input, you could use the “Map Result to Field” setting to save over the original input. If you want to present the suggested edit to the user, you could use an HTML field and the Live Merge Tag version of the GF OpenAI feed (looks like
@{:FIELDID:openai_feed_123}
) to show the suggestion and let the user decide whether to implement it. Please note, live results requires Populate Anything.I am facing problem, its’ giving same answer all time. I made a simple form with one text input field and one output field. I am using chat completion with gpt 3.5 turbo. Here what I wrote in message requirement section “Write SEO optimized title only with this keyword “{Input:1}”. I am always getting same title with same keyword.
Hey Philip, results are cached for 5 minutes by default. I’ve sent you an experimental build of GF OpenAI with support for a filter to disable this caching (check your email).
Is it possible to include the values of each field from my form in my prompt? For example, “The user {user} is {age} years old and needs a health plan. The chosen plan is {chosen plan}”…
Hi Claudio,
Definitely, you can combine static text with merge tags in the prompt. The entire prompt is sent to OpenAI when the form is submitted or live merge tags are processed.
Is there a way to persist a chat in Turbo? ex: I have a workflow that has an initial prompt that is fairly detailed, then use the results of that to generate several others. Rather than sending the results as part of each subsequent prompt, I’d like to just add additional queries.
Sounds like you’re looking for something more ChatCPT-like? If so, we don’t currently have a solution for that but it’s on our radar.
Something like that yes. Using a fairly complex prompt as a setup, then generating several different types of content based on the results of the first prompt. Asking for it all at once is more tokens than allowed.
I’ll pretty sure you could accomplish this using copycat, copy the response to another field which then runs multiple prompts from the response.
This is doable with a combination of CopyCat, Form-Pass-Thru, and Live Merge Tags. You might be able to do it with less than these 3 components as well.
I know some people had ideas of how to get this working. But has anyone done it? Where they submit a request to the AI, get a response, and then send a follow up request and the AI takes into account the original prompt, it’s first response, and the second prompt?
I use live merge tags in my gravity forms to populate some text fields with openai. Unfortunately, when click the submit button of my form, all the text fields with a live openai merge tag, are submitted to openai for the second time (making me billed twice). is there a workaround to prevent this from happening? Thank you for your amazing plugin and free support.
Hi Sara,
This sounds like a configuration issue. Do note that while the plugin is free, support is not. We’re happy to answer questions here, but help with specific configurations requires a Gravity Perks license.
If you have one already, we’ll be happy to help if you contact us via the support form.
Hi , i am using the plugin with Completions. With the {openai_feed_1} i am getting my answer back and works fine when i check it in Notes ! BUT i created also 2 notifications and when i add the {all_fields:openai_feed_1} in the Notifications, Then the email arrives with a Different answer than the 1 in Notes. In Notes is better !
Why is that? Does it call more than 1 time the API when i am having Notifications with the {all_fields:openai_feed_1} too ?
Thank you in advance
Hi Antonis, when testing locally, it seems to work as expected. I get always the same output. This will probably require some digging on your setup. If you have an Advanced or Pro license, you can reach out via our support form, and we should be able to help.
Hello, a multiline result save in a text field. Unfortunately, this is then output as a single line in notifications. Is there a solution for this?
Hi Oliver, we’ve already followed up via email to help you here.
Upcoming version introduces the :nl2br modifier for better control over when new lines are honored or ignored.
Hi, any plans to integrate the new model: gpt-3.5-turbo
Thanks
Best, Oliver
Hi Oliver,
Support was added in the latest release 1.0-beta-1. You can download it using the Download Plugin button above.
You must select the new Chat Completions endpoint for the model to be available.
Lifesaving plugin! To me, with livemergetags, if I exceed openai token limits, no error messages are displayed to the end user (and of course, no text is generated). is there a hook in your openAI plugin or gravity forms or live merge tags, that I can connect to so I can write custom code to send my long text in chunks to OpenAI and then return all the processed chunks in one piece to the user? If this is insane, any other alternatives (such as embeddings) that can be somehow used with your extension? Thank you
Hi Sara, I’ll ping our product manager for an investigation about this feature.
Cool idea but no solution for this just yet.
Unless there is a better way, I got embeddings to work by hooking into the gf_openai_request_body plugin filter of this plugin, gathering the prompt text, using curl to get the prompt’s embedding vector from OpenAI, sending it to the vector db, getting results back and formatting it as appropriate, then edit in that plugin filter the formatted results so that what it sends to OpenAI is your embedding results + your prompt.
Thanks for the feedback Alex!
Can we connect this to the GPT3.5 turbo API that came out this week? or DALL-E2 for images?
Do you have plans to implement the new ChatGPT API to the plugin? Currently using text-davinci-003, but I think that gpt-3.5-turbo can help me create a better crafted output.
Heck yeah, Freddy. It’s available in the latest version above. 😉
Do you have more in depth documentation on how to use this plugin? I’m pretty lost… haha I know what I’m trying to build, but got very lost trying to get there ;-}
Hi Michael, if you have an Advanced or Pro license, you can reach out via our support form and we should be able to help. We will have a Workshop on March 2nd about Open AI that you can join too.
Best,
Is there anyway to do conditional logic inside the actual prompt? Does the gravityforms notification/confirmation shortcodes work?
Hi Lewis,
There isn’t, but we’d love to hear more about your use case. Can you reach out to us via our support form? We’ll see if we can come up with a solution for you.
will do,
with regards to the fine-tuned model that you have added support for how exactly do we get that to work I cant see any documentation. Thanks :-)
Hi Lewis,
You’ll need to train the fine-tuned model. Once you have a trained model in your account, it will show up as an available model in the feed settings.