Prompts
LLMs rely on a prompt they receive, and return a reply accordingly. Let's go deeper into how to configure a prompt and what can be achieved with it.
We have seen in the Quickstart/Concepts section that an Interaction
's prompt is made of one or many parametrized Prompt Segments
.
A Prompt Segment
has a role, such as System
or User
.
System - This role is used to define the context of the interaction. For example, the system prompt can be used to define the persona of the LLM ("You are a seasoned expert in legal affairs").
User - This role is used to define the user input ("Does this contract comply with corporate rules?").
Several Prompt Segments
can thus be combined to make a consistent global prompt.
Each Prompt Segment
can define a Prompt Schema
made of zero or many Input Parameters
.
An Input Parameter
can have one of the following types:
string
number
integer
boolean
object
any
text
media
document
- and the equivalent arrays, such as
string[]
Example : definition of a structured table of movies
movie_input_table : object[]
├─ movie_title : string
├─ movie_director : string
└─ movie_synopsis : string
A Prompt Segment
may refer to Input Parameters
by injecting their value into the prompt's message.
Defining and using simple prompt segment input parameters
In the following example we add two Input Parameters
to a Prompt Segment
intended to request the summary of a document text into a target language.
Reusing prompt segments
Vertesia's Prompt Segment Library
allows easily reusing them to serve new needs and use cases. In the following example, we fork an existing Prompt
related to generic contracts in order to slightly adapt it to supplier contracts, and then add it to a relevant Interaction
.
Adding dynamicity and conditionality to a prompt
Vertesia supports plain text as well as JavaScript based Prompt Templates
. JavaScript allows adding dynamicity and conditionality to the prompt, which is extremely convenient powerful.
This is about simple JavaScript instructions: no need to be developer to use them.
The Javascript Template engine runs in a jailed environment. You can use standard javascript string replacement syntax ( ${my_input_parameter}
), as well as control blocks (for
, if
, else
, etc.), and array functions (map
, reduce
, filter
, etc.). In any case, the prompt segment needs to return a string.
Accessing properties of a stored document from a prompt
Vertesia Content Store is a convenient way to store contents within the platform - typically you knowledge bases, such as corporate policies, operational procedures, standards, or suppliers contracts.
It makes RAG (Retrieval Augmented Generation) much easier and also allows content metadata to be generated and stored in the platform. You may think about a first interaction extracting metadata from raw contracts, and a second performing specific analysis only on a subset of relevant contracts.
Stored Content Objects
may thus not only contain text, but also metadata (properties). Should you need to access such metadata from a prompt, here is the way to achieve it.
The following example illustrates how to retrieve the effective_date
of a stored contract from a Prompt Segment
.
Getting a suggestion of improvement for a prompt
What if you could get in a snap suggestions of improvement for your prompts? Here we go.