Deploying APIs
After building queries in Infactory, you can deploy them as API endpoints that applications can use. This guide explains how to deploy queries, manage deployed APIs, and integrate them into your applications.
The Deployment Process
Deploying a query transforms it from a development-stage query to a production-ready API endpoint that can be called from your applications.
Deploying Your First Query
Select a query to deploy
From the Build tab, select the query you want to deploy.
Click the Deploy button
Click the Deploy button at the top of the query editor.
Review deployment details
Review the deployment summary, which shows:
Query name and description
Required parameters (slots)
Estimated performance characteristics
Confirm deployment
Click Deploy to confirm and deploy your query as an API endpoint.
The Deploy Page
The Deploy page is your command center for managing deployed queries and APIs.
The Deploy page consists of these main components:
Deployed Queries List : All queries that have been deployed as API endpoints
API Documentation : Interactive Swagger documentation for your APIs
API Key Management : Tools for creating and managing API keys
Usage Metrics : Statistics on API usage and performance
Testing Deployed Queries
Navigate to the Deploy tab
Click on the Deploy tab in your Infactory project.
Find your deployed query
Locate your deployed query in the list or search for it by name.
Open the API documentation
Click on the Test or API Docs button for your query.
Enter parameter values
Fill in values for any required parameters (slots).
Execute the API call
Click Execute to run a test call to your API endpoint.
View the response
See the structured data response from your API endpoint.
Understanding the API Architecture
Infactory provides two ways to interact with your deployed queries:
1. Unified API Endpoint
The unified endpoint allows you to send natural language questions and have Infactory automatically route them to the appropriate query:
POST https://api.infactory.ai/v1/query
{
"query" : "What is the average height for each position?" ,
"project_id" : "your_project_id" ,
"stream" : false
}
{
"data" : [
{ "position" : "Quarterback" , "average_height" : 75.2 },
{ "position" : "Running Back" , "average_height" : 70.8 },
{ "position" : "Wide Receiver" , "average_height" : 72.5 }
// ... more results
],
"query_used" : "average_by_category" ,
"parameters" : {
"category" : "position" ,
"metric" : "height"
},
"execution_time_ms" : 124
}
2. Direct Query Endpoints
You can also call specific query endpoints directly with their parameters:
POST https://api.infactory.ai/v1/queries/{query_name}
{
"category" : "position" ,
"metric" : "height"
}
{
"data" : [
{ "position" : "Quarterback" , "average_height" : 75.2 },
{ "position" : "Running Back" , "average_height" : 70.8 },
{ "position" : "Wide Receiver" , "average_height" : 72.5 }
// ... more results
],
"execution_time_ms" : 118
}
API Key Management
To secure your API endpoints, Infactory uses API keys:
Creating an API Key
Navigate to API Keys section
In the Deploy tab, click on the API Keys section.
Create new key
Click the Create New Key button.
Configure key details
Enter a name for the key (e.g., “Development”, “Production”)
Set permissions
Select which projects and environments this key should have access to.
Create key
Click Create to generate the API key.
Save your key
Copy and securely store the displayed API key - it will only be shown once!
Using API Keys
Include your API key in all requests to the Infactory API:
const response = await fetch ( 'https://api.infactory.ai/v1/query' , {
method: 'POST' ,
headers: {
'Content-Type' : 'application/json' ,
'Authorization' : 'Bearer YOUR_API_KEY'
},
body: JSON . stringify ({
query: "What is the average height for each position?" ,
project_id: "your_project_id"
})
});
Managing Deployed Queries
Updating a Deployed Query
Navigate to the Build tab
Go back to the Build tab to make changes to your query.
Make your changes
Edit the query code or parameters as needed.
Test your changes
Click Run to test your updated query.
Deploy updates
Click Deploy to update the deployed version of your query.
Confirm update
In the confirmation dialog, you’ll see that this is an update to an existing query.
Click Deploy to confirm.
Undeploying a Query
Navigate to the Deploy tab
Go to the Deploy tab in your Infactory project.
Find the query
Locate the query you want to undeploy in the deployed queries list.
Undeploy the query
Click the Undeploy button or menu option for that query.
Confirm undeployment
In the confirmation dialog, click Undeploy to confirm.
Integrating with Your Applications
Here are examples of how to integrate with Infactory APIs in different languages:
JavaScript/TypeScript
Unified Endpoint Direct Endpoint // Using the unified API endpoint
async function askQuestion ( question ) {
const response = await fetch ( 'https://api.infactory.ai/v1/query' , {
method: 'POST' ,
headers: {
'Content-Type' : 'application/json' ,
'Authorization' : 'Bearer YOUR_API_KEY'
},
body: JSON . stringify ({
query: question ,
project_id: "your_project_id"
})
});
return await response . json ();
}
// Example usage
async function displayResults () {
const results = await askQuestion ( "What is the average height by position?" );
console . log ( results . data );
// Process and display the results
}
// Using the unified API endpoint
async function askQuestion ( question ) {
const response = await fetch ( 'https://api.infactory.ai/v1/query' , {
method: 'POST' ,
headers: {
'Content-Type' : 'application/json' ,
'Authorization' : 'Bearer YOUR_API_KEY'
},
body: JSON . stringify ({
query: question ,
project_id: "your_project_id"
})
});
return await response . json ();
}
// Example usage
async function displayResults () {
const results = await askQuestion ( "What is the average height by position?" );
console . log ( results . data );
// Process and display the results
}
// Using a direct query endpoint
async function getAverageByCategory ( category , metric ) {
const response = await fetch ( 'https://api.infactory.ai/v1/queries/average_by_category' , {
method: 'POST' ,
headers: {
'Content-Type' : 'application/json' ,
'Authorization' : 'Bearer YOUR_API_KEY'
},
body: JSON . stringify ({
category: category ,
metric: metric
})
});
return await response . json ();
}
// Example usage
async function displayHeightByPosition () {
const results = await getAverageByCategory ( "position" , "height" );
console . log ( results . data );
// Process and display the results
}
Python
Unified Endpoint Direct Endpoint import requests
# Using the unified API endpoint
def ask_question ( question , project_id ):
url = "https://api.infactory.ai/v1/query"
headers = {
"Content-Type" : "application/json" ,
"Authorization" : "Bearer YOUR_API_KEY"
}
payload = {
"query" : question,
"project_id" : project_id
}
response = requests.post(url, headers = headers, json = payload)
return response.json()
# Example usage
def display_results ():
results = ask_question( "What is the average height by position?" , "your_project_id" )
print (results[ "data" ])
# Process and display the results
import requests
# Using the unified API endpoint
def ask_question ( question , project_id ):
url = "https://api.infactory.ai/v1/query"
headers = {
"Content-Type" : "application/json" ,
"Authorization" : "Bearer YOUR_API_KEY"
}
payload = {
"query" : question,
"project_id" : project_id
}
response = requests.post(url, headers = headers, json = payload)
return response.json()
# Example usage
def display_results ():
results = ask_question( "What is the average height by position?" , "your_project_id" )
print (results[ "data" ])
# Process and display the results
import requests
# Using a direct query endpoint
def get_average_by_category ( category , metric ):
url = "https://api.infactory.ai/v1/queries/average_by_category"
headers = {
"Content-Type" : "application/json" ,
"Authorization" : "Bearer YOUR_API_KEY"
}
payload = {
"category" : category,
"metric" : metric
}
response = requests.post(url, headers = headers, json = payload)
return response.json()
# Example usage
def display_height_by_position ():
results = get_average_by_category( "position" , "height" )
print (results[ "data" ])
# Process and display the results
Node.js
Unified Endpoint Direct Endpoint const axios = require ( 'axios' );
// Using the unified API endpoint
async function askQuestion ( question , projectId ) {
const response = await axios . post (
'https://api.infactory.ai/v1/query' ,
{
query: question ,
project_id: projectId
},
{
headers: {
'Content-Type' : 'application/json' ,
'Authorization' : 'Bearer YOUR_API_KEY'
}
}
);
return response . data ;
}
// Example usage
async function displayResults () {
try {
const results = await askQuestion ( "What is the average height by position?" , "your_project_id" );
console . log ( results . data );
// Process and display the results
} catch ( error ) {
console . error ( 'Error querying Infactory:' , error );
}
}
const axios = require ( 'axios' );
// Using the unified API endpoint
async function askQuestion ( question , projectId ) {
const response = await axios . post (
'https://api.infactory.ai/v1/query' ,
{
query: question ,
project_id: projectId
},
{
headers: {
'Content-Type' : 'application/json' ,
'Authorization' : 'Bearer YOUR_API_KEY'
}
}
);
return response . data ;
}
// Example usage
async function displayResults () {
try {
const results = await askQuestion ( "What is the average height by position?" , "your_project_id" );
console . log ( results . data );
// Process and display the results
} catch ( error ) {
console . error ( 'Error querying Infactory:' , error );
}
}
const axios = require ( 'axios' );
// Using a direct query endpoint
async function getAverageByCategory ( category , metric ) {
const response = await axios . post (
'https://api.infactory.ai/v1/queries/average_by_category' ,
{
category: category ,
metric: metric
},
{
headers: {
'Content-Type' : 'application/json' ,
'Authorization' : 'Bearer YOUR_API_KEY'
}
}
);
return response . data ;
}
// Example usage
async function displayHeightByPosition () {
try {
const results = await getAverageByCategory ( "position" , "height" );
console . log ( results . data );
// Process and display the results
} catch ( error ) {
console . error ( 'Error querying Infactory:' , error );
}
}
Advanced API Features
Streaming Responses
For queries that might return large datasets, Infactory supports streaming responses:
async function streamResults () {
const response = await fetch ( 'https://api.infactory.ai/v1/query' , {
method: 'POST' ,
headers: {
'Content-Type' : 'application/json' ,
'Authorization' : 'Bearer YOUR_API_KEY' ,
'Accept' : 'text/event-stream'
},
body: JSON . stringify ({
query: "List all players and their statistics" ,
project_id: "your_project_id" ,
stream: true
})
});
const reader = response . body . getReader ();
const decoder = new TextDecoder ();
while ( true ) {
const { done , value } = await reader . read ();
if ( done ) break ;
const chunk = decoder . decode ( value );
// Process each chunk as it arrives
processChunk ( chunk );
}
}
Error Handling
Always implement proper error handling in your API integrations:
try {
const response = await fetch ( 'https://api.infactory.ai/v1/query' , {
// ... request details
});
if ( ! response . ok ) {
const errorData = await response . json ();
console . error ( "API Error:" , errorData );
// Handle specific error types
if ( errorData . error === "invalid_query" ) {
// Handle invalid query error
} else if ( errorData . error === "authentication_failed" ) {
// Handle authentication error
} else {
// Handle other errors
}
return ;
}
const data = await response . json ();
// Process successful response
} catch ( error ) {
console . error ( "Network Error:" , error );
// Handle network errors
}
Rate Limiting
Infactory implements rate limiting to ensure fair usage of the platform. Rate limits vary by plan and are applied on a per-API-key basis.
If you exceed your rate limit, you’ll receive a 429 Too Many Requests response with information about when you can retry:
{
"error" : "rate_limit_exceeded" ,
"message" : "Rate limit exceeded. Please retry after 134 seconds." ,
"retry_after" : 134
}
Implement exponential backoff in your applications to handle rate limits gracefully:
async function callWithRetry ( apiCall , maxRetries = 3 ) {
let retries = 0 ;
while ( true ) {
try {
return await apiCall ();
} catch ( error ) {
if ( error . response && error . response . status === 429 && retries < maxRetries ) {
const retryAfter = error . response . headers [ 'retry-after' ] || 1 ;
console . log ( `Rate limited. Retrying after ${ retryAfter } seconds...` );
await new Promise ( resolve => setTimeout ( resolve , retryAfter * 1000 ));
retries ++ ;
} else {
throw error ;
}
}
}
}
Monitoring and Analytics
Infactory provides analytics and monitoring for your deployed APIs:
Available Metrics
Usage Over Time : Track API calls over hours, days, or months
Average Response Time : Monitor performance and identify slow queries
Error Rate : Keep track of errors and their causes
Popular Queries : See which queries are used most frequently
User Insights : Understand what questions users are asking most often
Setting Up Alerts
You can set up alerts for important API events:
Navigate to the Alerts section in the Deploy tab
Click Create Alert
Choose the alert type (e.g., “Error Rate Above Threshold”)
Configure alert parameters and notification methods
Click Save
Best Practices
Unified vs Direct Use the unified endpoint for conversational interfaces and direct endpoints for specific operations.
Proper Error Handling Implement robust error handling in your API integrations to provide a good user experience.
Rate Limiting Implement backoff strategies to handle rate limits gracefully.
Caching Cache common query results to improve performance and reduce API calls.
Security Never expose your API keys in client-side code. Use a backend service to proxy requests when needed.
Monitoring Regularly check your API usage metrics to identify optimization opportunities.
Next Steps
Now that you know how to deploy queries and integrate with the Infactory API, learn how to explore and test your queries in the Exploring Queries guide.