Azure Service (src/azure/azure.service.ts)¶
Overview¶
The AzureService is the core service for Azure OpenAI integration in the BidScript backend. It provides methods for AI-powered text generation, document analysis, summarization, and question answering using Azure's OpenAI service. This service is essential for the application's intelligent document processing capabilities.
Dependencies¶
import { Injectable, Logger } from '@nestjs/common';
import { ConfigService } from '@nestjs/config';
import { OpenAIClient, AzureKeyCredential } from '@azure/openai';
import { ChatCompletionRequestMessage } from 'openai';
import { BlobService } from './blob.service';
Key Features¶
- AI-powered document summarization
- Question answering based on document content
- Text completion and generation
- Custom prompt construction
- Stream-based responses for real-time updates
- System message management
- Error handling and retry logic
Core Methods¶
completions¶
async completions(
messages: ChatCompletionRequestMessage[],
options?: {
temperature?: number;
maxTokens?: number;
systemMessage?: string;
stream?: boolean;
streamCallback?: (chunk: string) => void;
}
): Promise<string>
Generates completions using Azure OpenAI based on provided messages.
Parameters:
- messages: Array of message objects representing the conversation history
- options: Configuration options for the completion
- temperature: Controls randomness (0-1)
- maxTokens: Maximum tokens to generate
- systemMessage: System instructions for the AI
- stream: Whether to stream responses
- streamCallback: Callback for streamed chunks
Returns: - Generated text response
Example:
const response = await this.azureService.completions([
{ role: 'user', content: 'Summarize this document for me.' }
], {
temperature: 0.3,
maxTokens: 500,
systemMessage: 'You are a helpful assistant specialized in document analysis.'
});
summarizeDocument¶
async summarizeDocument(
documentText: string,
options?: {
maxLength?: number;
format?: 'bullets' | 'paragraphs' | 'sections';
}
): Promise<string>
Generates a summary of the provided document text.
Parameters:
- documentText: The document text to summarize
- options: Summarization options
- maxLength: Maximum summary length
- format: Summary format (bullets, paragraphs, sections)
Returns: - Document summary text
Example:
const summary = await this.azureService.summarizeDocument(documentText, {
maxLength: 500,
format: 'bullets'
});
answerQuestion¶
async answerQuestion(
question: string,
context: string,
options?: {
detailed?: boolean;
includeReferences?: boolean;
}
): Promise<{ answer: string; references?: string[] }>
Answers a question based on provided context.
Parameters:
- question: The question to answer
- context: The context information to use for answering
- options: Answer options
- detailed: Whether to provide a detailed answer
- includeReferences: Whether to include source references
Returns: - Object containing the answer and optional references
Example:
const { answer, references } = await this.azureService.answerQuestion(
'What are the main requirements in this document?',
documentText,
{ detailed: true, includeReferences: true }
);
streamCompletion¶
async streamCompletion(
messages: ChatCompletionRequestMessage[],
streamCallback: (chunk: string) => void,
options?: {
temperature?: number;
maxTokens?: number;
systemMessage?: string;
}
): Promise<void>
Streams completion responses chunk by chunk.
Parameters:
- messages: Array of message objects
- streamCallback: Callback function for processing chunks
- options: Completion options
Example:
await this.azureService.streamCompletion(
messages,
(chunk) => {
// Process each chunk as it arrives
console.log('Received chunk:', chunk);
// Send chunk to client
socket.emit('message:chunk', { messageId, chunk });
},
{ temperature: 0.7 }
);
Implementation Details¶
OpenAI Client Initialization¶
private openAIClient: OpenAIClient;
constructor(
private configService: ConfigService,
private blobService: BlobService,
) {
const endpoint = this.configService.get<string>('AZURE_OPENAI_ENDPOINT');
const key = this.configService.get<string>('AZURE_OPENAI_API_KEY');
if (!endpoint || !key) {
throw new Error('Azure OpenAI configuration missing');
}
this.openAIClient = new OpenAIClient(
endpoint,
new AzureKeyCredential(key)
);
}
System Message Construction¶
private constructSystemMessage(customMessage?: string): string {
const defaultMessage = 'You are a helpful assistant specialized in document analysis and understanding.';
return customMessage || defaultMessage;
}
Error Handling and Retries¶
private async executeWithRetry<T>(
operation: () => Promise<T>,
maxRetries = 3
): Promise<T> {
let lastError: Error;
for (let attempt = 1; attempt <= maxRetries; attempt++) {
try {
return await operation();
} catch (error) {
lastError = error;
this.logger.warn(`Attempt ${attempt} failed: ${error.message}`);
// Check if error is retryable
if (!this.isRetryableError(error) || attempt === maxRetries) {
break;
}
// Exponential backoff
await new Promise(resolve => setTimeout(resolve, 2 ** attempt * 100));
}
}
throw lastError;
}
Integration with Other Services¶
The AzureService integrates with:
- BlobService: For document retrieval when working with document references
- ChatService: For generating AI responses in chat conversations
- RAG Service: For enhancing responses with relevant document information
Error Handling¶
The service provides comprehensive error handling for various scenarios:
- Authentication errors: Invalid API key or credentials
- Rate limiting: Exceeding API quotas
- Timeout errors: Service taking too long to respond
- Model errors: Issues with specific prompts or inputs
- Network errors: Connection issues with Azure services
Logging¶
The service uses NestJS Logger for detailed logging:
private readonly logger = new Logger(AzureService.name);
// Usage
this.logger.log(`Generating completion with ${messages.length} messages`);
this.logger.error(`OpenAI error: ${error.message}`, error.stack);
Configuration¶
Required environment variables: