Expanding event information access
A leading technology and broadcast team aimed to improve attendee experiences for their major flagship events. These high-profile events showcase technological advancements and drive industry innovation, attracting tens of thousands of participants. Simplifying the process of finding relevant sessions, speakers, partners, and logistical information was crucial to maximize engagement and satisfaction.
The issue at hand
The existing process for accessing event information was
inefficient and time-consuming. Attendees had to navigate through numerous
documents and platforms. This process led to delays and difficulties in finding relevant
sessions and locations. This fragmented approach also overwhelmed support staff
with numerous tickets, resulting in slow response times.
Our approach
Figure 1: Solution architecture |
To streamline information access and accelerate digital
transformation using AI and large language models (LLMs), we proposed an
integrated Copilot platform using OpenAI GPT-4 Turbo and Azure AI services. The
solution aimed to improve user satisfaction, reduce response times, and increase
efficiency in accessing event-related information.
Diving deeper into the solution
Here are the detailed steps of our implementation:
Data ingestion
1. Ingested
data files from multiple sources such as Excel files, SharePoint sites, and the
event website into Parquet files using Databricks.
2. Extracted
and filtered relevant columns and content from 22 different file formats,
including CSV, DOC, HTML, JPG, MSG, PDF, PPTX, TXT, XLSX, and ZIP.
Data preparation
1. Cleaned
data by removing signatures and noise using Python libraries.
2. Chunked
file contents to a smaller token size for OpenAI processing.
Feature extraction
1. Used
OpenAI to extract features from files using prompts.
2. Redacted
PII data and detected non-English content.
3. Extracted
key phrases, titles, summaries, and Q&A data from the content.
Search index ingestion
1. Ingested
extracted features and references into Azure Search Index.
2. Created
indexes for sessions, speakers, partners, FAQs, and logistics.
ML prompt flow and RAG
1. Established
a prompt flow to generate responses.
2. Passed
user queries through Azure Content Safety for filtering.
3. Searched
multiple data indices to find relevant context based on the query.
4. Passed
retrieved context to the OpenAI LLM model to generate a response displayed on
the web app.
Copilot web application
1. The
Azure web app interacts with the prompt flow through an ML Endpoint.
2. Unified
platform for accessing all Copilots.
3. User
activity and feedback are stored in Application Insights.
4. Additional
features include suggested questions, dark mode, time-based searches, and
folder-level searches.
Solution highlights
1. Robust
data ingestion and enrichment: Established pipelines to ingest data from
multiple sources and formats, enriched using OpenAI capabilities.
2. Advanced
security measures: Implemented entitlement-based access control to
safeguard sensitive data.
3. Improved
user interface: Features like folder-level search, suggested questions, and
chat autocomplete significantly improved user experience.
4. Responsible
AI: Used Azure Content Safety to enable responsible AI.
5. Feedback
mechanism: Captured user feedback for continuous improvement.
Benefits of the solution
1. Efficient
information retrieval: Streamlined access to event information, reducing
search time significantly.
2. Copilot
capabilities: Assisted attendees with event logistics, session schedules,
speaker information, and partner locations.
3. Swift
inquiry handling: Immediate responses to high volumes of concurrent user
inquiries ensured bot performance and user satisfaction.
4. Up-to-date
information: Centralized documentation with incremental pipelines ensured
access to current and accurate information.
5. Elevated
user experience: Improved user experience through UI features such as time
intelligence, folder-level search, suggested questions, and chat autocomplete.
The project in numbers
· 2 events covered
· 150K+ questions answered during event days
· 70% reduction in support requests
· 20K+ concurrent users supported
· Over 1M interactions handled throughout the
event lifecycle
· 3.8M OpenAI calls and 2.5B tokens processed
· Data processed from 22 different file formats
and 15 languages
· Search context data available: 6.8 GB
For any further inquiries, contact Sales@MAQSoftware.com to see how copilots powered by Gen AI can transform your business, improve customer satisfaction, and accelerate your delivery. |