by Joche Ojeda | Oct 8, 2024 | A.I, Blazor, Semantic Kernel
Are you excited to bring powerful AI chat completions to your web application? I sure am! In this post, we’ll walk through how to integrate the DevExpress Chat component with the Semantic Kernel using OpenAI. This combination can make your app more interactive and intelligent, and it’s surprisingly simple to set up. Let’s dive in!
Step 1: Adding NuGet Packages
First, let’s ensure we have all the necessary packages. Open your DevExpress.AI.Samples.Blazor.csproj
file and add the following NuGet references:
<ItemGroup>
<PackageReference Include="Microsoft.KernelMemory.Abstractions" Version="0.78.241007.1" />
<PackageReference Include="Microsoft.KernelMemory.Core" Version="0.78.241007.1" />
<PackageReference Include="Microsoft.SemanticKernel" Version="1.21.1" />
</ItemGroup>
This will bring in the core components of Semantic Kernel to power your chat completions.
Step 2: Setting Up Your Kernel in Program.cs
Next, we’ll configure the Semantic Kernel and OpenAI integration. Add the following code in your Program.cs
to create the kernel and set up the chat completion service:
//Create your OpenAI client
string OpenAiKey = Environment.GetEnvironmentVariable("OpenAiTestKey");
var client = new OpenAIClient(new System.ClientModel.ApiKeyCredential(OpenAiKey));
//Adding semantic kernel
var KernelBuilder = Kernel.CreateBuilder();
KernelBuilder.AddOpenAIChatCompletion("gpt-4o", client);
var sk = KernelBuilder.Build();
var ChatService = sk.GetRequiredService<IChatCompletionService>();
builder.Services.AddSingleton<IChatCompletionService>(ChatService);
This step is crucial because it connects your app to OpenAI via the Semantic Kernel and sets up the chat completion service that will drive the AI responses in your chat.
Step 3: Creating the Chat Component
Now that we’ve got our services ready, it’s time to set up the chat component. We’ll define the chat interface in our Razor page. Here’s how you can do that:
Razor Section:
@page "/sk"
@using DevExpress.AIIntegration.Blazor.Chat
@using AIIntegration.Services.Chat;
@using Microsoft.SemanticKernel.ChatCompletion
@using System.Diagnostics
@using System.Text.Json
@using System.Text
@inject IChatCompletionService chatCompletionsService;
@inject IJSRuntime JSRuntime;
This UI will render a clean chat interface using DevExpress’s DxAIChat
component, which is connected to our Semantic Kernel chat completion service.
Code Section:
Now, let’s handle the interaction logic. Here’s the code that powers the chat backend:
@code {
ChatHistory ChatHistory = new ChatHistory();
async Task MessageSent(MessageSentEventArgs args)
{
// Add the user's message to the chat history
ChatHistory.AddUserMessage(args.Content);
// Get a response from the chat completion service
var Result = await chatCompletionsService.GetChatMessageContentAsync(ChatHistory);
// Extract the response content
string MessageContent = Result.InnerContent.ToString();
Debug.WriteLine("Message from chat completion service:" + MessageContent);
// Add the assistant's message to the history
ChatHistory.AddAssistantMessage(MessageContent);
// Send the response to the UI
var message = new Message(MessageRole.Assistant, MessageContent);
args.SendMessage(message);
}
}
With this in place, every time the user sends a message, the chat completion service will process the conversation history and generate a response from OpenAI. The result is then displayed in the chat window.
Step 4: Run Your Project
Before running the project, ensure that the correct environment variable for the OpenAI key is set (OpenAiTestKey
). This key is necessary for the integration to communicate with OpenAI’s API.
Now, you’re ready to test! Simply run your project and navigate to https://localhost:58108/sk
. Voilà! You’ll see a beautiful, AI-powered chat interface waiting for your input. 🎉
Conclusion
And that’s it! You’ve successfully integrated the DevExpress Chat component with the Semantic Kernel for AI-powered chat completions. Now, you can take your user interaction to the next level with intelligent, context-aware responses. The possibilities are endless with this integration—whether you’re building a customer support chatbot, a productivity assistant, or something entirely new.
Let me know how your integration goes, and feel free to share what cool things you build with this!
here is the full implementation GitHub
by Joche Ojeda | Sep 26, 2024 | A.I, DevExpress, Semantic Kernel
If you’re a Blazor developer looking to integrate AI-powered chat functionality into your applications, the new DevExpress DxAIChat component offers a turnkey solution. It’s designed to make building chat interfaces as easy as possible, with out-of-the-box support for simple chats, virtual assistants, and even Retrieval-Augmented Generation (RAG) scenarios.
The best part? You don’t have to start from scratch—DevExpress provides a range of pre-built examples, making it easy to get started and customize to your needs. Whether you’re aiming for a basic chatbot or a more complex AI assistant, this component has you covered.
To use the examples you can use any open A.I compatible service like Ollama, Open A.I and Azure OpenAI, in current devexpress example they use Azure like this
using DevExpress.AIIntegration;
...
string azureOpenAIEndpoint = Environment.GetEnvironmentVariable("AZURE_OPENAI_ENDPOINT");
string azureOpenAIKey = Environment.GetEnvironmentVariable("AZURE_OPENAI_API_KEY");
string deploymentName = "YourModelDeploymentName"
...
builder.Services.AddDevExpressBlazor();
builder.Services.AddDevExpressAI((config) => {
config.RegisterChatClientOpenAIService
new AzureOpenAIClient(
new Uri(azureOpenAIEndpoint),
new AzureKeyCredential(azureOpenAIKey)
), deploymentName);
//or call the following method to use self-hosted Ollama models
//config.RegisterChatClientOllamaAIService("http://localhost:11434/api/chat", "llama3.1");
});
I tested with OpenA.I API instead of azure, so my code looks like this
string azureOpenAIEndpoint = Environment.GetEnvironmentVariable("AZURE_OPENAI_ENDPOINT");
string azureOpenAIKey = Environment.GetEnvironmentVariable("AZURE_OPENAI_API_KEY");
string OpenAiKey= Environment.GetEnvironmentVariable("OpenAiTestKey");
builder.Services.AddDevExpressBlazor();
builder.Services.AddDevExpressAI((config) => {
//var client = new AzureOpenAIClient(
// new Uri(azureOpenAIEndpoint),
// new AzureKeyCredential(azureOpenAIKey));
//Open Ai models ID are a bit different than azure, Azure=gtp4o OpenAI=gpt-4o
var client = new OpenAIClient(new System.ClientModel.ApiKeyCredential(OpenAiKey));
config.RegisterChatClientOpenAIService(client, "gpt-4o");
config.RegisterOpenAIAssistants(client, "gpt-4o");
});
Notice the IDs of the models in Azure and Open A.I are different
- Azure=gtp4o
- OpenAI=gpt-4o
This are the URLs for the different example
- Chat : https://localhost:53340/
- Assistant/RAG: https://localhost:53340/assistant
- Streaming: https://localhost:53340/streaming
I’m super happy that DevExpress components are doing all the heavy lifting and boilerplate code for us, I have developed the same scenarios using semantic kernel even when there is not so much code to write you still have the challenge of develop a responsive U.I
For more information and to see the examples in action, check out the full article.
by Joche Ojeda | Sep 4, 2024 | A.I, Semantic Kernel
In the world of AI and large language models (LLMs), understanding how to manage memory is crucial for creating applications that feel responsive and intelligent. Many developers are turning to Semantic Kernel, a lightweight and open-source development kit, to integrate these capabilities into their applications. For those already familiar with Semantic Kernel, let’s dive into how memory functions within this framework, especially when interacting with LLMs via chat completions.
Chat Completions: The Most Common Interaction with LLMs
When it comes to interacting with LLMs, one of the most intuitive and widely used methods is through chat completions. This allows developers to simulate a conversation between a user and an AI agent, facilitating various use cases like building chatbots, automating business processes, or even generating code.
In Semantic Kernel, chat completions are implemented through models from popular providers like OpenAI, Google, and others. These models enable developers to manage the flow of conversation seamlessly. While using chat completions, one key aspect to keep in mind is how the conversation history is stored and managed.
Temporary Memory: ChatHistory and Kernel String Arguments
Within the Semantic Kernel framework, the memory that a chat completion model uses is managed by the ChatHistory
object. This object stores the conversation history temporarily, meaning it captures the back-and-forth between the user and the model during an active session. Alternatively, you can use a string argument passed to the kernel, which contains context information for the conversation. However, like the ChatHistory
, this method is also not persistent.
Once the host class is disposed of, all stored context and memory from both the ChatHistory
object and the string argument are lost. This transient nature of memory means that these methods are useful only for short-term interactions and are destroyed after the session ends.
What’s Next? Exploring Long-Term Memory Options
In this article, we’ve discussed how Semantic Kernel manages short-term memory with ChatHistory
and kernel string arguments. However, for more complex applications that require retaining memory over longer periods—think customer support agents or business process automation—temporary memory might not be sufficient. In the next article, we’ll explore the options available for implementing long-term memory within Semantic Kernel, providing insights on how to make your AI applications even more powerful and context-aware.
Stay tuned for the deep dive into long-term memory solutions!
by Joche Ojeda | Jul 3, 2024 | A.I, Blockchain, El Salvador
El Salvador, my birthplace, has recently emerged as a focal point for technological innovation under the leadership of President Nayib Bukele. Born in Suchitoto during the civil war and now living as a digital nomad in Saint Petersburg, Russia, I have witnessed El Salvador’s transformation from a distance and feel compelled to share its story. This article is the first in a series exploring how blockchain technology, financial services, and artificial intelligence (AI) can help a small country like El Salvador grow.
Historical Context and Economic Challenges
El Salvador has faced significant economic challenges over the past few decades, including poverty, gang violence, and a heavy reliance on remittances from abroad. The economy has traditionally been rooted in agriculture, with coffee and sugar being key exports. However, President Bukele, who took office on June 1, 2019, has sought to address these challenges by diversifying the economy and embracing technology as a key driver of growth.
Bukele’s Vision for Economic Transformation
President Bukele’s administration has prioritized technological innovation as a catalyst for economic transformation. His vision is to modernize the country’s infrastructure and position El Salvador as a hub for technological innovation in Latin America. This vision includes the strategic shift from an agriculture-based economy to one focused on technology, financial services, and tourism. The goal is to create a more resilient and diverse economic base that can sustain long-term growth and development.
The Adoption of Bitcoin as Legal Tender
One of the most groundbreaking moves by Bukele’s administration was the introduction of the Bitcoin Law, passed by the Legislative Assembly on June 9, 2021. This law made Bitcoin legal tender alongside the US dollar, which had been the country’s official currency since 2001. The rationale behind this decision was multifaceted:
- Financial Inclusion: With a significant portion of the population lacking access to traditional banking services, Bitcoin offers an alternative means of financial inclusion.
- Reduction in Remittance Costs: Remittances make up a substantial part of El Salvador’s economy. Bitcoin’s adoption aims to reduce the high transaction fees associated with remittance services.
- Economic Innovation: By adopting Bitcoin, El Salvador aims to attract foreign investment and position itself as a leader in cryptocurrency and blockchain technology.
The implementation of Bitcoin involved launching the Chivo Wallet, a state-sponsored digital wallet designed to facilitate Bitcoin transactions. The government also incentivized adoption by offering $30 worth of Bitcoin to citizens who registered for the wallet.
Initial Reactions and Impact
The reaction to the Bitcoin Law was mixed. While some praised the move as innovative and forward-thinking, others raised concerns about the volatility of Bitcoin and its potential impact on the economy. Despite these concerns, the Bukele administration has remained committed to its Bitcoin strategy, continuing to invest in Bitcoin and integrate it into the national economy.
Digital Transformation Initiatives
I
n addition to Bitcoin adoption, El Salvador has partnered with global tech giants like Google to enhance its digital infrastructure. These partnerships aim to modernize government services, improve healthcare through telemedicine platforms, and revolutionize education by integrating AI-driven tools. For instance, Google’s collaboration with the Salvadoran government includes training government agencies on cloud technologies and developing platforms that allow interoperability between institutions.
Strategic Shift to Technology, Financial Services, and Tourism
President Bukele’s broader economic strategy involves shifting El Salvador’s economic focus from traditional agriculture to more dynamic and sustainable sectors like technology, financial services, and tourism. This shift aims to create high-value jobs, attract foreign investment, and build a more diversified economy.
- Technology: By investing in digital infrastructure and fostering a favorable environment for tech startups, El Salvador aims to become a regional tech hub.
- Financial Services: The adoption of Bitcoin and other fintech innovations is intended to transform the financial landscape, making it more inclusive and efficient.
- Tourism: Enhancing the country’s tourism sector, with initiatives to promote its natural beauty and cultural heritage, is another key pillar of Bukele’s economic strategy.
Conclusion and Future Prospects
El Salvador’s journey towards becoming a technological leader in Latin America is a testament to the transformative power of visionary leadership and innovative policies. Under President Bukele, the country has taken bold steps to embrace technology, from adopting Bitcoin to integrating AI into public services. This series of articles will delve deeper into these initiatives, exploring their impact, challenges, and the future prospects for El Salvador in the global technological landscape.
By understanding El Salvador’s technological revolution, we can gain insights into the potential for other nations to leverage technology for economic and social development. The next article in this series will focus on the detailed implementation of Bitcoin as legal tender, examining the steps taken by the Bukele administration and the outcomes observed so far.
This introductory article sets the stage for a comprehensive exploration of El Salvador’s technological transformation under President Bukele. The subsequent articles will provide in-depth analyses and propose potential AI legislation to ensure the country’s continued leadership in technology within Latin America.
by Joche Ojeda | May 22, 2024 | A.I
A New Era of Computing: AI-Powered Devices Over Form Factor Innovations
In a recent Microsoft event, the spotlight was on a transformative innovation that highlights the power of AI over the constant pursuit of new device form factors. The unveiling of the new Surface computer, equipped with a Neural Processing Unit (NPU), demonstrates that enhancing existing devices with AI capabilities is more impactful than creating entirely new device types.
The Microsoft Event: Revolutionizing with AI
Microsoft showcased the new Surface computer, integrating an NPU that enhances performance by enabling real-time processing of AI algorithms on the device. This approach allows for advanced capabilities like enhanced voice recognition, real-time language translation, and sophisticated image processing, without relying on cloud services.
Why AI Integration Trumps New Form Factors
For years, the tech industry has focused on new device types, from tablets to foldable screens, often addressing problems that didn’t exist. However, the true advancement lies in making existing devices smarter. AI integration offers:
- Enhanced Productivity: Automating repetitive tasks and providing intelligent suggestions, allowing users to focus on more complex and creative work.
- Personalized Experience: Devices learn and adapt to user preferences, offering a highly customized experience.
- Advanced Capabilities: NPUs enable local processing of complex AI models, reducing latency and dependency on the cloud.
- Seamless Integration: AI creates a cohesive and efficient workflow across various applications and services.
Comparing to Humane Pin and Rabbit AI Devices
While devices like the Humane Pin and Rabbit AI offer innovative new form factors, they often rely heavily on cloud connectivity for AI functions. In contrast, the Surface’s NPU allows for faster, more secure local processing. This means tasks are completed quicker and more securely, as data doesn’t need to be sent to the cloud.
Conclusion: Embracing AI-Driven Innovation
Microsoft’s AI-enhanced Surface computer signifies a shift towards intelligent augmentation rather than just physical redesign. By embedding AI within existing devices, we unlock new potentials for efficiency, personalization, and functionality, setting a new standard for future tech innovations. This approach not only makes interactions with technology smarter and more intuitive but also emphasizes the importance of on-device processing power for a faster and more secure user experience.
For more information and to pre-order the new Surface laptops, visit Microsoft’s official store.