Setting Up WSL 2: My Development Environment Scripts

Setting Up WSL 2: My Development Environment Scripts

After a problematic Windows update on my Surface computer that prevented me from compiling .NET applications, I spent days trying various fixes without success. Eventually, I had to format my computer and start fresh. This meant setting up everything again – Visual Studio, testing databases, and all the other development tools.To make future setups easier, I created a collection of WSL 2 scripts that automate the installation of tools I frequently use, like PostgreSQL and MySQL for testing purposes. While these scripts contain some practices that wouldn’t be recommended for production (like hardcoded passwords), they’re specifically designed for testing environments. The passwords used are already present in the sync framework source code, so there’s no additional security risk.I decided to share these scripts not as a perfect solution, but as a starting point for others who might need to set up similar testing environments. You can use them as inspiration for your own scripts or modify the default passwords to match your needs.

Note that these are specifically for testing purposes – particularly for working with the sync framework – and the hardcoded credentials should never be used in a production environment.

https://github.com/egarim/MyWslScripts

LDAP Scripts

MyWslScripts/ldap-setup.sh at master · egarim/MyWslScripts

MyWslScripts/add-ldap-user.sh at master · egarim/MyWslScripts

MySQL

MyWslScripts/install_mysql.sh at master · egarim/MyWslScripts

Postgres

MyWslScripts/install_postgres.sh at master · egarim/MyWslScripts

Redis

MyWslScripts/redis-install.sh at master · egarim/MyWslScripts

Let me know if you’d like me to share the actual scripts in a follow-up post!
Understanding System Abstractions for LLM Integration

Understanding System Abstractions for LLM Integration

I’ve been thinking about this topic for a while and have collected numerous notes and ideas about how to present abstractions that allow large language models (LLMs) to interact with various systems – whether that’s your database, operating system, word documents, or other applications.

Before diving deeper, let’s review some fundamental concepts:

Key Concepts

First, let’s talk about APIs (Application Programming Interface). In simple terms, an API is a way to expose methods, functions, and procedures from your application, independent of the programming language being used.

Next is the REST API concept, which is a method of exposing your API using HTTP verbs. As IT professionals, we hear these terms – HTTP, REST, API – almost daily, but we might not fully grasp their core concepts. Let me explain how they relate to software automation using AI.

HTTP (Hypertext Transfer Protocol) is fundamentally a way for two applications to communicate using text. This is its beauty – text serves as the basic layer of understanding between systems, meaning almost any system or programming language can produce a client or server that can interact via HTTP.

REST (Representational State Transfer) is a methodology for systems to communicate and either change or read the state of another system.

Levels of System Interaction

When implementing LLMs for system automation, we first need to determine our desired level of interaction. Here are several approaches:

  1. Human-like Interaction: An LLM can interact with your operating system using mouse and keyboard inputs, effectively mimicking human behavior.
  2. REST API Integration: Your application can communicate using HTTP verbs and the REST protocol.
  3. SDK Implementation: You can create a software development kit that describes your application’s functionality and expose this to the LLM.

The connection method will vary depending on your chosen technology. For instance:

  • Microsoft Semantic Kernel allows you to create plugins that interact with your system through REST API, database, or SDK.
  • Microsoft AI extensions require you to decide on your preferred interaction level before implementation.
  • The Model Context Protocol is a newer approach that enables application exposure for LLM agents, with Claude from Anthropic being a notable example.

Implementation Considerations

When automating your system, you need to consider:

  1. Available Integration Options: Not all systems provide an SDK or API, which can limit automation possibilities.
  2. Interaction Protocol Choice: You’ll need to decide between REST API, HTTP, or Model Context Protocol.

This overview should help you understand the various levels of resolution needed to automate your application. What’s your preferred method for integrating LLMs with your applications? I’d love to hear your thoughts and experiences.

Bridging Traditional Development using XAF and AI: Training Sessions in Cairo

Bridging Traditional Development using XAF and AI: Training Sessions in Cairo

I recently had the privilege of conducting a training session in Cairo, Egypt, focusing on modern application development approaches. The session covered two key areas that are transforming how we build business applications: application frameworks and AI integration.

Streamlining Development with Application Frameworks

One of the highlights was demonstrating DevExpress’s eXpressApp Framework (XAF). The students were particularly impressed by how quickly we could build fully-functional Line of Business (LOB) applications. XAF’s approach eliminates much of the repetitive coding typically associated with business application development:

  • Automatic CRUD operations
  • Built-in security system
  • Consistent UI across different platforms
  • Rapid prototyping capabilities

Seamless Integration: XAF Meets Microsoft Semantic Kernel

What made this training unique was demonstrating how XAF’s capabilities extend into AI territory. We built the entire AI interface using XAF itself, showcasing how a traditional LOB framework can seamlessly incorporate advanced AI features. The audience, coming primarily from JavaScript backgrounds with Angular and React experience, was particularly impressed by how this approach simplified the integration of AI into business applications.

During the demonstrations, we explored practical implementations using Microsoft Semantic Kernel. The students were fascinated by practical demonstrations of:

  • Natural language processing for document analysis
  • Automated content generation for business documentation
  • Intelligent decision support systems
  • Context-aware data processing

Student Engagement and Outcomes

The response from the students, most of whom came from JavaScript development backgrounds, was overwhelmingly positive. As experienced frontend developers using Angular and React, they were initially skeptical about a different approach to application development. However, their enthusiasm peaked when they saw how these technologies could solve real business challenges they face daily. The combination of XAF’s rapid development capabilities and Semantic Kernel’s AI features, all integrated into a cohesive development experience, opened their eyes to new possibilities in application development.

Looking Forward

This training session in Cairo demonstrated the growing appetite for modern development approaches in the region. The intersection of efficient application frameworks and AI capabilities is proving to be a powerful combination for next-generation business applications.

And last, but not least, some pictures )))