Code Conversations

ali heydari moghaddam

Code Conversations, is a podcast for software developers, engineers, and tech enthusiasts of all levels. Hosted by a seasoned developer with nearly 20 years of experience, each episode dives deep into the world of software development, exploring coding techniques, best practices, industry trends, and the stories behind the code. Whether you're a beginner or a pro, tune in to gain valuable insights, hear from industry experts, and join conversations that will help you stay ahead in the fast-evolving tech world.

  1. Practical Generative AI Applications and LLMs

    18 OCT

    Practical Generative AI Applications and LLMs

    Recent advances in generative AI, exemplified by LLMs like Stable Diffusion and ChatGPT, have created significant industry hype. Generative AI involves creating new media (such as text or images) by analyzing massive datasets to deduce and mimic existing patterns, a process driven by probabilistic and stochastic modeling. While models like GPT can produce humanlike text, they operate as language prediction models rather than utilizing true reasoning (AGI), which means they often "stumble over facts," produce inconsistent results, and struggle with basic tasks like multiplication, leading to "hallucinations". To leverage these tools effectively, prompt engineering is necessary—this "subtle art" involves providing clear, specific instructions, setting a system context or persona, and potentially using examples to coax a useful result from the AI. When integrating AI via the stateless Completions API, developers must manually maintain conversation state by sending the entire history with each request, often summarizing older messages to manage token costs. More robust applications can utilize GPT Functions (Tools) to allow the model to intelligently call external functions—avoiding expensive model retraining—to access live or proprietary data. Alternatively, to query custom data using natural language, facts can be converted into high-dimensional vectors called embeddings and compared using cosine similarity against user queries, often managed in a database like Postgress with PG Vector. Finally, the newer Assistants API simplifies the development of domain-specific helpers by automatically managing message history and context compaction, and uniquely, when referencing uploaded knowledge files (like a lease document), it provides specific references or footnotes detailing where the answer was found. Ref: https://www.youtube.com/watch?v=OxHw_u45h7M&list=PL03Lrmd9CiGey6VY_mGu_N8uI10FrTtXZ&index=18

    18 min
  2. Next Generation Developer Platforms and Architectural Archetypes

    15 OCT

    Next Generation Developer Platforms and Architectural Archetypes

    Enterprise software development is currently facing immense executive pressure, driven by boards and CEOs demanding rapid innovation, especially utilizing AI, to increase productivity, save costs, and gain a competitive advantage, a significant shift from previous executive disinterest in issues like integration modernizations. Developers frequently encounter frustrations and delays, including the "age old disconnect" where their pursuit of new tools clashes with IT and Security's focus on uptime, reliability, and avoiding security breaches, leading to delays of months in realizing business value after writing working code. This inefficiency is exacerbated by the extensive time spent on environment setup, sometimes weeks, which the speaker suggests "should be illegal" given the cost of developer time. To address these challenges, modern tooling focuses on standardizing environments: GitHub Codespaces provides an ephemeral, standardized development environment (VS Code in the browser connected to backend compute) where mean time to onboard can be reduced to minutes, defined by a devcontainer.json file that specifies necessary dependencies; complementarily, Dev Box offers full, on-demand virtual machines based on team-defined templates, which are highly supported by security teams because they are built on existing tooling (like Windows 365 and InTune) and allow for customized security profiles that can exclude productivity apps (a common attack vector), helping one customer reduce environment setup time for .NET applications from over two weeks to three hours. Further accelerating delivery involves codifying Deployable Architectural Archetypes using the Azure Developer CLI (azd), a principle that dictates repositories contain source code alongside infrastructure-as-code (Terraform or Bicep) and CI/CD pipelines, ensuring critical elements like Key Vault and Azure Monitor are "baked in" from the start. This approach is key to engaging security, architecture, and infrastructure champions early, transforming them into innovation accelerators by incorporating their requirements into the blueprint templates, allowing deployment of full infrastructure and applications with simple commands like azd up. Ultimately, while tools like these, along with GitHub Co-pilot for Business (which securely boosts performance by 30% to 50%), help "grease the wheels," successful acceleration relies on ensuring the people and process are right, including having security champions on the team and deploying applications to secure, compliant Landing Zones. Ref: https://www.youtube.com/watch?v=mTozV_eV4jQ&list=PL03Lrmd9CiGey6VY_mGu_N8uI10FrTtXZ&index=17

    17 min

About

Code Conversations, is a podcast for software developers, engineers, and tech enthusiasts of all levels. Hosted by a seasoned developer with nearly 20 years of experience, each episode dives deep into the world of software development, exploring coding techniques, best practices, industry trends, and the stories behind the code. Whether you're a beginner or a pro, tune in to gain valuable insights, hear from industry experts, and join conversations that will help you stay ahead in the fast-evolving tech world.