442 episodes

Cognilytica's AI Today podcast focuses on relevant information about what's going on today in the world of artificial intelligence. Hosts Kathleen Walch and Ron Schmelzer discuss pressing topics around artificial intelligence with easy to digest content, interview guests and experts on the subject, and cut through the hype and noise to identify what is really happening with adoption and implementation of AI.

AI Today Podcast: Artificial Intelligence Insights, Experts, and Opinion AI & Data Today

    • Technology

Cognilytica's AI Today podcast focuses on relevant information about what's going on today in the world of artificial intelligence. Hosts Kathleen Walch and Ron Schmelzer discuss pressing topics around artificial intelligence with easy to digest content, interview guests and experts on the subject, and cut through the hype and noise to identify what is really happening with adoption and implementation of AI.

    Prompt Engineering Best Practices: Soft Skills

    Prompt Engineering Best Practices: Soft Skills

    Generative AI is one of the most accessible forms of AI currently available. While in the past, you might have used AI without knowing it, you can use Generative AI purposefully in ways that have immediate and dramatic impact on your daily life. In this episode of AI Today hosts Kathleen Walch and Ron Schmelzer discuss what soft skills are necessary to get what we want out of Generative AI.








    Why are soft skills important in the age of AI?







    The great thing about Generative AI, is that it doesn’t require “hard skills”. Hard skills include programming, math, analytics skills, database and data engineering skills, or anything else that requires specific education and years of training. You don’t need to be an expert in math skills such as Statistics & Probability, Calculus or Linear Algebra to get value from using Generative AI. You also don’t need to be skilled in using different Data Visualization tools. Nor do you need to have knowledge of different algorithms & modeling skills.








    Rather, we can use our existing “soft skills” of communication, planning, creativity, and critical thinking to get what we want out of Generative AI. No surprise, soft skills are incredibly important when it comes to effective prompting, since hard skills aren’t used at all when creating prompts that interact with Generative AI systems. In this episode we go into detail on why these soft skills are so important and how they help you get better at prompt engineering.








    Show Notes:










    Free Intro to CPMAI course







    CPMAI Certification







    Subscribe to Cognilytica newsletter on LinkedIn







    Properly Scoping AI Projects [AI Today Podcast]







    Prompt Engineering Best Practices: What is Prompt Chaining? [AI Today Podcast]







    Prompt Engineering Best Practices: Using Custom Instructions [AI Today Podcast]







    Prompt Engineering Best Practices: Hack and Track [AI Today Podcast]

    • 10 min
    AI’s Impact on Project Management: Interview with Saby Waraich [AI Today Podcast]

    AI’s Impact on Project Management: Interview with Saby Waraich [AI Today Podcast]

    AI, and in particular generative AI, is having a profound impact on just about every industry. In this episode of AI Today hosts Kathleen Walch and Ron Schmelzer interview Saby Waraich to discuss AI's impact on Project Management. Saby is CIO at Clackamas Community College and and speaking at the PMI Austin, TX Professional Development Day May 2, 2024.








    What impact will artificial intelligence (AI) have on the field of project management?







    Saby shares with us how he is seeing Generative AI tools changing the role of project management. He also discusses why soft skills are so important. And, what soft skills are going to become most important for PMs in a generative AI world. As project managers continue to use GenAI and become more comfortable creating and iterating on prompts, the benefits will continue to increase.








    How can a project manager use GenAI to improve power skills?







    PMs already have many soft skills they leverage on a basis. Saby shares how PMs can leverage their skills such as communication, critical thinking, adaptability, and attention to detail to build conversational skills with AI. Saby shares first hand experiences about prompts he has currently used. And, how he is getting creative with his prompts and asks to LLMs. He also discusses why trust is so important. After all, if you're not trusting your AI tool, you probably won't want to use it.








    Show Notes:










    PMI Austin, TX Chapter







    Free Intro to CPMAI course







    CPMAI Certification







    Subscribe to Cognilytica newsletter on LinkedIn







    Properly Scoping AI Projects [AI Today Podcast]







    Prompt Engineering Best Practices: What is Prompt Chaining? [AI Today Podcast]







    Prompt Engineering Best Practices: Using Custom Instructions [AI Today Podcast]

    • 18 min
    AI’s Impact on Communication skills: Interview with Patti DeNucci

    AI’s Impact on Communication skills: Interview with Patti DeNucci

    Effective communication is an important skill to have. And, in this AI-era it's more important than ever. In this episode of AI Today hosts Kathleen Walch and Ron Schmelzer interview Patti DeNucci. She is an author, speaker, workshop facilitator, consultant and keynoting the PMI Austin, TX Professional Development Day May 2, 2024.








    How does AI impact communication?







    Generative AI tools are having a profound impact on the way people work, write, and communicate. In this episode Patti discusses how GenAI tools are changing the way people communicate. She also discusses ways that AI can improve the way people communicate. Sometimes you can't find the write words for an email or a note to a friend. With the help of GenAI you can now get the conversation started, or fully written, with just a few sentences in your prompt.








    Patti also touches upon how AI is changing and impacting the project management profession and role. Project managers can leverage their soft skills of communication as well as critical thinking, adaptability, and attention to detail to build conversational skills with AI.








    Show Notes:










    PMI Austin, TX Chapter







    Free Intro to CPMAI course







    CPMAI Certification







    Subscribe to Cognilytica newsletter on LinkedIn







    Properly Scoping AI Projects [AI Today Podcast]







    Prompt Engineering Best Practices: What is Prompt Chaining? [AI Today Podcast]







    Prompt Engineering Best Practices: Using Custom Instructions [AI Today Podcast]

    • 13 min
    Prompt Engineering Best Practices: Hack and Track

    Prompt Engineering Best Practices: Hack and Track

    Experimenting, testing, and refining your prompts are essential. The journey to crafting the perfect prompt often involves trying various strategies to discover what works best for your specific needs. A best practice is to constantly experiment, practice, and try new things using an approach called “hack and track”. This is where you use a spreadsheet or other method to track what prompts work well as you experiment. In this episode of AI Today hosts Kathleen Walch and Ron Schmelzer discuss hack and track in detail.








    Keeping track of prompts







    It's rare to get the desired response on your first attempt. An iterative process of testing different prompts, analyzing the responses, and then tweaking your approach allows you to gradually hone your technique. Another challenge is that LLMs are constantly evolving. The performance of LLMs is very much domain and task dependent, and the performance will change over time. A current prompting best practice is to use a spreadsheet or other method to track what prompts work well as you experiment.








    How to set up your Hack and Track Spreadsheet







    Keeping track of prompts that work best for you in which situations, including which LLMs are providing the best results for you at that time, can be incredibly helpful for your colleagues as well. There are many LLMs, and at any one particular time, one LLM may perform better than another in a given situation. Without keeping track of prompts you've written and tested, it's hard to have others try to use these prompts themselves.








    When creating a spreadsheet to keep track of prompts, the details matter. Every spreadsheet may be set up a little differently but you'll want to include some essentials. Criteria you can use when setting up your hack and track sheet include: Name of the task or query, Prompt Pattern(s) used, LLM used, date last used for this prompt, Prompt chaining approach used if any, and maybe the person or group that created the prompt.








    Kathleen and Ron discuss their own experiences with hack and track in this episode and how learning from others is so critical. By seeing how others are writing prompts it helps you get creative an think of ways to use LLMs you may never have thought of. It also lets you see how others at your organization are writing prompts and the results they are having.








    Show Notes:










    Free Intro to CPMAI course







    CPMAI Certification







    Subscribe to Cognilytica newsletter on LinkedIn







    Properly Scoping AI Projects [AI Today Podcast]







    Prompt Engineering Best Practices: What is Prompt Chaining? [AI Today Podcast]







    Prompt Engineering Best Practices: Using Custom Instructions [AI Today Podcast]

    • 9 min
    Prompt Engineering Best Practices: Using Plugins

    Prompt Engineering Best Practices: Using Plugins

    Plugins for Large Language Models (LLMs) are additional tools or extensions that enhance the LLM's capabilities beyond its base functions. In this episode hosts Kathleen Walch and Ron Schmelzer discuss this topic in greater detail.








    Can I use plugins with ChatGPT?







    Plugins can access external databases, perform specific computations, or interact with other software and APIs to fetch real-time data, execute code, and more. In essence, they significantly expand the utility of LLMs, making them more versatile and effective tools for a wide range of applications. They bridge the gap between the static knowledge of a trained model and the dynamic, ever-changing information and capabilities of the external world. Plugins can be used on many different LLMs.








    Why use plugins?







    People use plugins for a variety of reasons. They allow you access to Real-time information by accessing up-to-date information from the web or other data sources,. They can also an perform specialized tasks like solving complex mathematical problems, generating code, or providing translations with expertise that might not be fully developed in the base model. Plugins also enable LLMs to interact with other applications and services, allowing for dynamic content generation, automation of tasks, and enhanced user interactions. They also allow for customization and personalization as well as improved performance and efficiency. In the episode we discuss this all in greater detail.








    Show Notes:










    Free Intro to CPMAI course







    CPMAI Certification







    Subscribe to Cognilytica newsletter on LinkedIn







    Properly Scoping AI Projects [AI Today Podcast]







    Prompt Engineering Best Practices: What is Prompt Chaining? [AI Today Podcast]







    Prompt Engineering Best Practices: Using Custom Instructions [AI Today Podcast]

    • 7 min
    Prompt Engineering Best Practices: Using Custom Instructions

    Prompt Engineering Best Practices: Using Custom Instructions

    As folks continue to use LLMs, best practices are emerging to help users get the most out of LLMs. OpenAI's ChatGPT allows users to tailor responses to match their tone and desired output goals. Many have reported that using custom instructions results in much more accurate, precise, consistent, and predictable results. But why would you want to do this and why does it matter? In this episode, hosts Kathleen Walch and Ron Schmelzer discuss why this is a best practice.








    What are custom instructions in ChatGPT?







    In ChatGPT, custom instructions are provided by answering two questions asked in settings that get sent along with your prompts:










    What would you like ChatGPT to know about you to provide better responses?







    How would you like ChatGPT to respond?









    It's important to note that once created, these instructions will apply to all future chat prompt sessions (not previous or existing ones). This allows you to make somewhat permanent settings that don’t have to be constantly reset. Custom prompt instructions are short and generally limited to about 1500 characters, so keep it precise and concise.








    Show Notes:










    Free Intro to CPMAI course







    CPMAI Certification







    Subscribe to Cognilytica newsletter on LinkedIn







    Properly Scoping AI Projects [AI Today Podcast]







    Prompt Engineering Best Practices: What is Prompt Chaining? [AI Today Podcast]







    AI Today Podcast: AI Glossary Series – OpenAI, GPT, DALL-E, Stable Diffusion







    AI Today Podcast: AI Glossary Series – Tokenization and Vectorization

    • 15 min

Top Podcasts In Technology

Apple Events (video)
Apple
Apple Events (audio)
Apple
Critical Thinking - Bug Bounty Podcast
Justin Gardner (Rhynorater) & Joel Margolis (teknogeek)
The Vergecast
The Verge
Hack'n Speak
mpgn
Tech&Co, la quotidienne
BFM Business

You Might Also Like

The AI Podcast
NVIDIA
Practical AI: Machine Learning, Data Science
Changelog Media
This Day in AI Podcast
Michael Sharkey, Chris Sharkey
The AI in Business Podcast
Daniel Faggella
Last Week in AI
Skynet Today
The AI Daily Brief (Formerly The AI Breakdown): Artificial Intelligence News and Analysis
Nathaniel Whittemore