This course is a practical and systematic guide to working with large language models for engineers, architects, and technical leaders who want to not just use LLMs, but understand how they work and how to build solutions for their projects.
You will understand the internal architecture of transformers, key properties and limitations of modern language models, learn how to correctly configure them via API, and work with typical problems - in particular, hallucinations, contextual limitations, and unstable response quality.
Special attention is paid to the transition from Prompt Engineering to Context Engineering: how to build systems where the model works stably within the given rules, knowledge, and tools. You will learn how to use LLM as a backend component, build RAG solutions, connect external tools, work with structured responses, and optimize token usage in real production scenarios.
The course combines theory with practical demos so that you can immediately apply the approaches in your own products, services, or internal platforms.
After the course, you will be able to test models for relevance to specific tasks and make technical decisions based on metrics, not intuition.
Who will be interested?
Format:
Recommended set of tools and subscriptions:
Language of the event and presentations: Ukrainian
— Co-founder and CEO of the Ukrainian IT company DevRain
— Co-founder and technical director of DonorUA - an intelligent blood donor recruitment system.
— Candidate of Technical Sciences in Information Technology.
—Microsoft Regional Director, Microsoft Artificial Intelligence Most Valuable Professional.
—Microsoft Certified: Azure Data Science Associate
Access to the course recording for a year
Presentations
Mentor's e-book "Large Language Models, Query Engineering + Agents"
Participant certificate (subject to completing testing)
10% discount on participation in Fwdays conferences
Affordable payment by installments from Monobank and purchase by installments from Privatbank