Demystifying the technology behind Generative AI
Generative AI is a rapidly growing field, but it can be shrouded in mystery and jargon that can make it difficult for non-technical professionals to understand. This conference talk aims to demystify generative AI and introduce at a high level how models like ChatGPT or Stable Diffusion are trained. We will start by providing an overview of the deep learning architecture used by generative AI models, including attention mechanisms, transformers, and RLHF (Reinforcement Learning with Human Feedback). Using Hugging Face open-source, we will also explain how these models are trained on large datasets and how they can be fine-tuned for specific tasks. By the end of this talk, business and IT professionals will have a better understanding of how generative AI works and how it can be applied in various industries, such as marketing and customer service. They will also have a high-level understanding of the underlying models, which will enable them to make more informed decisions about using generative AI in their businesses.
Julian Simon is currently Chief Evangelist at Hugging Face. He’s recently spent 6 years as Amazon Web Services where he was the Global Technical Evangelist for AI & Machine Learning. Prior to joining AWS, Julien served for 10 years as CTO/VP Engineering in large-scale startups where he led large Software and Ops teams in charge of thousands of servers worldwide.