Presented at
DEF CON 32 (2024),
Aug. 10, 2024, 9 a.m.
(240 minutes).
We live in a time of unexpected transformation. Machines can hold conversations, compose prose and poetry, and generate very convincing deepfakes. The field of AI where this all happens – deep learning – has a long history, starting with one simple building block: the neural network.
In this workshop, we will tour through the evolution of neural networks and discover that much of their evolution occurred in the world of low-level programming. Using C, C++ and a bit of assembly language, we will learn the fundamentals behind neural networks in their various forms, and build a foundation of knowledge that will allow us to understand how we arrived at large language models, the current state of the art. Most importantly, we will discover how far we can stretch everyday hardware to run deep learning models that solve interesting problems.
Presenters:
-
eigentourist
Eigentourist is a programmer who learned the craft in the early 1980s. He began formal education in computer science when the height of software engineering discipline meant avoiding the use of GOTO statements. Over the course of his career, he has created code of beautiful simplicity and elegance, and of horrific complexity and unpredictability. Sometimes, it's hard to tell which was which. Today, he works on systems integration and engineering in the healthcare industry.
Similar Presentations: