The biases run deep: from early in our school careers, we're taught that "smart people" go into math, science, and tech. There's an unspoken hierarchy many of us have drilled into our heads, with particle physics at the top of the academic food chain, engineering lower down but still higher than that weird squishy stuff in biology and the even squishier stuff in sociology, etc. "Smart people" tackle the "hard" problems, and the hard problems involve a lot of math, "hard" science, and empirical evidence. Well listen, J. Random Hacker, if you're so goddamn smart, why haven't you built a tool that makes it easy for people to encrypt their email yet? Why is adoption the major barrier to secure communications? Why haven't the tools you've built evened out the digital divide? Is the hard problem infrastructure scaling or the Traveling Salesman problem, or is the really hard problem dealing with the people you could never get to understand what you're doing? This talk will be an exhortation for hackers to overcome the traditional biases many of us have in favor of technical projects and against human-factors work. It's a call for more people to think about usability in open source software, particularly on the privacy and security tools we care so much about. Gus will tease apart the deep-seated socialization we have about what work "smart" people do, what "good" science looks like, and why studies of human social interactions must have different criteria than "hard" sciences in order to be effective.