Code reviews improve software by putting human-written code in front of human eyes. Reviews help developers learn best design patterns and coding practices, while also catching bugs that may not be identified by automatic testing. However, it is important to highlight that toxic behaviors during code reviews can be more unproductive than no code reviews at all, because these behaviors stifle the qualities developers need the most: creativity and innovativeness.
Some examples of toxic code review behaviors include:
● Passing one’s programming opinions off as fact (“This should be a lambda function
instead because I like lambda functions”)
● Asking judgmental questions (“Why didn’t you JUST do ___?”)
● Making demands without allowing a discussion (“Use ___ instead of what you did”)
● Sarcasm (“Did you even test this code before you checked it in?”)
● Using emojis instead of words to point out problems in code, which can be easily
● Using code reviews as an opportunity to show off how clever one is
This toxicity can make team members feel uncomfortable, gaslit, silenced, and bullied. It creates an unsupportive environment and discourages risk-taking and innovation in an industry that could never survive without it.
Let’s look at several mechanisms that people can use to help themselves and others unlearn toxic behaviors by refusing to normalize the toxicity. I will speak about how people can drive change from whatever position they may be in: individual contributor, manager, or HR. Ending this toxic culture helps make tech an environment where developers are allowed to learn, grow, and make mistakes.
By the age of 22, Sneha had two degrees in Computer Science, four SWE internships, and 7 job offers for various tech roles including Software Engineer, Cognitive Software Engineer, Decision Scientist, and Information Risk Manager. Most of the companies that offered her a full time job had previously rejected her for an internship (just a few months before!). In this talk, Sneha will share some of her experiences that slowly helped her gain more confidence and get over her fear of technical interviews.
"Unicorn" is a trite umbrella term often used to label people who aren't traditionally seen in tech. This alienates and isolates both people who are currently in tech or wish to enter it. In this talk I'll be delving into some issues with companies who adopt diversity on a surface level and how they can better tweak their initiatives, as well as talk about my experiences navigating the industry as a so called "unicorn". I will also touch upon the fear of being wanted solely for being a unicorn or how diversity for diversity's sake is actually causing a bottleneck effect in the pipeline.
We often talk about impostor syndrome - feeling like a fraud, undeserving of professional success. As an openly trans woman, I have seen parallels in my experiences and feelings when “passing” in various social contexts before and after transition, for example when attending events intended for women. These feelings cause a form of self-erasure: they undermine my identity and suppress my ability to share thoughts and opinions that I do not feel qualified to have. I'll talk about my experiences, some of the ways I think about this problem, and various approaches I’ve used for dealing with it.
As computer science becomes increasingly powerful, the ethical and social issues it raises become ever more pressing and complex. This talk discusses the issues I have encountered in my day-to-day work as a computer scientist, the need to educate young computer scientists on the broader implications of their work, and the need for diversity within the computer science workforce.
I joined the Node.js project with the intention of improving culture in 2015 along with several other individuals, and I eventually became one of the top leaders of the project. Two years later I resigned in protest over the rest of leadership's unwillingness to enforce our own CoC (code of conduct) against a problematic leader. In this talk, I will tell the story of social justice work in the project, the barriers we faced, how we overcame some of them, and the cost many people paid trying to make things better. Specifically, I will detail how and why the rest of leadership failed to understand the issues, and how that translated to their implicit refusal to protect those that needed protection the most.
Building or scaling products for emerging markets usually involves just translating your app into the appropriate language, right? No! In this talk, we will examine different things that folks building products for users outside of Western markets should look at, including things like understanding what devices people use, making sure illustrations are inclusive, etc. We will also focus on product inclusion aka how do you make sure that the product feedback you are using to build your products comes from a user base that is fully representative of the audience that you are building for. Using examples from the presenter's work, this talk can hopefully serve as a blueprint of different considerations for products and teams looking to build products for people beyond Silicon Valley and for the rest of the world.
I founded a dating app called Thurst, launched a beta in 2017. I shut that same beta down a few months later because of overwhelming threats and harassment to the small, but dedicated user base of 15,000. I realized that many social platforms and dating apps lacked an adequate let alone radical protocol for keeping users, particularly marginalized users safe. I've been developing a way to address harassment, abuse, and name and detect patterns of violence on platforms. Most platforms, apps, and online spaces are inherently hierarchical and therefore inherently stratified in terms of privilege, access, safety, and usability. My talk will talk about a few practices, paradigms, and tools my team and I use to better understand how to reduce harassment and violence.
Majority of low-income families access the internet through their smartphones. Mobile developers have the opportunity to reach marginalized communities through their smart phones and create tools to benefit these communities. I will talk about how developers (especially underrepresented developers) have created mobile tools for problems within our communities. I will discuss my app, We Read Too, as well as other mobile apps like Tala. My talk will also be a call-to-action to mobile developers to reach out and work with communities that could use mobile tools to make their lives better or provide them access to resources.
This presentation is about facing fear, facing the odds and embracing failure, and how failure can strengthen, rather than weaken, communities. In the developing world, building communities can be difficult. Not everyone has access to technology, attrition rates are high, and they're afraid they won't fit in. But this can't be a reason for quitting, and sometimes it takes multiple tries to get a community right. This presentation builds from personal experience to talk about 12 years of community-building, of triumph and failure that help make our communities stronger, and that help make the ones who lead them more capable of doing the critical work of building those communities. We embrace success, but failure is something we can certainly embrace, and this presentation will make the strong case for it.
When I was a PM working at a shopping loyalty startup, I was afraid to talk to my co-workers about the very real social issues that our industry was creating. I was even afraid to tell my co-workers that I was trying to mitigate some of these issues by running a social impact program for young professionals outside of work. I didn't want anyone to know that I was leaving work early to volunteer. Now that I organize young tech professionals for social good full-time, so many people tell me that they are hesitant to speak up, too.
When I did speak up, I found that people were actually very receptive. My hypothesis is that many tech workers DO care and DO want to get more familiar with issues like diversity, inclusion and social justice--they don't know where to start. How might we, as diverse tech people, bring these important conversations to our friends and co-workers? And, how might we build allies so that we are not harmed in our efforts to educate others?