Neha Narkhede

Helping companies make sense of all the data.

The business world is drowning in data, but Neha Narkhede is teaching companies to swim. As an engineer at LinkedIn, Narkhede helped invent an open-source software platform called Apache Kafka to quickly process the site’s torrent of incoming data from things like user clicks and profile updates. Sensing a big opportunity, she co-founded Confluent, a startup that builds Apache Kafka tools for companies, in 2014. She’s been the driving force behind the platform’s wide adoption—Goldman Sachs uses it to help deliver information to traders in real-time, Netflix to collect data for its video recommendations, and Uber to analyze data for its surge-pricing system. Confluent’s products allow companies to use the platform to, for example, sync information across multiple data centers and monitor activity through a central console.

“We view our technology as a central nervous system for companies that aggregates data and makes sense of it within milliseconds, at scale,” she says. “We think virtually every company would benefit from that and we plan to bring it to them.”

Adrienne Felt

Leading the push for a more secure Internet.

The next time you open up Google’s Chrome Web browser, take a look at the little green icon that appears in the left corner of the URL bar whenever you’re on a secure website. It’s a lock, and if it’s green it signals that the website you’re on is encrypting data as it flows between you and the site. But not everyone knows what it is or what it represents, and that’s where Adrienne Felt comes in.

As a software engineer for Chrome, Felt has taken on the task of making the Internet more secure and helping users of the world’s most popular browser make smart, informed choices about their safety and privacy while online. This includes heading a years-long push to convince the world’s websites, which traditionally used the unencrypted HTTP to send data from one point to another, to switch to the secure version, HTTPS.

Why is it tricky to come up with online security measures that work for all kinds of people?

Part of it is that security measures generally stop people from doing things. The way we keep you safe is by telling you no. But this has very real costs. You can scare people … you can keep people from using the Internet at all. On the other hand, if you don’t do anything you put people and their data at very real risk. So you have to figure out how to strike just the right balance. And with multiple billion users it’s very difficult to find a balance that makes everyone happy.

One way you are trying to make people safer while they’re online is by encouraging websites to use HTTPS. What makes this a complicated process?

Think about a site like the Washington Post. When you go to the Washington Post’s home page, there’s going to be 100 different [assets from various websites] that are loaded. All of those have to support HTTPS before the Washington Post itself can do it. Sites need to make sure there’s no revenue hit, they need to make sure there’s no [search] ranking hit, they need to make sure there’s no performance hit. And then they can switch. All these things can be done. Sites are transitioning very successfully at scale now. But it is work.

Now that many of the biggest websites have made the switch from HTTP to HTTPS, what are you focusing on?

The long tail is a big problem. There are lots and lots of sites that are out there. Some that are barely maintained, some that are run by your dentist, your hairdresser, a teacher at a local elementary school, and I don’t see them rushing to add support for HTTPS. The question is now, “Okay, we’ve hit all the really popular sites, we’re starting to get to the medium sites—what do we do for the rest of the Internet?” I don’t want to get in a state where oh, great, you’re secure if you go to a big company but not if you go to a small, independent site. Because I still want people to feel like they can go everywhere on the Web. 

Anca Dragan

Ensuring that robots and humans work and play well together.

Anca Dragan, an assistant professor of electrical engineering and computer science at UC Berkeley, is working to distill complicated or vague human behavior into simple mathematical models that robots can understand. She says many conflicts that arise when humans and robots try to work together come from a lack of transparency about each other’s intentions. Teaching a robot to understand how it might influence a person’s behavior could solve that. One pressing application for this work is in helping self-driving cars and human-driven cars to anticipate each other’s next moves.

Abdigani Diriye

A computer scientist who founded Somalia’s first incubator and startup accelerator.

Like many Somalis, I ended up fleeing my homeland because of the civil war, back in the late 1980s. At age five I moved to the U.K. because I had family there and was able to get asylum. I grew up in a fairly nice part of London and went on to get a PhD in computer science at University College London.

“At university I started becoming more aware of the world and realized I was quite fortunate to be where I am, to have had all the opportunities that I did. So, in 2012, I helped start an organization called Innovate Ventures to train and support Somali techies. The first program we ran was a two-week coding camp in Somalia for about 15 people. Though the impact was small at the time, for those individuals it meant something, and it was my first time going back to the continent; I hadn’t visited in more than two decades.

“I started to think how Innovate Ventures could have a much bigger impact. In 2015, we teamed up with two nonprofits that were running employment training for Somali youths, found some promising startups, and put them through a series of sessions on marketing, accounting, and product design. Five startups came out of that five-month incubator, and we awarded one winner around $2,500 in seed money to help kick-start its business.

“The next year saw us partner with Oxfam, VC4Africa [an online venture-capital community focused on Africa], and Telesom [the largest telco in Somaliland], and we ran a 10-week accelerator for startups. We were hoping to get 40 to 50 applicants, but we ended up getting around 180. We chose 12 startups for a two-week bootcamp and 10 to participate in the full 10-week training and mentoring program. The top four received a total of $15,000 in funding.

“This year, the accelerator will be 12 weeks long, and we’ve received almost 400 applicants. There are some large Somali companies that are interested in investing in startups and we want to bring them on board to help catalyze the startup scene. We also hope to persuade the Somali diaspora, including some of my colleagues at IBM, to donate their skills and invest in the local technology scene.

“Countries like Kenya and Rwanda have initiatives to become technology and innovation hubs in Africa. Somaliland and Somalia face fundamental challenges in health care, education, and agriculture, but innovation, technology, and startups have the potential to fast-track the country’s development. I think we’ve started to take steps in that direction with the programs we’ve been running, and we’re slowly changing the impression people have when they view Somalia and Somaliland.”

Tracy Chou

Bringing tech’s dismal diversity numbers out into the open.

Silicon Valley loves data. But until recently, there was one subject where tech companies showed little interest at all in the numbers: the diversity of their workforces. It’s not that the statistics were downplayed—the numbers didn’t even exist.

Today most big tech companies have issued public reports on diversity, and there’s an independent, crowdsourced data repository at GitHub that collects information on tech workforces. And this has happened in no small part because Tracy Chou, a Pinterest software engineer at the time, wrote a post on Medium in the fall of 2013 called simply “Where are the numbers?”

Chou wrote the post after returning from a conference where she heard Facebook COO Sheryl Sandberg say the number of women in tech was dropping. “I didn’t think she was wrong,” Chou says. “But I also thought: ‘How does she know? There are no numbers.’ I knew there was this problem.”

Chou’s Medium post quickly went viral. And soon the numbers began to flow—first via Twitter, and then via that GitHub repository, which Chou set up. Within a few weeks, Chou had data on more than 50 companies (the repository now has numbers for hundreds), and by the summer of 2014, a host of the Valley’s most powerful companies had released demographic reports on their workforces. The numbers were dismal—in general, somewhere between 10 and 20 percent of workers in technology positions were women, and one study found that 45 percent of Silicon Valley companies didn’t have a single female executive. But at least the data now existed.

As this was happening, Chou continued her coding work at Pinterest, but she also found herself in demand as a speaker and panelist. Last spring, she teamed up with a group of seven other women—including venture capitalist Ellen Pao and Slack engineer Erica Joy Baker—to form Project Include, an organization designed to help CEOs implement diversity and inclusion strategies at their companies.

Chou isn’t, and doesn’t want to be, a professional activist. “It’s fulfilling to work on this issue, and I can have an impact here,” she says. “But I see it as a complement to my main work, which is building things and making products.” Nonetheless, she’s become a voice of authority on tech’s diversity problem because she’s unusually good at articulating the connections between the personal experience of women in the Valley and the systemic sexism they face, while also identifying how a lack of diversity hurts companies themselves. For instance, there is clearly a pipeline problem when it comes to gender and technology—not enough young women take classes in science, technology, engineering, and math or graduate with STEM degrees. But it’s also true, as Chou argues, that the pipeline problem can’t explain the high rate of attrition for women in tech, or the lack of women in senior positions. In other words, the pipeline for women gets even more narrow once you’re inside a company.

Sometimes that’s because of extraordinarily retrograde, garden-variety sexism, exemplified by the recent problems at Uber or the men who regularly told Chou, “You’re too pretty to be a coder.” It’s also because at many companies there’s an implicit (and sometimes explicit) assumption that women are less naturally adept at coding, and less willing to work hard.

Chou, for example, went to Stanford for an undergrad degree in electrical engineering and got a master’s there in computer science, and had internships at Facebook and Google. Yet at her first job she regularly dealt with casually dismissive sexism, making her question whether she belonged in the industry. “I loved coding,” she says. “But I just felt something was off. I felt out of place, and I had serious questions about whether I was going to stay in tech. And I really thought the problem was me.”

A large body of research shows that making organizations and teams more diverse also improves their performance. Diversity makes teams less likely to succumb to groupthink and helps companies reach untapped markets. “Products tend to be built to solve the problems of the people building them,” Chou says. “And that’s not a bad thing, necessarily. But it means that in the Valley lots of energy and attention goes into solving the problems of young urban men with lots of disposable income, and that much less attention goes to solving the problems of women, older people, children, and so on.”

Despite the evidence, plenty of companies still need convincing. “There’s lots of diversity theater and lip service paid to the concept,” Chou says. “And maybe we’ve helped weed out some of the most egregious actors. But there’s a long way to go.”

Greg Brockman

Trying to make sure that AI benefits humanity.

Human-like artificial intelligence is still a long way off, but Greg Brockman believes the time to start thinking about its safety is now. That’s why, after helping to build the online-payments firm Stripe, he cofounded OpenAI along with Elon Musk and others. The nonprofit research group focuses on making sure AI continues to benefit humanity even as it increases in sophistication. Brockman plays many roles at the firm, from recruiting to helping researchers test new learning algorithms. In the long term, he says, a general AI system will need something akin to a sense of shame to prevent it from misbehaving. “It’s going to be the most important technology that humans ever create,” he says, “so getting that right seems pretty important.”

Viktor Adalsteinsson

Working to improve cancer diagnosis and treatment.

n his lab at the Broad Institute in Cambridge, Massachusetts, Viktor Adalsteinsson has put an automated system in place that scans blood samples for traces of tumor DNAa so-called liquid biopsy. Collecting genetic information on advanced cancers might lead to clues about what drives the disease in later stages and what drugs to give patients. Adalsteinsson, whose mother succumbed to breast cancer while he was earning his PhD, is now looking to improve treatment as part of several projects, including one that sends blood collection tubes to women fighting breast cancer across America. “The doctors and patients cross their fingers and there’s a lot of watching and waiting,” says Adalsteinsson. “Now we can closely monitor patients’ responses to therapy and see what’s causing treatments to fail.”