...

Artificial Intelligence may be the future of information technology, but it has some way to go when it comes to workforce diversity. Laura Montoya talks about her efforts to improve access to the industry for minorities.


In some ways, machine learning algorithms are like little children: learning from whatever input they can get. Responsible parenting and education are necessary to ensure they grow up to be good algorithms when they leave their development environment to live on production servers.

In 2015, Google learned this lesson the hard way when Google Photos labeled African Americans as "gorillas". Other examples of racial bias in algorithms include facial recognition software for passport applications in New Zealand, as well as in Nikon cameras. Both algorithms were faulty in telling if a person's eyes were open or closed, depending on their ethnicity.

All these cases can be seen as examples that machine learning has a long way to go. After all, humans do not accidentally confuse people with gorillas and usually have no trouble telling if a person's eyes are open, regardless of their ethnicity.


Biased datasets lead to biased algorithms

Laura Montoya, founder of the education platform Accel.ai and the minority advocacy group LatinX in AI Coalition, says there's another way of looking at the issue: the knowledge of AI algorithms depends on the quality of the training data sets. Algorithms producing these racially biased results simply haven't been trained enough on images of different ethnic groups.

According to Laura, one reason for this imbalance can be found in the lack of diversity in the workforces of tech companies in the AI industry. "Not having enough [diverse] representation in the room to provide oversight, that's a big issue," she says. "Ensuring [there's] at least one person that could look at a dataset and say 'this dataset is not representative of my culture or community'" could have been enough to avoid training a racially biased algorithm, she believes.

The workforce in large tech companies, especially in the role of engineers, is still dominated by white males. To counter this issue, Laura has founded Accel.ai, a non-profit organization focused on helping members of minorities enter the AI industry, as well as LatinX in AI, a group looking to highlight and advance Latino individuals in the AI industry.

Laura came to Silicon Valley in 2015 with a background in biology and physical science. She quickly became captivated by computer science and software engineering. "Once you're here [in Silicon Valley], it's hard to get away from tech. Everyone talks about it and everyone's involved in it," she says. She became a member of the non-profit institute Women Who Code, for which she now serves as a director.

Recognizing many similarities between artificial intelligence and her biology studies in university, the field became her main focus area. "It felt like a natural progression to go from a background in biology and physical sciences to explore AI once I really started getting into computer science."


Strengthening the Latino Community in AI

Laura is working to counter the diversity issues of the tech industry in Silicon Valley. With the LatinX in AI coalition, she develops datasets that include more samples from Latino individuals. "We want to build datasets that are more inclusive and more representative of society as a whole," she says. The group also maintains a database of people with a Latino background that are available to speak at conferences and company events.

"Oftentimes, we hear from large tech companies that there aren't enough people from [Latino and other minority] backgrounds that are educated or experienced enough to do the work," Laura tells me. "Well, that's not true. There are PhDs, [people] that have been doing research, but they're not recognized as often as others in the industry." With LatinX in AI, she tries to fight this sentiment and connect companies with skilled Latino individuals.

Still, Laura agrees that education can be an issue for members of minority groups. People of color, as well as women, are still underrepresented among engineers in Silicon Valley. A way to improve this situation, Laura thinks, is offering more alternative education programs where individuals can learn tech skills without having to go through a traditional university program.


Alternative education simplifies access to knowledge

Her non-profit Accel.ai is offering technical classes surrounding artificial intelligence and machine learning, as well as so-called "mindset training". In these courses, students learn how to successfully tackle difficult problems. "We help people to get past their imposter syndrome, we help them understand how to have a growth mindset, how to learn as individuals," Laura explains. "If you're not from a technical background, you might struggle with [that]."

She mentions mathematics as a hurdle that many people struggle with: "I think that's where most people end up having fears about their abilities, when getting into math." Accel.ai tries to help overcome these fears by teaching not only the subject itself but also the methodology needed to solve these problems.

Another group that Accel.ai caters to is people that have nothing to do with AI yet: "We also focus on individuals [in] blue collar jobs that are going to be taken up by automation," Laura says. She mentions trucking as an example, but also retail, basic administration, and marketing. "No matter where you look, there is room for automation. I think if you're not in the tech industry, you're not really aware of it."


Helping the victims of automation

Laura emphasizes that Accel.ai doesn't try to train everybody to become an engineer. Instead, they provide counsel and focus on individual abilities of their students to prepare them for a future career change. But change is not solely the responsibility of the affected workers. Laura is confident that tech companies can also do their part to soften the impact of AI-driven automation.

She mentions Starsky Robotics as an example of how socially conscious automation could work. The company develops autonomous trucks. But instead of taking the driver completely out of the equation, drivers work from an office building and control the trucks remotely, similar to how military pilots control drones today. On the highway, the trucks drive without human guidance. Human drivers take over when the truck has to navigate the narrow premises of warehouses or in difficult road situations, such as construction zones on the highway.

This redefinition of the truck driver job eliminates many disadvantages of the work: instead of spending weeks on the road and suffering from sleep deprivation, drivers work regular hours and can go home to their families every night.

Automation can improve working conditions


"People [can] transition from a job that was very labor-intensive. This is a healthier transition for them," Laura says. She admits that this can only be a short-term solution, though. "In the long term, the goal is to get people educated in a way that really advances their career."

A few years ago, much of the knowledge needed to start a new career was only available in university programs. Today, online classes and learning resources offer this knowledge. Many are free, and even paid ones are much cheaper than getting a university degree could ever be. "If you want to transition careers, you can do that. It takes a lot less time than it used to, and it's no longer a barrier that you have to pay 40k dollars a year to go to school," Laura says.

Education plays a major part in all of Laura Montoya's projects, including helping companies to diversify their workforce and providing learning opportunities to minority groups. And while Laura's focus is always on the humans in the AI industry, her effort may also lead to more tolerant algorithms. After all, in order to raise well-behaved algorithm children, you need to educate their parents first.