In this panel discussion, we heard about how VCs are putting their efforts into supporting AI for Good through investing in startups with an overall goal of positive social impact. Our panelists were Andy Byrnes, Investment Director at Comet Labs, Wes Selke, Co-founder & Managing Director at Better Ventures and Li Sun, Partner at Foundation Capital, and our moderator was Shaloo Garg, Managing Director at Microsoft for Startups.
Here's what we learned:
Shaloo: When we talk about investment, the number of people involved in AI for Good isn't that high. How would you define AI for Good? The matrix might not be in numbers, but the return has a rippling effect on the community:
Li: There are 2 types of organisations. As VCs we look at startups that work for Good because they tend to last the longest. People always think about education when they think about social impact, but there’s also labour shortage. For example, if we want to help and empower the low wage workers to find jobs, we can use AI here, and in this circumstance we're turning a profit and it's still AI for Good and it’s also boosting the economy. In terms of AI for Good this is only one example but we can extend to others and combine for profit and for good.
Wes: The debate of 'good' and profit has been going for a while. We’re looking for opportunities where we can get a higher return by investing in Good ventures, it doesn't mean that profits should be lower just because it's for social impact. People want to work in areas that have meaning and purpose. There’s a lot of talent and we can attract talent in areas that are mission driven. There’s different kinds of business models, but just because it’s impactful it doesn’t mean it’s lower return. AI is a neutral technology but it can be pointed at impactful areas like health and education.
Andy: I agree and disagree. Founders are really good at solving problems. 99.9999% of companies are venture backed, we must remember that. Using AI to solve problems is great; it’s good to draw a line on venture investing and impact investing. There are different pools of money that are good at doing different things for different reasons.
Shaloo: AI for Good manifests differently in different industries. Share where you see the maximum growth in alignment with GDP, in your view which sectors are high GDP growth?
Andy: I definitely see the sector using AI with the most impact being in education. Personalisation of education is extremely impactful and valuable for entire societies and can diffuse to reach areas all across the globe.
Wes: I’m personally most excited about synthetic biology. It’s building on the power of massive computational forces and there are some really exciting things that can be done with life sciences and technology like green chemicals and synthetically producing toxic chemicals to help the environment.
Li: Personally, when I think of GDP I think of employment. I’m excited about robotics that can work alongside humans to free up individuals to focus on their tasks whilst robots can take on other tasks, and this also increases the quality of employment.
Shaloo: All of us meet with lots of startups as investors. You hear their vision. What’s some of the common challenges they face?
Andy: It’s easy for founders to attract talent in the Good space. It’s important to think about strategy so carefully rather than just the big picture of the global challenges - you need the methodology and scalability before you get the cash! A lot of startups take the non-profit then fall into non-profit category, which isn't necessarily a bad thing, but we want to keep them profitable.
Wes: What’s the problem, how are you solving it, and how are you uniquely able to solve it? There needs to be a urgent customer need. There’s a lot of companies out there that look really similar but we need you to have a ‘secret sauce’ and need to think how you’re 10x better than everyone else. We’re investing in the team so we need a team of at least 2 people and one that’s willing to grow. We need a business savvy person and a technical person.
Shaloo: As you’re meeting with founders, what kind of coaching do you give them?
Wes: We help them stay on track. They know more about the market than we do (hopefully!), so our job is to help them think about key milestones, and we help them stay focused by asking lots of questions and contextualise their problem, think about key resources and key hires.
Li: Since this is an AI conversation I want to look at these types of startups specifically. When you dig too deep into AI it's easy to forget about the problem you’re solving. Sometimes you might not even need AI! If it is the right tool you need to think ‘is AI actually going to improve the product?’ If it’s only going to make a 5% difference or something negligible, you won’t shift your customers.
Andy: There’s a difference between ‘good to have’ and ‘oh my gosh I need that’ - keep searching until someone needs it because that’s what will push you over from doing okay to huge success.
Shaloo: How do you see responsible AI playing in this field? As you’re meeting startups how do you communicate the importance?
Li: most AI technologies aren’t good enough for us to worry about this, but we need to put boundaries in now. AI can go wild so we need to set boundaries in the beginning. If a robot is going to assist elders we need to find out what tasks it can do and make sure humans are the ones who make the final decisions so that it doesn’t get out of hand.
Wes: It starts with awareness. People weren’t aware of what Facebook was doing until we found out recently. It was feeding us inciting content and very few people were aware of this. It’s all out now but we need an awareness from early stages and make sure we’re having a public debate about it and that companies are aware of it too.
Li: As investors we’re not always aware of it too! It’s a question for us as investors too like how can we be aware of it as a board.
Andy: I worry about both sides isolating their work so communication is so important. I don’t know a single founder in AI that wants to be bad! People want to be able to make autonomous decisions that aren’t 100% aligned with business decisions, we want them to have human decisions. If a tool like a weight loss AI makes irresponsible decisions it could be really dangerous!
Shaloo: At our level we need to be open about it. Any questions?
Attendee Question: Going back to responsible AI, do you feel comfortable investing in companies that don’t follow these guidelines?
Wes: There aren’t any guidelines!
Li: In future companies I invest with I will make sure there’s a clear clause. You have to think proactively and think a few years ahead and consider the impact.
Attendee Question: How patient is your capital?
Andy: Very! When we invest in ML hardware it takes years to just get your first product then it won’t work. So we have to be patient and we do definitely expect it to be a process.
Li: Absolutely, some of our companies take 7-8 years so we have high tolerance with timelines.
Wes: Of course, these things take time. Most funds are 10 years and most funds aren’t liquidated after this time.
This discussion took place on the Deep Dive track at the Applied AI Summit, Deep Reinforcement Learning Summit and AI for Good Summit in San Francisco, register for post-presentation video access here.