Microsoft President Brad Smith On The Dark Side Of AI Search Engines: ‘We’re Going To Need Laws’

 Microsoft President Brad Smith said during an interview this weekend that the kinds of things that artificial intelligence chatbots are capable of doing will need to be regulated by lawmakers before potentially irresponsible companies can release products that are able to do things that could cause havoc on society, like teaching people how to make bombs.

Smith made the remarks during an interview with CBS News’s “60 Minutes.” He also championed the positives that he claims will come from the public having access to AI technologies.

Co-host Leslie Stahl asked Smith what could be done to prevent a Chinese company like Baidu from releasing a product that could be seriously misused to do something nefarious.

“I think we’re going to need governments, we’re gonna need rules, we’re gonna need laws,” he said. “Because that’s the only way to avoid a race to the bottom. I think it’s inevitable.” 

WATCH:

TRANSCRIPT:

Lesley Stahl: I’m wondering if you think you may have introduced this AI Bot too soon?

Brad Smith: I don’t think we’ve introduced it too soon. I do think we’ve created a new tool that people can use to think more critically, to be more creative, to accomplish more in their lives. And like all tools it will be used in ways that we don’t intend.

Lesley Stahl: Why do you think the benefits outweigh the risks which, at this moment, a lot of people would look at and say, “Wait a minute. Those risks are too big”?

Brad Smith: Because I think– first of all, I think the benefits are so great. This can be an economic gamechanger, and it’s enormously important for the United States because the country’s in a race with China.

Smith also mentioned possible improvements in productivity.

Brad Smith: It can automate routine. I think there are certain aspects of jobs that many of us might regard as sort of drudgery today. Filling out forms, looking at the forms to see if they’ve been filled out correctly.

Lesley Stahl: So what jobs will it displace, do you know?

Brad Smith: I think, at this stage, it’s hard to know.

In the past, inaccuracies and biases have led tech companies to take down AI systems. Even Microsoft did in 2016. This time, Microsoft left its new chatbot up despite the controvery over Sydney and persistent inaccuracies.

Remember that fun fact about penguins? Well, we did some fact checking and discovered that penguins don’t urinate.

Lesley Stahl: The inaccuracies are just constant. I just keep finding that it’s wrong a lot.

Brad Smith: It has been the case that with each passing day and week we’re able to improve the accuracy of the results, you know, reduce– you know, whether it’s hateful comments or inaccurate statements, or other things that we just don’t want this to be used to do.

Lesley Stahl: What happens when other companies, other than Microsoft, smaller outfits, a Chinese company, Baidu. Maybe they won’t be responsible. What prevents that?

Brad Smith: I think we’re going to need governments, we’re gonna need rules, we’re gonna need laws. Because that’s the only way to avoid a race to the bottom.

Lesley Stahl: Are you proposing regulations?

Brad Smith: I think it’s inevitable-

Lesley Stahl: Wow.

Lesley Stahl: Other industries have regulatory bodies, you know, like the FAA for airlines and FDA for the pharmaceutical companies. Would you accept an FAA for technology? Would you support it?

Brad Smith: I think I probably would. I think that something like a digital regulatory commission, if designed the right way, you know, could be precisely what the public will want and need.

Powered by Blogger.