In a recent panel discussion in Washington D.C., Microsoft President Brad Smith urged lawmakers and companies alike to keep pace with the rapid development of artificial intelligence (AI) and help establish regulations and risk management strategies. Smith’s call for action comes as no surprise amid growing concerns surrounding AI and its potential harmful impacts, such as privacy threats, job loss due to automation, and the spread of “deep fake” content.
While the responsibility of mitigating the risks of AI development has thus far fallen predominantly on governments, Smith emphasized the importance of corporations stepping up as well to prevent AI development spiraling out of control. One proposal shared by Smith was the implementation of “safety brakes” for AI systems controlling critical infrastructure and the development of a broader legal and regulatory framework for AI.
It is worth noting that even though Microsoft has been developing AI itself, such as specialized chips to power OpenAI’s viral chatbot, ChatGPT, Smith assured that the tech giant is committed to taking responsibility for AI-related safeguarding. The Microsoft executive supported OpenAI founder and CEO Sam Altman’s ideas of granting licenses to AI companies and suggested that only licensed AI data centers should undertake “high risk” AI services and development.
The increasing awareness and concerns around AI have led many industry leaders to call for stricter oversight and even temporary halts to AI development. The Future of Life Institute, for instance, published an open letter in March, calling for a “pause” in AI development and garnering over 31,000 signatures from major tech figures, including Elon Musk and Apple cofounder Steve Wozniak.
While it is evident that AI holds great potential, its fast-paced advancements can also be a double-edged sword. As we encounter challenges and debates surrounding its development, it remains crucial for companies, governments, and regulatory bodies to unite and ensure the risks of AI are assessed and managed.
Source: Cointelegraph