cancel
Showing results for 
Show  only  | Search instead for 
Did you mean: 
cancel
Showing results for 
Show  only  | Search instead for 
Did you mean: 
Caute_cautim
Community Champion

Risks of Generative AI - HBR

Hi All

 

A very interesting piece, something to think about at the corporate board level:

 

https://hbr.org/2023/08/generative-ai-nxiety

 

Ask yourselves:

 

  1. What ethical, reputational, regulatory, and legal risks does generative AI share with non-generative AI?
  2. What ethical, reputational, regulatory, and legal risks are particular to, or exacerbated by, generative AI compared to non-generative AI?

 

Regards

 

Caute_Cautim

2 Replies
Paigereed
Viewer

Hii,

 

The article "Generative AI-nxiety" explores critical concerns at the corporate board level regarding generative AI. Both generative and non-generative AI pose ethical, reputational, regulatory, and legal risks related to bias, privacy, accountability, and unintended consequences.

Best regard,
livetheorangelife
annianni
Viewer

I will never disagree that Generative AI runs a huge potential of being misused by folks for their evil intentions but at the same time we have to appreciate so much that generative AI is doing for us lately. From helping kids with school work, research to assisting content creators with high quality materials and making lives easy for millions with such high degree of automation that t it offers. Something THAT beneficial will sure be abused in the wrong hands hence the immediate need for rules and regulations regarding AI generated content. I myself use a voice cloning AI to create professional voiceovers for my video content using my own voice as a reference and that saves me a great deal of expense in mic setup and expensive editing software and dubbing artist remunerations as the application itself allows me to dub my voice in multiple languages. Thus keeping such huge benefits in mind that we are reaping from AI, the regulations must be introduced ASAP, so that people can use AI without any fear.