OP-ED: AI Could Be Great, But It Has to Be Good First

Without Ethical Standards, AI Poses Risks to the Creative Industry

As co-founder of the Future of Publishing Mastermind, people ask me about artificial intelligence (AI) a lot. Since I’ve been on several panels extolling the dangers of AI in creative industries, people think that I’m a Luddite. 

However, that couldn’t be further from the truth. I literally co-founded a conference all about how to use technology and emerging strategies to grow your author business. 

I kind of love AI. 

Retail algorithms, after all, are AI, and they fuel the publishing industry.

AI helps us navigate the internet, and it’s responsible for advances I couldn’t imagine ten years ago. Right now, somewhere, engineers are coding an AI that will revolutionize everything again. Eventually that AI will be advanced enough to code itself to build something humanity can’t even imagine. 

AI will likely be responsible for solving climate change, and there is nothing better at analyzing large data sets than artificial intelligence. Countless lives have been saved by implementing AI in the medical field. 

That said, I do think the way that technology has been presented and integrated in creative fields is both dangerous and, frankly, dumb. 

In other fields that work with AI, there are strict ethical standards about how they gather and process data. Whole jobs are devoted to making sure people maintain those high standards, not only in the procurement and analysis stage but also in the implementation stage. 

Even with those ethical standards, AI makes a ton of mistakes. There are numerous studies showing AI can be biased at best and downright racist at worst, not to mention how often it is flat-out wrong

That doesn’t even address the unethical way these companies gather and process information, by scraping artists’ and other creatives’ work from the internet to train their AI models without their knowledge or permission.

There are unintended consequences with every technology, and we have the ability to do better than our forebears when adopting something that could radically change the way we do business in overtly negative ways. 

Bottom line, right now, AI is hurting a lot of creatives, and I don’t think it’s OK to step on the backs of creative people because it makes life marginally easier. 

I don’t have an intrinsic problem with AI writing or art. I personally think it looks god-awful, but then I’ll bet you think a lot of the things I like are terrible, too, and that’s okay. 

In fact, that’s art. 

My problems stem from the way this data is collected, tested, and distributed. I know there are ethical ways to implement AI that can help industries get to a place they never could alone. I see it every day, all over the world. 

I just think the way these companies are going about it, and then gleefully flouting how they can’t wait until they put us out of a job, is profoundly macabre and evil. 

AI is supposed to support and further industry, not destroy it.