It can feel as if AI is inescapable, as even the most common tools used by remote workers supported by IT support staff seem to have integrated unsolicited functionality powered by tools such as ChatGPT.

Updates over the past couple of years mean that even the most recent version of Microsoft Office has been inexplicably renamed 365 Copilot and promises improved productivity through the use of applications powered by generative artificial intelligence and large language models.

Whilst there is a temptation to try the systems, especially since they seem to be free to use (for now), there are some causes for concern.

One of these is the motivation to aggressively onboard people to AI tools; according to MarketWatch, the AI bubble is four times the size of the subprime mortgage bubble that led to the 2008 financial crisis and 17 times the size of the dot-com bubble that wiped out many early technology stocks.

However, beyond the concerns about the long-term financial viability and the potential that the bubble bursting could have on businesses outside the scope of their IT provisions, there are a lot of reasons why you should think twice about relying too heavily on AI tools, even those that claim to boost productivity.

Here are some examples as to why.

 

AI Is Increasingly Unreliable

A considerable issue since the early days of ChatGPT has been its propensity to hallucinate. The reason for this is that large language models are not capable of processing data but producing what appears, according to their training data, to be the right answer.

If trained on a lot of highly accurate data, such as many medical implementations, it can be accurate, but everything an LLM produces should be carefully scrutinised.

At that point, it can often become more work to fix work made by an AI rather than to have created it yourself.

 

It Can Create Legal Liabilities

Many LLM-based AI tools rely on copyrighted material, which can be, as is the case with OpenAI, subject to class-action lawsuits.

This means that work generated with AI may be plagiarised from other people, and if it is passed off as your own, it could lead to serious legal repercussions.

Share this article