As technology evolves, it becomes harder to tell ‘real’ AI from marketing
Jan 10, 2025
In his college courses at Stanford University, Jehangir Amjad poses a curious question to his students: Was the 1969 moon landing a product of artificial intelligence?
It might sound like a work of science fiction, or time travel, he said, but understanding the history of AI answers the question for them.
“I would actually argue, yes, a lot of the algorithms that were part of what put us on the moon are precursors to a lot of what we are seeing today as well,” said Amjad, a Bay Area technology executive and a computer science lecturer at Stanford. “It’s essentially precursors to the same kind of similar sort of ‘next, next, next generation’ algorithms.”
Amjad poses the question to his students to underline how hard it is to actually define “artificial intelligence.” This has become even more difficult as the technology explodes in sophistication and public awareness.
“The beauty and the dilemma is, ‘what is AI?’ is actually very hard to define,” Amjad said.
That broad definition – and public understanding – of “artificial intelligence” can make it difficult for both consumers and the tech industry to parse out what is “real” AI and what is simply marketed as such.
Swapnil Shinde, the Los Altos, California-based CEO and cofounder of AI bookkeeping software Zeni, has seen it through his investment firm Twin Ventures. Over the last two years, Shinde has seen a huge uptick in companies seeking funding that describe themselves as “AI-powered” or “AI-driven.” The AI market is very saturated, and some “AI companies” in fact just use the technology in a very small part of their product, he said.
“It’s very easy to figure out after a few conversations if the startup is just building a wrap around ChatGPT and calling that a product,” Shinde said. “And if that’s the case, they are not going to survive for long, because it’s not really deep tech. It isn’t solving a very deep, painful problem that was driven by humans for a long period of time.”
The rush to build AI
Since early 2023, Theresa Fesinstine said she has observed a race in the corporate world to introduce AI technologies in order to stay competitive and relevant. It’s when she launched her AI education company, peoplepower.ai, in which she leads workshops, teaches organizations about how AI is built and consults them on which tools might be a good fit for their needs.
In a time where everyone wants to claim the most cutting edge tools, some basic education about AI can help both companies and their employees navigate the technology landscape, the Norwalk, Connecticut-based founder said.
We should doubt wherever we start seeing claims of originality coming from AI because originality is a very human trait.
–
Jehangir Amjad, tech executive and Stanford lecturer
In an effort to look more innovative, companies may tout basic automations or rule-based alerts as exciting new AI tools, Fesinstine said. While these tools do use some foundational technologies of AI, the companies could be overstating the tool’s abilities, she said, especially when they throw around the popular buzzword term “generative AI,” which uses complicated algorithms and deep learning techniques to learn, adapt and predict.
The pressure on companies to keep up with the latest and greatest may also lead some organizations to buy new AI software tools, even if they don’t have a strategy to implement and train their employees how to best use it.
“It’s predatory, I would say,” Fesinstine said. “For companies, especially those that are feeling unsure of what AI is going to look like, what it should be, people have a fear of being left behind.”
Some technologists argue that ambiguity around what is or isn’t AI allows for all kinds of tech products to be sold as such. Predictive analytics, for example, which uses data to forecast future outcomes, may be “borderline” AI, said Ed Watal, the Reston, Virginia-based founder of IT and AI strategy consultancy firm Intellibus.
True AI systems use algorithms to sort, analyze and review data, and make informed decisions on what to do with it, based on what humans prompt it to do. The “learning” aspects of these systems are how AI gets smarter over time through neural networks which take feedback and use history to get better at completing tasks over time.
“But the purists, the purists, will argue that AI is only machine learning and deep learning,” he said.
“AI washing”
Though there seems to be an AI-powered company promising to do pretty much any task for you, technologists warn that today’s “real” AI has its limitations. Watal said the industry has seen some “AI washing” or over-promising and over-marketing the uses of AI.
A company that promises that its AI tool can build a website from the ground up could be an example, he said. While you could get ChatGPT or another AI algorithm to generate the code, it can’t create a fully functioning website, he said.
“You wouldn’t be able to do things which require, let’s say, something as simple as sending an email, because sending an email requires a [simple mail transfer protocol] server,” Watal said. “Yeah, you could ask this AI tool to also write the code for a mail server, but you’d still have to host it and run it somewhere. So it’s not as simple as, oh, you click a button and you have an entire app.”
Amjad, who is also the head of AI Platform at generative AI company Ikigai, said companies sometimes over-promise and over-market the ability of AI to perform original, creative tasks.
While artificial intelligence tools are great at pattern recognition, data sorting and generating ideas based on existing content, humans remain the source of original, creative tasks and output, he said.
“People would argue that in the public imagination, AI is creating a lot of things, but really it’s regurgitating. It’s not creating, right?” Amjad said. “And we should doubt wherever we start seeing claims of originality coming from AI because originality is a very human trait.”
It’s definitely not the first time that a new technology has captured the public’s attention and led to a marketing frenzy, Watal said. About a decade ago, the concept of “Web3,” or a decentralized internet that relies on blockchain technology, quickly grew in popularity, he said.
Blockchain technology operates as sort of a public ledger, where transactions and records are kept in an accessible forum. It’s the basis of many cryptocurrencies, and while it has become more mainstream in recent years, it hasn’t taken over the internet as was predicted about a decade ago.
“The cloud” is another example of a technology marketing makeover, Watal said. The concept of remote servers storing information separately from your hardware goes back decades, but after Apple’s introduction of the Elastic Compute Cloud in 2006, every technology company competed to get their claim to the cloud.
Only time will tell if we are overusing or underusing the term artificial intelligence, Amjad said.
“I think it’s very clear that both the hype and the promise, and the promise of applications is actually pretty real,” Amjad said. “But that doesn’t mean that we may not be, in certain quarters, overdoing it.”
Amjad suspects the interest in AI will only continue to rise, but he feels Ikigai’s technology is one that will prove itself amid the hype cycle.
“Yes, it’s come and captured the public imagination. And I’m absolutely thrilled about that part, but it’s something that builds upon a very long tradition of these things,” Amjad said. “And I wish that would help temper some of the expectations … the hype cycle has actually existed in AI, at least a couple of times, in the last, maybe, 50 years itself.”
The post As technology evolves, it becomes harder to tell ‘real’ AI from marketing appeared first on The Lexington Times.