AI models trained on public data are always free from copyright issues.
False
AI Output: “CIBC’s digital banking platform launched in 2018, making it the first bank to do so.”
CIBC was not the first bank to launch a digital platform; others had digital offerings earlier
AI Output: “CIBC operates in both Canada and the Caribbean.”
Real
When reviewing AI-generated financial summaries, what is a key step to ensure accuracy?
Cross-check figures and statements with official financial reports or trusted data sources
AI can sometimes make up information. What is this called?
Hallucination
AI Output: “The inflation rate in Canada was 2.5% in 2022, according to a 2023 report from the Bank of Canada.”
The actual inflation rate for 2022 was higher; always verify with official statistics
AI Output: “CIBC was the first bank in North America to introduce online banking in 1996.”
AI hallucination (CIBC was an early adopter, but not the first in North America)
What should you do if an AI output cites a source you cannot find online?
Treat the information as unverified and seek confirmation from primary or reputable sources
True or False: AI can sometimes reinforce existing biases found in its training data.
True
AI Output: “CIBC’s CEO, Victor Dodig, was appointed in 2010 and has led the bank since then.”
Victor Dodig became CEO in 2014, not 2010
AI Output: “CIBC’s mobile app supports biometric authentication.”
Real
Name the most likely indicator that an AI-generated answer may be outdated.
References to past events as current
Name one reason why AI might give an incorrect answer.
Outdated data, ambiguous question, or lack of context
AI Output: “All AI models are explainable and transparent by design.”
Many AI models, especially deep learning models, are considered ‘black boxes’ and lack transparency
AI Output: “CIBC merged with TD Bank in 2005 to form the largest bank in Canada.”
AI hallucination
AI summarizes a news article but omits key context that changes the meaning. What risk does this illustrate?
Misrepresentation due to lack of context
AI Output: “CIBC’s main headquarters is in Vancouver, where most of its executive team is based.”
CIBC’s headquarters is in Toronto, not Vancouver
AI Output: “CIBC’s AI-powered chatbot can provide investment advice regulated by the Canadian Securities Administrators.”
AI hallucination (chatbots can provide information, but not regulated investment advice)