
One in five Americans is making 2025 the year they save more money, and there is a growing trend of asking generative AI technology for advice on how to achieve personal finance goals. But is using ChatGPT or Gemini to improve your personal balance sheet the right move? Experts say it’s complicated.
When used as glorified search engines, large language models excel at helping you find information fast, and they can also be a good source of general advice on how to set up a budget or improve your credit score. However, getting accurate answers on specific, sensitive financial questions is where the concerns start, says Andrew Lo, director of the laboratory for financial engineering at the MIT Sloan School of Management.
“It is quite dangerous to seek advice [from AI] of pretty much any sort, whether it's legal, financial, or medical,” he says. “All three of those areas have very large dangers if they're not done well.”
Many AI platforms lack domain-specific expertise, trustworthiness, and regulatory knowledge, especially when it comes to providing sensitive financial advice. They might even lead individuals to make unwise investments or financial decisions, Lo warns.
Despite these worries, many Americans are already turning to AI chatbots for financial management help, and among the 47% who reported doing or considering the practice, 96% have a positive experience, according to an Experian study from October 2024.
How to use AI in financial planning, with guardrails
Platforms like Perplexity and ChatGPT can help people who are looking for advice on saving and budgeting, investment planning, and credit score improvement. According to Christina Roman, consumer education and advocacy manager at Experian, the technology is a great starting point for consumers who otherwise might not be able to afford professional financial advice.
“I don't think that this is going to make people become reliant on AI for these types of services, but I think it's a great tool that can help them to begin to navigate their financial lives and begin to really understand complex topics like investing and whatnot,” Roman says.
While each prompting experience will differ, an individual can provide relatively simple details about their financial situation, and generative AI can produce a fairly elaborate plan.
Here is an example of a well-crafted AI prompt for personal finance advice:
"I need help managing my money. I make $50,000 a year.
I have $10,000 in credit card debt on one credit card and $2,500 debt on another credit card
My rent each month is $750
My car payment each month is $450
I have $150 in other utility expenses
I only have $250 set aside in my emergency fund
Can you help me get on track?”
Here are six separate sections ChatGPT provided in response to this prompt:
"1. Budgeting with a 50/30/20 Rule (Customized for You), including a monthly income estimate.
2. Spending Breakdown
3. Debt Repayment Strategy: Snowball or Avalanche
4. Emergency Fund Goal
5. Budget Adjustments
6. Automate Payments and Savings and Example Action Plan for Next Month”
Asking AI follow-up questions and adding more details about your financial situation and goals is a good practice. It helps the platform understand your unique circumstances and offer resourceful information.
However, Roman advises that people be very cautious with the output. AI platforms hallucinate, and that means the advice they offer may not be grounded in best practices, or even in any sound personal finance reality.
Furthermore, Roman says to be generic about the information they provide to any generative AI platform since you may not realize the way your information will be saved or used to train the AI model itself.
How AI can get financial planning wrong
Generative AI platforms are innovating by the minute—and there's no question that the ChatGPT of 2025 is more accurate and detailed than the version available just one or two years ago. But that’s exactly what worries some financial experts. Hallucinations can be hidden in plain sight, and individuals without financial expertise or experience may not know the difference.
In Lo’s recent research paper on using generative AI for financial advice, he cites an example where ChatGPT 3.5 made up the author names for a paper it used to back up its responses. While this may not seem like a serious offense, when it comes to statements that involve financial risk, hallucinations could ruin someone’s finances.
AI platforms may not always disclose as much source or background information as you might want or need. For example, when asking ChatGPT for investment advice, it recommends investing in companies like Microsoft. While human financial advisors may do the same, everyday users may not realize that Microsoft has invested over $13 billion in OpenAI, the parent company of ChatGPT. The chatbot does not note the conflict of interest to users unless it is pointed out.
Here are some other examples where ChatGPT recognizes its responses were “suboptimal”:


Working with a human financial advisor allows for more conversation-based financial planning. Details about one’s financial status and goals can be discussed further to create a personalized plan that includes an understanding of all risks tied to potential money moves. Generative AI will provide financial advice with just a few minor details, and individuals who take that insight without thinking about their full financial picture could make costly mistakes.
The future of AI in financial services
As LLMs become more advanced, one can imagine a future where generative AI is much more integrated into the financial advising ecosystem.
Michael Donnelly, the interim managing director of corporate growth at the CFP Board, says financial professionals may be more capable than others in making a strong case that technology can’t replace human advice. He engaged in a similar conversation a decade ago, during the rise of roboadvisors.
Financial advisors who learn to accept AI as a tool that’s great for things like internal practice management will excel, Donnelly says. Using AI can save advisors time better devoted to strengthening personal relationships, a hallmark of the financial planning profession.
“Advisors that scoff technology or are reluctant to engage and implement and embrace technology, those are the advisors that are going to be impacted and potentially replaced by AI,” he says.
For consumers, AI won’t eliminate the need for business professionals to work with a human financial planner, Donnelly says. But for those without a dedicated financial advisor, Lo has concerns.
“We don't have any guardrails yet in terms of how large language models are able to provide advice to consumers,” says Lo. “And I think that on the regulatory front, we do need to have more careful guardrails, but on the research front, it really opens up a whole new set of VISTAs for us to explore.”
Lo equates the situation to the fact that consumers largely have access to lower-risk mutual funds or money market accounts, but there are much greater regulations when it comes to who can deal with riskier private equity or hedge fund investments. AI, largely, has no guardrails.
Lo suggests a three-pronged approach to making AI’s intersection with finance safer:
- Investor education: help investors understand that hallucinations can happen, and that any responses should be checked carefully.
- Built-in guardrails: Create LLMs that can properly detect abuse and misuse.
- Regulations: Much as classes of financial products are limited to certain audiences, so should AI.
“There are winners and losers every time you introduce new technologies, and the early adopters are the ones that may end up making a lot of the mistakes in using the technology, but they're also the ones that are much more facile and are more likely to innovate with the technology, as opposed to pushing against it,” Lo says.