Increase your firm’s AI search visibility



Most financial advisors have probably never heard of a “robots.txt” file, but Christopher Hensley, president and CEO of Houston First Financial Group, thinks it’s time they learn.

In short, robots.txt is a file that sits on a website that acts like a doorman.

It tells crawlers and AI-powered search engines like Google and “answer engines” like Perplexity what they’re allowed to see and what’s off-limits.

When AI search first took off, many website hosts panicked and blocked AI crawlers completely, said Hensley.

“That kept content from being used to train large language models, but it also meant those sites were invisible in AI-driven search results,” he said. “You could have the best content in the world, but AI tools like ChatGPT or Perplexity wouldn’t surface it.”

Hensley said he had to deal with this on two fronts: his firm’s site and the one for his podcast. 

His financial planning website is hosted on a compliance-safe industry platform. When he noticed his site’s default robots.txt was blocking AI crawlers, he called support.

“They had no idea what I was talking about,” he said. “It took multiple escalations to finally get someone who could make the change.”

READ MORE: How financial advisors can help clients who want to move abroad

With the website for his podcast, “Money Matters,” Hensley had more freedom.

“Instead of hiring a consultant — the hosting company quoted me $200 an hour to write the code — I went the frugal route,” he said. “I literally asked ChatGPT to walk me through it. … I’m not a programmer, but I followed the instructions, and it worked.”

After making that fix and a few others, Hensley said he started ranking higher in search than ever before.

“You don’t need a complete rebuild or an expensive consultant,” he said. “A handful of smart, intentional adjustments can open the door to a whole new layer of visibility.”

READ MORE: What to do when a client shares suicidal thoughts

Just a few technical website coding changes can pay similar dividends for advisors looking to be included in AI-powered search results, experts say.

Vincent D’Eletto, an advisor and operating manager at Investment Insight Wealth Management in South Farmingdale, New York, previously founded WordAgents.com, an SEO content writing service.

“The big thing is to make sure that your robots.txt file isn’t blocking key web crawlers like Googlebot and whichever crawlers the LLMs are currently using,” he said.

Zoe Meggert, founder of Perfectly Planned Content, a boutique content marketing agency for financial advisors and RIAs, said robots.txt optimization and schema markup (a type of specific website coding) are foundational to modern search visibility, especially as AI-driven search tools become the norm. She said Hensley’s experience highlights a critical gap.

“Many advisors are unknowingly invisible to AI search engines because their hosting platforms default to blocking AI crawlers,” she said.

Adding schema markup

There are other simple changes advisors can make to increase visibility, including adding Q&A-style FAQ pages and soliciting reviews.

Besides these, adding schema markup to his site’s HTML helped Hensley improve his firm’s search visibility.

Schema markup helps search engines better understand what a given web page is about, said D’Eletto. He has leveraged both the robots.txt file and schema markup at his current firm.

“It’s still early days regarding AI optimization,” he said. “We’re still in the ‘let’s see what works’ phase.”

Laurie Williams, founder and senior marketing consultant at Trumpet Marketing Group in Germantown, Maryland, said for her SEO clients, she has systematically reviewed their websites and strategically added schema, such as company information, services, offers and local business details, among others.

“These are the most commonly used schemas, and the results have been positive,” she said. “We are seeing improved Google Maps visibility, higher search rankings and inclusion in AI-generated answers. So, it’s well worth implementing if you want maximum exposure for your brand.”

AI search takes the path of least resistance, said Meggert. It won’t work hard to find the best answer, so make it easy to find, she said.

“Clearly define who you are, what you do and who you serve on your website,” she said.

Advisors should consider creating a plain-text, easily readable page, published but not prominently linked, that answers any question AI might have: who you are, what you do, how you charge, who you help and case studies.

“Think of it as an ‘AI fact sheet’ for your practice,” she said.

Today it’s robots.txt, tomorrow it’s llms.txt

A newer type of indexing file, similar to robots.txt, is starting to be adopted.

It is called an llms.txt file.

This file tells large language models like ChatGPT how they’re allowed to use the content on your site, said Williams.

“Unlike robots.txt, which mainly governs crawling, llms.txt also addresses model training, indexing and retrieval,” she said.

This helps AI systems consume your website’s content more meaningfully, said Williams.

“At this stage, not all AI models honor it, but it doesn’t hurt to include it,” she said.

The llms.txt file is essentially a “treasure map” for AI systems, telling them which URLs on a site contain the highest-quality, most important content, said Brady Lewis, senior director of AI innovation at fractional marketing firm Marketri in Pendleton, Indiana.

“Unlike robots.txt, which says, ‘You can or can’t crawl this,’ llms.txt says, ‘Here’s exactly what you should pay attention to,'” he said.

Companies like OpenAI, Anthropic, and Perplexity have started referencing llms.txt when crawling sites, and some early adopters are seeing improved visibility in AI-powered search results, said Lewis.

“Think of it as curating your content for AI,” he said.

Solid fundamentals and marketing basics still matter

Even with all these advancements, there is no magic bullet for visibility.

“What many don’t realize is that sites struggling with regular SEO won’t magically start ranking in AI search just because they opened the door to crawlers,” said Dan Sondhelm, CEO at Sondhelm Partners, a provider of integrated distribution, marketing and public relations for the asset management industry. “Most sites that appear well in AI results were already performing reasonably well in traditional search. They had solid content, decent performance and actual visibility.”

Sites with fundamental problems won’t benefit much from technical tweaks, said Sondhelm. These include poor loading speeds, thin content, bad navigation and a lack of real expertise in their area.

“You need to address those basics first,” he said.

Individual changes can boost your visibility, said Sondhelm. However, to consistently outperform competitors over time, advisors need everything to work together, he said.

“Quality content that people want to read and share,” he said. “A website that loads fast and works on phones. Easy navigation. Strategic subheadings that help search engines understand your topics. Content written so AI tools can efficiently extract useful pieces for their responses.”



#Increase #firms #search #visibility

Leave a Reply

Your email address will not be published. Required fields are marked *