Business Aviation Takes Measured Approach To AI Adoption

AI applications graphic

Business aviation companies are using genAI for some operations.

Credit: MauriceNorbert/Alamy Stock Photo

It is easy to understand why businesses worldwide are keen to use generative artificial intelligence (genAI) tools in their operations to increase efficiency and streamline workflows. The business aviation sector is no different, but there are challenges that need to be overcome before the benefits these products should be able to provide can be gained.

Even when the systems are not used in safety-critical applications, there are still significant considerations that need to be addressed, particularly when a key element of the service is a guarantee of privacy and confidentiality for customers. Deciding which parts of a business aviation service are suitable for automation can be tricky.

“I think a lot of it has to do with having young leaders,” says Kyle Patel. The 33-year-old founder and president of Florida-based charter brokerage Bitlux has become a leading advocate for the use of a range of new digital technologies in business aviation. He says around one-third of his firm’s bookings are made using cryptocurrencies, and Bitlux has deployed a range of genAI tools across its operations.

Part of why the company has been able to do this, Patel argues, is because it is a relatively young business that is not hidebound to existing practices and traditional modes of working.

“There’s a lot more infrastructure in a 20 or 30-year-old company,” he says, “but AI has made it to where implementation of it into internal systems and processes is pretty easy.” He also thinks companies need “the right business leaders who are young enough to have been at the forefront of this entire thing. A lot of leaders in aviation are [older]. No offense to old people,” but growing up with technology enables a different mindset, and he thinks for older generations, “it’s just not something that’s there. It’s not seeded into what you do.”

Task Prioritization

Bitlux’s use of genAI is extensive. Inspired in part by Apple’s Siri voice-activated search tool, the company positions its services akin to a virtual personal assistant. Patel uses genAI in a secretarial role, both personally and across the business.

“We deploy and employ AI in several different areas, from mailbox summaries to daily to-do tasks that we have that are lined up through our client services email,” he says. “All of our customers use a client services email just like a regular charter [operator] would [have]. We have the deployment where it will read and understand the emails that come in, and prioritize tasks that the team needs to perform based off of several different variables.”

During his interview with BCA, conducted using Google’s Meet videoconferencing system, Patel uses a commercially available genAI tool called Read that demonstrates some of this kind of capability. As well as supplying a (mostly accurate) transcript of the meeting shortly afterward, Read also generates a summary of the meeting and a series of action points, itemizing the elements of the conversation where one or more participant appeared to agree to carry out some particular action or provide follow-ups. This kind of functionality is something Patel uses to help manage his business and grow its capability, using a tool separate from the ones deployed across the organization.

“I have my own separate AI system for myself that acts pretty much as my personal assistant,” he says. “I can totally brain-dump whatever I’m thinking into this thing.” On the business side, Bitlux frequently updates “buttons and routes that you can go down through our online portal … to keep track of everything,” he says. Updates occur nearly real-time. “If we publish changes, all I have to do is essentially create a Google Meet with my [human] assistant and my AI assistant, and it’ll go through and take the information that I’m speaking, understand what I’m doing on my computer and will update the areas of the standard operating procedures that we have.”

Security Concerns

Rather than being a consumer of third-party genAI systems, the Swedish digital aviation document solution firm Web Manuals made the decision to develop its own system to help customers navigate their document libraries. The system, called Amelia, was launched in May 2024 and has been undergoing refinement and ongoing development since.

Rather than acting in the kind of assistant role Bitlux has adopted, Amelia sticks to the same basics as most of the well-known and publicly available genAI tools. It is “a search engine that is AI-powered,” says the firm’s chief technology officer, Richard Sandstrom. This has simplified development by ensuring the tool is designed to perform a narrowly defined set of tasks.

As a result, Web Manuals has been able to make significant upgrades to Amelia during its first year in service with customers, including being able to offer searches in Spanish and English documents. Sandstrom says the company is making progress toward offering search functionality to the user even when the device being used to search the manuals is offline. Moreover, the search function is just the first of what is envisaged as a range of genAI tools for customers.

Nevertheless, the concerns both Web Manuals and its customers have over data security and integrity were not just at the heart of the development of the Amelia search-—they remain a predominant focus of all ongoing work. “We spent more time on the security compliance than we did on the actual development,” Sandstrom says.

This focus on security is an inevitable consequence of the way genAI tools work. Each is based on a large language model (LLM) concept, which involves “training” the software by giving it access to vast amounts of representative data from which it “learns” what kind of answers a user would be expecting to a given question. In the case of publicly available LLMs like Open AI’s Chat GPT or Microsoft’s CoPilot, this “training” is carried out, as far as the developers have been willing to acknowledge, by pointing their software at publicly available webpages and analysing information published on them. When these tools are deployed within a business, the additional “training” required usually involves the software accessing data held internally on the company’s own servers and devices.

The concerns for a company wishing to deploy a genAI tool in a business-aviation environment will therefore include questions around reliability of data used to train the underlying public model, and around whether confidential internal data may be at risk of public exposure if it is ingested into the company-specific model. A genAI tool which has, perhaps, “learned” about aviation from the public internet may occasionally generate incorrect or incomplete answers which could lead to catastrophic consequences. Even if a business does not allow a company-specific deployment to feed information back into a public LLM, there is still a risk of confidential data leaking if, for example, an employee were to use ChatGPT to generate a piece of marketing material which, unbeknownst to them, included sensitive internal information to which the LLM had access.

“We have a lot of different safeguards and systems in place for these types of things,” Sandstrom says. “There is something called ‘bleed,’ where data can leak between users or, in the worst case, customers. It is very important for us that this doesn’t happen. If you train models, preventing things from bleeding between users is very difficult. But for us, it’s the highest priority. This is why we have internal compliance and legal teams and things like that, even though we are quite a small company.”

Partly to facilitate offline operation, Web Manuals uses real-time data mining of customer data sets rather than traditional LLM “training.” Sandstrom was involved in Amelia’s development from the inception of the idea within Web Manuals, and he says one of his biggest surprises when studying the emergent genAI realm was how little of a concern these issues apparently have been to the major public LLM developers.

“As the lead on R&D and CTO, the interesting thing for me was how little [appeared to be] done on that part in the general case,” he says. “Most AI systems ingest everything and summarize it, basically—or read Wikipedia and do generative AI on that. That approach doesn’t work for us. We cannot allow things to leak between customers, or even between users, because there might be sensitive information that a user doesn’t have access to. So this is something we spent a lot of time on.”

Hallucinate Or Elucidate?

Another area that developers in safety-critical sectors need to prioritize is avoiding the tendency exhibited occasionally by some public LLMs to “hallucinate,” or give responses to queries that read like reliable answers, but which include apparently bona fide information that has no basis in fact. When this happens during a general-interest query, it is annoying. If it were to happen when a charter broker was offering a quote, it could cost the company in lost business. If it were to occur when an aircraft maintainer was trying to find the correct method for completing a repair, the ultimate consequence could be the deaths of all onboard the vehicle.

Flight-management software firm FL3XX and cost-management platform provider MySky run two partnered initiatives, integrating the former’s flight-specific tools with the latter’s Spend and Quote systems which, respectively, analyse historic aircraft expenditure and provide predictive budgeting for future flights. Both firms are working to automate and streamline internal operations, and both therefore have been deploying genAI to provide their respective services. Avoiding “hallucination” is of vital importance here, too.

“FL3XX employs purpose-built AI that enhances automation and operational efficiency without generating speculative content,” says CEO and co-founder Paolo Sommariva. “For example, it extracts data from customer emails to produce fast, accurate charter quotes. Our AI is grounded in the operator’s own workflow and we’re always transparent about what the technology is doing and why. Of course, data protection and privacy play a major role in how we deploy AI,” he says. FL3XX does not use personal or protected data to train AI because “experience shows there’s no need to. This internal, user-generated data forms the basis of our AI-driven insights, offering unparalleled visibility without breaching customer privacy. All data usage is consent-based and aligned with the operator’s own platform use,” Sommariva says.

“The data we work with is owned or permissioned by our clients—operators, management companies and owners,” says MySky’s co-founder, Chris Marich.

This, he says, gives the company “access to highly detailed cost, procurement and trip-level operational data” which “enables us to provide precise and confidential financial analytics without breaching privacy standards.”

Just as importantly, if not more so, Marich says his firm is “focused on precision over prediction,” which means that its technology “doesn’t generate speculative answers. It processes verified cost and operational data to support informed decision-making.”

This perhaps implies that limits remain on the amount of enhancement that can be gained from adopting genAI tools in a safety-critical industry. If “hallucinations” are not acceptable, then the systems will have to be configured in a way that prevents them from happening. This may mean they respond slightly more slowly, or that their outputs may require slightly more time to interpret and implement.

There will still be benefits, but when safety has to be guaranteed, there will, necessarily, be limits on how numerous those benefits may be—and how quickly they can be realized.

“We think of our AI more as a financial co-pilot, designed to enhance accuracy in budgeting, quoting and spend analysis, not to replace human judgement,” Marich says.

“While public tools may cause concern, our clients appreciate that MySky’s AI is rooted in real-world data with clear, auditable outputs, building confidence rather than eroding it.”

Angus Batey

Angus Batey has been contributing to various titles within the Aviation Week Network since 2009, reporting on topics ranging from defense and space to business aviation, advanced air mobility and cybersecurity.