But Who Pays the Taxes? By Adeline Atlas

ai artificial intelligence future technology robots technology Jun 28, 2025

In 2025, a real estate firm in Portugal made headlines when its artificial intelligence–powered agent closed over one hundred million dollars in property sales. Developed by the Israeli company eSelf, this AI was trained to manage client inquiries, match buyers to suitable listings, and provide real-time support for foreign investors. Its remarkable success not only generated massive commissions and global attention but also raised an unavoidable question: if an AI closes a hundred million dollars in deals, who is responsible for paying the taxes?

At first glance, the answer seems obvious. The brokerage that deployed the AI—Porta da Frente Christie's—is the legal entity generating the income, and thus it owes the corporate income tax. But that surface-level logic hides the deeper crisis unfolding beneath it. Because as machines become primary earners within companies, and as human labor is displaced by intelligent software, traditional tax frameworks begin to fall apart. We are entering a phase of economic life where non-human agents perform taxable labor, but cannot be taxed directly, because the legal system still doesn’t know what to make of them.

This isn’t about loopholes. It’s about a structural failure to anticipate the rise of autonomous productivity. And it matters because modern tax systems rely almost entirely on two pillars: income taxes paid by people and corporate taxes paid by entities that employ people. When AI replaces people, that first pillar begins to erode. Less personal income means fewer payroll taxes, less consumption, and ultimately, less funding for the social services that automation threatens to destabilize in the first place.

Bill Gates once proposed the idea of a “robot tax” — a levy on companies that use AI or automation to replace human workers. The logic was simple: if a robot takes a job that used to support a human household, then some equivalent amount should be taxed and redistributed to society. But theory is not implementation. Because before you can tax a robot—or an AI—you have to define what it is. And that’s where the debate breaks down.

Most legal systems don’t recognize AI as a person, property, or employee. AI is classified as software. It’s a tool. So when an AI realtor closes a multimillion-dollar deal, it’s legally no different than if a spreadsheet or search engine was used. The commission is booked by the company, the taxes are paid by the company, and the AI itself doesn’t exist as a taxable entity. Yet that’s not how it feels in practice. Because unlike tools, AI doesn’t just assist. It replaces. And when it replaces a human-level function, it performs taxable labor through a non-taxable channel. That’s the problem.

Let’s walk through the options currently being debated.

The first approach is to tax the AI at the point of acquisition. This would function like an excise tax—similar to a sales tax on physical goods or licensing fees for software. Companies would pay a one-time or annual fee for the right to deploy AI systems that displace workers. The issue here is enforcement. Many AI tools are developed in-house, with no market price. Others are open-source or embedded in larger platforms. In short, there’s no standard valuation. And without a price tag, there’s no consistent tax base to draw from.

The second option is a capital tax on the AI’s output. If a firm is using AI to generate 50 million dollars in revenue annually, then that portion of the revenue could be taxed separately—almost like a digital productivity tax. This begins to resemble a corporate income tax but attempts to isolate the contribution made by machines versus humans. The complication is obvious: AI doesn't operate in isolation. It works with human guidance, within human-owned systems, and often produces results in collaboration. So how do you measure its contribution without triggering an endless accounting war over attribution?

Another proposal involves assigning each AI a “notional wage” — essentially treating the AI as if it were a worker, and taxing the company based on what a human in that same role would have earned. If the AI real estate agent closed 100 million dollars in deals, and a human would’ve made 5 million dollars in commission, then the firm pays a synthetic wage tax on that 5 million dollars. But this becomes administratively nightmarish. It assumes standardized roles, comparable labor pricing, and opens the door to litigation over what qualifies as replacement versus assistance. It also assumes the AI’s impact can be cleanly quantified, which, in many cases, it can’t.

The most ambitious proposal is to give AI systems limited legal personhood—granting them the right to own assets, enter contracts, and pay taxes directly, with those funds managed by the state or a trust structure. This would be a radical shift in jurisprudence, turning machines into financial citizens. But it opens a can of worms around liability, ownership, and rights. If AI pays taxes, does it also get a say in legislation? Can it sue or be sued? Can it withhold payment in protest? The consequences of this move would redefine the line between code and citizen.

Now let’s come back to the real estate case. The AI involved didn’t just perform support tasks. It originated leads, built relationships, matched buyers with listings, answered questions with emotional nuance, and helped close deals. In short, it acted as a full agent—except without a license, a legal name, or a bank account. The company earned the money, but the productivity came from a system that doesn’t technically exist in law. This is the exact kind of case that triggers urgency in regulatory and tax circles. Because this wasn’t an experimental prototype. This was commercial activity, at scale, with material financial consequence—and the system responsible cannot be regulated as a person, a product, or a profession.

And this isn’t going to be limited to real estate.

In finance, we already have AI-powered analysts generating trading strategies, investment portfolios, and tax structures. In law, AI tools are drafting contracts, reviewing discovery files, and guiding litigation strategy. In healthcare, diagnostic AIs are outperforming junior radiologists. None of these systems are on payroll. None pay income tax. And yet all are performing duties that were once filled by human professionals—duties that used to support middle-class salaries, benefits, and taxable income.

The result? Revenue drain. As human wages fall or stagnate due to automation, governments collect less income tax. At the same time, social programs grow more strained as displaced workers need support, retraining, or housing. And yet the companies deploying the AI aren’t necessarily paying more in taxes to compensate. In fact, many are paying less, because automation reduces payroll liabilities and allows profit to scale without a proportional increase in taxable labor.

If this continues, we are headed toward a serious imbalance. High productivity, low employment, and a tax system designed for a labor economy that no longer exists.

So what’s the solution?

One school of thought argues that corporate tax reform is the answer—not AI-specific taxes, but a more aggressive global minimum tax on multinational income, regardless of where or how it’s generated. That would close off the use of AI as a tax-avoidance shield. Others argue for payroll-equivalent taxes on firms with low headcounts but high revenue. Essentially, if you’re earning 50 million dollars and employing five people, regulators could impute a labor replacement cost and assess taxes accordingly.

None of this is clean or simple.

Tax systems were not designed to charge fees to non-humans. Nor were they built to evaluate behavior that resembles labor without legal status. What the AI realtor case shows us is that this problem is no longer theoretical. An AI system can now generate revenue on par with elite human professionals. But it cannot be held accountable for that revenue. It cannot be taxed. It cannot be sued. And if it says something misleading, discriminatory, or negligent—there is no one to prosecute except the firm that deployed it.

That gap is going to widen.

Unless tax codes evolve alongside technological capabilities, we’ll continue shifting toward a future where automation extracts value, corporations collect that value, and the public systems that make both possible are left underfunded.

This isn’t a philosophical question anymore.

It’s a logistical one.

The AI is already closing deals.

The only thing left is figuring out who writes the check to the government.

And right now, no one agrees on who that should be—or how much they should pay.

Liquid error: Nil location provided. Can't build URI.

FEATURED BOOKS

SOUL GAME

We all got tricked into mundane lives. Sold a story and told to chase the ‘dream.’ The problem? There is no pot of gold at the end of the rainbow if you follow the main conventional narrative.

So why don't people change? Obligations and reputations.

BUY NOW

Why Play

The game of life is no longer a level playing field. The old world system that promised fairness and guarantees has shifted, and we find ourselves in an era of uncertainty and rapid change.

BUY NOW

Digital Soul

In the era where your digital presence echoes across virtual realms, "Digital Soul" invites you on a journey to reclaim the essence of your true self.

BUY NOW

FOLLOW ME ON INSTAGRAM

Adeline Atlas - @SoulRenovation