Howard A. Rubin is an American technology economist, executive advisor, and former senior analyst at Gartner, known for his work on IT economics, benchmarking, technology strategy, and digital transformation. Over several decades, he has advised large enterprises and government organizations on how technology investments drive business performance.
We sat down with Howard to discuss how the global economy is becoming a technology economy and one that requires new economic measures that are different than what measures are used today.
You’ve said the global economy is now being driven by technology. What do you mean by that?
Historically, different eras have had different economic engines. In the 1800s, there were railroads and the Industrial Revolution. In the early 1900s, it was automobiles and aviation. In the 1970s, the dominant force was energy and oil.
Today, the driver is technology.
The problem is that the economic measures we rely on—things like GDP and the Consumer Price Index—were created around the time of World War I. They were designed to measure inflation and industrial output in a manufacturing economy.
They were never built to measure the digital economy. Yet technology is now moving markets, shaping business models, and driving economic growth.
You’ve been developing the concept of “technology economics.” What does that mean?
Technology economics is about understanding the outcomes created by technology investments.
Most organizations treat technology primarily as an accounting issue—tracking budgets, depreciation, or IT spending ratios. That’s necessary for financial reporting, but it doesn’t tell you whether technology is improving the business.
Technology economics asks different questions:
How is technology improving operational efficiency?
How is it strengthening customer relationships?
How is it accelerating innovation?
When you look at the numbers globally, the technology economy itself is roughly eight to nine trillion dollars. The only economies larger than that are the United States and China. That effectively makes technology the third-largest economy in the world.
Yet we still don’t have a clear way to measure how it works.
Many companies track technology spending as a percentage of revenue. Why isn’t that enough?
Because the number alone doesn’t tell you anything meaningful.
A company could have higher technology spending than its peers because it’s innovating aggressively and investing for the future. Another company could have high spending simply because its infrastructure is inefficient.
Looking at IT spending as a percentage of revenue doesn’t tell you which of those situations is true.
What matters is whether technology investments are producing real outcomes. Companies compete through operational efficiency, customer intimacy, faster innovation, and stronger reputation. Technology should be measured against those outcomes.
If you bring in technology without targeting the outcome you expect, nothing happens.
Why do so many technology initiatives fail to deliver value?
Because organizations often adopt technology as a fashion statement.
One company moves to the cloud, and suddenly, everyone else believes they need to move to the cloud as well. The same thing is happening now with artificial intelligence.
Companies say, “We need AI.” But they haven’t defined what business problem they’re trying to solve or what outcomes they expect.
When technology is adopted without a clear objective, it rarely improves margins or productivity. It simply becomes another cost.
Artificial intelligence is dominating the technology conversation right now. How do you see it?
AI is a very powerful technology, but we’re seeing a failure of expectations. People expect immediate transformation. In reality, the technology is still evolving.
Some organizations have invested billions of dollars in AI and are only beginning to see results. In many cases, the financial impact is still relatively small.
Another issue is that AI still requires significant human oversight. These systems can produce impressive results quickly, but they also make mistakes. You need human expertise to interpret the results and verify them.
Right now, the best model is AI combined with human judgment.
Some people argue that AI will dramatically reduce work. Do you agree?
Not necessarily.
Research is beginning to show that AI often intensifies work rather than reducing it. When people use these tools, they gain access to more information and more analysis, which means they have more decisions to make.
Employees tend to work faster and take on more tasks. In many cases, they also work longer hours.
What is the right way for companies to approach AI investment today?
The key is being very targeted.
Instead of broad experimentation, companies should focus on specific problems where AI can produce measurable improvements. They should track performance carefully and adjust their strategy as they learn.
Organizations also need to decide whether they want to be first movers or fast followers. Some companies are comfortable taking the risk of leading innovation. Others prefer to wait, observe what works, and adopt technologies once they mature.
Both strategies can work, but they require discipline.
You’ve also been talking about something called “IT inflation.” What is that?
IT inflation refers to the rising cost of technology.
Software prices are increasing. Hardware costs are rising due to supply chains and tariffs. Building data centers is more expensive than ever.
At the moment, technology costs are rising at roughly seven percent annually. But many companies are only increasing their technology budgets by about three percent.
That gap creates pressure. Eventually, organizations have to either increase their budgets or dramatically improve efficiency.
Some analysts are warning about a potential AI bubble. Do you see that risk?
The issue isn’t necessarily AI itself. The issue is the scale of infrastructure investment.
Right now, companies are spending enormous amounts of money building data centers and other infrastructure to support AI systems.
If the revenue generated by those systems doesn’t keep pace with the investment, you could end up with a significant imbalance.
Technology infrastructure also becomes obsolete quickly. If trillions of dollars are invested in systems that need to be replaced in a few years, the financial consequences could be substantial.
You’ve used the phrase “economic agility.” What does that mean for organizations?
Economic agility is the ability of a company to adjust its cost structure quickly when conditions change.
Many companies are not very agile because large portions of their costs are locked into long-term contracts with suppliers. When revenue declines, the only thing they can change quickly is staffing.
Ideally, organizations should design contracts and supply chains that allow costs to scale with the business. Employees should be the last resource you lose, not the first.
What ultimately needs to change in how we think about the technology economy?
We need new ways to measure it.
Technology is now central to how businesses operate and how economies grow. Yet many of the indicators we rely on were designed for a completely different era.
Technology economics is an attempt to understand the entire system—investments, infrastructure, outcomes, workforce skills, and economic impact.
If technology is going to remain one of the dominant forces shaping the global economy, we need better ways to understand how it actually creates value.