Strategic Tech and Market Signals

The Market Trusts Buildable AI, But Still Waits for AI That Customers Will Pay For

May 14th, 2026|

Executive Summary Investment markets are applying two different standards of evidence to AI. The market has been willing to believe in AI infrastructure because GPUs, data centers, AI servers, optical communications, liquid cooling, power equipment, and supply chain orders can be built, measured, and reflected in financial results. But when the discussion shifts to SaaS and AI applications, the market asks for clearer proof of commercialization, including enterprise willingness to pay, user habits, workflow change,

Google and Anthropic’s Competition: Two Different Paths in the AI Era

April 24th, 2026|

Executive Summary Google and Anthropic are not the most obvious rivals, but that is precisely why the comparison is worth paying attention to. Google represents a full platform path, with models, cloud infrastructure, enterprise tools, and global scale, and seeks to absorb AI into its existing platforms and enterprise systems. Anthropic represents a more focused path, seeking to build a long term position across multiple platforms through model capabilities, enterprise trust, and clear positioning. The

Why Analyzing AI Forced Me to Reclaim the Skills of a Financial Analyst

April 16th, 2026|

AI has forced me to rethink not only how I read companies, but also how I read markets. What began as industry analysis gradually led me back to skills I once used as a financial analyst. This essay is a reflection on why that happened. I never expected that one day I would write an essay like this. For me, industry analysis has always had a certain kind of

Why New AI Demand Still Often Flows to the NVIDIA Ecosystem

April 14th, 2026|

Executive Summary The AI compute market is becoming increasingly diverse. Large cloud providers continue to push forward with in-house ASIC and XPU development, and the number of alternatives to NVIDIA keeps growing. In theory, new AI demand should become more evenly distributed across different architectures, rather than continuing to concentrate in the NVIDIA ecosystem. But when several recent signals are viewed together, the key question may not simply be who has compute. It may be

Could AI’s Next Growth Phase Be Faster Than Expected?

April 1st, 2026|

Executive Summary A recent remark by Groq founder Jonathan Ross raises an important question. If models begin to improve the quality of their own learning signals, then the AI growth logic we have become familiar with may no longer follow the same path of diminishing returns. This article does not ask whether Ross’s claim should be accepted at face value. It asks whether the idea behind it is already supported by a set of meaningful

The Linear Narrative Around AI Memory Demand May Be Starting to Show Small Cracks

March 26th, 2026|

Executive Summary In current discussions around AI infrastructure, the market broadly assumes that memory demand will continue rising steadily as models scale, inference workloads expand, and HBM and DRAM remain under supply pressure. This narrative is grounded in real conditions, which is also why it appears especially durable. But once the focus shifts from demand itself to system design, the picture becomes less straightforward. As memory supply, cost, and capacity allocation increasingly become real constraints,

After the Groq Move, NVIDIA’s Moat May Be Deeper Than It Appears

March 20th, 2026|

Executive Summary At first glance, NVIDIA’s move to incorporate the Groq-based NVIDIA Groq 3 LPX into the Vera Rubin platform may look like a new approach to inference workload allocation. But the real focus of this article is not the technical detail itself. It is whether this move suggests that NVIDIA’s moat may be deeper than it previously appeared. The argument here is that NVIDIA’s competitive strength may not rest only on chip performance, the

The Expansion Logic of AI Infrastructure Is Changing

March 19th, 2026|

Executive Summary Several recent signals that appear unrelated at first glance may in fact point to a shift in how decisions around AI infrastructure are being made. Adjustments to the expansion pace of the Abilene data center by OpenAI and Oracle, together with Meta’s description of its in-house AI chip roadmap for MTIA, suggest that companies are facing the same underlying question. As model development, chip generations, and infrastructure construction cycles become increasingly out of

A Second Path Beyond the GPU? Architectural Thinking Behind NVIDIA’s Licensing Agreement with Groq

March 5th, 2026|

Executive Summary NVIDIA’s licensing agreement with Groq is worth watching not only because the technology itself is extreme, but because it may signal that AI compute architecture is being reconsidered. Even after GPUs have become the dominant platform for AI training and inference, NVIDIA is still willing to engage seriously with an execution model that runs almost counter to the mainstream path. That suggests the demands of the inference era may be making determinism important

CPU as an AI Pillar, Is Arm Approaching a Structural Inflection?

February 27th, 2026|

Note (March 2026): I wrote this piece before Arm officially unveiled its own data center CPU. That does not make the original argument irrelevant, but it does change the context in an important way. I am keeping the article largely as it is because the framework still helps explain what to watch. What has changed is that some of the questions discussed here are no longer purely hypothetical. They can now be read

Go to Top