top of page
Color logo - no background.png

53 results found with an empty search

  • Decisions Under Uncertainty: The Venture Investor’s Framework

    Decision-making remains one of a manager’s most critical, and difficult responsibilities. Every strategic choice reflects a judgment about the future, yet many leaders still view it as more art than science.  As business environments grow more complex and data-rich, the challenge lies not in access to information but in the lack of structure to interpret it. This article explores how Decision Analysis provides that structure, connecting its theoretical foundations to practical applications in strategy and venture investing.      Figure 1: Decision Analysis Process Flowchart (Arsham, H. (1994))   I. The Concept: What Decision Analysis Is  According to the Corporate Finance Institute (CFI), Decision Analysis (DA) provides a systematic framework for evaluating alternatives when outcomes are uncertain. Originating in economics and engineering, DA helps decision-makers navigate complexity through three interconnected steps:  Problem Framing: defining the decision, clarifying objectives, and identifying constraints.  Model Construction: translating qualitative options into quantitative or probabilistic models that capture relevant uncertainties.  Evaluation and Choice: assessing each alternative by its expected value or utility to determine the most favourable path forward.  In contrast to instinctive or purely experience-based approaches, Decision Analysis makes the reasoning behind choices explicit. It allows managers to visualize possible futures, compare trade-offs, and document assumptions  II. The Foundations: How Probability Guides Rational Choice  At the heart of Decision Analysis lies the concept of probability. While no amount of analysis can remove uncertainty, probability provides a rational structure for comparing potential outcomes.   As noted by H. Arsham (1994) in Tools for Decision Analysis, probability serves as a disciplined substitute for perfect information: it enables decision-makers to express incomplete knowledge in quantitative form and to reason about risk systematically rather than intuitively.  Several key principles underpin this reasoning:  Expected Utility: Good decisions balance outcome and likelihood. In practice, this means choosing an option with a steady, reliable payoff over one with a big but unlikely return.  Trade-off Logic: Every choice involves balancing risk and reward. Sound judgment ensures that the possible gains are worth the risks taken.  Value of Information: New information matters only if it changes how we see the odds or affects which option we choose.  This encourages leaders to ask not “What will happen?” but “Given what we know, what is most likely to happen, and what are we willing to risk?”  III. The Process: How Data Becomes Judgment  Rational decision-making depends not only on statistical tools but also on how organizations transform information into actionable knowledge. According to Arsham (1994), this process follows a progressive hierarchy that links empirical observation to managerial insight:   Stage   Description   Decision Purpose   Data   Raw, unprocessed observations.  Establish a factual baseline.  Information   Data contextualized for a specific question.  Filter relevance from noise.  Fact   Verified information supported by evidence.  Anchor reasoning in reliability.  Knowledge   Integrated understanding of causal patterns.  Guide forecasting and interpretation.  Decision   Application of knowledge to select an action.  Translate insight into commitment.    IV. The Behavioral Dimension: How Human Factors Influence Decisions  Even the most rigorous models depend on human judgment, and people, by nature, are inconsistent. As Arsham (1994) noted, decision-makers often face psychological traps that can quietly distort their reasoning. These distortions usually come from two main sources: bias and noise.  Addressing similar issues, Kahneman, Lovallo, and Sibony (2011) had also suggested a set of simple but structured habits called Decision Hygiene in their article:  Premortems:  imagine a decision has failed, then work backward to uncover what might have caused it.  I ndependent Evaluations:  ask individuals to form and record their opinions before group discussion to preserve diverse perspectives.  Comparative Audits:  review similar past decisions to spot unexplained differences or weak spots in reasoning.  Importantly, emotions are not just sources of bias, they also express what we value. They help define what feels acceptable or fair. The best decision-making combines analytical clarity with emotional awareness, ensuring outcomes that are both reasonable and human.  V. The Application: How Decision Analysis Operates in VCs  Venture capital, at its core, is a living experiment in decision-making. It operates in an environment where uncertainty is constant, and information is never complete. Every investment requires judgment under ambiguity, making structure not a constraint, but a necessity.  Clint Korver (2012), co-founder and managing director of Ulu Ventures, brought this structure into the heart of venture investing. Drawing on his Stanford Ph.D. in Decision Analysis, he developed a framework that treats each deal not as a binary yes-or-no bet, but as a probabilistic model, a map of uncertainty that can be reasoned through.   In Korver’s framework, every opportunity is dissected across four critical dimensions:  Market Feasibility: Is there real, scalable demand?  Product Execution: Can the team deliver with speed and quality?  Team Adaptability: How well can they adjust as new information emerges?  Financing Continuity: Can capital be sustained through key milestones?  Each factor is modelled under low, base, and high scenarios with explicit probabilities, generating a Probability-Weighted Multiple on Investment (PWMOI). The goal isn’t numerical precision, it’s clarity. By quantifying assumptions, teams make their reasoning visible, comparable, and ultimately improvable.  Over time, this approach shifts the nature of decision-making itself. Instead of chasing perfect forecasts, investors, and founders, build systems that learn.  VI. Turning Decisions into Knowledge  As we can see from the venture investing example, the value of Decision Analysis extends beyond a single application. It is not only a tool for evaluating investments, but a broader framework for improving how organizations learn from decisions over time. Each outcome contributes to a growing base of evidence that refines future judgment.  Institutionalizing this process typically involves three reinforcing mechanisms:  Documentation: recording the rationale, probabilities, and key assumptions behind major decisions.  Feedback Loops: comparing projected outcomes with actual results to identify where reasoning diverged from reality.  Iteration: refining models and heuristics as new data and experience emerge.  Together, these mechanisms create what can be described as a decision memory, a collective intelligence that strengthens organizational judgment over time. In complex, uncertain sectors such as venture capital, this accumulated learning becomes a strategic advantage, turning experience into foresight and improving decision quality across the system.    References list:  Arsham, H. (1994, February 25). Tools for decision analysis . https://home.ubalt.edu/ntsbarsh/business-stat/opre/partIX.htm   Team CFI. (2024, August 12). Decision Analysis (DA) . Corporate Finance Institute. https://corporatefinanceinstitute.com/resources/data-science/decision-analysis-da/   Kahneman, D., Lovallo, D., & Sibony, O. (2011). Before you make that big decision...   Harvard Business Review , 89(6), 50–60. Retrieved from https://www.researchgate.net/publication/51453002_Before_you_make_that_big_decision   Korver, C. (2012). Applying decision analysis to venture investing.   Kauffman Fellows Journal , 3. Retrieved from https://www.kauffmanfellows.org/journal/applying-decision-analysis-to-venture-investing

  • Beyond Models: The Power of Context

    Over the past year, the center of gravity in AI has steadily evolved. As a16z observed in “Context Is King” (2025), the next wave of defensibility in AI will not be defined by who builds the largest or fastest models, but by how intelligently these systems are applied within high-context human environments, the places where trust, workflow design, and domain understanding matter just as much as technical performance. As foundation models become more accessible, the true opportunity now lies in how effectively companies translate capability into real-world value. Success depends not only on what an AI system can do, but on how naturally it fits into the rhythm of existing processes, how well it supports decision-making, safeguards data integrity, and augments human judgment.   Today, we’ll explore why context has become the new moat , how it’s redefining what makes an AI company defensible, how it’s changing the profile of founders leading this generation of startups, and why those who build with context at the core will ultimately outlast those who only build with code    1. The Founder Inversion  In previous software waves, startups were usually founded by domain experts. A doctor built software to digitize patient workflows. A logistics manager turned operational pain points into SaaS for fleet tracking. Domain insight came first; engineering came later.  AI has flipped that model. The new generation of founders starts from technical depth, not industry experience. They’re engineers, researchers, or data scientists who understand how to prompt, fine-tune, and orchestrate large language models. Their expertise lies in the toolset, not the domain.  This inversion has unlocked speed. Technical founders can prototype in days and iterate in public. But speed introduces a new risk: when you start with technology instead of context, it’s easy to build something impressive that never quite fits the workflow.    2. From Differentiation to Defensibility  AI makes differentiation easy , and defensibility hard. With open-source models and public APIs, the cost of building has collapsed. But so has the cost of copying. A dozen teams can now ship nearly identical features within weeks.  That’s why the most resilient companies are shifting their focus from speed to stickiness. They know that long-term advantage doesn’t come from better prompts or faster releases — it comes from contextual depth.  Defensibility still rests on three timeless principles:  Owning the workflow  end-to-end.  Embedding deeply  into customer systems.  Earning trust  through accuracy and reliability.  And in AI, each of those depends on context.   Harvey  Embedding: Legal Reasoning into AI Systems  In 2022, Gabe Pereyra, a former DeepMind researcher, and Winston Weinberg, a litigation associate at O’Melveny & Myers, founded Harvey, an AI copilot for legal professionals. Pereyra brought technical mastery in reinforcement learning and reasoning systems; Weinberg contributed a practitioner’s sense of how lawyers argue, document, and defend their decisions.  Harvey’s core system builds on large language models fine-tuned for legal reasoning. It layers those LLMs with structured retrieval pipelines that pull from precedent databases, templates, and firm-specific repositories, grounding every output in verifiable sources. Instead of generating text freely, the system constrains its reasoning through citation and document linking, ensuring interpretability, auditability, and compliance with strict confidentiality standards.  That design mirrors the discipline of the legal profession itself. Harvey doesn’t simply “write like a lawyer”; it reasonslike one, weighing precision over speed and justification over novelty. It integrates directly into firms’ document management systems, aligning with internal processes and hierarchy of review. This fidelity to real-world legal practice — not just technical performance — helped Harvey win clients such as Allen & Overy and PwC Legal, and earn investment from OpenAI’s Startup Fund. Harvey’s defensibility lies in trust by design: its architecture encodes the same logic of evidence and accountability that governs the legal field.    Runway: Turning Generative AI into Production Infrastructure  At NYU’s Interactive Telecommunications Program, Cristóbal Valenzuela, Anastasis Germanidis, and Alejandro Matamala began experimenting with machine learning as a new medium for creativity. They weren’t filmmakers, they were engineers and artists asking a simple question: Could AI become a creative partner rather than just a tool?  Their answer became Runway, launched in 2018 as an open playground for generative models in image and video creation. Early adoption came from digital artists, but professionals in film and design quickly exposed its limits, inconsistent frame quality, lack of version control, and weak integration with existing production software.  Runway evolved fast. It rebuilt its core on a proprietary multimodal engine that combines diffusion models for text-to-video generation with temporal coherence systems to maintain frame-by-frame consistency. It integrated seamlessly with Adobe Premiere Pro, After Effects, and Unreal Engine, embedding AI capabilities directly inside professional workflows.  That shift, from model experimentation to production-grade infrastructure, redefined Runway’s competitive edge. Its architecture now optimizes for real-world constraints: color fidelity, export stability, and latency. The company’s innovation wasn’t merely algorithmic; it was operational. Runway bridged the gap between cutting-edge generative models and the exacting standards of commercial production, and in doing so, became part of the creative stack itself.    Adept: Teaching Machines to Understand Human Workflows  Founded in 2022 by David Luan and Niki Parmar, both alumni of OpenAI, Google Brain, and DeepMind, Adept set out to answer a different question: Can AI learn to use software the way people do?  Rather than training domain-specific systems, Adept builds transformer-based agents that interact with existing applications, Salesforce, Google Sheets, Chrome, through their actual interfaces. By combining text input with UI structure, cursor trajectories, and clickstream data, Adept’s models learn to perform tasks end-to-end, creating what the company calls a universal action model.  These agents don’t predict text; they predict actions in context. They understand menus, shortcuts, and the logic behind user corrections. Over time, this forms a “workflow intelligence layer”, a behavioral dataset that maps how real people navigate digital work.  Adept’s defensibility doesn’t come from model scale but from data exclusivity. While most foundation models are trained on static text, Adept’s systems learn from proprietary, high-resolution records of human task behavior, the kind of contextual data that cannot be scraped or replicated.  In effect, Adept isn’t building software to replace humans; it’s training AI to use the tools humans already rely on. Its moat comes from this unique alignment between model learning and human intent, a form of context that compounds over time.  3 . "Context is the king" Context Turns Capability Into Usefulness  AI models are powerful at generating answers, but they’re still poor at understanding situations. A model can summarize a document or analyze data, but it doesn’t know the cultural, legal, or operational context in which those actions take place.  In the real world, users don’t want creative output, they want reliable outcomes that respect industry standards, maintain data integrity, and follow decision-making rules.  Context provides that missing bridge. It allows AI systems to operate within the “logic” of a specific domain — whether that means legal reasoning, financial compliance, manufacturing quality control, or clinical workflows.  Without context, AI remains impressive but impractical. With context, it becomes a trusted assistant, a system that augments human judgment rather than complicating it.    Context Builds Trust and Adoption  Trust is the foundation of any sustainable AI product.  Users don’t trust AI because it’s intelligent; they trust it because it behaves consistently within their world, using familiar language, adhering to policy, and respecting boundaries.  That’s why contextual grounding, relying on verified data sources, domain-specific logic, and workflow integration, is so powerful. When an AI behaves predictably and fits naturally into existing processes, users stop treating it as an experiment and start depending on it as infrastructure.  This is exactly how companies like Harvey, Runway, and Adept have scaled. Their advantage isn’t just technical; it’s relational. They’ve earned permission to operate in high-stakes environments, law firms, production studios, and enterprises, where accuracy, continuity, and compliance are not optional. Trust, once earned, becomes the strongest form of retention.    Context Creates a Data Flywheel  Every time a user interacts with an AI system that’s deeply embedded in their workflow, it generates valuable behavioral data: how people phrase requests, make corrections, and handle exceptions.  That feedback compounds over time.  Better context produces better outputs.  Better outputs drive higher usage.  Higher usage creates richer data for fine-tuning.  This contextual flywheel becomes a self-reinforcing loop, a proprietary data asset that no competitor can replicate. It’s the foundation of defensibility in the era of open models.  Companies that invest early in domain integration create moats that grow stronger with every user interaction. They may be using the same underlying LLMs as everyone else, but their data, trust, and workflow depth are uniquely their own.    4. Implications for Startups  This shift reshapes how AI companies should think about product strategy and long-term defensibility.  Move from product demos to workflow depth. The goal isn’t to show impressive output, it’s to solve real operational pain points inside the customer’s system of record.  Prioritize embedding over expansion. The most resilient startups dominate one vertical before expanding to others. Context travels horizontally only after it’s mastered vertically.  Build data ownership through usage, not scraping. Proprietary value comes from how customers use your product, not from what you scrape from the internet.  Treat trust as an asset. Every accurate, explainable, and compliant output compounds your credibility, and credibility compounds retention.  For founders, this means pairing technical mastery with deep customer intimacy. For investors, it means evaluating startups not just on model innovation, but on their ability to embed AI into the hard edges of real business systems, where contracts are signed, decisions are made, and accountability lives.  References: Haber, D. (2025, August 18). Context is King. Andreessen Horowitz . https://a16z.com/context-is-king/ Martin, I. (2025, October 29). Legal AI startup Harvey raises $150 million at $8 billion valuation . Forbes. https://www.forbes.com/sites/iainmartin/2025/10/29/legal-ai-startup-harvey-raises-150-million-at-8-billion-valuation/ Vyshyvaniuk, K. (2025, August 13). The inspiring story: Cristóbal Valenzuela, CEO at Runway . KITRUM. https://kitrum.com/blog/the-inspiring-story-cristobal-valenzuela-ceo-at-runway/ Wiggers, K. (2022, April 26). Adept aims to build AI that can automate any software process. TechCrunch . https://techcrunch.com/2022/04/26/2304039/

  • The Architecture of AI: Mapping the Layers That Shape the Industry 

    AI is often viewed through the lens of applications, chatbots, image generators, and recommendation systems, but its true structure lies deeper. Beneath the surface is a layered architecture of hardware, infrastructure, models, and capital flows that determines where value and power concentrate.  This article explores that architecture, how compute, manufacturing, and supply constraints shape the pace and direction of AI’s growth. It examines: the Layered Architecture of AI , the Hardware Foundation , the role of Fabs and Foundries , and the Strategic Implications  for those building within this evolving ecosystem. Over time, as compute becomes ubiquitous and inexpensive, AI will integrate into every sector much like electricity, pervasive, invisible, and indispensable.  This ecosystem is still forming, setting the foundation for an eventual embedding of AI into every commercial and consumer context. The process may unfold over the next two decades, as inference becomes as ubiquitous as electricity, invisible, cheap, and everywhere.    Source: The Business Engineer (2025)   The Layered Architecture of AI  To understand who holds leverage in AI, one must look across its layered stack, from hardware foundations to the applications built on top .    Each layer, from infrastructure to models and user-facing products, plays a distinct role in shaping performance, scalability, and value capture across the ecosystem. The modern AI stack is organized into interconnected layers, each representing a distinct source of value creation and competitive advantage:  Hardware Layer:  The foundation of AI performance. Chips such as GPUs, TPUs, NPUs, and ASICs enable large-scale computation for training and inference.  Cloud & Compute Infrastructure:  Scalable environments from hyperscalers like AWS, Google Cloud, and Azure that provide the compute backbone for model training.  Model Layer:  Foundation and domain-specific models, from general-purpose LLMs (GPT, Claude, Gemini) to fine-tuned vertical models, that define performance boundaries.  Vertical & Consumer Applications:  Industry-specific AI solutions (finance, healthcare, manufacturing) and consumer products that bring AI directly into daily workflows.  AI-Integrated Hardware:  Devices such as smart glasses, wearables, and edge systems that embed intelligence locally, closing the loop between physical and digital interaction.    Source: The Business Engineer (2025)     Each layer compounds upon the one below it. Competitive advantage often emerges at the intersections , where infrastructure meets application, or where proprietary data enables model differentiation.  For most companies, operating in one or two layers offers focus and defensibility. Only a handful such as Google, OpenAI, Microsoft, Amazon, and Meta,   attempt multi-layer integration to build end-to-end moats.’  Hardware: The Foundation of the Stack  The hardware layer defines the ceiling for AI performance. It integrates four critical subsystems, compute, memory, interconnect, and workload optimization , that together determine the efficiency, scalability, and economics of AI at scale. Below is a focused look into the primary subsystems and how they coalesce.       Source: The Business Engineer (2025)     Compute Units   Each compute unit is tuned for a particular balance of throughput, latency, and efficiency. Compute lies at the core of AI workloads:  GPUs & TPUs:  General-purpose GPUs are well-suited for large-scale training. TPUs and tensor-optimized ASICs tailor compute paths for deep learning primitives (e.g. matrix multiplications).  NPUs & AI Accelerators:  These are optimized for inference, especially at the edge. NPUs often operate on quantized representations with lower power draw.  Hybrid / Chiplet Designs:  Modern architectures mix compute types (CPU + GPU + NPU) on a single package, connected through local interconnects.  Memory & Data Access   Efficient memory hierarchy and data orchestration unlock the full potential of compute units.  Feeding these compute engines demands a sophisticated memory pipeline:  High-Bandwidth Memory (HBM):  Close-coupled DRAM stacks that offer wide data paths and low latency, essential for heavy workloads.  On-Chip Caches / SRAM:  Fast local storage that stages data and alleviates pressure on external memory.  Near-Memory / In-Memory Processing:  Emerging designs embed compute near memory banks to reduce movement cost and energy.  Memory Fabrics:  In large systems, memory may be shared or disaggregated across units, forming a fabric of data access.  Interconnect & Fabric   Interconnect design balances bandwidth, latency, coherence, and scalability.  Compute and memory must be wired together with minimal friction:  Network-on-Chip (NoC):  On-chip routing layer that connects tiles, caches, and accelerators.  High-Speed Links / Protocols:  Between chips, protocols such as NVLink, CXL, or proprietary fabrics enable high-bandwidth, low-latency communication.  Cluster Fabric / Pod-Scale Networks:  At rack or cluster scale, cross-node interconnects become critical to coordinate distributed compute.  Unified Trade-offs & Use Cases   Good designs achieve hardware–software co-optimization: compiler, scheduler, data layout, and compute architecture must align to deliver consistent gains. When integrated, these subsystems support two core modes:  Training Workloads:  Maximize parallelism and throughput , memory bandwidth and compute are priority, while occasional inefficiencies in latency or power are tolerable.  Inference Workloads:  Demand tight latency bounds, energy efficiency, and predictable performance, often near the data source.  Fabs and Foundries  While most discourse around AI focuses on chips, models, and cloud, the manufacturing backbone, semiconductor fabs and foundries, is poised to become the silent pivot of the entire value chain.  We examines the scale of global investment, the strategic importance of fabrication in AI competitiveness, and the structural challenges that will shape how nations and companies build this critical infrastructure.  The Scale of Investment & the Global Surge   According to Mckinsey (2025), the industry is targeting $1 trillion+ in fab investments through 2030 to support next-generation semiconductor capacity.   Strategic commitments are already underway: GlobalFoundries (2025) announced a $16 billion U.S. investment to expand its chip manufacturing footprint.  National and regional policies, for instance, the U.S. CHIPS Act, are fueling incentives to onshore advanced node fabs, reduce supply-chain risk, and assert sovereign control over critical infrastructure.   Why Fabs Matter in AI Strategy   Bottleneck Leverage:  As AI chips demand tighter tolerances, advanced lithography, and novel packaging (e.g. 2.5D, 3D stacking, chiplets), having fabs that push manufacturing boundaries unlocks a deep moat.  Sovereignty and Security:  Controlling the fabrication layer reduces dependencies on geopolitical chokepoints, export controls, or supply disruption.  Co-innovation & Differentiation:  When a company designs chips and simultaneously co-owns or controls fabrication, it can exploit process-level optimizations unavailable to external clients.  Integration across Edge-to-Cloud:  Edge AI will drive demand for specialized nodes, custom packaging, and hybrid architectures. Designers deeply aligned with manufacturers will capture the premium in performance and power efficiency.  Challenges & Strategic Trade-offs   Massive CapEx & Long Time Horizons:  Building a leading-edge fab costs multiple billions and often takes 5–10 years to reach stable production, risking obsolescence if technology shifts midstream.  Talent, Yield, and Complexity:  Advanced nodes require deep expertise, near-flawless yields, tight process control, and years of iteration.  Energy & Infrastructure Demand:  Fabs demand massive power, water, clean-room infrastructure, and cooling systems. The expansion will often run parallel to energy and utility upgrades.   Geopolitics & Trade Risk:  Fabs straddle trade policies, subsidy regimes, cross-border supply contracts, and national security constraints.    Implications: Competing Along the AI Value Chain  For founders, understanding the AI value chain isn’t just academic, it’s strategic. Each layer, from silicon to software, defines a different source of leverage . Knowing where to play, and how deep to integrate, determines whether a company builds a durable moat or becomes dependent on others’ infrastructure.   We will outline three dimensions of competitive advantage: how to position within the stack, how to manage ecosystem interdependence, and how capital and timing shape long-term outcomes.  Strategic Positioning   Strategic focus determines durability within the AI stack:  Pick your depth, not just your niche.  In the AI stack, focus is power. Competing across too many layers dilutes capital and talent. Excelling in one or two, and mastering the interfaces between them, yields more defensibility than chasing vertical integration.  Defensibility moves downward.  As models and APIs commoditize, differentiation migrates toward data, infrastructure control, and real-world deployment. Owning the bottleneck, whether compute access, proprietary datasets, or on-device presence, shapes long-term advantage.   Ecosystem Dependence   Interconnectedness across the stack makes collaboration essential.  The stack is interdependent.  A startup’s success in one layer often hinges on alignment with players above and below, chip suppliers, cloud providers, or application distributors. Building partnerships early can offset dependence and reduce scaling risk.  Hardware awareness is now a founder skill.  Even software-native teams must understand compute economics. Access, latency, and cost will increasingly shape product feasibility and gross margins.  Capital and Timing   Capital allocation and market timing shape competitive outcomes.  Follow the capital flow.  The next decade will see trillions funneled into fabs, energy, and compute, reshaping cost curves and regional dynamics. Startups that anticipate where capacity will expand (and where it won’t) can position themselves ahead of bottlenecks.  Timing the layer shift.  As AI infrastructure matures, opportunities will cascade upward: first in compute efficiency, then model differentiation, then applied intelligence. Founders who time their entry at the right inflection point, when a lower layer stabilizes, can ride the next wave of abstraction.  The Takeaway  The AI economy rewards those who understand the stack as a system, not a buzzword. Whether you build models, applications, or tools, your defensibility depends on how you connect to, or control, the layers beneath you.  Over the next decade, the real question isn’t  “What’s your AI feature?”  It’s “Where in the value chain do you create non-replaceable value, and who controls your bottleneck?”     List of references:  Cuofano, G. (2025, March 17). The AI value chain. The Business Engineer . https://businessengineer.ai/p/the-ai-value-chain   GlobalFoundries. (2025, June 4). GlobalFoundries announces $16B U.S. investment to reshore essential chip manufacturing and accelerate AI growth . GlobalFoundries. https://gf.com/gf-press-release/globalfoundries-announces-16b-u-s-investment-to-reshore-essential-chip-manufacturing-and-accelerate-ai-growth/?utm_source=chatgpt.com   Pilling, D., & Steele, M. (2025, August 6). Unleashing AI’s next wave of infrastructure growth . Sands Capital. Retrieved from https://www.sandscapital.com/unleashing-ais-next-wave-of-infrastructure-growth/

  • Battery energy storage systems (BESS): from global role to emerging market opportunities 

    The big picture: why storage matters?  As renewable energy grows, variability becomes the defining challenge for power systems. Solar and wind depend on weather and time of day, often creating mismatches between supply and demand. The result is grid instability, curtailment, and reliance on fossil peakers  (Yoo & Ha, 2023).  Battery energy storage systems (BESS)   are designed to close this gap. They absorb surplus electricity and release it when needed, delivering:  smoothing of renewable intermittency  ancillary services such as frequency control and black start  deferred transmission and distribution upgrades  improved power quality and reduced curtailment  By enabling grids to integrate higher shares of renewables, BESS play a central role in decarbonization strategies worldwide.  Global market outlook  Among storage technologies (pumped hydro, compressed air, thermal, supercapacitors), BESS dominate because they combine modularity, scalability, and rapid response with no siting constraints. This explains why BESS capacity is growing faster than any other form of storage.  According to McKinsey analysis on battery energy storage systems (2023)  the global market is projected to expand sharply over the decade, with front-of-the-meter (FTM) utility-scale projects already accounting for most new deployments. By 2030, utility-scale BESS installations could reach 450–620 GWh annually, representing up to 90% of the market.      Opportunities by segment  Three major segments illustrate where BESS is creating value:  Source: McKinsey     Utility-scale (FTM)   Customers: utilities, grid operators, renewable developers  Growth: fastest, with ~29% CAGR expected this decade  Revenue models: revenue stacking through ancillary services, wholesale arbitrage, and capacity markets  Success drivers: scale, cost, project execution, software for grid optimization  Commercial and industrial (C&I)   Forecast growth: ~13% CAGR, 52–70 GWh annual additions by 2030  Subsegments:  EV charging infrastructure: batteries at charging stations to avoid costly grid upgrades  Critical infrastructure: data centers, hospitals, telecom towers replacing diesel and lead-acid backup  Buildings and factories: peak shaving, self-consumption, backup, grid services  Harsh environments: mining, oil & gas, construction sites shifting away from gensets  Residential   Forecast: ~20 GWh by 2030, the smallest but innovation-rich segment  Value drivers: bundling with rooftop PV, home EV charging, microgrids  Consumer adoption shaped by: price, safety, ease of installation      Source: McKinsey       How BESS projects generate value  The BESS value chain starts with manufacturers of storage components, including battery cells and packs, and of the inverters, housing, and other essential components  in the balance of system. By our estimate, the providers in this part of the chain will receive roughly half of the BESS market profit pool.   Source: McKinsey   According to McKinsey article on revenue potential of energy storage technologies (2025), t he profitability of BESS is shaped as much by local market context as by technology. Revenue streams vary depending on power price spreads, storage-specific incentives, and the balance of renewable and storage build-out.   In the UK, frequency services underpin early projects; in Italy, capacity auctions have created bankable business models; in Germany, storage value lies in deferring costly grid upgrades. Simulations in southern Europe further highlight this variability, showing that identical assets can yield very different returns under different system evolution scenarios. For investors and operators, advanced market modeling is therefore essential,not just to optimize dispatch strategies, but also to align project design and duration with local conditions.   Performance also depends on choices around duration, system sizing, and bidding strategy. Top-quartile assets in the same market can outperform averages by 50–60%, highlighting the importance of advanced trading and controls.  Emerging Markets: A Critical Frontier  While most installed BESS capacity today is concentrated in developed economies, the next growth frontier lies in emerging markets. These countries face a unique combination of pressures: electricity demand is growing quickly, renewable energy is expanding at unprecedented rates, and grid infrastructure is often fragile or underdeveloped. This makes the case for storage even stronger than in mature systems.  However, the pathway to commercialization is far from straightforward. A few structural challenges persist:  High technical costs:  Without domestic manufacturing, equipment is often imported, making projects expensive and vulnerable to currency and logistics risks.  Lack of benefit evaluation frameworks:  Many energy markets still lack transparent mechanisms to quantify and compensate the system-level benefits of storage, from frequency regulation to deferred grid investments.  Incomplete storage regulations:  Policies for renewables are common, but few governments have codified how storage participates in wholesale markets, ancillary services, or capacity mechanisms.  Dependence on imports:  Minimal local production capacity forces developers to rely on global supply chains, which can deter foreign investment and delay deployment.  These barriers create a paradox. On one hand, the market potential is enormous—emerging countries could leapfrog directly into renewables-plus-storage systems rather than building out fossil-based grids. On the other, without clear policies, stable revenue models, and local capacity-building, projects remain difficult to finance at scale.  Vietnam As a Case Study  According to the research by Yoo, Y., & Ha, Y. (2023),  Vietnam illustrates both the promise and the challenges of this frontier. The country has emerged as a regional leader in renewable energy, with variable renewable energy (VRE) generation reaching 28.2 TWh in 2021, 11.5% of total power output, the highest share in Southeast Asia. Supportive policies have helped Vietnam achieve rapid renewable deployment, positioning it as one of the most dynamic clean energy markets in the region.  Yet the development of storage has not kept pace. Despite its leadership in VRE growth, Vietnam ranks only third among five Southeast Asian peers in installed BESS capacity. This imbalance creates growing risks of grid congestion, curtailment, and instability, undermining the full value of the country’s renewable build-out.  Unlocking Vietnam’s BESS market will require a deliberate policy and market design push:  Codified rules for grid participation , enabling storage to provide ancillary services and be compensated.  Capacity auctions or contracts-for-difference (CfDs)  to stabilize revenue streams and reduce investor risk.  Integration of storage into grid planning , ensuring that storage is considered alongside transmission and distribution investments.  Selective localization , not in full-scale manufacturing, but in assembly, EMS software, and lifecycle services that can strengthen the domestic ecosystem.  Implications:  A Recipe for Success  Given these uncertainties, what does it take to succeed? Four principles stand out:  Expand along the value chain.  In a young industry, integrators can move into packaging, and battery makers into integration and services. Software will become key as value shifts from hardware to control and optimization platforms, making digital capabilities critical for margins.  Build resilient supply chains.  Cells, inverters, and control systems depend on complex global networks exposed to material shocks and regulation. Winners will diversify sourcing, localize production, and partner with strong EPC firms to deliver reliably at scale, especially for utility projects.  Focus on what customers value.  System design should follow segment priorities, duration, safety, PV compatibility, and interconnection ease, rather than pure performance. In a price-driven market, differentiation lies in matching technical features with customer needs.  Scale decisively.  As larger players consolidate, smaller firms must act boldly, via partnerships, IP commercialization, or fast scaling, to stay relevant.  There is no universal BESS model. Success depends on adapting to local revenue opportunities while building strengths in supply, technology, and execution. Early movers who combine these will turn storage from a necessity into a sustainable business.  Battery energy storage has shifted from niche to essential in enabling renewable grids. Value now spans utility, C&I, and residential segments, with business models built on stacked revenues and market context. In emerging markets like Vietnam, policy clarity and market design matter as much as costs. Those investing in scalable, chemistry-flexible systems and strong software will lead the next wave of growth.    Reference list:  McKinsey & Company. (2025). Evaluating the revenue potential of energy storage technologies . https://www.mckinsey.com/industries/electric-power-and-natural-gas/our-insights/evaluating-the-revenue-potential-of-energy-storage-technologies   McKinsey & Company. (2023). Enabling renewable energy with battery energy storage systems . https://www.mckinsey.com/industries/automotive-and-assembly/our-insights/enabling-renewable-energy-with-battery-energy-storage-systems   Yoo, Y., & Ha, Y. (2023). Market attractiveness analysis of battery energy storage systems in Indonesia, Malaysia, the Philippines, Thailand, and Vietnam. Renewable and Sustainable Energy Reviews , 191 , 114095. https://doi.org/10.1016/j.rser.2023.114095

  • Vietnam’s Trade Finance Gets a $60M Push: What IFC’s MSB Deal Means for Startups and Investors

    In a region where venture capital funding is under scrutiny and startup growth demands sharper fundamentals, the International Finance Corporation (IFC)’s recent $60 million trade finance proposal for Maritime Commercial Joint Stock Bank (MSB) is a timely signal for fintech and B2B startup builders. While the headlines point to traditional banking, the implications for Vietnam’s startup and SME ecosystems are far-reaching, especially for platforms enabling trade, logistics, embedded finance, or inventory solutions. What’s the Deal? IFC, a member of the World Bank Group, is evaluating a $60 million trade finance facility for MSB under its Global Trade Finance Program (GTFP). The one-year facility is designed to: Expand MSB’s trade finance operations Increase liquidity support for importers/exporters Improve access to credit for underserved SMEs Strengthen MSB’s connectivity with global correspondent banks This is not IFC’s first touchpoint with MSB. In February 2025, it also signed an advisory agreement with the bank to develop a Sustainable Finance Framework, and it's part of a broader investment trend that includes green bonds, digital infrastructure (e.g., VETC toll collection), and retail finance support in Vietnam. Why This Matters for the VC Ecosystem? Liquidity Is the Lifeblood of SME Trade Vietnam’s economy is heavily SME-driven: SMEs account for over 90% of all registered enterprises, yet many face high barriers to accessing capital, especially for cross-border transactions or inventory-heavy business models. When IFC injects this scale of credit liquidity into a commercial bank like MSB, it opens up credit lines for thousands of small manufacturers, exporters, and supply chain operators. These are the same businesses that rely on: B2B marketplaces Inventory management SaaS Trade documentation tools Cross-border payments and logistics platforms Founders building tools for these users will benefit from a rising tide. Strategic Entry Point for Embedded Fintech Models Startups enabling invoice factoring, supplier financing, or procurement-as-a-service can partner with banks like MSB to ride this liquidity wave. The more banks need to serve SMEs efficiently, the more they look for tech partners who can help originate, underwrite, and monitor trade transactions, digitally. Think of models like: An API-layer for trade finance access SME ERP tools with financing embedded Logistics platforms that bundle customs clearance, tracking, and capital Strong Signal from a Tier-One Institutional Capital Provider When the IFC deploys capital through trade guarantees, it reduces perceived risk for global correspondent banks and encourages deeper capital flows into Vietnam—a country increasingly positioned as a China+1 supply chain destination. As IFC also explores green bonds, toll road infrastructure, and consumer finance (e.g., HD Saison), it's clear that Vietnam's financial infrastructure is being primed for scaled digital expansion—creating tailwinds for private capital and startups to follow. The Bigger Picture MSB, with $12.2B in assets and 260+ branches/transaction offices, is one of the few major banks with a mixed domestic-foreign ownership base (70% local, 30% international). This positions it as a forward-leaning partner for fintech collaborations. Pair this with the rise of: Cross-border e-commerce from Vietnam Nearshoring of manufacturing Growth in retail financing and digital payments … and you have a multi-layered signal that credit liquidity, financial infrastructure, and trade ecosystem modernization are converging. Key Takeaways for Startups & VCs For fintech founders: This is your cue to build smart layers on top of trade finance flows—MSB and others need digital enablers.  For logistics & B2B SaaS startups: Liquidity expansion means your SME users will have more transaction volume—and budget.  For VCs: This is an early indicator of Vietnam’s next growth frontier—not consumer apps, but B2B trade, capital, and infrastructure tech. As capital markets cool in many sectors, smart founders will look where liquidity is flowing—and right now, trade finance in Vietnam is one of those places. References list: Mosqueda, M. W., Jr. (2025, April 16). IFC mulls $60m trade finance facility for Vietnam’s Maritime Bank. DealStreetAsia . https://www.dealstreetasia.com/stories/ifc-vietnam-maritime-bank-438560

  • OpenAI’s Big Move in AI Coding: Cursor, Windsurf, and What It Means for Founders

    Popular AI Apps OpenAI, the creator behind ChatGPT, is making strategic bets on the future of AI-driven software development. But before it started negotiating a $3 billion acquisition of AI coding startup Windsurf, it considered another option, Cursor. Cursor, an AI-powered coding assistant, exploded in popularity last year. Built by the San Francisco-based startup Anysphere, Cursor integrates Anthropic’s advanced Claude 3.5 Sonnet model with Microsoft's widely-used open-source editor, Visual Studio Code. The result? A seamless, developer-friendly experience that quickly outpaced competitors—including, notably, Microsoft’s GitHub Copilot. OpenAI first reached out to Anysphere in 2023. No deal emerged. This year, amid Cursor's accelerating user growth—more than 1 million daily active developers as of March—OpenAI reached out again. Still, conversations stalled, according to CNBC sources. Now, Anysphere is independently aiming high. Bloomberg recently reported the company is raising funding at a valuation approaching $10 billion—remarkable growth for a startup founded just last year. The backing comes from heavyweight investors like Andreessen Horowitz, Benchmark, Thrive Capital, and even OpenAI’s own Startup Fund. Meanwhile, OpenAI's strategic focus has shifted to Windsurf. The potential $3 billion acquisition would mark its largest-ever purchase. At the same time, CEO Sam Altman announced OpenAI’s latest coding-optimized models-o3 and o4-mini-alongside a new tool, Codex CLI, designed to make AI-assisted coding even more powerful. The stakes are high. Tech giants globally are investing billions into data centers packed with Nvidia GPUs, powering massive LLMs (Large Language Models). These models are reshaping industries far beyond software development—including customer service, sales, and even law. But software is where innovation is fastest, so much so that companies now worry about developers quietly using AI assistance to ace job interviews. The industry shift was perfectly captured by OpenAI co-founder Andrej Karpathy, who earlier this year coined the term “vibe coding” to describe developers guiding AI models through natural instructions rather than traditional code-writing. Karpathy notably highlighted Cursor’s use of Anthropic’s Claude model—not OpenAI’s own tools—a subtle but telling endorsement. Today, Cursor is part of a broader wave that includes rapidly-growing platforms like Bolt, Replit, and Vercel, capturing mindshare among developers looking for frictionless AI integration into their workflows. OpenAI clearly senses the shift. The company has reportedly spoken to over 20 startups in the AI coding space, seeking to position itself at the core of the new developer stack. What should founders watch here? The era of building software alone is fading. A new, collaborative AI-human model of software creation is emerging, fast. And the winners won't just provide better technology—they’ll shape how millions of developers bring ideas to life. What Should Founders Watch Here? Anchor Your AI in High‑Frequency Workflows Cursor’s breakout success came from inserting AI directly into daily coding routines. OpenAI’s Codex CLI follows the same playbook—bringing AI into command‑line workflows. Founders should identify and embed their tools into developers’ most repeatable tasks - be it CI/CD pipelines, code reviews, or release scripts. When your AI becomes part of the “always‑on” workflow, it transcends novelty and becomes indispensable. Blend Grassroots Traction with Strategic Partnerships Cursor’s million‑user milestone didn’t happen in isolation; it rode developer word‑of‑mouth into enterprise pilots. OpenAI’s dual approach—first courting Cursor, then negotiating Windsurf - underscores that scale will come from both bottom‑up adoption and top‑down alliances. Founders should cultivate vibrant user communities while also engaging platform leaders early, so you can transition from GitHub stars to boardroom discussions without losing momentum. Cultivate Developer Mindshare Before It Consolidates Andrej Karpathy’s “vibe coding” mention of Cursor was more than a meme - it was a tacit endorsement that drove even more adoption. In today’s fragmenting AI landscape, establishing your tool as the go‑to, “everyone’s talking about it” solution creates a network effect that's hard to replicate. Founders need to seed organic advocacy - through workshops, open forums, or referral incentives - before larger players swoop in to consolidate market share. Prepare for an Accelerating M&A Wave OpenAI’s willingness to explore multi‑billion‑dollar deals is a clear harbinger of rapid consolidation. If you’re building an AI coding platform, assume that major players will be eyeing your domain soon. Architect your company for strategic alignment: expose clean APIs, document extensibility points, and maintain transparent roadmaps. When acquisition talks come, you’ll be positioned not as a bolt‑on feature but as a core enabler of the bigger ecosystem. In less than a year, OpenAI’s dance between Cursor and Windsurf has outlined the contours of the next developer platform. Founders who heed these signals- embedding AI where it matters most, balancing organic growth with partnership plays, locking in developer mindshare, and readying for consolidation-will be the ones to define how software gets built in the AI era. References list: Novet, J. (2025, April 17). OpenAI looked at buying Cursor creator before turning to AI coding rival Windsurf. CNBC . https://share.google/1X89YuChN6wXuzAWf

  • OpenAI Agrees to Acquire Startup Windsurf in $3B Deal

    One day you're debugging code, building product, and shipping features like any other startup. The next, you're on the phone with OpenAI. That’s the surreal kind of week Windsurf, formerly Codeium, just had. According to Bloomberg, OpenAI just made waves in the AI industry by agreeing to acquire Windsurf, an innovative AI-assisted coding tool formerly known as Codeium, for about $3 billion. This deal, if finalized, will be the largest acquisition to date for the ChatGPT creator, signaling an aggressive move in an increasingly competitive space. Although both OpenAI and Windsurf have yet to officially comment on the acquisition, sources familiar with the ongoing discussions confirmed the deal’s progress, speaking anonymously due to its private nature. This significant acquisition aligns with OpenAI's recent fundraising milestone, a massive $40 billion round led by SoftBank Group Corp., pushing OpenAI’s valuation to an impressive $300 billion. Interestingly, OpenAI recently shifted its strategy away from restructuring into a conventional for-profit entity, following intense public scrutiny and feedback. But why Windsurf? Why now? OpenAI is likely making this strategic acquisition to quickly strengthen its foothold in a fiercely competitive market. Competitors like Google's Gemini and China's DeepSeek are increasingly placing pricing pressure on foundational AI models, while rivals such as Anthropic and Google have recently launched models outperforming OpenAI’s offerings in coding benchmarks. Instead of building a coding assistant from scratch, OpenAI can instantly tap into Windsurf’s established developer community and mature product, rapidly expanding its business in developer-focused AI tools. Previously, Bloomberg News also highlighted talks between OpenAI and Windsurf, underscoring the seriousness of their strategic interest. Windsurf, formally known as Exafunction Inc., had been actively engaging with top-tier venture investors such as Kleiner Perkins and General Catalyst, aiming to secure funding at around a $3 billion valuation. Notably, just last year, Windsurf secured funding at a valuation of $1.25 billion in a round led by General Catalyst, reflecting rapid growth and escalating market interest. References list: Roof, K., & Metz, R. (2025, May 6). OpenAI reaches agreement to buy windsurf for $3 billion. Bloomberg.com . https://share.google/n6UzHBaoAtrgCxEnj

  • Anysphere Raises $900 Million at $9.9 Billion Valuation

    A Defining Moment in Developer Tools and AI Infrastructure In one of the most extraordinary funding announcements of the year, Anysphere Inc., the company behind AI-powered developer tool Cursor, has raised $900 million at a $9.9 billion valuation. Just 14 months after launching, the company has surpassed $500 million in annualized revenue and earned daily usage from more than one million developers. Adoption now spans more than half of the Fortune 500. The round, led by Thrive Capital with participation from Andreessen Horowitz, Accel, and DST Global, is not just a funding milestone. It is a signal of what the next era of software startups could look like. Cursor may appear to be just another AI assistant. But beneath the surface lies a story of extraordinary precision—across product strategy, technical architecture, go-to-market motion, and timing. To call Anysphere the fastest-growing software startup in history is not hyperbole. It is a case study in how compound growth emerges when the fundamentals are right, and the moment is seized. Start with the User, Then Scale with the Enterprise Anysphere did not chase the enterprise directly. It began with a single, elegant proposition: a powerful, AI-enhanced code editor priced at $20 per month. The simplicity of the offer and the product’s immediate value led to rapid word-of-mouth growth among individual developers. Crucially, this bottom-up adoption model allowed Anysphere to grow organically across organizations without incurring high customer acquisition costs or relying on outbound sales teams in the early stages. As usage scaled inside large enterprises, IT teams and procurement functions followed. Sales did not lead the growth—it followed it. Why this matters: Bottom-up adoption validates product-market fit more reliably than top-down pilots High net retention and usage-led expansion drive more efficient lifetime value Product-led growth creates momentum before sales complexity enters the picture This go-to-market motion is increasingly essential in software, especially in technical categories like devtools. The strongest distribution channel is a product that solves real problems and spreads naturally among users. Own the Infrastructure to Own Your Future Where many startups layer on top of existing models or third-party APIs, Anysphere took a different route. Early in its lifecycle, it made the strategic decision to build and train its own foundational model infrastructure. This technical independence gave the company several distinct advantages: Control over margins by eliminating costly dependencies Flexibility to iterate faster on model updates and user feedback Differentiation in performance, privacy, and feature set Infrastructure ownership is not cheap, but it is transformative. In a space where most startups are building wrappers around OpenAI or Anthropic, Anysphere has built defensible value by going deeper into the stack. It now benefits from roadmap autonomy, pricing power, and resilience against vendor changes or cost fluctuations. For founders building in AI, the message is clear: platform risk is real. Owning your technical foundation may slow you down in the short term, but it compounds advantages long term—especially in competitive, fast-moving markets. Solve Daily Problems, Not Just Novelty Gaps Cursor succeeded not because it wowed users with flashy features, but because it seamlessly embedded into daily developer workflows. It made coding faster, autocomplete smarter, and debugging more intuitive. All without requiring developers to change their habits or rewire their tools. This daily utility is what drives real software retention: Low friction integration means faster onboarding and stickier behavior Embedded tools create higher usage frequency and stronger network effects Routine impact, not novelty, drives expansion and cross-team adoption The product became valuable because it was consistently useful. In contrast, many AI startups chase impressive demos that don’t translate into durable usage. Anysphere understood that tools are kept, not for what they promise, but for how quietly they improve daily work. Speed as a Strategic Weapon Anysphere’s rise was not just about what it built—it was about how fast it moved. While others in the space debated product strategy, go-to-market approaches, and infrastructure choices, Anysphere launched, iterated, and scaled. In consolidating markets like developer tools and AI platforms, execution speed is not a luxury—it is an imperative. The early leader in a space that is structurally winner-take-most can capture a disproportionate share of mind, market, and multiple. In just over a year, Anysphere leapfrogged from product launch to enterprise ubiquity. That window will not reopen for competitors. The company’s momentum now creates defensibility that goes beyond technology—it becomes behavioral and contractual. For founders, this underlines a vital principle: moving decisively is not just about being first—it is about removing the option for others to follow. Strategy in Sequence: The Deeper Lesson What makes Anysphere’s story most compelling is not the speed or size of its growth, but the sequencing. Every layer of its growth was intentional: First, build a tool individual users love Then, scale into organizations via usage, not sales Reinforce growth by owning infrastructure Deepen retention through workflow integration Lock in market position through velocity Each step supported the next. Nothing was rushed, but nothing was left on the table. It was a masterclass in sequencing decisions to create compounding advantage. Final Thoughts: The New Blueprint for AI Startups Anysphere’s rise is not a fluke of hype. It is the result of quiet discipline, thoughtful strategy, and deeply intentional choices. In a market filled with AI noise, Anysphere built something that delivered signal—at scale. For founders building in AI, developer tools, or enterprise software, this is more than inspiration. It is instruction. References list: Temkin, M. (2025, June 5). Cursor’s Anysphere nabs $9.9B valuation, soars past $500M ARR. TechCrunch . https://share.google/BcqqJwryB2KodQWbT

  • F88 is preparing to list on the Unlisted Public Company Market (UPCoM) within 30 days

    According to DealstreetAsia, Vietnamese consumer finance platform F88 is preparing to list on the Unlisted Public Company Market (UPCoM) within 30 days, after receiving approval from the State Securities Commission. While the move is technically a step below Vietnam’s main stock exchanges, it marks a strategic shift in how late-stage startups are approaching liquidity and long-term capital planning. F88 originally targeted a full IPO on the Ho Chi Minh City Stock Exchange (HOSE) by 2024 but has now extended that timeline to 2027. Rather than being seen as a delay, the UPCoM listing is increasingly viewed as a calculated on-ramp, a transitional stage that allows companies to build public market readiness while maintaining growth momentum. The company’s financial turnaround supports this phased approach. After reporting a net loss of VND 545 billion in 2023, F88 rebounded with a VND 351 billion (~$13.5 million) profit in 2024 and posted strong Q1 2025 figures: a 25% increase in disbursement, 21.5% revenue growth, and a 204% surge in pre-tax profit.  To fund further growth, F88 plans to raise VND 700 billion (~$27 million) through bond issuance in 2025 and is seeking a strategic investor in 2026. Backed by notable investors like Mekong Capital, Vietnam Oman Investment, and Granite Oak, and debt providers such as Lending Ark Asia and Lendable, F88’s capital strategy exemplifies the layered, modular financing approach now taking hold in Vietnam. In Vietnam’s venture ecosystem, once defined by fast fundraising rounds and a “growth-at-all-costs” mentality, F88’s approach marks a clear break from the past. The company's journey shows that the road to becoming a public company is no longer just about raising capital quickly or scaling aggressively. Instead, it now demands real financial discipline: sustained profitability, margin control, and operational transparency. But financial health alone isn't enough. F88 is also layering its capital sources, using a mix of venture equity, private debt, bonds, and future strategic investment, illustrating how modern startups must diversify funding beyond traditional VC pathways. This multi-tiered financing approach gives companies more flexibility while signaling maturity to the market. Perhaps most importantly, F88’s UPCoM listing highlights how liquidity itself is evolving. No longer a binary jump from private to IPO, exits are becoming staged processes, where startups use intermediate steps like bond issuance or secondary exchanges to build public trust, test investor appetite, and de-risk their eventual market debut. The message is clear: building a company today means architecting not just a product, but an integrated capital strategy. Exits are not singular events, they’re designed, sequenced, and signaled long in advance. For both founders and investors, the message is clear: building a company today is not just about developing a product, but about architecting a comprehensive capital strategy. An exit is no longer a one-time event—it’s a carefully designed and sequenced journey that starts early. References list: Vietnam Investment Review. (2025, May 7). F88 becomes public company after being listed on UPCoM. Vietnam Investment Review - VIR . https://share.google/rlzTvLUsEiRaH4sKx

  • Can AI Learn from Books Without Breaking the Law?

    Why the Anthropic Decision Is a Turning Point for AI Training Data As generative AI continues to disrupt industries, one question has loomed large: Can AI legally train on copyrighted data? A recent U.S. court decision involving Anthropic, the maker of the Claude AI model, has delivered the first major legal precedent. In short: Training AI models on copyrighted content may be allowed under fair use—but storing pirated copies is not. This ruling reshapes the landscape for AI startups, enterprise LLM builders, and venture capitalists evaluating AI companies. What Happened: Claude AI, Copyright Law, and Fair Use Anthropic trained its Claude model using millions of copyrighted books. The court ruled that: AI model training can qualify as fair use in certain contexts Storing pirated or unlicensed copyrighted works is illegal This ruling is significant because it carves out a potential legal path for AI development under U.S. copyright law—but draws a strict line around data acquisition and storage compliance. For startups building LLMs or other generative models, this precedent highlights the legal risks of using unverified datasets. AI Startups: Data Compliance Is No Longer Optional Many AI startups use scraped datasets containing copyrighted works—books, lyrics, articles, or multimedia—often assuming that “public” equals “permissible.” But this ruling makes clear: how you obtain and store training data matters. Key takeaways for startups: Verify all training data sources Document data collection and licensing practices Avoid storing or redistributing copyrighted content without rights Anticipate legal discovery on data sourcing in future fundraising or M&A What Investors Need to Know: New Due Diligence for AI For venture capitalists, this ruling introduces a new layer of AI investment diligence. Legal exposure related to AI training data could materially impact a company’s valuation or risk profile. Investors should now ask: Where did the training data originate? Was any copyrighted content used without proper rights? Has the startup built a defensible compliance framework? Are there legal audits or internal documentation of data use? Backing companies with opaque or risky data pipelines could bring reputational and financial downside. Investors who prioritize lawful AI development will future-proof their portfolios. The Future of Generative AI and Copyright Law This ruling doesn’t end the debate—it ignites it. Other lawsuits involving OpenAI, Meta, and Google are still pending. But the Anthropic case sets a tone: courts are willing to recognize fair use in AI training, while penalizing illegal data practices. What’s next: Growth in AI data licensing platforms More transparent, auditable training pipelines Case-by-case legal guidance on fair use boundaries Cross-border data governance frameworks for global AI markets Build Responsibly: Legal, Ethical, and Scalable AI At VinVentures, we champion founders who build responsibly, where innovation meets intention. In AI, that means complying with copyright law, respecting data ownership, and preparing for a world of regulatory clarity. Whether you're developing a foundation model, fine-tuning an industry-specific LLM, or investing in AI infrastructure, data legality is now a strategic differentiator. References list: Capoot, A. (2025, June 24). Judge rules Anthropic did not violate authors’ copyrights with AI book training . CNBC. https://www.cnbc.com/2025/06/24/ai-training-books-anthropic.html

  • First Principles Thinking: The Mindset Behind Game-Changing Ideas

    Why "Best Practices" Aren’t Enough in Startup Innovation In the fast-paced world of entrepreneurship, founders are often encouraged to follow best practices and proven strategies. But the startups that truly change industries rarely follow inherited playbooks. Instead, they challenge assumptions, rebuild from scratch, and discover new opportunities others overlooked. This approach, First Principles Thinking , is the foundation behind some of the most successful and disruptive companies in tech. It empowers startup founders to innovate at a deeper level by rethinking everything from customer behavior to pricing models and operational structures. What Is First Principles Thinking? First Principles Thinking is a problem-solving framework that breaks down complex challenges into their most basic, undeniable truths. Instead of reasoning by analogy (doing what others have done), this method starts with core facts and rebuilds upward from there. Popularized by Elon Musk and rooted in physics and philosophy, this approach asks: What do we know to be scientifically or logically true? What assumptions are we carrying without realizing it? If we stripped this down to zero, what solution would emerge? For startups, this method encourages original thinking and differentiated strategy in markets that are often saturated with imitation. Startup Examples That Prove the Power of First Principles Some of the most iconic companies today applied First Principles Thinking at their inception. Their founders reimagined markets by questioning assumptions others accepted as facts: SpaceX rebuilt the economics of spaceflight by asking why rockets had to be single-use. From physics, they concluded reusability was viable—if designed correctly. Airbnb challenged the global hospitality industry by asking: what if travelers stayed in people’s homes instead of hotels? They created a new supply model with zero infrastructure overhead. Impossible Foods deconstructed meat to the molecular level and rebuilt it without animals—by starting with the chemical properties that make meat taste like meat. These companies didn't just improve existing models. They reinvented their industries from first principles. Why First Principles Thinking Matters More Than Ever in 2025 With rapid technological shifts, AI acceleration, and changing global dynamics, many traditional business models are becoming obsolete. For founders building in uncertain times, incremental improvement is no longer enough. Breakthroughs come from foundational clarity. In 2025, First Principles Thinking is especially relevant because: Venture capital has become more selective, prioritizing novel insights over safe bets Generative AI has commoditized superficial innovation, creating a need for deeper differentiation Global markets demand locally-grounded, agile business models that can adapt to volatility Startups that think from the ground up—not from history, will outperform in the long run. How Founders Can Apply First Principles Thinking Today Founders can begin applying this mindset by revisiting core areas of their business and asking fundamental questions: Product development : Are we building what users truly need or what competitors already offer? Monetization : Are our pricing structures based on customer value, or just industry norms? Go-to-market strategy : Have we tailored our GTM to user behavior, or are we copying common SaaS tactics? Cost structure : Could we redesign operations for efficiency rather than inheriting legacy models? Teams should develop a culture of curiosity. Regularly challenge assumptions, ask “why” multiple times, and experiment with rebuilding core systems from first principles. Implications for Investors and the Broader Ecosystem For venture capital investors, founders who use First Principles Thinking signal higher potential. These entrepreneurs typically demonstrate deeper market insight, stronger conviction, and more resilient strategies. Their solutions are not just incrementally better—they are categorically different. At VinVentures, we prioritize founders who: Understand the core dynamics of the problems they solve Have original views that deviate from market consensus Are willing to question “how things are done” in pursuit of superior solutions In a saturated funding environment, the ability to think independently and build from zero is a defining competitive advantage. Final Thoughts: Build From Truth, Not Tradition First Principles Thinking is more than a mental model, it’s a strategic advantage. As markets shift and technologies evolve, founders who adopt this approach will find new categories to lead, new products to create, and new ways to solve timeless problems. If you're building with deep conviction and rethinking the fundamentals of your market, VinVentures wants to hear from you . We back founders who challenge assumptions and create bold, scalable solutions. References list: Clear, J. (2020, February 3). First principles: Elon Musk on the power of thinking for yourself . James Clear. https://jamesclear.com/first-principles#:~:text=First%20principles%20thinking%20helps%20you,exploring%20widely%20for%20better%20substitutes .

  • How ScaleAI and Alexandr Wang Quietly Redefined the Future of Artificial Intelligence Infrastructure

    From Inspiration to Industry Influence In 2016, Alexandr Wang walked out of a movie theater after watching The Social Network . His reaction was not about emulation but ignition. He didn’t want to copy Facebook. He wanted to build something enduring. That spark became Scale AI. Nearly a decade later, the company he founded is no longer a behind-the-scenes player. It is a defining force in one of the most strategic sectors in artificial intelligence—data infrastructure. Meta’s $14.3 billion acquisition of a 49 percent stake in Scale AI is not only a vote of confidence in Wang. It is a strategic pivot for Meta and a clear signal of where long-term value in the AI economy is being built. Understanding Scale AI’s Strategic Role Scale AI began with a clear but underestimated thesis: machine learning models are only as effective as the quality of the data they are trained on. As companies raced to build larger models, Scale focused on building smarter pipelines. Its core strength lies in delivering labeled, structured, and compliant data at enterprise scale. From powering autonomous vehicles to supporting government intelligence and defense initiatives, Scale has established itself as a critical node in the artificial intelligence supply chain. Its offerings include large language model training data, synthetic data generation, and tools for human-in-the-loop review. Unlike many startups chasing the spotlight, Scale positioned itself quietly within the infrastructure layer, serving foundational needs of clients who demand precision, reliability, and scale. That positioning is now being rewarded with one of the largest strategic investments in the space. Why Meta’s Investment Signals a Strategic Shift Meta’s decision to acquire nearly half of Scale AI is not merely financial. It reflects a broader strategic calculus. As generative AI reshapes the internet, owning high-quality data infrastructure is becoming a competitive necessity. Meta recognizes that scaling large models efficiently requires more than GPU clusters. It requires trust, repeatability, and data precision. This investment marks a new phase in the platform wars where access to trusted data becomes as valuable as model performance. With this move, Meta gains an in-house advantage for building safer, enterprise-grade AI systems at a time when public scrutiny and regulatory demands around AI ethics and provenance are rapidly increasing. Lessons for Founders: Value is Built Below the Surface Alexandr Wang’s journey is a masterclass in strategic clarity. Rather than building flashy front-end tools, Scale focused on what the market would eventually depend on. The infrastructure layer may not earn headlines as often, but it earns contracts, compounding influence, and strategic leverage. Founders should pay attention to four clear lessons from this trajectory: Infrastructure creates enduring value because it becomes essential, not optional Trust and security are monetizable, especially in regulated and enterprise contexts Focused execution in unsexy markets often leads to defensible positions Long-term success requires building where the strategic bottlenecks are forming, not where attention is temporarily gathered Scale AI has followed the principles that underpin the most successful companies in history, solve a fundamental problem, own the critical layer, and stay ahead of where the market is going. Implications for Investors and the AI Ecosystem For investors, this moment affirms that the next wave of artificial intelligence returns will not come solely from consumer applications or chat interfaces. They will come from the platforms, protocols, and infrastructure that make generative systems reliable and trustworthy at scale. The Scale AI and Meta deal offers three key insights: Strategic value in AI is shifting toward infrastructure that ensures data quality, compliance, and trust Long-term enterprise AI adoption will depend on verifiable, well-structured data pipelines Investors should deepen diligence into how startups source, structure, and govern their training data This is a reminder that as AI becomes mainstream, the real leverage lies in the foundational components that are difficult to replicate and essential to performance. Closing Perspective: The New Edge in AI Innovation Scale AI’s trajectory reflects a broader pattern emerging in the artificial intelligence ecosystem. The highest value companies are not necessarily the most visible—they are the most indispensable. While many chase virality or short-term engagement, builders like Wang and his team chose the hard path of building trust, tools, and infrastructure that future innovation depends on. At VinVentures, we are constantly seeking founders with the discipline, foresight, and originality to create those systems. If you are building a company that rethinks core assumptions, supports next-generation AI, or builds the foundations others will stand on, we want to hear from you. References list: ( Founder, Alexandr Wang, Joins Meta to Work on AI Efforts | Scale , n.d.) https://scale.com/blog/scale-ai-announces-next-phase-of-company-evolution

bottom of page