Winning Against Giants with 70 People: Lessons from an AI Image Generation Startup on the Structure of ‘Winning Small’
Related Articles
Thousands of Billions vs. 70 People: Why the Smaller Can Win
Google invests 5 trillion yen annually in AI. Microsoft is spending a similar amount. Meta also invests several trillion yen.
Meanwhile, a startup with just 70 people is fighting on the front lines of AI image generation: Black Forest Labs. This company was founded by the team that developed the core technology behind Stable Diffusion.
Why can a team of 70 win against giants? The structure of their success offers insights for small and medium-sized enterprises (SMEs) battling large corporations.
Not Competing with “General Purpose” but Striking with “Specialization”
The giants’ strategy is “general-purpose AI.” They aim to create models that can do anything, as large as possible. The estimated training cost for GPT-4 is over 1 billion dollars (approximately 150 billion yen). Gemini is expected to be similar or even more expensive.
Directly competing with this is tantamount to suicide. Black Forest Labs does not take that route.
What they are doing is “specialization.” They focus on creating models specifically for image generation, particularly for professional creative work. While general-purpose models can produce “80-point images of anything,” specialized models can generate “98-point images in a specific style.”
This difference is crucial. Advertising designers, video creators, and game developers do not use 80-point outputs; they require 98-point quality. And it is the specialized models, not the general-purpose ones, that can deliver that level of quality.
“Being able to do anything” is synonymous with “not excelling at anything.”
This mirrors the structure of SME management. While large corporations dominate the market with their comprehensive capabilities, SMEs can win through a specialization strategy that asserts, “We will not lose in this particular field.”
Not Letting “Being Small” Be a Handicap Through Technology
“I understand that specialization is key, but doesn’t developing AI models require enormous computational resources?”
That was true—until last year.
Now, the technology known as Mixture-of-Experts (MoE) is changing the game.
The mechanism of MoE is simple. Instead of running one massive model, it activates only a part of sub-models called “experts” based on the input. Imagine having 100 experts, but only 5 respond to a single question.
This allows for a large number of parameters in the model while keeping the actual computational load low. As a result, high-performance models can be operated with fewer computational resources.
Moreover, advancements in quantization technology are further enhancing this. The latest framework, MoBiE, evaluates the importance of each expert in the MoE model and assigns fewer bits to less important experts. This results in inference speeds doubling while maintaining nearly the same accuracy.
Additionally, a technology called BTC-LLM compresses the model weights to binary values of ±1. Research shows that a model with 13 billion parameters can be compressed to 0.8 bits while still maintaining practical performance.
What this means is that “models can be built that can compete without a massive GPU cluster.”
Let’s look at the numbers.
- Training a GPT-4 class model: Estimated over 15 billion yen, tens of thousands of GPUs
- Specialized model + MoE + quantization: Several million to tens of millions of yen, dozens of GPUs
The cost difference is over 100 times. This disparity allows a startup of 70 people to stand on the battlefield.
Three Conditions for “Winning Small”
From the case of Black Forest Labs, we can identify the conditions under which small teams can triumph over larger opponents.
Condition 1: Narrowing the Field of Competition
Do not compete in a general manner. Choose a field where you can claim, “We are the best for this application.” For Black Forest Labs, it is “professional image generation.” For SMEs, it could be “this region, this industry, this challenge.”
Condition 2: Changing Cost Structures through Technology
Utilize technologies like MoE and quantization to achieve high results with fewer resources. The same applies to AI utilization in SMEs. If a monthly AI tool costing 20,000 yen can automate 40 hours of work, that equates to an hourly wage of 500 yen—cheaper than hiring part-time help.
Condition 3: Leveraging Speed in Decision-Making
An organization of 70 can move faster than one of 10,000. When new technology emerges, they can adopt it immediately. They can respond to market changes the following week. This speed is something large corporations cannot replicate. SMEs can do the same; if the CEO says, “Let’s do it,” they can start the next day.
Implications for SMEs—Using the Structure of AI Rather than Creating AI
Did you think, “That doesn’t apply to us because we’re not an AI model development company”?
That’s not the case. What you should learn here is not how to create AI but the “structure of winning small.”
Specialize. Lower costs through technology. Move quickly. These three principles can be applied to all businesses, not just AI development.
For example, consider a local construction company that automates estimate creation using AI. While major house manufacturers have a nationwide standardized estimation system, they do not account for local material prices or labor rates. If a local construction company creates an AI tool that can claim, “We have the highest estimation accuracy in this region,” it follows the same structure as a specialized model.
Investing 100,000 yen per month in a general-purpose tool is less effective than developing a prompt template tailored to your specific business. This too is “specialization.”
The Giants’ Weakness is Their Size Itself
Both Google and Microsoft are slow-moving due to their size. It takes them six months to release a new model. Their internal approval processes can take three months. Specializing in a single market is often not permitted by shareholders.
The 70-person startup faces none of these constraints. SMEs do not either.
The era when being large equated to strength was one where costs were high. Now, with AI dramatically lowering costs, being large can actually become a hindrance.
Being small is not a weakness; it is a condition for changing the way of fighting.
What is your company’s “area of specialization”? How can you incorporate AI into that? Answering these questions is the first step to winning against the giants.
JA
EN