No Result
View All Result
  • Home
  • Politics
    • Causes
      • Resources
  • Features
    • Opinions
  • Lifestyle
    • Finance
  • World
  • About Us
  • Home
  • Politics
    • Causes
      • Resources
  • Features
    • Opinions
  • Lifestyle
    • Finance
  • World
  • About Us
No Result
View All Result
  • Home
  • Politics
  • Features
  • Lifestyle
  • World
  • About Us
ADVERTISEMENT
Home Lifestyle

Beyond Tech Giants: How China and Singapore Are Cracking the AI Efficiency Code

Genjoy by Genjoy
September 3, 2025
in Lifestyle
Reading Time: 6 mins read
Beyond Tech Giants: How China and Singapore Are Cracking the AI Efficiency Code
Facebook

Honestly, the AI world never ceases to amaze me. While everyone has been obsessing over who has the biggest models, two completely different approaches have quietly emerged that might just change everything.

China’s DeepSeek v3.1 and Singapore’s Hierarchical Reasoning Models are proving that sometimes, smart engineering trumps sheer scale.

DeepSeek v3.1: The Startup That Made Silicon Valley Sweat

Let me paint you a picture. DeepSeek is barely two years old, founded in July 2023, yet they just dropped a model that isgiving OpenAI executives sleepless nights. DeepSeek V3.1, released under the MIT License on August 21, 2025, is a massive 685-billion-parameter model that somehow costs a fraction of what the big players charge.

(Image generated by AI)

Here is where it gets interesting. A coding run costs about one dollar with V3.1, while competitors charge close to seventy dollars for a similar task.

ADVERTISEMENT

That is not a typo. We are talking about a 70x price difference for comparable performance.

The secret sauce? Pure engineering brilliance. The model uses a Mixture-of-Experts (MoE) architecture that activates only 37 billion parameters per token, which helps keep inference costs low despite its immense total size.

Think of it like having a massive toolbox but only picking the exact tools you need for each job, rather than dragging the whole thing around.

It also has a hybrid design that integrates reasoning and non-reasoning functions into a single model.

ADVERTISEMENT

Previous models had to choose between quick responses or deep thinking. V3.1 can switch between modes seamlessly. It is like having both autopilot and manual control in the same system.

The timing was not accidental either. The release was strategically timed, arriving just weeks after OpenAI’s GPT-5 and Anthropic’s Claude 4.1 launches.

While most major firms keep their frontier models proprietary, DeepSeek released their latest breakthrough as open source under the MIT License.

Singapore’s HRM: When Brain Science Meets AI

Now, here’s where things get really fascinating.

ADVERTISEMENT

While DeepSeek was disrupting with scale and cost, Singapore researchers took a completely different path inspired by how our brains work.

Scientists at Sapient, an AI company in Singapore, developed the Hierarchical Reasoning Model (HRM) which achieves reasoning speeds up to 100 times faster while requiring only about 1,000 training examples, compared to the billions typically needed by large language models.

Let that sink in for a moment.

The brilliance is in the architecture. HRM features two coupled recurrent modules: a high-level (H) module for abstract, deliberate reasoning, and a low-level (L) module for fast, detailed computations.

It is mimicking how your brain works when you are solving a puzzle.

The high-level part does the strategic thinking while the low-level part handles the nitty-gritty details.

(Image generated by AI)

Without pre-training or CoT (Chain-of-Thought) supervision, HRM learns to solve problems that are intractable for even the most advanced LLMs.

It achieves near-perfect accuracy in complex Sudoku puzzles and optimal pathfinding in 30×30 mazes, where state-of-the-art CoT methods completely fail (0% accuracy) for these specific hard benchmarks and small-data settings.

Think about that. While massive models with billions of parameters fail completely, this relatively tiny system with just 27 million parameters nails it every time.

That is not just impressive, that is revolutionary.

The Brain Connection That Changes Everything

What is particularly fascinating is how the Singapore team built HRM inspired by brain principles and discovered that the trained model’s organization closely matched actual biological measurements.

The low-level module uses a narrow range of thinking patterns (like focusing on specific details), while the high-level module spreads its thinking across many more dimensions (like considering abstract concepts and connections).

This mirrors exactly what neuroscientists observe in mouse brains, where higher-order areas use more dimensions for processing complex tasks.

The high-to-low ratio in HRM (2.97) almost perfectly matches that measured in mouse cortex (2.95).

It is like the AI independently discovered and refined a fundamental principle of biological intelligence.

Why This Matters More Than You Think

Both approaches challenge the prevailing wisdom in completely different ways. DeepSeek proves you do not need Silicon Valley budgets to build world-class AI.

The company says it trained V3, a predecessor of R1, for US$5.6 million, whereas reports suggest training GPT-4 may have cost around US$100 million in 2023.

Meanwhile, Singapore’s HRM shows that being smart about architecture can beat brute force approaches.

When everyone else is racing to build bigger models, they are building smarter ones.

What excites me most is how these complement each other. DeepSeek’s approach democratizes access to powerful AI through open-source availability and dramatic cost reductions.

Singapore’s HRM points toward a future where AI systems are fundamentally more efficient and require vastly less data to learn.

The Bigger Picture

I think we are witnessing the beginning of a major shift. The old playbook of “bigger models, bigger budgets, bigger everything” is getting challenged by approaches that prioritize efficiency and cleverness over raw computational power.

For developers and researchers outside the handful of mega-corporations, this is huge news. DeepSeek’s open-source model and Singapore’s efficiency breakthroughs suggest that innovation does not have to be locked behind corporate walls and massive budgets.

The next few years are going to be absolutely fascinating to watch. When small teams can achieve breakthrough results through smart engineering rather than massive scale, it openspossibilities we are only beginning to imagine.

Sources

DeepSeek v3.1 Sources:

  1. Bloomberg – “China’s DeepSeek Releases V3.1, Boosting AI Model’s Capabilities”
  2. Wikipedia – “DeepSeek” (comprehensive company and model information)
  3. DeepSeek Official Website – https://www.deepseek.com/
  4. TechTalks – “What we know so far about DeepSeek-V3.1, the new Chinese open-weight language model”
  5. Fortune – “China’s DeepSeek just dropped a new GPT-5 rival—optimized for Chinese chips, priced to undercut OpenAI”
  6. CNBC – “DeepSeek hints latest model will be compatible with China’s ‘next generation’ homegrown AI chips”
  7. South China Morning Post – “Tech war: DeepSeek’s V3.1 model emerges as ‘key pillar’ for China’s chip self-sufficiency”

Singapore HRM Sources:

  1. arXiv Research Paper – “Hierarchical Reasoning Model” (https://arxiv.org/html/2506.21734v1) – Primary technical source
  2. Live Science – “Scientists just developed a new AI modeled on the human brain — it’s outperforming LLMs like ChatGPT at reasoning tasks” (https://www.livescience.com/technology/artificial-intelligence/scientists-just-developed-an-ai-modeled-on-the-human-brain-and-its-outperforming-llms-like-chatgpt-at-reasoning-tasks)

Watch the videos here:

 

More from Wake Up Singapore:-

AI and the Workforce: How Artificial Intelligence is Changing the Employment Landscape

DeepSeek: The Unexpected AI Contender Changing the Game

ChatGPT Search: Will It Take Over as the Top Search Engine?

 

If you have a story or a tip-off, email admin@wakeup.sg or get in touch via Whatsapp at 8882 5913.

Interested in advertising on our media channels? Reach out to us at admin@wakeup.sg!


Since you have made it to the end of the article, follow Wake Up Singapore on Telegram and X!

Wake Up Singapore is a volunteer-run site that covers alternative views in Singapore. If you want to volunteer with us, sign up here!

If you can, please do consider buying a mug or two to support Wake Up Singapore’s work!

 

ADVERTISEMENT
Previous Post

Thailand Closes Over 1,000 Cannabis Shops in Nationwide Crackdown

Related Posts

Dedication on Display: Pet Owner in China Builds Subway Station for His Maine Coons
Lifestyle

Dedication on Display: Pet Owner in China Builds Subway Station for His Maine Coons

August 30, 2025
Thailand Proposes 200,000 Free Domestic Flights for Tourists from September to November 2025
Lifestyle

Thailand Proposes 200,000 Free Domestic Flights for Tourists from September to November 2025

August 29, 2025

Categories

  • Causes
  • Features
  • Finance
  • Home
  • Lifestyle
  • Opinions
  • Palestine
  • Politics
  • Resources
  • Singapore News
  • World
  • Advertise
  • Careers
  • Privacy Policy
  • Contact

© 2025 Wake Up, Singapore

No Result
View All Result
  • Home
  • Politics
    • Causes
      • Resources
  • Features
    • Opinions
  • Lifestyle
    • Finance
  • World
  • About Us

© 2025 Wake Up, Singapore