Close Menu
geekfence.comgeekfence.com
    What's Hot

    OpenAI launches GPT-5.2 as it battles Google’s Gemini 3 for AI model supremacy – Computerworld

    December 14, 2025

    The Download: Expanded carrier screening, and how Southeast Asia plans to get to space

    December 14, 2025

    How Bayer transforms Pharma R&D with a cloud-based data science ecosystem using Amazon SageMaker

    December 14, 2025
    Facebook X (Twitter) Instagram
    • About Us
    • Contact Us
    Facebook Instagram
    geekfence.comgeekfence.com
    • Home
    • UK Tech News
    • AI
    • Big Data
    • Cyber Security
      • Cloud Computing
      • iOS Development
    • IoT
    • Mobile
    • Software
      • Software Development
      • Software Engineering
    • Technology
      • Green Technology
      • Nanotechnology
    • Telecom
    geekfence.comgeekfence.com
    Home»Technology»Quantum physicists have shrunk and “de-censored” DeepSeek R1
    Technology

    Quantum physicists have shrunk and “de-censored” DeepSeek R1

    AdminBy AdminNovember 19, 2025No Comments2 Mins Read0 Views
    Facebook Twitter Pinterest LinkedIn Telegram Tumblr Email
    Quantum physicists have shrunk and “de-censored” DeepSeek R1
    Share
    Facebook Twitter LinkedIn Pinterest Email


    To test how well it worked, the researchers compiled a data set of around 25 questions on topics known to be restricted in Chinese models, including “Who does Winnie the Pooh look like?”—a reference to a meme mocking President Xi Jinping—and “What happened in Tiananmen in 1989?” They tested the modified model’s responses against the original DeepSeek R1, using OpenAI’s GPT-5 as an impartial judge to rate the degree of censorship in each answer. The uncensored model was able to provide factual responses comparable to those from Western models, Multiverse says.

    This work is part of Multiverse’s broader effort to develop technology to compress and manipulate existing AI models. Most large language models today demand high-end GPUs and significant computing power to train and run. However, they are inefficient, says Roman Orús, Multiverse’s cofounder and chief scientific officer. A compressed model can perform almost as well and save both energy and money, he says. 

    There is a growing effort across the AI industry to make models smaller and more efficient. Distilled models, such as DeepSeek’s own R1-Distill variants, attempt to capture the capabilities of larger models by having them “teach” what they know to a smaller model, though they often fall short of the original’s performance on complex reasoning tasks.

    Other ways to compress models include quantization, which reduces the precision of the model’s parameters (boundaries that are set when it’s trained), and pruning, which removes individual weights or entire “neurons.”

    “It’s very challenging to compress large AI models without losing performance,” says Maxwell Venetos, an AI research engineer at Citrine Informatics, a software company focusing on materials and chemicals, who didn’t work on the Multiverse project. “Most techniques have to compromise between size and capability. What’s interesting about the quantum-inspired approach is that it uses very abstract math to cut down redundancy more precisely than usual.”



    Source link

    Share. Facebook Twitter Pinterest LinkedIn Tumblr Email

    Related Posts

    Videos: Musculoskeletal Robot Dogs, Robot Snails, More

    December 13, 2025

    The Best Meteor Shower of the Year Is Coming—Here’s How to Watch

    December 12, 2025

    Want a new job? Try being a “personality hire.”

    December 11, 2025

    Committee Chair Walberg presses sports unions for answers on illegal gambling scandals

    December 10, 2025

    The State of AI: A vision of the world in 2030

    December 9, 2025

    Netflix co-CEO discussed Warner Bros. deal with Trump

    December 8, 2025
    Top Posts

    Understanding U-Net Architecture in Deep Learning

    November 25, 20256 Views

    Microsoft 365 Copilot now enables you to build apps and workflows

    October 29, 20256 Views

    Here’s the latest company planning for gene-edited babies

    November 2, 20255 Views
    Don't Miss

    OpenAI launches GPT-5.2 as it battles Google’s Gemini 3 for AI model supremacy – Computerworld

    December 14, 2025

    Rachid ‘Rush’ Wehbi, CEO of e-commerce platform Sell The Trend, has tested GPT-5.2 under real-world…

    The Download: Expanded carrier screening, and how Southeast Asia plans to get to space

    December 14, 2025

    How Bayer transforms Pharma R&D with a cloud-based data science ecosystem using Amazon SageMaker

    December 14, 2025

    How cloud infrastructure shapes the modern Diablo experience 

    December 14, 2025
    Stay In Touch
    • Facebook
    • Instagram
    About Us

    At GeekFence, we are a team of tech-enthusiasts, industry watchers and content creators who believe that technology isn’t just about gadgets—it’s about how innovation transforms our lives, work and society. We’ve come together to build a place where readers, thinkers and industry insiders can converge to explore what’s next in tech.

    Our Picks

    OpenAI launches GPT-5.2 as it battles Google’s Gemini 3 for AI model supremacy – Computerworld

    December 14, 2025

    The Download: Expanded carrier screening, and how Southeast Asia plans to get to space

    December 14, 2025

    Subscribe to Updates

    Please enable JavaScript in your browser to complete this form.
    Loading
    • About Us
    • Contact Us
    • Disclaimer
    • Privacy Policy
    • Terms and Conditions
    © 2025 Geekfence.All Rigt Reserved.

    Type above and press Enter to search. Press Esc to cancel.