All posts tagged: computing

Spanish ‘soonicorn’ Multiverse Computing releases free compressed AI model

Spanish ‘soonicorn’ Multiverse Computing releases free compressed AI model

Large language models have a problem: they are large. Multiverse Computing, a Spanish startup, is addressing this issue with compressed models that aim to close the gap between what frontier models can do and what companies can actually afford to deploy.  The secret sauce is CompactifAI, a compression technology inspired by quantum computing that the Basque company has applied to models released by OpenAI. As of today, developers can access a newer version of Multiverse’s HyperNova 60B model for free on Hugging Face. The company also plans to open-source more compressed models in 2026 to support a wider range of use cases. According to Multiverse, its models are smaller, but nearly as potent and accurate. At 32GB, HyperNova 60B is roughly half the size of the model it derives from — OpenAI’s gpt-oss-120B — while boasting lower memory usage and lower latency. The updated version, called HyperNova 60B 2602, now also better supports ​​tool calling and agentic coding, where inference costs can be high. One of the competitors Multiverse claims to have beaten with HyperNova …

High-performance computing market set to reach bn by 2030

High-performance computing market set to reach $91bn by 2030

High-performance computing has evolved into a cornerstone of the modern digital economy. Driven by the AI race, the need for real-time big data analytics, and the widespread transition to cloud-based supercomputing, high-performance computing solutions are now essential for competitive advantage in sectors ranging from life sciences to financial modelling. A new report reveals that the HPC market is projected to reach $91.38bn by 2030, growing at a steady CAGR of 10.4% over the forecast period from 2026 to 2030. This signals significant market growth, as the market was valued at $55.88bn in 2025. Key drivers for the huge uptake of high-performance computing The report identified several key drivers of the high-performance computing boom. These complex technologies and discoveries rely on it to power them as easily, quickly, and cost-effectively as possible. AI and machine learning Training large language models and complex AI frameworks requires massive parallel processing power. HPC provides the necessary infrastructure for these high-density compute tasks, making it indispensable for the next generation of artificial intelligence. Cloud computing The “democratisation” of supercomputing is …

Composability for powerful edge computing

Composability for powerful edge computing

The CAPE project develops a computer architecture for efficient Edge-Cloud. Locally deployed powerful edge-cloud Infrastructure is needed to support AI-driven environments with networks of autonomous devices to maintain their context and individual and shared states. This allows the devices to work together in federation towards individual and shared goals. To scale sustainably, this infrastructure must be provided as a service and built on composable hardware foundations. Compute, memory, storage, and accelerators must dynamically self-configure to maximise utilisation, minimise waste, and adapt capacity and heterogeneity to local needs delivering cloud-class performance through decentralised, fine-grained deployment. Fig. 1: CAPE vision of composable edge micro data centres within the Edge–Cloud Continuum. The EU-funded CAPE (European Open Compute Architecture for Powerful Edge) project addresses this gap by establishing edge micro data centres as a new, composable building block of the Edge–Cloud Continuum. CAPE combines open hardware, open-source software, and open standards to enable flexible, efficient, and sovereign edge computing across Europe. Edge hardware platforms: Composable by design CAPE rethinks edge servers as pools of dynamically composable resources rather than …

Nvidia’s Deal With Meta Signals a New Era in Computing Power

Nvidia’s Deal With Meta Signals a New Era in Computing Power

Ask anyone what Nvidia makes, and they’re likely to first say “GPUs.” For decades, the chipmaker has been defined by advanced parallel computing, and the emergence of generative AI and the resulting surge in demand for GPUs has been a boon for the company. But Nvidia’s recent moves signal that it’s looking to lock in more customers at the less compute-intensive end of the AI market—customers who don’t necessarily need the beefiest, most powerful GPUs to train AI models, but instead are looking for the most efficient ways to run agentic AI software. Nvidia recently spent billions to license technology from a chip startup focused on low-latency AI computing, and also started selling standalone CPUs as part of its latest superchip system. And yesterday, Nvidia and Meta announced that the social media giant had agreed to buy billions of dollars worth of Nvidia chips to provide computing power for the social media giant’s massive infrastructure projects—with Nvidia’s CPUs as part of the deal. The multi-year deal is an expansion of a cozy ongoing partnership between …

EPI’s path to innovative high-performance computing

EPI’s path to innovative high-performance computing

The European Processor Initiative is working to achieve digital sovereignty in high-performance computing by developing European-made processors. Almost ten years ago, the European Commission decided it needed a special Framework Partnership Agreement (FPA) that would tackle the idea of a European-made processor – if not produced in Europe, then certainly designed and thought of in Europe. Even before creating a special Joint Undertaking, that would synchronise and plot the way for European HPC efforts – the EuroHPC Joint Undertaking – the European Commission included such a plan into the Horizon 2020 programme, the most expansive research and innovation funding programme on the continent. The European Processor Initiative A group of industry and research participants formed a consortium, called the European Processor Initiative (EPI), to apply to the topic under the LEIT: ICT work programme. EPI then won the FPA and started off in 2018, under the Specific Grant Agreement 1 (SGA1). The specific challenge of the topic was supporting the creation of a world-class European High-Performance Computing and Big Data ecosystem built on two exascale …

Computing the properties of dense quantum plasmas

Computing the properties of dense quantum plasmas

Researchers from Kiel University, Helmholtz-Zentrum Dresden-Rossendorf and Rostock University are developing quantum simulations that help to predict the properties of warm dense matter and inertial fusion plasmas. Fusion is the process of fusing two light atomic nuclei into a heavier one, producing tremendous amounts of energy that powers the heat and radiation of stars such as our Sun. For decades, scientists have tried to achieve fusion in laboratories on Earth using various concepts. In December 2022, for the first time, net energy gain was achieved in a fusion experiment at the National Ignition Facility (NIF) in the US,¹ raising enormous hopes to meet the dramatic increase in energy consumption of mankind that is expected for the coming decades. Fusion would be a secure and clean energy source, complementing solar and wind energy, and the ambitious goal of a fusion power plant triggered huge government investment as well as private capital allocation in many countries over recent years. Left: Prof Michael Bonitz, Christian-Albrechts-Universität zu Kiel; center: Dr Tobias Dornheim, Helmholtz-Zentrum Dresden-Rossendorf (photo: © HZDR / A. …

Super Nintendo by Keza MacDonald review – a joyful celebration of the gaming giant | Computing and the net books

Super Nintendo by Keza MacDonald review – a joyful celebration of the gaming giant | Computing and the net books

What is the highest-grossing entertainment franchise of all time? You might be tempted to think of Star Wars, or perhaps the Marvel Cinematic Universe. Maybe even Harry Potter? But no: it’s Pokémon – the others don’t come close. The Japanese “pocket monsters”, which star in video games, TV series and tradable playing cards, have made an estimated $115bn since 1996. Is this a sign of the lamentable infantilisation of postmodern society? Not a bit of it, argues Keza MacDonald, the Guardian’s video games editor, in her winsomely enthusiastic biography of Nintendo, the company that had become an eponym for electronic entertainment long before anyone had heard the words “PlayStation” or “Xbox”. Yes, Pokémon is mostly a children’s pursuit, but a sophisticated one: “Like Harry Potter, the Famous Five and Narnia,” she observes, “it offers a powerful fantasy of self-determination, set in a world almost totally free of adult supervision.” And in its complicated scoring system, “it got millions of kids voluntarily doing a kind of algebra”. Meanwhile, a lot of adults participated in the 2016 summer …

the famous rule of computing has reached the end of the road, so what comes next?

the famous rule of computing has reached the end of the road, so what comes next?

For half a century, computing advanced in a reassuring, predictable way. Transistors – devices used to switch electrical signals on a computer chip – became smaller. Consequently, computer chips became faster, and society quietly assimilated the gains almost without noticing. These faster chips enable greater computing power by allowing devices to perform tasks more efficiently. As a result, we saw scientific simulations improving, weather forecasts becoming more accurate, graphics more realistic, and later, machine learning systems being developed and flourishing. It looked as if computing power itself obeyed a natural law. This phenomenon became known as Moore’s Law, after the businessman and scientist Gordon Moore. Moore’s Law summarised the empirical observation that the number of transistors on a chip approximately doubled every couple of years. This also allows the size of devices to shrink, so it drives miniaturisation. That sense of certainty and predictability has now gone, and not because innovation has stopped, but because the physical assumptions that once underpinned it no longer hold. So what replaces the old model of automatic speed increases? …

How the Industrial Revolution invented modern computing

How the Industrial Revolution invented modern computing

Sign up for Big Think on Substack The most surprising and impactful new stories delivered to your inbox every week, for free. Before computers existed, people performed massive calculations by hand where error, repetition, and standardization shaped the outcome. We tracked comets, mapped nations, and solved problems of scale.  That legacy of manual calculation shapes how we live today;  our modern algorithms and the shaping of predictive models. Dr. David Alan Grier explains the unexpected link between the Industrial Revolution and artificial intelligence. DAVID ALAN GRIER: I’m David Alan Grier. I am currently a writer and author on issues of technology and industry and things of that sort. In the past, I have been a computer programmer, a professor, a software engineer, president of the IEEE Computer Society. I am the author of the book “When Computers Were Human” and also the book “Crowdsourcing for Dummies,” among others. [typing] Chapter 1 – Computers and the Industrial Revolution Why is computing part of the Industrial Revolution? The Industrial Revolution is about systematizing production. And it’s about producing goods …

How to finally get a grasp on quantum computing

How to finally get a grasp on quantum computing

IBM’s Quantum System Two on display at a data centre in Germany Quantum computing seems to pop up in the news pretty often these days. You’ve probably seen quantum chips gracing your feeds and their odd, steampunk-ish cooling systems in the pages of magazines and newspapers. Politicians and business leaders are peppering their announcements with the word “quantum” more frequently, too. If you’re feeling a little confused about it all, it’s a good year for a New Year’s resolution to finally figure out what quantum computing is all about. This is an ambitious goal, and the timing certainly makes sense. The quantum computing industry has seen many scientific achievements this past year, and the field is now worth more than $1 billion – a figure projected to double within the next two years. But wherever there is money and growing interest, there is also bound to be lots of hype. Plenty of questions remain about when or how a quantum computer may be able to outdo conventional computers. Mathematicians and theorists may be able to …