Apple”s launch of the next M5-powered MacBook Pro and MacBook Air are going to be later than many hoped for, with delays to release reaching into 2026.
Big dataLooking for help with today’s NYT Strands puzzle? Here’s an extra hint to help you uncover the right words, as well as all of today’s answers and Spangram.
Big dataKrispy Kreme teased a Crocs collab with a donut conveyor belt and mystery shoe. Fans are spiraling—and begging for details.
Big dataThe right hardware setup is just as important as code when mining big data.
Big dataThe federal government wants to boost Australia’s productivity levels – as a matter of national priority. It’s impossible to have that conversation without also talking about innovation. We can be proud of (and perhaps a little surprised by) some of the Australian innovations that have […]
Big dataThe federal government wants to boost Australia’s productivity levels – as a matter of national priority. It’s impossible to have that conversation without also talking about innovation.
We can be proud of (and perhaps a little surprised by) some of the Australian innovations that have changed the world – such as the refrigerator, the electric drill, and more recently, the CPAP machine and the technology underpinning Google Maps.
Australia is continuing to drive advancements in machine learning, cybersecurity and green technologies. Innovation isn’t confined to the headquarters of big tech companies and university laboratories.
Small and medium enterprises – those with fewer than 200 employees – are a powerhouse of economic growth in Australia. Collectively, they contribute 56% of Australia’s gross domestic product (GDP) and employ 67% of the workforce.
Our own Reserve Bank has recognised they also have a huge role to play in driving innovation. However, they still face many barriers to accessing funding and investment, which can hamper their ability to do so.
The Federal Government is focussed on improving productivity. In this five-part series, we’ve asked leading experts what that means for the economy, what’s holding us back and their best ideas for reform.
We all know the saying “it takes money to make money”. Those starting or scaling a business have to invest in the present to generate cash in the future. This could involve buying equipment, renting space, or even investing in needed skills and knowledge.
A small, brand new startup might initially rely on debt (such as personal loans or credit cards) and investments from family and friends (sometimes called “love money”).
Having exhausted these sources, it may still need more funds to grow. Bank loans for businesses are common, quick and easy. But these require regular interest payments, which could slow growth.
Alternatively, a business may want to look for investors to take out ownership stakes.
This investment can take the form of “private equity”, where ownership stakes are sold through private arrangement to investors. These can range from individual “angel investors” through to huge venture capital and private equity firms managing billions in investments.
It can also take the form of “public equity”, where shares are offered and are then able to be bought and sold by anyone on a public stock exchange such as the Australian Securities Exchange (ASX).
Unfortunately, small and medium-sized companies face hurdles to accessing both kinds.
Research examining the gap in small-scale private equity has found 46% of small and medium-sized firms in Australia would welcome an equity investment – despite saying they were able to acquire debt elsewhere.
They preferred private equity because they also wanted to learn from experienced investors who could help them grow their companies. However, very few small and medium-sized enterprises were able to meet private equity’s investment criteria.
When interviewed, many chief executives and chairs of small private equity firms said their lack of interest in small and medium-sized enterprises came down to cost and difficulty of verifying information about the health and prospects of a business.
To make it easier for investors to compare investments, all public companies are required to disclose their financial information using International Financial Reporting Standards.
In contrast, small private companies can use a simplified set of rules and do not have to share their statements of profit and loss with the general public.
Is it possible to list on a stock exchange instead? An initial public offering (IPO) would enable the company to raise funds by selling shares to the public.
Unfortunately, the process of issuing shares on a stock exchange is time-consuming and costly. It requires a team of advisors (accountants, lawyers, and bankers) and filing fees are high.
There are also ongoing costs and obligations associated with being a publicly traded company, including detailed financial reporting.
Last week, the regulator, the Australian Securities and Investments Commission (ASIC), announced new measures to encourage more listings by streamlining the IPO process.
Despite this, many small companies do not meet the listing requirements for the ASX.
These include meeting a profits and assets test and having at least 300 investors (not including family) each with A$2,000.
There is one less well-known alternative – the smaller National Stock Exchange of Australia (NSX), which focuses on early-stage companies. Ideally, this should have been a great alternative for small companies, but it has had limited success. The NSX is now set to be acquired by a Canadian market operator.
Our previous research has highlighted that small and medium-sized businesses should try to make themselves more attractive to private equity companies. This could include improving their financial reporting and using a reputable major auditor.
At their end, private equity companies should cast a wider net and invest a little more time in screening and selecting high-quality smaller companies. That could pay off – if it means they avoid missing out on “the next Google Maps”.
There are other opportunities we could explore. Australia’s pool of superannuation funds, for example, have begun growing so large they are running out of places to invest.
That’s led to some radical proposals. Ben Thompson, chief executive of Employment Hero, last year proposed big superannuation funds be forced to invest 1% of their cash into start-ups.
Less extreme, regulators could reassess disclosure guidelines for financial providers which may lead funds to prefer more established investments with proven track records.
There is an ongoing debate about whether the Australian Prudential Regulation Authority (APRA), which regulates banks and superannuation, is too cautious. Some believe APRA’s focus on risk management hurts innovation and may result in super funds avoiding startups (which generally have a higher likelihood of failure).
In response, APRA has pointed out the global financial crisis reminded us to be cautious, to ensure financial stability and protect consumers.
The author would like to acknowledge her former doctoral student, the late Dr Bruce Dwyer, who made significant contributions to research discussed in this article. Bruce passed away in a tragic accident earlier this year.
Colette Southam does not work for, consult, own shares in or receive funding from any company or organisation that would benefit from this article, and has disclosed no relevant affiliations beyond their academic appointment.
Data monitoring helps prevent costly errors and keeps big data systems accurate, reliable, and ready for real-time decisions.
Big dataData monitoring helps prevent costly errors and keeps big data systems accurate, reliable, and ready for real-time decisions.
In the face of US President Donald Trump’s wavering commitments and Russian President Vladimir Putin’s inscrutable ambitions, the talk in European capitals is all about rearmament. To that end, the European Commission has put forward an €800 billion spending scheme designed to “quickly and significantly increase […]
Big dataIn the face of US President Donald Trump’s wavering commitments and Russian President Vladimir Putin’s inscrutable ambitions, the talk in European capitals is all about rearmament.
To that end, the European Commission has put forward an €800 billion spending scheme designed to “quickly and significantly increase expenditures in defence capabilities”, in the words of Commission President Ursula von der Leyen.
But funding is only the first of many challenges involved when pursuing military innovation. Ramping up capabilities “quickly and significantly” will prove difficult for a sector that must keep pace with rapid technological change.
Of course, defence firms don’t have to do it alone: they can select from a wide variety of potential collaborators, ranging from small and medium-sized enterprises (SMEs) to agile start-ups. Innovative partnerships, however, require trust and a willingness to share vital information, qualities that appear incompatible with the need for military secrecy.
That is why rearming Europe requires a new approach to secrecy.
A paper I co-authored with Jonathan Langlois of HEC and Romaric Servajean-Hilst of KEDGE Business School examines the strategies used by one leading defence firm (which we, for our own secrecy-related reasons, renamed “Globaldef”) to balance open innovation with information security. The 43 professionals we interviewed – including R&D managers, start-up CEOs and innovation managers – were not consciously working from a common playbook. However, their nuanced and dynamic approaches could serve as a cohesive role model for Europe’s defence sector as it races to adapt to a changing world.
Our research took place between 2018 and 2020. At the time, defence firms looked toward open innovation to compensate for the withdrawal of key support. There was a marked decrease in government spending on military R&D across the OECD countries. However, even though the current situation involves more funding, the need for external innovation remains prevalent to speed up access to knowledge.
When collaborating to innovate, firms face what open innovation scholars have termed “the paradox of openness”, wherein the value to be gained by collaborating must be weighed against the possible costs of information sharing. In the defence sector – unlike, say, in consumer products – being too liberal with information could not only lead to business losses but to grave security risks for entire nations, and even prosecution for the executives involved.
Although secrecy was a constant concern, Globaldef’s managers often found themselves in what one of our interviewees called a “blurred zone” where some material could be interpreted as secret, but sharing it was not strictly off-limits. In cases like these, opting for the standard mode in the defence industry – erring on the side of caution and remaining tight-lipped – would make open innovation impossible.
A weekly e-mail in English featuring expertise from scholars and researchers. It provides an introduction to the diversity of research coming out of the continent and considers some of the key issues facing European countries. Get the newsletter!
Studying transcripts of more than 40 interviews along with a rich pool of complementary data (emails, PowerPoint presentations, crowdsourcing activity, etc.), we discerned that players at Globaldef had developed fine-grained practices for maintaining and modulating secrecy, even while actively collaborating with civilian companies.
Our research identifies these practices as either cognitive or relational. Cognitive practices acted as strategic screens, masking the most sensitive aspects of Globaldef’s knowledge without throttling information flow to the point of preventing collaboration.
Depending on the type of project, cognitive practices might consist of one or more of the following:
Encryption: relabelling knowledge components to hide their nature and purpose.
Obfuscation: selectively blurring project specifics to preserve secrecy while recruiting partners.
Simplification: blurring project parameters to test the suitability of a partner without revealing true constraints.
Transposition: transferring the context of a problem from a military to a civilian one.
Relational practices involved reframing the partnership itself, by selectively controlling the width of the aperture through which external parties could view Globaldef’s aims and project characteristics. These practices might include redirecting the focus of a collaboration away from core technologies, or introducing confidentiality agreements to expand information-sharing within the partnership while prohibiting communication to third parties.
Using both cognitive and relational practices enabled Globaldef to skirt the pitfalls of its paradox. For example, in the early stages of open innovation, when the firm was scouting and testing potential partners, managers could widen the aperture (relational) while imposing strict limits on knowledge-sharing (cognitive). They could thereby freely engage with the crowd without violating Globaldef’s internal rules regarding secrecy.
As partnerships ripened and trust grew, Globaldef could gradually lift cognitive protections, giving partners access to more detailed and specific data. This could be counterbalanced by a tightening on the relational side, eg requiring paperwork and protocols designed to plug potential leaks.
As we retraced the firm’s careful steps through six real-life open innovation partnerships, we saw that the key to this approach was in knowing when to transition from one mode to the other. Each project had its own rhythm.
For one crowdsourcing project, the shift from low to high cognitive depth, and high to low relational width, was quite sudden, occurring as soon as the partnership was formalised. This was due to the fact that Globaldef’s partner needed accurate details and project parameters in order to solve the problem in question. Therefore, near-total openness and concomitant confidentiality had to be established at the outset.
In another case, Globaldef retained the cognitive blinders throughout the early phase of a partnership with a start-up. To test the start-up’s technological capacities, the firm presented its partner with a cognitively reframed problem. Only after the partner passed its initial trial was collaboration initiated on a fully transparent footing, driven by the need for the start-up to obtain defence clearance prior to co-developing technology with Globaldef.
Since we completed and published our research, much has changed geopolitically. But the high-stakes paradox of openness is still a pressing issue inside Europe’s defence firms. Managers and executives are no doubt grappling with the evident necessity for open innovation on the one hand and secrecy on the other.
Our research suggests that, like Globaldef, other actors in Europe’s defence sector can deftly navigate this paradox. Doing so, however, will require employing a more subtle, flexible and dynamic definition of secrecy rather than the absolutist, static one that normally prevails in the industry. The defence sector’s conception of secrecy must also progress from a primarily legal to a largely strategic framework.
Les auteurs ne travaillent pas, ne conseillent pas, ne possèdent pas de parts, ne reçoivent pas de fonds d'une organisation qui pourrait tirer profit de cet article, et n'ont déclaré aucune autre affiliation que leur organisme de recherche.
Data helps marketers reach the right audience and spend ad budgets more wisely on LinkedIn.
Big dataData helps marketers reach the right audience and spend ad budgets more wisely on LinkedIn.
Data helps speech-language pathologists make more precise, informed, and effective decisions for their clients.
Big dataData helps speech-language pathologists make more precise, informed, and effective decisions for their clients.
Building material made from recycled plastic waste. Rene Notenbomer/Shutterstock The construction industry accounts for approximately 37% of global CO₂ emissions. Traditional materials like cement, steel, and bricks contribute over 70% of its footprint, with cement production making up an especially large share. To confront this […]
Big dataThe construction industry accounts for approximately 37% of global CO₂ emissions. Traditional materials like cement, steel, and bricks contribute over 70% of its footprint, with cement production making up an especially large share.
To confront this problem, researchers are developing all manner of innovative construction materials and mechanisms, ranging from walls that produce solar energy to self-repairing bacteria-based concrete.
These smart materials, seemingly the stuff of science fiction, are fast becoming a reality, and a raft of European Union (EU) initiatives aim to turn cutting-edge construction materials into real, sustainable, affordable solutions. The private sector is also playing its part – over the past two decades companies such as Dyson Holdings and Monodraught have filed more than 40 patents for advanced materials aimed at enhancing buildings’ thermal performance, durability and environmental impact.
However, any new material has to clear a lot of safety, security and environmental hurdles. This means that getting them out of the lab and into the real world can present a serious challenge.
The development process begins with identifying a technical or environmental issue, such as improving insulation or reducing energy use. A functional prototype is then created and tested under controlled conditions to assess its physical and chemical properties. This includes evaluating compressive strength, water absorption, fire resistance, thermal conductivity and acoustic insulation.
If the prototype shows promise, it then progresses to a pilot production phase, where larger quantities are manufactured to test stability, consistency, and scalability. At the same time, comprehensive technical documentation is prepared.
In the EU, approval is a lengthy and tightly regulated process. Construction materials have to comply with the Construction Products Regulation (EU No 305/2011). This involves obtaining CE (European conformity) marking, submitting a Declaration of Performance (DoP), and adhering to harmonised European standards (EN) established by the European Committee for Standardisation (CEN). These standards ensure products meet criteria related to structural safety, thermal efficiency, moisture resistance and fire behaviour.
Additionally, a Life Cycle Assessment is conducted to evaluate the environmental impact of the material, from the extraction of its component raw materials through to its eventual disposal or recycling. This assessment is crucial for aligning with European policies, and for obtaining green building certifications such as BREEAM and LEED.
Once technical approvals are complete, strategies for production, packaging, distribution and marketing are developed. Performance simulations and digital representations of the material (known as Building Information Modelling or BIM objects) are also created to ensure seamless integration into architectural designs using specialised commercial software.
Leer más:
Buildings inspired by worms and grasshoppers: the future of biomimicry in construction
This complex process means that many innovative ideas in construction never reach the market. Developers need to follow strict safety, performance, and environmental rules, which often involve costly testing and certifications. At the same time, many research teams face challenges like limited funding or industry contacts, and they may not fully understand the legal requirements. Without the right support, even the best ideas can stay stuck as prototypes.
To address these challenges, the European Union has launched several initiatives to push innovations from the initial research phase to market adoption:
Horizon Europe funds research and development in sustainability and energy efficiency.
New European Bauhaus promotes inclusive, green, and accessible urban spaces.
The LIFE Programme supports environmental and climate action projects.
The Green Deal and Renovation Wave aim to decarbonise buildings and promote a circular economy.
Leer más:
‘Urban form’ and the housing crisis: Can streets and buildings make a neighbourhood more affordable?
Bridging the gap between prototypes and market-ready construction materials requires comprehensive support. Exploit4InnoMat is a European platform offering a Single-Entry Point for entrepreneurs, SMEs and research centers aiming to scale smart, sustainable materials.
The platform provides services encompassing the entire development cycle:
Technical validation and certification: Access to EU-approved pilot line facilities for testing according to European standards, including the aforementioned CE marking and Declaration of Performance (DoP).
Specialised scientific advice: Support in material characterisation, property optimisation, and scaling strategies.
Simulation and digital modelling: Tools that predict how materials behave in terms of heat, strength, and environmental impact within digital models of real buildings. These tools help create models that can be directly inserted into BIM platforms.
Legal and intellectual property support: Assistance with patent registration and regulatory compliance.
Through this comprehensive approach, Exploit4InnoMat has already brought several new materials to market. These innovations not only enhance energy efficiency, but also minimise environmental impact and extend the lifespan of buildings. Some prominent examples include:
Ceramic panels with phase change materials (PCM), which store and release heat, maintaining stable indoor temperatures and reducing the need for heating and cooling.
Nanotechnology coatings offering antibacterial and reflective properties. These coatings are ideal for hospitals and schools, particularly in hot climates where hygiene and energy efficiency are paramount.
Recycled cement panels made from industrial waste, which reduce the use of virgin raw materials and lower emissions in production.
Optimised Ceramic Elements, such as bricks and tiles improved with additives, recycled materials, and nanotechnology to boost insulation, porosity, and sustainability.
Schemes like Exploit4InnoMat play a crucial role by integrating all development phases into a single platform. From laboratory testing and environmental validation through to market entry, they assist developers in accelerating their innovations in the knowledge that they stand a solid chance of actually being used in construction.
Materials that previously stalled at the prototype stage now have a much clearer pathway to real-world application. This streamlined process ensures that scientific advancements reach our built environment more rapidly, contributing to the creation of greener, more efficient cities prepared for future challenges.
Andrés Jonathan Guízar Dena participates as a researcher in the Exploit4InnoMat project, funded by the European Union. Within the project, he provides advisory and product characterisation services for digital modelling, including BIM environments and energy simulation.
Shutterstock/Olivier Le Queinec A lack of strategy and research funding – by both the current and previous governments – has been well documented, most comprehensively in the first report by the Science System Advisory Group (SSAG), released late last year. If there is one word […]
Big dataA lack of strategy and research funding – by both the current and previous governments – has been well documented, most comprehensively in the first report by the Science System Advisory Group (SSAG), released late last year.
If there is one word that sums up the current state of New Zealand’s research sector, it is scarcity. As the report summarises:
We have an underfunded system by any international comparison. This parsimony has led to harmful inter-institutional competition in a manner that is both wastefully expensive in terms of process and scarce researcher time, and is known to inhibit the most intellectually innovative ideas coming forward, and of course it is these that can drive a productive innovation economy.
The government expects research to contribute to economic growth, but policy and action undermine the sector’s capacity to do so.
The latest example is last week’s cancellation of the 2026 grant application round of the NZ$55 million Endeavour Fund “as we transition to the science, innovation and technology system of the future”. Interrupting New Zealand’s largest contestable source of science funding limits opportunities for researchers looking for support for new and emerging ideas.
Changes to the Marsden Fund, set up 30 years ago to support fundamental research, removed all funding for social science and the humanities and shifted focus to applied research. This is despite fundamental research in all fields underpinning innovation and the international ranking of our universities.
New Zealand has an opportunity to change its economy based on the potential of emerging sectors such as artificial intelligence, cleantech and quantum technologies. Other countries, including Australia and the United Kingdom, already consider quantum technologies a priority and fund them accordingly.
But when it comes to strategy, the composition of the boards of new Public Research Organisations, set up as part of the government’s science sector reform, are skewed towards business experience. Where there is scientific expertise, it tends to be in established industries. The governance of the proposed new entity to focus on emerging and advanced technologies is yet to be announced.
Scientists have been calling for a science investment target of 2% of GDP for a long time. It was once – roughly a decade ago – the average expenditure within the OECD; this has since increased to 2.7% of GDP, while New Zealand’s investment remains at 1.5%.
The SSAG report repeatedly refers to the lack of funding, and it would be the obvious thing to see addressed in this year’s budget. But expectations have already been lowered by the government’s insistence there will be no new money.
The report’s second high-level theme is the engagement of government with scientific strategy. Government announcements to date seem focused on attracting international investment through changes to tax settings and regulation. I would argue this is a matter of focusing on the wrapping rather than the present: the system itself needs to be attractive to investors.
Creating a thriving research sector is also a matter of scale. International cooperation is one way for New Zealand to access efficiencies of scale. And work on building international partnerships is one area of positive intent. But we need to look at our connectivity nationally as well, and use investment to build this further.
Countries with greater GDPs than New Zealand’s invest much more in research as a proportion of GDP. It means the size of these other countries’ scientific ecosystems – if measured by total expenditure – is three to four times New Zealand’s on a per capita basis.
Per-capita scale matters because it tells us how easy it is for researchers to find someone else with the right skillset or necessary equipment. It tells us how likely it is for a student to find an expert in New Zealand to teach them, rather than needing to go overseas.
And it tells us how quickly start-up companies in emerging technologies will be able to find the skilled employees they need. A thriving university system that attracts young people to develop the research skills needed by advanced technology companies is a key part of this challenge.
The government’s science sector reform aims to increase its contribution to economic growth. But research contributes to economic growth when scientists can really “lean in” with confidence to commercialising and translating their science.
That can’t happen if budgets don’t fund the critical mass, connectivity and resources to stimulate the transition to a thriving science system.
Nicola Gaston receives funding from the Tertiary Education Commission as the Director of the MacDiarmid Institute for Advanced Materials and Nanotechnology. She also receives funding from the Marsden Fund. All research funding goes to the University of Auckland to pay the costs of the research she is employed to do.
On a glorious afternoon recently, I had the good fortune to attend a specially themed Education and Skills Garden Party hosted at Buckingham Palace in London to celebrate the contributions of educators in the United Kingdom and beyond. As a Canadian citizen living and working […]
Big dataOn a glorious afternoon recently, I had the good fortune to attend a specially themed Education and Skills Garden Party hosted at Buckingham Palace in London to celebrate the contributions of educators in the United Kingdom and beyond.
As a Canadian citizen living and working in education in the United Kingdom, I was invited to attend by the High Commission of Canada in London.
The occasion provided a relaxing yet exciting opportunity to reflect on my involvement embedding sustainability into education related to innovation and intellectual property (IP) rights law.
King Charles has been a lifelong supporter of sustainability education, which is a new addition to the curricula. For me, the Royal garden and lake beautifully highlighted concerns with sustainability.
The King’s Royal garden at the Palace is an oasis in the city of London, alive with foliage and wildlife that guests may stroll around and explore. According to the event leaflet: “A survey of the Garden by the London Natural History Society revealed a wealth of flora and fauna, some quite rare species.”
Garden parties are a special way for members of the Royal Family to speak to a broad range of people, all of whom have made a positive impact on their community. Today these events are a way to recognize and reward public service.
A network of sponsors is used to invite guests, including lord-lieutenants, societies and associations, government departments and local government, as well as representatives of various churches and other faiths.
Charles first marked the issue of pollution in 1970 when he was a 21-year-old student. The King continues to champion his lifelong passion regarding the importance of the health of the environment and living sustainably.
Since 2004, I have been an innovation, intellectual property rights and business law educator. My research group contributed to a publication called The Guide to The Sustainable Development Goals (SDGs), developed to explore the connections between the United Nation’s 17 SDGs, sustainable development and IP.
Intellectual property is of concern because we need to envision and build a common future with innovation and creativity. How sustainability challenges are overcome depends on the commercialization of new green technology catalysts.
However, this process is complex. Choosing between solar versus wind, or hydro, geothermal or tidal energy technologies involves making difficult choices. IP rights, such as patents, provide practical scientific information about new green technologies. This information helps society to prioritize public, private and alternative financing to support climate change mitigation and adaptation.
Canadian firms have patented numerous climate change mitigation technologies.
For example, the Toronto-based WhalePower has significantly advanced fluid dynamics and has filed Canadian, European Union, United States, Chinese and Indian patents to protect its new technology. Their award-winning invention, inspired by the bumpy flippers of humpback whales, results in more efficient and reliable wind turbine blades.
Read more:
Here’s why UK tides are soon going to play a much bigger part in powering your home
This “tubercle” technology, named for a rounded point of a bone, also has applications for hydroelectric turbines and for revolutionizing fan design. These blades, featuring tubercles (bumps) on the leading edge, reduce aerodynamic drag and improve performance. WhalePower also generates revenue by licensing its patented technology to other companies to use in wind turbines.
Patents encourage knowledge sharing, because the way the invention works must be disclosed, rather than kept secret.
For example, new tidal energy inventors can read Whalepower’s patents and be inspired to further advance the new technology with additional incremental innovations.
A granted patent is published for free online and digitally tagged using globally recognized classification codes to facilitate easy searching by scientists, investors and financiers. The data collected on the patent register is also used to design new climate innovation research studies and inform policy-making.
In this manner, IP often stimulates investment by providing the legal rights needed to justify longer-term investment in a changing landscape of innovation.
Long-term investment into green technology is a form of environmental stewardship that I discuss in more detail in my article “Companies and UN 2030 Sustainable Development Goal 9 Industry, Innovation and Infrastructure.” IP rights support firms like Whalepower by enabling knowledge tools that can bring sustainable development goals closer to fruition.
The significant role of IP rights in promoting sustainability gained a higher profile when the United Kingdom’s Chartered Institute of Patent Attorneys (CIPA) became an Official Nominator for the annual Earthshot Prize launched by Prince William’s Royal Foundation in 2020.
CIPA helps to identify and nominate solutions for the environmental challenges that the prize aims to address. One nominated solution that uses DNA sequencing and nature’s own colours to create sustainable dyes to reduce the use of water and harmful chemicals in the fashion industry, Colorifix, was a runner-up in the 2023 edition.
Read more:
Can marketing classes teach sustainability? 4 key insights
CIPA provides crucial IP rights checks to finalists, ensuring that their innovations have no outstanding IP issues. This partnership is an example of how the Royal Family works together with CIPA to use the power of IP to help solve sustainability challenges.
As the King stated when he was Prince of Wales in 2017: “Mine is not a new commitment, but perhaps you will allow me to restate my determination to join you in continuing to do whatever I can, for as long as I can, to maintain not only the health and vitality of the ocean and all that depends upon it, but also the viability of that greatest and most unique of living organisms — nature herself.”
Janice Denoncourt is affiliated with the British Association for Canadian Studies (BACS)..
Oak Ridge National Laboratory's Frontier supercomputer is one of the world's fastest. Oak Ridge Leadership Computing Facility, CC BY High-performance computing, or HPC for short, might sound like something only scientists use in secret labs, but it’s actually one of the most important technologies in […]
Big dataHigh-performance computing, or HPC for short, might sound like something only scientists use in secret labs, but it’s actually one of the most important technologies in the world today. From predicting the weather to finding new medicines and even training artificial intelligence, high-performance computing systems help solve problems that are too hard or too big for regular computers.
This technology has helped make huge discoveries in science and engineering over the past 40 years. But now, high-performance computing is at a turning point, and the choices the government, researchers and the technology industry make today could affect the future of innovation, national security and global leadership.
High-performance computing systems are basically superpowerful computers made up of thousands or even millions of processors working together at the same time. They also use advanced memory and storage systems to move and save huge amounts of data quickly.
With all this power, high-performance computing systems can run extremely detailed simulations and calculations. For example, they can simulate how a new drug interacts with the human body, or how a hurricane might move across the ocean. They’re also used in fields such as automotive design, energy production and space exploration.
Lately, high-performance computing has become even more important because of artificial intelligence. AI models, especially the ones used for things such as voice recognition and self-driving cars, require enormous amounts of computing power to train. High-performance computing systems are well suited for this job. As a result, AI and high-performance computing are now working closely together, pushing each other forward.
I’m a computer scientist with a long career working in high-performance computing. I’ve observed that high-performance computing systems are under more pressure than ever, with higher demands on the systems for speed, data and energy. At the same time, I see that high-performance computing faces some serious technical problems.
One big challenge for high-performance computing is the gap between how fast processors are and how well memory systems can keep up with the processors’ output. Imagine having a superfast car but being stuck in traffic – it doesn’t help to have speed if the road can’t handle it. In the same way, high-performance computing processors often have to wait around because memory systems can’t send data quickly enough. This makes the whole system less efficient.
Another problem is energy use. Today’s supercomputers use a huge amount of electricity, sometimes as much as a small town. That’s expensive and not very good for the environment. In the past, as computer parts got smaller, they also used less power. But that trend, called Dennard scaling, stopped in the mid-2000s. Now, making computers more powerful usually means they use more energy too. To fix this, researchers are looking for new ways to design both the hardware and the software of high-performance computing systems.
There’s also a problem with the kinds of chips being made. The chip industry is mainly focused on AI, which works fine with lower-precision math like 16-bit or 8-bit numbers. But many scientific applications still need 64-bit precision to be accurate. The greater the bit count, the more digits to the right of the decimal point a chip can process, hence the greater precision. If chip companies stop making the parts that scientists need, then it could become harder to do important research.
This report discusses how trends in semiconductor manufacturing and commercial priorities may diverge from the needs of the scientific computing community, and how a lack of tailored hardware could hinder progress in research.
One solution might be to build custom chips for high-performance computing, but that’s expensive and complicated. Still, researchers are exploring new designs, including chiplets – small chips that can be combined like Lego bricks – to make high-precision processors more affordable.
Globally, many countries are investing heavily in high-performance computing. Europe has the EuroHPC program, which is building supercomputers in places such as Finland and Italy. Their goal is to reduce dependence on foreign technology and take the lead in areas such as climate modeling and personalized medicine. Japan built the Fugaku supercomputer, which supports both academic research and industrial work. China has also made major advances, using homegrown technology to build some of the world’s fastest computers. All of these countries’ governments understand that high-performance computing is key to their national security, economic strength and scientific leadership.
The United States, which has been a leader in high-performance computing for decades, recently completed the Department of Energy’s Exascale Computing Project. This project created computers that can perform a billion billion operations per second. That’s an incredible achievement. But even with that success, the U.S. still doesn’t have a clear, long-term plan for what comes next. Other countries are moving quickly, and without a national strategy, the U.S. risks falling behind.
I believe that a U.S. national strategy should include funding new machines and training for people to use them. It would also include partnerships with universities, national labs and private companies. Most importantly, the plan would focus not just on hardware but also on the software and algorithms that make high-performance computing useful.
One exciting area for the future is quantum computing. This is a completely new way of doing computation based on the laws of physics at the atomic level. Quantum computers could someday solve problems that are impossible for regular computers. But they are still in the early stages and are likely to complement rather than replace traditional high-performance computing systems. That’s why it’s important to keep investing in both kinds of computing.
The good news is that some steps have already been taken. The CHIPS and Science Act, passed in 2022, provides funding to expand chip manufacturing in the U.S. It also created an office to help turn scientific research into real-world products. The task force Vision for American Science and Technology, launched on Feb. 25, 2025, and led by American Association for the Advancement of Science CEO Sudip Parikh, aims to marshal nonprofits, academia and industry to help guide the government’s decisions. Private companies are also spending billions of dollars on data centers and AI infrastructure.
All of these are positive signs, but they don’t fully solve the problem of how to support high-performance computing in the long run. In addition to short-term funding and infrastructure investments, this means:
High-performance computing is more than just fast computers. It’s the foundation of scientific discovery, economic growth and national security. With other countries pushing forward, the U.S. is under pressure to come up with a clear, coordinated plan. That means investing in new hardware, developing smarter software, training a skilled workforce and building partnerships between government, industry and academia. If the U.S. does that, the country can make sure high-performance computing continues to power innovation for decades to come.
Jack Dongarra receives funding from the NSF and the DOE.