Feed aggregator

How Should the Linux Kernel Handle AI-Generated Contributions?

Linux.Slashdot.org - Mon, 11/17/2025 - 07:34
Linux kernel maintainers "are grappling with how to integrate AI-generated contributions without compromising the project's integrity," reports WebProNews: The latest push comes from a proposal by Sasha Levin, a prominent kernel developer at NVIDIA, who has outlined guidelines for tool-generated submissions. Posted to the kernel mailing list, these guidelines aim to standardize how AI-assisted patches are handled. According to Phoronix, the v3 iteration of the proposal [posted by Intel engineer Dave Hansen] emphasizes transparency and accountability, requiring developers to disclose AI involvement in their contributions. This move reflects broader industry concerns about the quality and copyright implications of machine-generated code. Linus Torvalds, the creator of Linux, has weighed in on the debate, advocating for treating AI tools no differently than traditional coding aids. As reported by heise online, Torvalds sees no need for special copyright treatment for AI contributions, stating that they should be viewed as extensions of the developer's work. This perspective aligns with the kernel's pragmatic approach to innovation. The proposal, initially put forward by Levin in July 2025, includes a 'Co-developed-by' tag for AI-assisted patches, ensuring credit and traceability. OSTechNix details how tools like GitHub Copilot and Claude are specifically addressed, with configurations to guide their use in kernel development... ZDNET warns that without official policy, AI could 'creep' into the kernel and cause chaos... The New Stack provides insight into how AI is already assisting kernel maintainers with mundane tasks. According to The New Stack, large language models (LLMs) are being used like 'novice interns' for drudgery work, freeing up experienced developers for complex problems... The Linux kernel's approach could set precedents for other open-source projects. With AI integration accelerating, projects like those in the Linux Foundation are watching closely... Recent kernel releases, such as 6.17.7, include performance improvements that indirectly support AI applications, as noted in Linux Compatible.

Read more of this story at Slashdot.

Categories: Linux

How Should the Linux Kernel Handle AI-Generated Contributions?

Slashdot.org - Mon, 11/17/2025 - 07:34
Linux kernel maintainers "are grappling with how to integrate AI-generated contributions without compromising the project's integrity," reports WebProNews: The latest push comes from a proposal by Sasha Levin, a prominent kernel developer at NVIDIA, who has outlined guidelines for tool-generated submissions. Posted to the kernel mailing list, these guidelines aim to standardize how AI-assisted patches are handled. According to Phoronix, the v3 iteration of the proposal [posted by Intel engineer Dave Hansen] emphasizes transparency and accountability, requiring developers to disclose AI involvement in their contributions. This move reflects broader industry concerns about the quality and copyright implications of machine-generated code. Linus Torvalds, the creator of Linux, has weighed in on the debate, advocating for treating AI tools no differently than traditional coding aids. As reported by heise online, Torvalds sees no need for special copyright treatment for AI contributions, stating that they should be viewed as extensions of the developer's work. This perspective aligns with the kernel's pragmatic approach to innovation. The proposal, initially put forward by Levin in July 2025, includes a 'Co-developed-by' tag for AI-assisted patches, ensuring credit and traceability. OSTechNix details how tools like GitHub Copilot and Claude are specifically addressed, with configurations to guide their use in kernel development... ZDNET warns that without official policy, AI could 'creep' into the kernel and cause chaos... The New Stack provides insight into how AI is already assisting kernel maintainers with mundane tasks. According to The New Stack, large language models (LLMs) are being used like 'novice interns' for drudgery work, freeing up experienced developers for complex problems... The Linux kernel's approach could set precedents for other open-source projects. With AI integration accelerating, projects like those in the Linux Foundation are watching closely... Recent kernel releases, such as 6.17.7, include performance improvements that indirectly support AI applications, as noted in Linux Compatible.

Read more of this story at Slashdot.

Bitcoin Erases Year's Gain as Crypto Bear Market Deepens

Slashdot.org - Mon, 11/17/2025 - 03:35
655"Just a little more than a month after reaching an all-time high, Bitcoin has erased the more than 30% gain registered since the start of the year..." reports Bloomberg: The dominant cryptocurrency fell below US$93,714 on Sunday, pushing the price beneath the closing level reached at the end of last year, when financial markets were rallying following President Donald Trump's election victory. Bitcoin soared to a record US$126,251 on Oct 6, only to begin tumbling four days later after unexpected comments on tariffs by Trump sent markets into a tailspin worldwide. "The general market is risk-off," said Matthew Hougan, the San Francisco-based chief investment officer for Bitwise Asset Management. "Crypto was the canary in the coal mine for that, it was the first to flinch." Over the past month, many of the biggest buyers — from exchange-traded fund allocators to corporate treasuries — have quietly stepped back, depriving the market of the flow-driven support that helped propel the token to records earlier this year. For much of the year, institutions were the backbone of Bitcoin's legitimacy and its price. ETFs as a cohort took in more than US$25 billion, according to Bloomberg data, pushing assets as high as roughly US$169 billion. Their steady allocation flows helped reframe the asset as a portfolio diversifier — a hedge against inflation, monetary debasement and political disarray. But that narrative — always tenuous — is fraying afresh, leaving the market exposed to something quieter but no less destabilising: disengagement. "The selloff is a confluence of profit-taking by LTHs, institutional outflows, macro uncertainty, and leveraged longs getting wiped out," said Jake Kennis, senior research analyst at Nansen. "What is clear is that the market has temporarily chosen a downward direction after a long period of consolidation/ranging..." Boom and bust cycles have been a constant since Bitcoin burst into the mainstream consciousness with a more than 13,000% surge in 2017, only to be followed by a plunge of almost 75% the following year... Bitcoin has whipsawed investors through the year, dropping to as low as US$74,400 in April as Trump unveiled his tariffs, before rebounding to record highs ahead of the latest retreat... The market downturn has been even tougher on smaller, less liquid tokens that traders often gravitate toward because of their higher volatility and typical outperformance during rallies. A MarketVector index tracking the bottom half of the largest 100 digital assets is down around 60% this year.

Read more of this story at Slashdot.

Google is committing $2.25 million to support AI-ready data in Africa.Google is committing $2.25 million to support AI-ready data in Africa.

GoogleBlog - Mon, 11/17/2025 - 03:30
New funding from Google will help launch a regional Data Commons for Africa.
Categories: Technology

More Tech Moguls Want to Build Data Centers in Outer Space

Slashdot.org - Mon, 11/17/2025 - 00:50
"To be clear, the current economics of space-based data centers don't make sense," writes the Wall Street Journal. "But they could in the future, perhaps as soon as a decade or so from now, according to an analysis by Phil Metzger, a research professor at the University of Central Florida and formerly of the National Aeronautics and Space Administration." "Space enthusiasts (comme moi) have long sought a business case to enable human migration beyond our home world," he posted on X amid the new hype. "I think AI servers in space is the first real business case that will lead to many more...." The argument essentially boils down to the belief that AI's needs are eventually going to grow so great that we need to move to outer space. There the sun's power can be more efficiently harvested. In space, the sun's rays can be direct and constant for solar panels to collect — no clouds, no rainstorms, no nighttime. Demands for cooling could also be cut because of the vacuum of space. Plus, there aren't those pesky regulations that executives like to complain about, slowing construction of new power plants to meet the data-center needs. In space, no one can hear the Nimbys scream. "We will be able to beat the cost of terrestrial data centers in space in the next couple of decades," Bezos said at a tech conference last month. "Space will end up being one of the places that keeps making Earth better." It's still early days. At Alphabet, Google's plans sound almost conservative. The search-engine company in recent days announced Project Suncatcher, which it describes as a moonshot project to scale machine learning in space. It plans to launch two prototype satellites by early 2027 to test its hardware in orbit. "Like any moonshot, it's going to require us to solve a lot of complex engineering challenges," Pichai posted on social media. Nvidia, too, has announced a partnership with startup Starcloud to work on space-based data centers. Not to be outdone, Elon Musk has been painting his own updated vision for the heavens... in recent weeks he has been talking more about how he can use his spaceships to deploy new versions of his solar-powered Starlink satellites equipped with high-speed lasers to build out in-space data centers. On Friday, Musk further reiterated how those AI satellites would be able to generate 100 gigawatts of annual solar power — or, what he said, would be roughly a quarter of what the U.S. consumes on average in a year. "We have a plan mapped out to do it," he told investor Ron Baron during an event. "It gets crazy." Previously, he has suggested he was four to five years away from that ability. He's also touted even wilder ideas, saying on X that 100 terawatts a year "is possible from a lunar base producing solar-powered AI satellites locally and accelerating them to escape velocity with a mass driver." Simply put, he's suggesting a moon base will crank out satellites and throw them into orbit with a catapult. And those satellites' solar panels would generate 100,000 gigawatts a year. "I think we'll see intelligence continue to scale all the way up to where...most of the power of the sun is harnessed for compute," Musk told a tech conference in September.

Read more of this story at Slashdot.

Comment