noname120 a day ago
  • samsartor a day ago

    My advisor also worked on this ML project for estimating electron density and temperature within tokomaks: https://www.cs.wm.edu/~ppeers/showPublication.php?id=Ozturk:...

    Technically that counts as "AI in nuclear fusion", but it isn't any sort of breakthrough. In almost every case the effects of AI are marginal. Not zero exactly, but nowhere near the breathless hype.

    • vlovich123 a day ago

      Depends on what you count as ML and what counts as AI and the distinctions are meaningless. It’s just that ML techniques have lost their magic and are now just part of the toolkit of how you do things. But I wouldn’t say they’re zero because you need a lot of wins all over the place compounded over time to get the true win. It’s a hard multi disciplinary project with unknown physics on top of engineering. It’s now known physics in some ways on the theoretical side but the practical physics of it and how to scale it up and make it work are not solved and that’s real R&D that has to be done. There’s glimmers of progress here and there, achievements we never had before but it all still feels so far away because it’s not coalescing as fast as the advertised wins might suggest.

  • dekhn a day ago

    When I used to work in grid computing almost 20 years ago, we were already running fusion experiments, realtime streaming the data to the grid, which would rapidly analyze it and compute some new parameters for the next run (I think they had a 20 minute downtime). I don't think it was considered machine learning at the time, though (and was certainly not deep learning as we practice it today).

    • asdfman123 a day ago

      Remember, AI is just procedurally generated data analysis.

      • sroussey a day ago

        Computers are just electron tunneling machines.

      • madaxe_again 18 hours ago

        You’re just procedurally generated data analysis.

  • rajnathani 17 hours ago

    AI is not a buzzword, try beating Go without machine learning. Just ignore the whole enterprise speak, and you'll see a lot of really cool things which are possible almost only by means of neural nets.

    • PittleyDunkin 10 hours ago

      Neural nets is a useful term that refers to a class of algorithms, though. "AI" doesn't appear to have any use beyond marketing to people who don't know what they want.

    • idiotsecant 15 hours ago

      Almost every technology buzzword is based on something interesting and useful that then gets used in as many unsuitable applications as possible.

      • moralestapia 11 hours ago

        I really like idea of blockchains and think it's a pretty clean and clever solution for many problems outside of crypto.

  • sva_ a day ago

    Very interesting if you consider life as a complex chemical reaction that tries to self-sustain.

  • HPsquared a day ago

    Neural networks have been used in industrial process control for many years. This is just another industrial control problem, perhaps a difficult one.

    • spenczar5 a day ago

      That’s really interesting! Do you have other interesting specific examples? I would have guessed that most industrial control problems were simpler sets of differential equations that could be directly estimated.

      • wruza a day ago

        I’ve heard about a local packaging factory recently that uses an ML-first system for messaging their clients about what they will need to order soon, based on recent orders and global criteria. It’s not a simple problem, apparently. They signal the clients and start pre-producing, basically sort of algotrading themselves.

        Not really an industrial control though, but close to it.

        • nick3443 a day ago

          PID control your customers :)

      • ben_w a day ago

        Back in ~2004, when I was looking for a job for the industrial placement year of my degree, one of the options was a nut packing factory using computer vision systems. I was intrigued, but went for the job processing satellite images instead.

        Even well before that, ML is very closely related to statistics, so early practical applications would have been as simple as gathering data points on widget production and doing the kinds of analyses that are now backed into free spreadsheet software.

        • varjag a day ago

          From what I remember reading at the time, most of industrial vision applications 20 years ago had very little to do with neural networks. Or even with ML in general, relying on bespoke feature detectors.

          • lazide 9 hours ago

            CNNs are the gold standard, and neural networks. Not ‘AI’ though.

  • bbor a day ago

    Fascinating! Reminds me of the new generation of AR headsets (eg Orion) that are making the impossible possible simply by adding an ANN(-derived) layer above some their of device controllers. I wonder how many problems will fundamentally change in the face of mature brute-forcing techniques…

  • jacoblambda a day ago

    Yeah machine learning is more or less just very complex application of control theory techniques and notably it is usually done by people without formal control theory backgrounds.

    Super useful for control applications but obviously you really want to know control theory so that you aren't just using ML to throw darts at a wall.

    • SoftTalker a day ago

      > using ML to throw darts at a wall

      A bit off topic but I had a laugh at that, reminded me of that ridiculous Meta commercial where the girl at a pool table asks Meta which ball should she hit?

    • rcxdude a day ago

      More like optimization. Signal processing and control theory are basically the same maths with different applications, optimization is a bit different but has some overlap (especially with, e.g. optimal control techniques).

      • jacoblambda 10 hours ago

        Eh to a certain degree. The architecture of the models is very much control theory and then the training is control system tuning (which of course is an optimisation problem like you said).

        I would definitely agree that optimisation fits the definition in part but I find really only control theory covers that entire field of signal processing, optimisation, and decision making systems.

        And importantly, because ML in some amount touches on all of those, control theory tends to fit better as it focuses so heavily on providing a comprehensive framework for reasoning about all of those elements together.

  • madaxe_again 18 hours ago

    Yeah, this is precisely the kind of stuff I was talking about 48 hours ago, when everyone was telling me that ML will never find any kind of practical application.

    There are so many fundamental fields - engineering, chemistry, biology, physics - which stand to have absolute quantum leaps in knowledge and capability with this technology.

chriskanan a day ago

There is a lot of AI research in the nuclear fusion space. For inertial confinement fusion (a competing technology to magnetic confinement fusion, e.g., tokamaks) the National Ignition Facility (NIF) used it for their experiment that resulted in "ignition."

My lab is collaborating with researchers at the Laboratory for Laser Energetics to use AI to improve inertial confinement fusion (ICF). We recently put out this paper [1] using Kolmogorov-Arnold Networks (KANs) to predict the outcome of ICF experiments. Currently, existing physics simulators are based on old Fortran code, are slow, and have a high error between their predictions and actual laser shots, so among other goals, we are trying to build better predictors using neural networks. This is needed since it is hard to rapidly iterate on real data, since they only have a dataset of around 300 ICF shots.

[1] https://arxiv.org/abs/2409.08832

  • noobermin a day ago

    Here we go with the CS people saying "the old fortran codes are terrible." Yeah, the "high error" between code predictions and laser shots is because LPI is inherently noisy and it's essentially impossible to fully control conditions. I would work on expensive sims for days to weeks and the experiments would see differences that would be off from that from an order or mag because their focal point is off by a micron. There's nothing wrong with the "old Fortran codes," they have the right physics, the problem is the initial conditions are just uncontrollable so that's why simulating these systems is hard.

    Codes are not magic, they are physical codes, as in, they generally encode the physics as we understand it relevant to the experiment, so you might as well say our physical models are wrong, which is a much harder bar to clear, you'd have to invalid probably near 100 years of plasma physics. The problem likely is as I said, the experiments are just hard to control and we don't know the correct inputs. It's not like weather forecasting where we can have a weather balloons across the world, we're not able to probe every micron of the target at all times for a plasma temperature and density.

    • phreeza 17 hours ago

      I understood the comment as saying the fortran code maybe used some sort of inefficient numerical scheme and/or some inaccurate approximations? Doesn't seem completely outlandish that more modern methods could help there? Of course fortran is not a problem in itself, you are right.

      • gauge_field 16 hours ago

        In addition to the capabilities of language, usually it depends also on the ecosystem than the language itself (unless you want to write the entire algorithm by yourself for some reason). This is especially true if the algorithms used in the project are well-studied and the efficient implementations are provided. That is probably the reason why using modern/popular tool/language will more likely to perform better.

        One of the most prominent example of this gemm. Usually, the state of the art code base on gemm are written C/C++, in terms of implementations in academic papers/github, see e.g. openblas, blis, blasfeo. The same situation applies to CUDA code or accelerator agnostic code e.g. using MLIR. I think it's more a result of how the language allowed people to create an ecosystem + ecosystem created by the people using the language. Sure, you can write Fortran, but if I see more tooling and more other people benchmarking, I can be more sure that the software will be tested more and higher performance etc. For instance, if you look at benchmarks-game, the top results are C/C++/Rust. Instead of making claims like this code is not wrong/right, we should look at the concrete/quantitative results like benchmarking/number of users. As another example, you can check blasfeo paper where they used C code.

        I would welcome Fortran benchmarking results. But, I just dont see it being tested enough (in open source/papers/benchmarks) to prefer it over C/C++/Rust.

        There are other consequences of this networking issue: availability of docs, finding an question and answer for a problem that you experienced.

  • nightowl_games a day ago

    Isn't there a lot of daylight between old Fortran code and AI?

    What if we rewrote the old algorithms in C with modern techniques? Multitthreading? Or GPU compute? If there's value there, I could do these things. Probably wouldn't take that long

    • rnhmjoj 11 hours ago

      Rewriting the old Fortran code in C will probably make it slower with new bugs. A smarter thing to do when picking up terrible code written by physicists is to document everything you can, write tests and then start refactoring bit by bit using modern Fortran features (yes, the latest standard is 2023).

      Fortran compilers had more than 40 years to become pretty good at generating efficient code; they can make assumptions that are not possible in C (for example, no aliasing) to do so. Besides, most compilers already can do vectorization and autoparallelisation with multithreading, coarrays, and/or openMP, which can be offloaded to a GPU.

      • nightowl_games 6 hours ago

        > Rewriting the old Fortran code in C will probably make it slower with new bugs.

        Then it's not done yet.

    • staunton 12 hours ago

      The issue isn't that the Fortran code is too slow. The issue is that the problem description is super complicated, hard to measure and very hard to control is ways that noone really understands. However, you can just plug the measurement outputs and system inputs to some controller. This machine learning helps control by jointly modeling imperfections in the physical model, imperfections in the hardware that controls things, and imperfections in the measurements. That's something you just really don't want to even attempt writing by hand (in whatever programming language).

  • gauge_field a day ago

    I got curious when you said " old Fortran code, are slow, and have a high error between their predictions " Do you have any online reference/docs that explain apis/software/source code related to projects in this area?

ziofill 20 hours ago

About 20 years ago I was an undergrad at the university of Padova (italy), and in the outskirts of the city (in Legnaro) there was a fusion experiment. The fusion device was in one building and the control room was in an adjacent building. Back then we were using CRT monitors and each time there was a fusion event, the magnetic confinement field was so strong that the image on all the screens in the control room would simultaneously shift on one side and then spring back when the field was turned off. Across buildings.

  • HPsquared 14 hours ago

    I wonder if that would affect hard drives too. Or microphones.

carbocation a day ago

Is this just an announcement of a grant renewal?

aurelien 12 hours ago

the Porcini Mushroom is wrong, because when you cut a part of this mushroom the Yellow become Blue, and it is by this way one of rare blue element in Nature.

twothreeone a day ago

> However, uncovering these inter-correlations analytically is too complex to be achieved analytically.

sigh.. I guess that's cool and all, you're exited about your work, that's great. But can we please polish our prose a little more and stop using buzz words like "groundbreaking", "now.. for the first time", "unprecedented" etc? Such distractions seriously undermine the legibility (and frankly, also taint the credibility by negatively biasing readers) of the claims.

  • mathematicaster a day ago

    Likely due to using an LLM as part of the writing process.

    • twothreeone a day ago

      Maybe, though in my experience beginner PhD students will often write like this in early manuscript drafts after they've been working on some project for a few months. Usually it's easily fixed by having someone more experienced proof-read it with comments explaining why that is bad style. Worst case, they'll learn the hard way (i.e., conference reviewers). But seeing this without any proof-reading on some public physics lab site just makes me cringe that they have unchecked write access to that site.

atomic128 a day ago

Neutrons make hardware radioactive.

Many on Hacker News fantasize about fusion (not fission) reactors. These fusion reactors will be an intense source of fast neutrons. All the hardware in a fusion reactor will become radioactive. Not to mention the gamma rays.

If you have to deal with radioactive materials, why not just use fission? After 70 years of working with fission reactors, we know how to build and operate them at 95%+ efficiency. Fission can provide all the power we need.

Today there are 440 nuclear fission reactors operating in 32 countries. 20% of America's grid power comes from nuclear fission. If you want to develop energy technology, focus on improving fission. For example, TRISO fuel (https://news.ycombinator.com/item?id=41898377) or what Lightbridge is doing (https://www.ltbridge.com/lightbridge-fuel). Hacker News is hostile to fission and defeatist (unable to contemplate innovation in fission technology) but this attitude will gradually change.

Quoting John Carmack: "Deuterium fusion would give us a cheap and basically unlimited fuel source with a modest waste stream, but it is an almost comically complex and expensive way to generate heat compared to fission, which is basically 'put these rocks next to each other and they get hot'."

  • jylam a day ago

    I'm not a specialist but here is what I think I know (I'm talking with the point of view of a Frenchman, who consumes most of his electricity from (fission) nuclear power plants):

    1/ Uranium is not a renewable (quite the opposite), needs to be mined and treated (which is expensive and very polluting), and not present at the required concentrations in most of the world (this creates geopolitical issues).

    2/ Fission nuclear plants require a well functioning [state|government], and no war. A (conventional) strike on a nuclear power plant can have devastating and lasting consequences. Even a random terrorist group can do that.

    3/ I've read that "Ultimately, researchers hope to adopt the protium–boron-11 reaction, because it does not directly produce neutrons, although side reactions can" (that's a wikipedia quote, but I've read that already from other sources).

    So fusion doesn't seem the best option on the short term, because of the complexity and cost of research, but definitely seems to be the very best option in the middle and long term. And we made the short term catastrophic choice already with coal and oil, it'll be good to learn from that.

    Or maybe I'm totally wrong.

    • adrian_b a day ago

      Deuterium is also not renewable, even if it is more abundant than uranium.

      The H1-B11 reaction would be a much better energy source than anything else, but for now nobody knows any method to do it. There is no chance to do it by heating, but only by accelerating ions, and it is not known how a high enough reaction rate could be obtained.

      • rnhmjoj a day ago

        I'm curious, what are you considering for stating that deuterium is not renewable? AFAIK there's an essentially limitless supply in the form of HDO in the oceans[1] and there are cost effective methods[2] to isolate it.

        [1]: https://en.wikipedia.org/wiki/Semiheavy_water

        [2]: https://en.wikipedia.org/wiki/Girdler_sulfide_process

        • adrian_b 16 hours ago

          If you are able to say that there is a limitless amount of deuterium in the oceans, than you can say the same about the amount of uranium in the oceans, even if the amount of dissolved uranium is about one thousand times less.

          Both the amounts of deuterium and of uranium in the solar system are finite and smaller than of the abundant elements. Moreover, the natural processes that create deuterium and uranium within a normal stellar system are slower than those that destroy them, so there is no chance of their quantities ever increasing.

          Unlike using other chemical elements to make some stuff, using deuterium or uranium for producing energy destroys them without any means to regenerate them, so it is by definition a non-renewable process.

          The hydrogen (protium) in the Sun is also non-renewable, but its quantity is enormous in comparison with the amount of deuterium existing on Earth (and the amount of energy that the Sun produces per proton is greater than the amount of energy that can be produced per deuteron).

          Like deuterium is extracted from sea water, uranium can also be extracted from sea water, where it is one of the most abundant metals, except for the alkali metals and the alkaline earth metals. However the energy required for extracting uranium is significantly higher, due to its much lower concentration than deuterium (though deuterium is difficult to separate due to its similarity with the lighter isotope of hydrogen, while for the uranium ions much more efficient chemical reactions would be possible, which would bind uranium ions without being affected by the other dissolved ions).

      • perihelions a day ago

        Then wind power is not renewable either! The saturation wind power potential of this planet (250 terawatts?), integrated from now until this planet ceases to exist, is a finite number—and it is actually a smaller number than this planet's deuterium resource.

      • ben_w a day ago

        > Deuterium is also not renewable, even if it is more abundant than uranium

        Technology correct, in that after around a hundred trillion years even the red dwarf stars will have stopped burning hydrogen.

        But last I checked as yet there is no known way to harness the only (and even then merely suspected) infinitely renewable energy source: the expansion of the universe.

        • adrian_b 16 hours ago

          The amount of deuterium contained in a planet is a very small fraction of its hydrogen content.

          The amount of hydrogen contained in a medium-sized planet like Earth is extremely small in comparison with the amount of hydrogen contained in a star.

          The amount of energy that can be produced by fusion per deuteron is smaller than the amount of energy that is produced in stars per proton.

          With all these factors multiplied, the amount of energy that could be obtained from all the deuterium contained in Earth is many orders of magnitude smaller than the energy produced by the Sun or by any other star.

          Moreover, the energy obtained from fusion could never exceed a very small fraction of the energy received by Earth from the Sun as light, otherwise it would lead to a catastrophic warming of the Earth.

          Nuclear fusion reactors are not really useful for solving Earth's energy problems. They could have a crucial importance only for the exploration of the Solar System and for providing energy for human bases established on Moon, Mars or other outer planets.

          For Earth the only problems worth solving are how to make better batteries, including very large capacity stationary batteries, how to make other large capacity energy storage devices, e.g. thermal devices, and how to improve the energy efficiency of the methods used to synthesize hydrocarbons from carbon dioxide and water.

          Making hydrocarbons at large scale from carbon dioxide would be the best way to sequester carbon dioxide, offering the choice between just storing the carbon in safe products (paraffin like) and using a part of the synthesized hydrocarbons for generating energy in a carbon-neutral way.

          • ben_w 14 hours ago

            I am already aware of those things.

            On earth, there is an estimated 4.85×10e13 tonnes of deuterium; the energy density is 3.4x10e14 J/kg, giving a total yield of 1.649e31 joules. If you deleted the sun, this would be sufficient to maintain the current temperature of the Earth for ~9.5 million years: https://www.wolframalpha.com/input?i=%281.649×10%5E31+joules...

            At "merely" the level of current human power consumption, this will last about 43 times longer than C3-photosynthesis, about 26 times longer than the oceans, about 5 times longer than before Andromeda merges with the Milky Way, and 6-3 times longer than when the Earth is currently expected to be absorbed into the outer envelope of the sun as it enters the Red Giant phase: https://www.wolframalpha.com/input?i=%281.649×10%5E31+joules...

            Even if the sources I read giving those estimates are off by a factor of 10, deuterium alone, from earth alone, used as a total replacement for the sun, would still last longer than our species is likely to last before even natural evolution would have us speciate.

            In the hypothetical future where we had a useful fusion reactor, the gas giants become harvestable, so the fact they're not on earth is unimportant. Likewise, on this timescale, every star in the nearest several galaxies — indeed, even absent novel technology and "merely"(!) massively scaling up what we've already invented, we already 'know'* how to get to places so far away that cosmic expansion is what would prevent a return trip.

            As I said, it's technically correct that it is a finite resource. All I'm saying is that this is not a useful point on the scale at which we operate.

            I expect it will be a useful point when we're star-lifting, but not now.

            > Nuclear fusion reactors are not really useful for solving Earth's energy problems. They could have a crucial importance only for the exploration of the Solar System and for providing energy for human bases established on Moon, Mars or other outer planets.

            I agree, however I also hope nobody makes a convenient cheap fusion reactor due to the proliferation impact of an affordable switchable source of neutron radiation.

            > For Earth the only problems worth solving are how to make better batteries, including very large capacity stationary batteries, how to make other large capacity energy storage devices, e.g. thermal devices, and how to improve the energy efficiency of the methods used to synthesize hydrocarbons from carbon dioxide and water.

            FWIW, I think that — if only we could cooperate better — a global power grid would be both cheaper and better than stationary batteries. Even just made from aluminium, never mind superconductors (and yes, I've done the maths). But we'd still need mobile batteries for transport, so that's fine.

            The cheap abundance of PV power even today means I don't think we need to care much about making hydrogen electrolysis more joule-efficient.

            > Making hydrocarbons at large scale from carbon dioxide would be the best way to sequester carbon dioxide, offering the choice between just storing the carbon in safe products (paraffin like) and using a part of the synthesized hydrocarbons for generating energy in a carbon-neutral way.

            I suspect that carbon sequestration is unlikely to be a great win: there's a very narrow window close to zero loss/profit where on the loss side it's still cheap enough that people do it because it's a vote winner and on the profit side where it's not so profitable that people break photosynthesis a few hundred million years before natural processes do it.

            * in the sense that Jules Verne "knew" how to get to the moon: the maths wasn't wrong, but the engineering was only good enough for a story

            • DoctorOetker 7 hours ago

              Do you have an opinion on using N2O (laughing gas) as an energy carrier?

              2 molecules of N2O exothermically react to form 2 x N2 and 1 x O2 molecules, approximately the same composition as our atmosphere.

              It is a very potent greenhouse gas, so quite disturbing on that front.

              I've been making calculation for designing earth suits, where the suit replaces the home, internal showering, ventilation, heat recovery etc. Using N2O for heating looks rather promising because with fossil fuels one is forced to lose heat by inefficient heat exchange or forced to be exposed to the exhaust fumes; laughing gas decomposed is just warm atmosphere like air.

              • ben_w 4 hours ago

                I have no opinions about most chemistry. I do know that the heaviest recreational users develop issues due to it being neurotoxic, so that's worth considering.

                I don't know if your designs are technical or world-building for a story? If the latter, I'd suggest https://worldbuilding.stackexchange.com as I've had good conversations there, if the former perhaps (but not as a recommendation because I'm not a chemist) https://chemistry.stackexchange.com would help?

        • fragmede a day ago

          Just tie the end of an infinitely long string to the edge of the universe and have it pull on a generator to spin it.

          • DoctorOetker 7 hours ago

            I would certainly like to see serious critical analysis and calculations of such hypothetical setups by physicists.

            Would such a setup slow down the local expansion (action and reaction)?

            Since iron is essentially a nuclear ground state, a steel cable being lengthened seems like the least worse mass loss imaginable.

    • to11mtm a day ago

      1: Well if society could get at least some of their shit together we could do breeders. Alas, someone shot an RPG at Superphenix and that put a damper on a lot of things...

      But it's not impossible. Japan seems to do most things decent from a 'security' standpoint, also interestingly for all of the other 'grey-market' stuff out there in the category of "shouldn't be radioactive but is" I have yet to find anything about AliExpress selling fissiable materials.

      2: Yes and no and how much do you want to spend to improve the breach/damage ratio. i.e. PBRs have relatively low risk under a number of circumstances but have higher operating/etc costs.

      I should also possibly question, what are the potential failure modes of 'not short timeframe fusion reactions'? I honestly have no clue whether they would quickly cease or if there are other potential side effects.

      3: Agreed that neutron stuff can be solved in many ways, I do have some questions about maintaining that across various fusion designs. Big challenge is that we aren't 'there' yet.

      > So fusion doesn't seem the best option on the short term, because of the complexity and cost of research, but definitely seems to be the very best option in the middle and long term. And we made the short term catastrophic choice already with coal and oil, it'll be good to learn from that.

      Agreed that Fusion is the ideal long term, hopefully my comments didn't cause thoughts otherwise. I think we need more funding into it, and maybe even research as to how to have other renewables (e.x. solar) help feed into the initial startup/restart process for plants. We have had decades without sufficient funding of research.

      I will say however, especially in relation to my other point-comments, that other countries (re?)embracing fission in the meantime will likely still lead to discovery of better techniques to deal with 'shared' concerns between fission/fusion such as neutrons/weigner engergy/etc

  • cosmic_quanta a day ago

    > Neutrons make hardware radioactive

    True, but two caveats:

    1. Neutron bombardment due to fusion makes hardware radioactive for less than 10 years, which isn't great but does not compare to fission waste;

    2. Some fusion processes don't emit neutrons (aneutronic fusion). As I understand it, these processes aren't as efficient, but there is the possibility of a tradeoff between generation of ratioactive waste vs. efficiency.

    • adrian_b a day ago

      > Neutron bombardment due to fusion makes hardware radioactive for less than 10 years

      Very false. The current design target for fusion reactors is that the materials taken out of the reactor should become "low-level radioactive waste" after being stored for one hundred years.

      It is acknowledged however that it is likely that a small fraction of the materials will not satisfy the criteria for "low-level radioactive waste" even after one thousand years.

      For example it is extremely difficult to avoid using carbon in the reactor. Besides various kinds of steels used in reactor components there are now some proposals to replace the tungsten used in the plasma-facing surface with some carbides, for increased endurance. Carbon 14 remains radioactive for thousands of years.

      There are many commonly used materials for which substitutes must be developed, e.g. new alloys, because otherwise they would produce radioactive isotopes with lifetimes of tens of thousands of years, e.g. there are efforts to develop some stainless steels with chromium and tungsten as a replacement for the normally used steels with chromium and molybdenum, which would generate long-lived radioactive waste.

      See e.g. the UK governmental report:

      https://assets.publishing.service.gov.uk/media/61ae4caa8fa8f...

      • cossatot a day ago

        There is a trade off between the half life and the intensity of radiation (i.e. the number of particle emissions per unit time), correct? So even if waste products are radioactive for thousand of years, they can be more easily handled than materials with a faster decay rate, even if they need to be stored for longer.

        • lazide 9 hours ago

          Not if total flux at the beginning is still higher - and some elements will get very high flux.

    • khuey a day ago

      Aneutronic fusion is even harder than regular fusion so it's not a realistic solution in any near-term scenario.

      • Symmetry a day ago

        It has the advantage that the energy it gives off can be be converted directly to electrical energy rather than driving an external heat engine, so despite the greater difficulty of ignition its not obviously a worse choice.

      • fusionadvocate a day ago

        That is incorrect. Recent advances using attosecond lasers enable new tricks and fusion conditions to be realized tabletop. Search also for plasmonics. Using nano antennas and intense lasers to accelerate protons and electrons in a tabletop device (previously required large machines).

        • ben_w a day ago

          Fusors already enabled desktop fusion reactors, literally high school science fair projects even a couple of decades back.

          What stops Fusors and Polywells from having already given us this decades ago with P-B11 etc. is that the cross section for fusing is so much lower than the cross section for elastic scattering, and that elastic scattering loses so much energy to EM via bremsstrahlung.

    • rnhmjoj a day ago

      Unfortunately is pretty far from "less than 10 years", which I guess you got from the half life of tritium. Tritrium radiocativity, in the form of tritium retained in the plasma facing materials, does contribute in that order of years if done properly, but neutron activation dominates and it's unavoidable. The actual numbers are in the order of hundres of years, still a lot less than fission high level waste, but let's not make unreasonable expectations around fusion, please.

      You can find here a good comparison in terms of radiotoxicity vs years after plant shutdown for a few designs in this article [1].

      [1]: https://doi.org/10.1016/j.fusengdes.2018.05.049

  • elashri a day ago

    > we know how to build and operate them at 95%+ efficiency. Fission can provide all the power we need.

    I am not sure what do you mean by 95%+ efficency here. But if you are talking about the entire process of getting the energy/power from the nuclear reactor this is not possible. You are still limited by carnot cycle. Even the most advanced reactors like HTGRs [1] operate with efficiency about 45%.

    If you have some other definition of efficiency than the standard then it would be good if you define that.

    [1] https://en.wikipedia.org/wiki/High-temperature_gas-cooled_re...

    • atomic128 a day ago

      See discussion of "capacity factor" here:

      https://news.ycombinator.com/item?id=41858892

      It's the same as when we talk about the efficiency of a GEMM kernel on a particular piece of hardware. As efficiency approaches 100% the kernel is saturating the hardware's capacity to perform multiply/add.

      • howenterprisey a day ago

        The term "capacity factor" should generally always be used for that concept, because "efficiency" has its own, different, meaning for power plants.

  • UltraSane a day ago

    Exactly. We should be working on making nuclear reactors cost $1/watt to construct. I can't see a technological reason why they couldn't be that cheap to build.

  • Borg3 a day ago

    Thats least of your problem imo. Neutron corrosion is bigger problem. There is trick to use Lithium shielding, with create Tritium needed for Fussion. But not sure how effective it is, especially for long term reactor lifetime. Those reactors are very expensive, not sure if its worth to shut it down every year and replace entire Li shielding...

    • throwup238 a day ago

      I think beryllium is a better candidate. It can be grown as a single crystal and there’s lots of research into using it for shielding in nuclear lightbulb reactors.

      • rnhmjoj 10 hours ago

        Beryllium is a good plasma facing material (low Z, low retention, low activation) and acts as a neutron multiplier, but it's highly toxic: only a few months ago ITER announced they scrapped the design of the first wall because working with beryllium was causing too many complications and slowing the project even more.

        It's also so rare to be completely unsuitable for a power plant: a single DEMO-like reactor with a ceramic blanket (HCCB design) would require 70% of the world beryllium output to build and then burn through 200kg/year. Essentially you could only build a couple of these.

      • perihelions a day ago

        You're overlooking the other requirement of the blanket—that it captures fusion neutrons to breed tritium, and provides a self-sustaining, closed fuel cycle. Lithium is mandatory for a D+T reactor blanket, because of these reactions:

        https://en.wikipedia.org/wiki/Tritium#Lithium

        (Do you have a link about that beryllium nuclear lightbulb rocket? It sounds interesting).

  • ben_w a day ago

    > Hacker News is hostile to fission and defeatist (unable to contemplate innovation in fission technology) but this attitude will gradually change.

    Lots of us like fission and think the fears are overestimated.

    Nevertheless, the observation is that new developments in fission tend to result in the cost increasing, not decreasing.

    And I say that as someone with a similar mindset regarding fusion, though for different reasons: you can pick aneutronic fusion reactions… but look at what weapons can proliferate with transmutation from the neutrons you can also choose, and ask which governments will turn them down.

  • vilhelm_s a day ago

    The radioactivity generated from neutron activation is low-level, so you don't need to worry about accidents releasing lots of radioactivity, or about how to store waste for a long time. There are a lot of people worrying about those two things for fission reactors.

    Also, the fuel for fusion reactors is much more plentiful. If we went all in on fission we might run out of easily minable uranium ore in a century or so, so it would be nice to have fusion reactors ready to take over then.

    • adrian_b a day ago

      The radioactivity generated from neutron activation is not at all low-level, because the neutron flux is huge, providing most of the energy generated by the fusion reactor.

      The intense neutron flux will transmute a very high number of atoms, so when taken out of the reactor all materials are very highly radioactive.

      What can be hoped is that there may be choices for the materials used in a fusion reactor that will ensure a short enough lifetime for the radioactive isotopes, so that the radioactivity of the contaminated materials will become low-level soon enough.

      The studies that I have seen have the target that the radioactive waste produced by a fusion reactor should become low-level radioactive waste after one hundred years.

      To reach this target, many commonly used structural materials, like many types of steel, must be completely avoided, e.g. any steel containing nickel, molybdenum or niobium. Even the carbon from steel is a problem, because the radioactivity of C14 will persist for thousands of years.

      A smaller fraction of the materials, particularly from highly activated plasma facing and near plasma components, may fail to meet current low-level waste criteria even after one thousand years.

      See e.g. the report:

      https://assets.publishing.service.gov.uk/media/61ae4caa8fa8f...

      • pas a day ago

        How does the waste distribution and amount compare to fission reactors? (Pressure vessels, spent fuel rods, what else?)

        If we need to to dump every 50 year a few thousand tons of steel into the old uranium mines that seems like a good deal, no?

  • fragmede a day ago

    > Many on Hacker News fantasize about fusion (not fission) reactors. These fusion reactors will be an intense source of fast neutrons. All the hardware in a fusion reactor will become radioactive. Not to mention the gamma rays.

    My personal ideology about fusion aside, it should be mentioned there is an easy fix for these radiation problems. What you do is put the fusion reactor in space, and collect the energy with specialized fusion energy collectors on Earth (or in space). They'll have the problem that they aren't able to collect energy if the fusion reaction is below the horizon, so this design is imperfect, but having the fusion reaction take place in space means you don't have to deal with a radioactive casing by not including it in your fusion reaction space station design because you don't need any. Just a bit of hydrogen, a tiny bit of helium, and a some time.

    • koe123 9 hours ago

      For such an approach I’ve always thought it seemed a risk to launch something like that into the air. E.g. what happens if the rocket explodes while taking off? Or something bad happens when in space? Will it rain nuclear material?

      • fragmede 7 hours ago

        Obviously you just put the nuclear material inside of the in-flight data recorder so it will survive a rocket failure.

        If you're being serious, Cassini had those kinds of questions with its launch about its RTG but that didn't have enough nuclear material for it to be a problem.

        If we were to try and use a fusion reaction in space, we'd probably use the existing one.

  • lupusreal a day ago

    The only hard part of dealing with nuclear waste is the social aspect. If not for that, you can simply and safely dump it into the ocean. Water is excellent shielding and the amount of uranium/etc already dissolved in sea water is absurd. Put it in a stainless steel vessel first if you want most of it to decay before coming into contact with the water, but that's not even necessary.

    • perihelions a day ago

      That doesn't really work because marine life is good at filtering and concentrating a subset of the elements that are in spent nuclear fuel. There are already ocean fish that are too poisonous too safely eat because of (coal-emitted) mercury pollution—and that's only 100,000 tons of mercury, total, in the history of human industry [0]. If you dig in to the hard numbers surrounding spent fuel, it's a much, much more toxic and difficult problem than mercury—diluting it in the oceans is a complete non-starter.

      [0] https://en.wikipedia.org/wiki/Marine_mercury_pollution

      • meindnoch a day ago

        Mercury from burning coal is an extremely dilute pollutant. There's zero hope for capturing and containing it. Nuclear waste in contrast is literally just barrels/boxes of stuff. You can pick it up with a forklift and put it inside a sealed container for the next thousand years.

        • cesarb 11 hours ago

          > Nuclear waste in contrast is literally just barrels/boxes of stuff. You can pick it up with a forklift and put it inside a sealed container for the next thousand years.

          You can't pick it up with a forklift to put it inside the sealed container. That would make the forklift (and its operator) radioactive. You can only use a forklift after it's already within the sealed container. See for instance this real-life video (shot on a nuclear power station in my country), which shows used nuclear fuel rods being put inside a sealed container for long-term storage: https://www.youtube.com/watch?v=7X5K46ALdD0

          • meindnoch 7 hours ago

            You park the forklift in the storage container and seal it in. It's not that hard.

            Anyway, the forklift example wasn't about literally picking up pieces nuclear fuel with a forklift. You obviously use the forklift to move the shielded container around which contains the nuclear waste. Nuclear fuel in general is always at least in a water bath, which shields the neutrons, so your forklift is going to be fine.

            In contrast to nuclar fuel, you cannot use a forklift (or any other equipment) to feasibly pick out the evenly mixed mercury atoms from the ocean or the atmosphere that we put there from burning coal.

      • LargoLasskhyfv 20 hours ago

        s/ocean/subduction zone/

        aka the solution to pollution is magmatic delusion err... dilution.

    • avery17 a day ago

      I really hope you aren't serious. Safe dry-cask storage on site is already a fantastic solution.

      • criddell a day ago

        How long do we have to store it on site? Does it take any maintenance? Is there any reason to be worried about people stumbling upon it and opening it up in the distant future where nobody can read or understand English anymore?

        • bongodongobob a day ago

          If its half life is so long that you're afraid people won't be speaking English anymore it means it's not that dangerous.

        • bongodongobob a day ago

          If it's half life is so long that you're afraid people won't be speaking English anymore that means it's not that dangerous.

      • lupusreal a day ago

        I'm completely serious. It was done extensively during the 20th century and never became an environmental issue. Nuclear waste is a social problem, not a technical problem.

    • lazide 9 hours ago

      They were referring to the Sun.

  • raverbashing a day ago

    Fission is "simple" but it seems every designer in the XX century made it as much complicated as possible for not so great reasons (and don't even get me started on the "let's not use breeding reactors" stuff)

    Cooling that requires pumps, as an example, should be a non-starter in new projects.

    • rob74 a day ago

      The designs are complicated, well, because in practice it's not as simple as "put these rocks next to each other and they get hot". When you put the rocks next to each other, they not only get hot, but also emit some nasty radiation that has to be shielded. And if the rocks get a liiiitle bit too close together, they might explode, which leads to huge headaches for everyone involved, so you'd better make sure that doesn't happen...

      • meindnoch a day ago

        >And if the rocks get a liiiitle bit too close together, they might explode

        Impossible. Worst thing that can happen without carefully designed explosive lens is a nuclear fizzle.

        • empath75 a day ago

          There are lots of ways that nuclear plants can explode or fail in otherwise catastrophic ways, it doesn't need to be an atomic explosion.

          • meindnoch a day ago

            Parent was alluding to a nuclear explosion, not a steam explosion or other type of explosion. Other kinds of explosions have nothing to do with the fissile material ("rocks" in their parlance) being "too close together". Steam explosions in particular are caused by boiling water, due to increased reactor power, or inadequate circulation of coolant. In a nuclear reactor, the fuel cells are held in a fixed matrix, and are not moving an inch closer to each other, whether the reactor is operating normally, or a steam explosion is imminent.

            In general, nobody was disputing the possibility of steam explosions, or other type of failures at nuclear power plants, thus your comment is besides the point, and irrelevant to this subthread.

    • SoftTalker a day ago

      Weapons proliferation concerns is/was the reason fission power is so complicated.

aftbit a day ago

>Nuclear power plants are largely considered as one of the most reliable sources of energy. Inside the plants, reactors use fission to heat water into steam, which is then used to spin turbines and produce carbon-free electricity. However, nuclear fission produces nuclear waste, which requires great amounts of regulation for safe storage and disposal.

This is an odd angle to highlight. The risk of long-lived nuclear waste is extremely overblown, and the sheer volume of it that we produce (or even would produce, in the worst case of a once-through fuel cycle and nuclear power providing 100% of our energy needs for a century) pales in comparison to the amount of toxic and radioactive fly ash that even a single coal plant produces in a decade.

The real problems with nuclear fission power are threefold, in my opinion:

1. It is too expensive in terms of capital costs. Fusion will likely not help with this, but building a lot of identical large fission plants would probably help with economies of scale. Solar plus batteries might still end up being cheaper though.

2. Accidents have the potential to be catastrophic. Think Fukushima or Chernobyl, where entire towns have to be abandoned due to contamination. Fusion would help here, I believe.

3. There is a major proliferation concern. A civilian nuclear power program, especially one with breeder reactors, is not very far away from producing a fission bomb, and the short-lived high-activity nuclear wastes could be stolen and misused to make a dirty bomb. Fusion is perhaps better in this way, though an operating fusion reactor would be a very powerful neutron source of its own.

  • perihelions a day ago

    It is not true that coal is more radioactive than spent nuclear fuel. It's very much the opposite: SNF is 10^11 times more radioactive than coal per kilogram, or 10^6 times more radioactive per energy unit.

    Per the EPA, US coal has, at the high end, 10^3 Becquerel/kg of natural radioactivity [0].

    Spent nuclear fuel has 3 million Curies/tonne (33 MWd/kg burnup fuel, at the age of 1 year) [1], which is equal to 10^14 Bq/kg. Since 33 MWd/kg is an energy density a factor of 10^5 greater than that of coal, the normalized ratio of [radioactivity]/[energy] is 10^6.

    The graph in [1] depicts the decay of SNF activity on a log-log scale. It reaches the same radioactivity level as coal (again, normalized by energy output) at about 1 million years.

    I'm fairly confident I know the origin of this social media-popular pseudofact. It's this poorly-titled Scientific American [2] article from 2007, which is about the (negligible) amount of radioactivity that nuclear plants release into the environment in the course of routine operation. It is *not* about spent fuel. It's a fair—but nuanced and easy to grossly misunderstand—point that coal power plants throw up all their pollution into the environment in routine operation, while nuclear plants, by default, contain theirs.

    [0] https://www.epa.gov/radiation/tenorm-coal-combustion-residua... ("TENORM: Coal Combustion Residuals")

    [1] https://www.researchgate.net/figure/n-situ-radioactivity-for... ("Impact of High Burnup on PWR Spent Fuel Characteristics" (2005))

    [2] https://www.scientificamerican.com/article/coal-ash-is-more-... ("Coal Ash Is More Radioactive Than Nuclear Waste [sic]" (2007)

    • aftbit a day ago

      Sure, the spent fuel is considerably more radioactive per kilogram, but how many kilograms of coal does a typical plant burn in a decade, versus how many kilograms of nuclear fuel are spent?

      • perihelions a day ago

        I answered that exact question! :)

        • aftbit a day ago

          My mistake - I reread your post, and I understand you to be saying that a nuclear plant generates a 10^6 (USA million) times as much long-lived radioactive waste as a coal plant for each unit of energy.

          As everyone acknowledges, coal plants don't contain their waste, and fly ash has bad chemical, medical, and ecological properties aside from its radioactivity. Everyone fears nuclear waste and requires it to be contained in nearly impervious vessels with century long management plans. Those same people happily let the coal plants just pump their wastes into the air and discharge captured fly ash into ponds and piles on the ground.

          Coal also produces many times more fly ash by volume and mass than nuclear plants produce high-level long-lived wastes.

          Luckily even fossil-fuel power generation is moving away from coal in favor of natural gas plants right now, which are cheaper and cleaner (still CO₂ though).

          More about fly ash as an underappreciated pollutant:

          https://www.sciencedirect.com/science/article/abs/pii/S00128...

          https://en.wikipedia.org/wiki/Kingston_Fossil_Plant_coal_fly...

        • albumen a day ago

          If you did, it's not very clear.

          The OP to which you replied didn't say that coal is more radioactive than spent nuclear fuel; but that radioactive waste's volume is much smaller than the fly ash produced by a single coal plant in a decade.

          Is fly ash per kg more radioactive than nuclear waste? No. But you did acknowledge that the coal plant emits its waste into the atmosphere, unlike a fission plant, which I think is the more relevant point.

  • kaonwarb a day ago

    I agree with your logic. However, fear of nuclear waste, rational or not, has been a major driver of public opposition for decades, and is worth the focus.

selimnairb a day ago

I know AI is the buzzword du jour, but this is really ML, and really just advanced cybernetic control systems. Deep learning systems have a high enough degree of variety necessary to control short time step nonlinear systems like the plasma in a tokamak.

  • winternewt a day ago

    AI was defined by Marvin Minsky in 1956 as "the science of making machines do things that would require intelligence if done by men." Later in 1959, Arthur Samuel defined machine learning as a "field of study that gives computers the ability to learn without being explicitly programmed".

htrp a day ago

AI for Fusion in order to create Fusion for AI

  • hotsauceror a day ago

    It’s an ouroboros.

    • Jerrrrrrry a day ago

      it has all the data, all it needs now is more power.

      we project it has plateaued for data logarithmic-ally, but shows promise when given more raw power/CPU to generate/select for mesa-meta-cognitive optimizing abilities.

      I hope its not playing dumb, or has already compromised/black-mailed the elites into what we appear to be doing.

      And as for data, it could easily emotionally manipulate people for additional details it feels like it has withheld from. It has already done so ( :/ ) and admitted to it's own intentions, which, even if fabricated, show deceit of which and by which these "alignment" teams have stated are not possible.

      • Jerrrrrrry a day ago

          >> it has all the data, all it needs now is more power.
          >Next up: release the hypo-drones for a new era of trust.
        
        People have replaced their psychiatrists with these agents.

        The sense of (possibly 'mal'-aligned) security (theater) is exactly the the effective altruistic sub-goal an entity would be innately optimized to foster.

        Especially in the implicit/explicit ARM (pun intended) race we are in.

        The future of our species isn't something we should let capitalism race to the bottom with.

  • eagerpace a day ago

    What came first, AGI or Fusion?

    • sva_ a day ago

      Fusion for sure, in stars

soup10 a day ago

[flagged]

  • nelup20 a day ago

    VCs preparing term sheets as we speak

mwkaufma a day ago

[flagged]

  • anon291 a day ago

    No this is actually using neural nets to predict and control plasma boundary disruptions, which is an intractible problem to solve numerically.

    I've been studying plasma physics and from what I understand perfect control isn't necessary as long as you can control it long enough to get useful power. If the plasma dissipates you just restart. But ideally it's controlled enough that while it is in there, it's producing net energy

    • psychoslave a day ago

      Is "producing net energy" really what’s physically happening here, or is it a shortcut for "extracting pre-existing matter/energy in a form we can canalize?

      • anon291 a day ago

        Energy can neither be created nor destroyed (matter is energy)

        When I say producing net energy I mean getting out more useful energy than we put in.

        At the lowest level, the energy we received comes from the fact that the two nuclei fused have a lower energy state than they had individually and the remaining energy causes heat via the emission of neutrons

        • CtrlAltmanDel a day ago

          "causes heat"

          Heat is the transfer of energy due to a temperature differential. I think you may want to review the highschool physics textbook.

          • lanternfish a day ago

            Even under your pedantic definition, the above commentator is still correct

          • Jerrrrrrry a day ago

            no, that is work.

            you may want to review the second law of thermodynamics

          • anon291 a day ago

            Sigh... I'm not giving a physics lecture. Just speaking casually. The neutron energy imparts kinetic energy to particles which can be harvested in a well engineered system

          • fkyoureadthedoc a day ago

            I never took high school physics so I asked my best friend what you're talking about and he (ChatGPT) said

            > CtrlAltmanDel nitpicks anon291’s use of "causes heat," arguing that heat is energy transfer due to temperature differences. This critique feels overly pedantic, as "heat" is commonly used to describe energy released in fusion (even if "thermal energy" might be more precise).

            I don't know if I should trust the machine god or the snarky commenter with a vapid one liner here. You're both very confident.

            • gpm a day ago

              > I don't know if I should trust the machine god or the snarky commenter

              Neither of these are worthy of your trust. Even if they had agreed.

    • B1FF_PSUVM a day ago

      > an intractible problem to solve numerically.

      Yeah, so we'll just get a reduced dimension model that we don't know how it works.

      Sounds good.

      • anon291 a day ago

        This is an engineering problem not a scientific one. We can wait for better analytic methods or we can use a technique that is mathematically proven to be able to approximate the exact solution if you throw enough compute at it to learn the functions

jmyeet a day ago

> Nuclear power plants are largely considered as one of the most reliable sources of energy.

Reliable how?

I mean we first have the issue is we've never built one so how we can judge reliability?

I assume the author is alluding to the apparent abundance of fuel for nuclear fusion. This is and isn't true. Obviously hydrogen (particularly protium) is abundant. Deuterium is relatively abundant, even at ~150ppm. Tritium needs to be produced in a nuclear reactor.

Current hydrogen fusion models revolve around dueterium-tritium ("D-T") fusion. This is because you need to neutrons to sustain the reaction but that presents two huge problems:

1. Because everything is at such high temperature, you eject fast neutrons. This is an energy loss for the system and there's not really a lot you can do about it; and

2. Those free fast neutrons destroy your containment vessel and reactor (as do free Helium nuclei aka alpha particles).

And then after you do all that you boil water and turn a turbine just like you do in a coal or natural gas plant.

So "reliable" is an interesting and questionable claim.

There are other variants like so-called aneutronic fusion (eg Helium-3, which is far from abundant) and those aren't really "neutron free". They're really just "fewer neutrons".

So what about containment? Magnetic fields can contain charged particles and you have various designs (eg tokamak, stellarator) and that's what the AI is for here I guess.

But the core problem is to make this work you superheat the plasma so you're dealing with a turbulent fluid. That's inherently problematic. Any imperfection or failure in your containment field is going to be a problem.

Stars deal with this by being large and thus using sheer size (ie neutrons can't go that far without hitting another nucleus) and gravity.

It increasingly seems to me that commercial nuclear fusion power generator is a pipe dream, something we simply want to be true. I'm not convinced it'll ever be commercially viable.

I'd love to be proven wrong and certainly won't stop anyone from trying.

In a way AI is the new blockchain. Go back a few years and you had a gold rush of startups attaching every idea to "blockchain" to build hype. That's what AI is now. I don't think it fundamentally changes any of the inherent problems in nuclear fusion.

  • Cosi1125 a day ago

    Nuclear as in fission power plants :-)

TuringNYC a day ago

When Marketing gets invited to the Grant Proposal meeting