Siren's Call is located only a half dozen blocks inside Pacifica itself. She picked it specifically to be close to, but outside, of the area NCPD still patrols.

It's kind of difficult to point in down in actual CP2077's map because Pacifica district wasn't designed to include Dogtown, so its a question of where actually is everything in relative to each other.

I've been meaning to ask. Is the territory controlled by saint cog about 12 blocks in area, or 12 blocks by 12 blocks? I'm trying to get a feel for just how big Taylor's territory is.
 
For all that I agree the prospect of nuclear power is pretty great, I'm also wary of people constantly underestimating the engineering challenges associated with big new technologies. Some of the reporting around next gen nuclear and fusion tech reads a lot like techbro-ism to me; The kind of thing that attracts a lot of media attention as "The Future" and then vanishes later. Hyperloop. Cybertruck. Google glass. Lab Meat (Scop?), Crypto... I'm aware that there's tons of regulation around it slowing things down, but are there actual, "new technology" nuclear reactors using the thorium salt or whatever else is modern, even if they're research scale and not power producing? I can't seem to find anything from oy own quick searches, but who knows how Google chooses to display things...
We literally don't need new tech for Thorium reactors. We have had designs for them for literally longer than we have had the uranium designs. It is just that Thorium reactors can't be used to breed weapons-grade fission materials.
 
This might give me an idea, there's tons of other stuff in the ocean that is incredibly valuable… like thousands of trillions of dollars of dissolved gold that are also in sea water.
Forget the water, look up the mineral wealth in seabeds that can be extracted via dredging. And the kinds of minerals.

There's also methane clathrates but that's a "hot" topic to be avoided.
 
For all that I agree the prospect of nuclear power is pretty great, I'm also wary of people constantly underestimating the engineering challenges associated with big new technologies. Some of the reporting around next gen nuclear and fusion tech reads a lot like techbro-ism to me...

I listened to Thunderf00t about the widely talked-about nuclear fusion achieved in 2022 and he (a nuclear scientist) said the way it was done produced a enormous amount of energy for a split second that almost instantly destroyed the quipment, and if you combine that with the press release, its likely that the focus of that particularly lab wasn't for power generation but to create a fusion bomb.

Yet it was reported all over the wold like a major breakthrough to archive clean and massive nuclear power production.
 
We literally don't need new tech for Thorium reactors. We have had designs for them for literally longer than we have had the uranium designs. It is just that Thorium reactors can't be used to breed weapons-grade fission materials.
I thought that to be the case, then I read the Wikipedia article
Article:
Disadvantages
...
* Some MSR designs can be turned into a breeder reactor to produce weapons-grade nuclear material.[12]


Banning those varieties would seem smart... So, you'd need international inspectors, who don't get to be blocked by countries with thorium reactors?

In general, thorium rectors look a smart idea.

A slightly weird thought, though... 'Two' is a funny number when it comes to varieties of stuff, 'one', OK, 'many', OK, 'two', not so much. We currently know about fission, because there's a useful feature of uranium decay, fission, because we see it in the gravity-confined reactors we call 'stars'. Are there a load of other nuclear reactions that might be useful, that we've not recognised/figured-out, yet?

Science fiction is fun, but it's all the future bits that surprise us (LLM AI, most recently), that could really stir things up...
 
I listened to Thunderf00t about the widely talked-about nuclear fusion achieved in 2022 and he (a nuclear scientist) said the way it was
isn't that the youtuber who tends to make stuff up?.
and that actual achievement was completely misquoted in most mainstream media.
it was a research reactor that got a stable ignition resulting in more energy then they used to power the systems to cause it.
that particular reactor design has issues though since it needs to be shut down and essentially repaired between tests.
it will probably only ever be a research reactor design unless material science for lasers significantly improves.
 
I thought that to be the case, then I read the Wikipedia article
Article:
Disadvantages
...
* Some MSR designs can be turned into a breeder reactor to produce weapons-grade nuclear material.[12]


Banning those varieties would seem smart... So, you'd need international inspectors, who don't get to be blocked by countries with thorium reactors?

In general, thorium rectors look a smart idea.

A slightly weird thought, though... 'Two' is a funny number when it comes to varieties of stuff, 'one', OK, 'many', OK, 'two', not so much. We currently know about fission, because there's a useful feature of uranium decay, fission, because we see it in the gravity-confined reactors we call 'stars'. Are there a load of other nuclear reactions that might be useful, that we've not recognised/figured-out, yet?

Science fiction is fun, but it's all the future bits that surprise us (LLM AI, most recently), that could really stir things up...
We know there are only two, because Fission is "split one atom into multiple" while Fusion is "combine multiple atoms into one". Two is a more understandable value when it is plus or minus.
 
We know there are only two, because Fission is "split one atom into multiple" while Fusion is "combine multiple atoms into one". Two is a more understandable value when it is plus or minus.
Suggest you look at bound neutron tunneling, where you might get energy out without radioactivity... One atom looses a neutron, another gains one - you can't claim that's either 'fission' or 'fusion'... It's an isotopic shift.

Also, nuclear isomer batteries, though gamma rays aren't that convenient to turn into useful energy...

The aim is to get energy out of nuclear rather than chemical processes, or for a battery store/release. Details tend to be a bit more subtle when you look at things closely...
 
This is what I mean about being uncertain of the technology. The only active thorium reactor I can specifically find documentation about instead of vague designs for is KAMINI, a 30kw research reactor first powered on in 1996. Which is a step in the tech but far from power production. And even then it's unclear to me if it's actually breeding thorium to u233 or just fissioning u233 for research...
 
Last edited:
Japanese scientists already invented this process using special sponges that are reactive with only uranium, this was back in the early 2000s or even 90s. It's good to see additional research on it though, because if we have access to the huge amounts of uranium in seawater (3.2 parts per billion, which equates to 4.6x10^9 tons of uranium) we would have enough uranium to power the entire planet for longer than the planet would exist (over 1.5 bilion years.)
[...]
This might give me an idea, there's tons of other stuff in the ocean that is incredibly valuable… like thousands of trillions of dollars of dissolved gold that are also in sea water.
For what it's worth, engineered organisms can even be made selective not only to whatever elements are desired, but even selective isotopes of it (exploiting differences in chemical reaction rates)

That could be a way to (slowly) remove anthropegenic radionucleides from ground- and sea-water... or extract fuel for nuclear reactors, be it that uranium, or deuterium/tritium (which are a pretty renewable ressource, AFAIK)


It seems like such a great option, and I think commercial Nuclear was a solid choice 20 years ago, but today the economics just can't compete with battery backed solar basically anywhere south of New York City.
[...] projecting an inflation adjusted 10c/kWh average on the wholesale market even just 20 years from now with falling renewable costs is difficult to justify.
As far as I'm aware, there are 2 main contributors for the massive fall in solar's LOCE (levellized cost of energy, basically the plant's lifetime production, divided by its TCO)
  • government subsidies: funding directly or indirectly (tax credits etc.) deployment, manufacturing , and development, or committing to more favorable energy-purchase conditions
  • high production volumes and installed capacity, feeding back into improved manufacturing and exploitation.
However, there are costs to solar that aren't decreasing, and either just can't or are unlikely to:
  • land use: there's only so much sunlight falling on a given area of land
    while building solar farms in a desert is fine, most places don't have conveniently-located and otherwise-useless land for massive solar farms
  • energy storage:
    • lithium batteries have limited lifetime, massive ecological and human costs, and high demand for personal electronics and BEVs (even though I think BEVs are a pretty terrible technology compared to FCEVs)
    • chemical batteries not based on lithium are not much better on those aspects
    • chemical batteries (lithium or not) scale very poorly, due to capacity and power being tied (unlike fuel cells etc.)
    • non-chemical energy storage usually has much lower round-trip efficiency.
That's why I don't think it's viable to exclusively use intermittent renewables, and the non-intermittent renewables seem to be even more contextual than the well-known intermittent ones:
  • geothermal power is in principle everywhere, if one digs deep enough, but in practice it's not viable everywhere (due to the boring costs, or to unfavorable geology making construction too difficult)
  • geological hydrogen, if it pans out, is unlikely to be present everywhere... and might still be most useful as a fuel for industrial processes or FCEVs
To be clear, I think it's obvious we do need renewables, but they don't do the same job as nuclear power, in terms of where it's applicable, density in terms of power vs. land use, providing highly-stable continuous output vs. intermittent one, etc. Trying to pretend they do gets us situations like France:
  • massive buildup of nuclear power in the 80s saw it reach almost 80% of its yearly energy generation;
  • politicians have been systematically curtailing further investements in the nuclear parc since (at least) the 2000s, so its share of the electrical mix has been steadily decreasing;
  • despite a large buildup in renewables, the use of "natural" gas (for electrical power) has gone up almost 5×

Also ignoring the biggest issue with fission plants right now isn't the safety design but the companies running them cutting corners.
Absolutely, and for all I'm... not exactly big on centralizing power in the hands of the government, I think the sensible option is:
  • Make nuclear power either a state monopoly, or at least extremely strictly controlled:
    Fukushima showed that putting a profit-first company in charge of operating a nuclear reactor, and trusting them to accurately report what's going on, is a disaster waiting to happen.
  • Have a nuclear safety board that actually audits reactors periodically, in any way they deem appropriate from inspecting the actual hardware, to testing whether operator training and processes are adequate, to looking into the accounting (like maintainance spending) etc.
  • Use a limited number of plant designs: having the same design plonked down many times doesn't only save on R&D, it also helps identify design issues (in ways that aren't obvious without enough plants of the same design to do statistics on) and remedy them everywhere
    The French nuclear parc being highly homogeneous is a large part of why it's so reliable despite politicians making pretty deep cuts in funding,


For all that I agree the prospect of nuclear power is pretty great, I'm also wary of people constantly underestimating the engineering challenges associated with big new technologies. [...] I'm aware that there's tons of regulation around it slowing things down, but are there actual, "new technology" nuclear reactors using the thorium salt or whatever else is modern, even if they're research scale and not power producing?
I think at least China, India, and... the Netherlands, are all actively pursuing thorium-cycle MSRs again, which is pretty exciting:
  • thorium is believed to be 3-4× as abundant as uranium, within the Earth's crust ;
  • IIRC, closed cycles / fuel reprocessing are much simpler than with uranium ;
    EDIT: forgot to explain, that's important because about only 1/3rd of the spent fuel that's been produced globally has been reprocessed, while the rest just sits in storage
  • the resulting nuclear waste "only" needs to be stored for 300 years, not on geological timescales ;
  • for decades, the main obstacle to practical thorium fuel cycles has been the lack of market incentives... which makes sense, as long as the true costs of nuclear waste management aren't accounted for (notably, the US seems to be pretending its long-lived nuclear waste does not exist, ever since the cancellation of the Yucca mountain site)

We know there are only two, because Fission is "split one atom into multiple" while Fusion is "combine multiple atoms into one". Two is a more understandable value when it is plus or minus.
There already are multiple known kinds of nuclear reactions though:
  • fusion: two light nuclei are joined into a heavier one
  • fission: heavy nucleus absorbs light particles (usually neutrons) and then splits into 2 or 3 pieces
  • spallation: parts of a nucleus are "torn out" when hit by a sufficiently energetic particle
  • internal transitions between different nuclei configurations
    either inducing gamma-ray emissions, or directly ejecting an electron because fucking quantum mechanics
  • nucleon transfer, including the neutron tunneling @Ace Dreamer mentioned,
  • etc.
 
Last edited:
Might want to add in:
* Molten salt batteries, which I understand scale quite nicely...
* Liquid air energy storage.

Also, I understand research continues on direct solar energy to hydrogen gas (from water), with no intermediary electricity-electrolysis stage.

Considerably more far-out, direct solar energy to methane gas (from water and carbon dioxide). This is particularly attractive because it is fully 'green', and we have all the infrastructure to use 'natural gas'. You could then use more solar-derived energy to go to more complex hydrocarbons, like 'gas'/petrol, I suspect.

Going fully electric, with no chemical energy to fall-back on, is possibly unwise, say when a solar flare (CME) trashes the electrical infrastructure... Might need hand-pumping, though? Bicycle generators? (With no electronics.)

I wonder what would happen to Cyberpunk if there's another Carrington event (like in 1859)...



'Burning' long-life nuclear waste so the dangerous life-span is decades rather than millennia is feasible, though energy-intensive. This is a lot smarter than burying it, or launching it into space. Rarely mentioned, though...
 
Last edited:
Might want to add in:
* Molten salt batteries, which I understand scale quite nicely...
* Liquid air energy storage.
Cyberpunk battery tech is lore-wise consistent with being superconducting capacitor loops. Both because they can provide sufficient power for direct energy weapons, and because they tend to explode when netrunners get tricksy with hacking cyberware.

They are also quite scalable thanks to the settings' nanite tech.

It's generally a safe assumption that they are orders of magnitude better off than we today are with power storage and transfer, as they also are with power production. They use gobs and gobs and gobs of the stuff.
 
I thought that to be the case, then I read the Wikipedia article
Article:
Disadvantages
...
* Some MSR designs can be turned into a breeder reactor to produce weapons-grade nuclear material.[12]


Banning those varieties would seem smart... So, you'd need international inspectors, who don't get to be blocked by countries with thorium reactors?

In general, thorium rectors look a smart idea.

One thing that rarely gets talked about with 4th gen reactors is the fact that most of them are high temperature reactors. Existing water cooled reactors only have a core operating temperature of ~400C, and a useful hot coolant output of onyl ~300C, and that seriously limits the thermodynamic efficiency of the steam turbines used, and makes the turbomachinery to actually get economical amounts of electricity out of them the vast majority (80+%) of capex for reactor construction. These are highly specialized football field sized turbines needed, and they are still relatively low efficiency.

High temperature reactors have a hot coolant output of anywhere from 550C to 800 or even 900C, depending on the design, of which there are dozens of wildly different designs. This means they can have much cheaper and smaller turbomachinery to get the same efficiency and power as an equivalent water cooled reactor, and also don't require a water cooling tower. The waste heat from the turbines will be potentially in the 350-400C range, which means air cooled radiators are sufficient, as radiative cooling efficiency increases exponentially with radiator temperature, which is another massive reduction to capex.

That waste heat is also hot enough to be a useful heat source by itself, used for district heating or desalination and water treatment.

Also, you don't need to use the reactor thermal output for electricity alone: the real moneymaker is going to be in industrial process heating. Did you know that oil refineries burn as much as 25% of the oil they refine, just to heat the fractional distillation tower to 400C? Imagine if you just made a cogeneration plant with an oil refinery that makes electricity and clean water with the turbine, and also makes an oil refinery 25% more productive with no increase in oil consumption, just by giving the refinery process heat with a molten salt coolant loop.

Same thing with ammonia production via Haber-Bosch process, it needs 500C heating for an endothermic process, and it usually just burns natural gas that it also steam reforms to produce the hydrogen it needs. Nuclear process heating can also supply hydrogen by cracking water with the sulfur-iodine process, which is more thermodynamically efficient than electrolysis but requires heating. Currently the haber process accounts for 5% of global carbon emissions, just giving it a reactor cuts that to zero, and is cheaper per unit of ammonia produced than burning natural gas.

There are other emerging technologies like supercritical CO2 turbines and the small modular reactor concept that have the potential to even further massively reduce capex for reactor construction. Even if the levelized cost/kWh of electricity is higher than gas turbines, using the reactors in cogeneration plants providing industrial process heating is such a profitable alternative that electrical production is probably going to be seen as a beneficial side effect by the nuclear industry in a few decades.

Edit: also, certain molten salt reactors being very good weapons breeders is irrelevant if the countries that are using them are already nuclear powers like the us, france, UK, etc.
 
Last edited:
he (a nuclear scientist) said the way it was done produced a enormous amount of energy for a split second that almost instantly destroyed the quipment
I just wrote a rant about this but it's too long so I've spoilered it. Yes, achieving ignition doesn't mean ICF is anywhere close to practical power production, but nonetheless it is a significant milestone. Without it, inertial confinement is just an interesting way to experiment with high density plasma physics. With it, it might actually be worth pursuing for energy. That is what ignition means.
A chemist, actually, not that that matters really because as soon as anything vaguely interesting happens everyone and their dog presents themselves as (somehow) an expert on plasma physics, condensed matter physics, HEDP or whatever the topic du jour is. "Never mind doing years or decades of research on this specific subfield, I've taken a look over things in my spare time over the past few months or days, and that makes me an authority, trust me." The fact is that it's already widely accepted that inertial confinement, despite achieving Q > 1 first (and this is in fact a significant milestone), is expected to be decades further from engineering breakeven or actual practical energy production.

On the matter of "destroying" the equipment? No. Just no. While there are challenges to operating a facility designed for 1.8 MJ at 2.05 MJ instead, and there is damage that has to be repaired, the people working there are not idiots. They know that just because their research results are useful for nuclear weapons maintenance doesn't mean that they have unlimited funding, and they take steps to reduce the damage so that it's sustainable within their budget. I'm sure someone can find some slideshow out there to explain what exactly that they've done, probably even unclassified, but a simple fact is that they cannot run tests at the frequency that they do if they destroy the equipment every few tests. I also imagine that a propensity for destroying very expensive laboratory equipment is generally seen as a negative for continuing employment. Professional scientists tend to avoid doing so at their jobs for both of those reasons. Similarly, if they say they plan on achieving Q = 10-20 at the same facility around 2030 (still quite far from QE = 1 for ICF), that means they're hoping to work out a way of handling 60 MJ returned without instantly destroying the equipment before they run those tests. Certainly the media hype does no favours when the earliest demonstration reactors will be mid 2030s even assuming everything goes well, and commercial reactors decades after that, but criticisms based on the patently erroneous or obvious strawmen (I did track down the video, but couldn't really get past the assumption that a commercial plant would target the same energy per shot) are just as facile.

Ignition is not a significant milestone because we're suddenly and unexpectedly close to practical realisation of IFE, it actually comes about a decade late since the hope was for it to be achieved when NIF first reached full power in 2012. Ignition is what is required to even seriously consider inertial confinement as a candidate for practical fusion energy. It's when those fusion experiments could stop being just a interesting way to study hot dense plasmas with conditions like where fusion happens (and maybe occasionally produce spinoff technologies like EUV lithography) and start being something that could actually be pursued as a practical energy option, if one so chooses.

As the National Academy of Sciences put it in their 2013 report:
The appropriate time for the establishment of a national, coordinated, broad-based inertial fusion energy program within DOE is when ignition is achieved.
Emphasis mine

It is just that Thorium reactors can't be used to breed weapons-grade fission materials.
Common misconception! U-233 is actually an excellent weapons material, just not sufficiently better than plutonium, which is also excellent, to be worth paying twice for R&D to get essentially the same thing. LRL indicated that if they conquered the technical challenges of the Thorium cycle first, they would have no interest in switching to plutonium either. Thanks to centrifuges though, high enrichment U-235 is probably the clear material of choice for any aspiring weapon states despite the overall less desirable characteristics.

But really, there is no such thing as a "thorium reactor". Light water reactors admittedly might have issues with neutron economy depending on the specific mix, but by and large current generation reactors only need minor adjustments whether you want to run them on some MOX or just natural or enriched uranium. And even if alterations were required, the advantages and disadvantages are still largely going to be inherited from the overall reactor type rather than a matter of the specific fuel cycle. A U-235 MSR will perform fairly similarly to a U-233 one, as will LWRs, HWRs or gas or liquid metal cooled reactors. There are advantages and disadvantages of various fuel cycle of course, but it just isn't nearly enough to switch in most cases. Abundance is a factor, but for the next century or so (i.e. more than the service life of a new reactor) it's local abundance (resource independence and geopolitics) that drives that decision and not the "4 times more abundant in the crust" people keep quoting. We don't extract minerals we actually want to use from random points in the crust, we do so from ores that have higher concentrations than random rocks because that makes it cheaper, so today it's about as relevant as how much uranium there is in sea water.

air cooled radiators are sufficient, as radiative cooling efficiency increases exponentially with radiator temperature, which is another massive reduction to capex.
Less costly than a dry cooling setup for a lower working temperature maybe, but in no world are you able to decrease capex by switching to a less efficient working fluid.

World Nuclear Association says:
Both types of dry cooling involve greater cost for the cooling set-up and are much less efficient than wet cooling towers

The International Atomic Energy Agency says:
However, dry cooling systems are more costly than comparable wet systems
and
Compared to wet cooling towers, however, dry cooling towers involve higher operating costs, require more electricity, occupy a larger footprint, have lower performance, and entail higher capital costs
and these are both organisations, if this were an actual way to make nuclear reactors cheaper, whose jobs (promote nuclear power) it would be to shout "Did You Know We Can Make Cheaper Reactors If We Do This" from the rooftops. Just in case there's anyone imagining those documents might only apply to regular temperature reactors, Yan et al.'s 2014 "Evaluation of GTHTR300A nuclear power plant design with dry cooling" specifically covers a high temperature gas cooled reactor
As a result, the overnight construction cost of an air-cooled plant of the same 600 MWt power rating as the baseline would increase by 11.1%

Did you know that oil refineries burn as much as 25% of the oil they refine, just to heat the fractional distillation tower to 400C?
Gu et al.'s 2015 "Energy Optimization for a Multistage Crude Oil Distillation Process" seems to be the most commonly cited in literature on a quick look, and they put it at 1-2%. Even with the "as much as" hedge, an order of magnitude is an excessive discrepancy.

Plus, distillation is only like 30 to 40% of the total energy consumption of refining, so really if it were true yields would be something like 40%, and adding 25% to that'll make for a 1.62× increase (65/40) instead of 1.33× (100/75), which I imagine would have made people want to run oil over the hot radioactive rocks at current temperatures even if it meant they couldn't do anything else with it at all.

Fun fact, plausible grid scenarios in the US that feature high nuclear capacity typically also have higher solar and diurnal storage capacity, much of which of >6h duration. This is because pretty much the only way to make nuclear anywhere near cost competitive seems to be to cripple interconnection, which is devastating for wind power but has less effect on solar, which still ends up being cheaper than trying to run a nuclear power plant like a peaker. Would be fun to run a few more sensitivity cases on that one.
 
There are problems with using Li tech to build batteries in that it eventually breaks down. Nickel Iron tech while much less efficient, has one great advantage. It can be used for a really long time, centuries. Original Nickel Iron cells are still in use because as long as you feed them uncontaminated water and potassium hydroxide they just keep running. 60 percent efficiency and high weight for stored unit are negatives, but for building a small vehicle that will run to town and back I can easily envision a vehicle and local energy storage that lasts a lifetime. That was one of the problems with the tech, Properly built cells never needed to be replaced. Only a few places build them nowadays, you can buy them from china but their production in the USA ceased in about 1975 when the Exide battery company who purchased the tech in 1972 stopped making them. ie in order to have them last a lifetime the use of nickel shells vs plastic components would be necessary.
 
Last edited:
The circle begins. The wheel is spinning. Again.

It's time to stock up on popcorn.

It's going to be fun, I guess.
 
Star Trek and Star Wars both exist. They didn't show up in the modern adaptations because of rewrites and liscencing issues, but the original TTRPG parodied them with dueling poser gangs.
 
Star Trek and Star Wars both exist. They didn't show up in the modern adaptations because of rewrites and liscencing issues, but the original TTRPG parodied them with dueling poser gangs.
Ah, well. I have confidence in Taylor's ability to write something (trans-humanist?) which upsets people. I strongly suspect there's things from Earth Bet (or Earth Alph), not found in CP Earth, that Taylor knows of that she could... be inspired by. And, I'm not talking about things derived from 'Twig' (or 'Pact') either.
 
Taylor writes a romance wherein there is much hand holding and walks on beaches, everyone is appalled, there is much pearl clutching, It is so different from the norm that it gains a following. ;-)
 
So one thing I'm wondering about.. Is Hiro someone that we know from 2077 in a sidequest or something? Or is he just random character that Spira liked well enough to keep using and developing?
 
Back
Top