Phil's Avatar

Phil

@philphi.bsky.social

Explainable AI, Evolutionary AI, PHILosophy. "Philo" φίλος which means "loving" or "friend". D[R S] ≠ 0

80 Followers  |  64 Following  |  384 Posts  |  Joined: 28.11.2024  |  2.1621

Latest posts by philphi.bsky.social on Bluesky

NGD is better than SGD because it respects the underlying geometry. Even without full sPNP verification, the Fisher Info geometry framework can revolutionize: Variational quantum algorithms using natural gradients, Quantum machine learning with geometric priors, Hybrid classical-quantum optimization

13.08.2025 04:03 — 👍 0    🔁 0    💬 0    📌 0

FS/QFIM NGD enables parameter updates along the quantum curvature. The geometry is the most information-efficient way to move in parameter space, every update is maximally effective from the laws of quantum mechanics. As compute grows, simulating FS/QFIM becomes tractable for large systems. no noise

13.08.2025 03:48 — 👍 0    🔁 0    💬 1    📌 0

FS is the intrinsic geometric form of Fisher Information for pure quantum states. Fisher Information is measure-dependent sensitivity. FS/QFIM is measurement-independent maximal sensitivity, baked into the geometry. Best measurement for distinguishing nearby states is the Quantum Fisher Information.

13.08.2025 03:36 — 👍 0    🔁 0    💬 1    📌 0

Quantum NGD is the future. The Fisher Information Metric (FIM) tells you how your data feels. The Fubini–Study (FS) metric tells you how reality bends in Hilbert space. sPNP as the pilot, FS= the pure-state QFIM, the gold standard, and FIM is the control translating geometry into actionable steering

13.08.2025 03:31 — 👍 0    🔁 0    💬 1    📌 0

Mu could become more extreme the more we rely on AI. Or even a whole new level, for example, reports today of Sam Altman's new BCI.

13.08.2025 03:14 — 👍 1    🔁 0    💬 0    📌 0

what's more likely than near-term novelty and originality, is that AI is able to parse through so much data that it will be able to find gaps and connections in the data that a single person has not. Novelty will come as AI improves, and humans work with AI as a symbiotic tool.

11.08.2025 16:34 — 👍 1    🔁 0    💬 0    📌 0

3N-6 is a reduced version. You get rid of 3 translations which is the position of the whole subsystem in space and doesn’t affect internal dynamics. And -3 rotations which is the orientation of the whole subsystem and doesn’t affect internal dynamics. Reducing it will make it easier for compression.

09.08.2025 16:32 — 👍 0    🔁 0    💬 1    📌 0

the quickest way to decode a closed quantum subsystem may be to reduce to 3N-6 from 3N. I think scale is too important for spacetime, such as with projection kernels. So distance/size/scale should be kept instead of trying full shape-space 3N-7.

09.08.2025 15:48 — 👍 1    🔁 0    💬 1    📌 0

yeah I think that we just have to deal with the quantum probabilities until we can actually map a closed quantum sub-system of trajectories for determinism. Or even better, if we can decode the entire wave-functional, big Psi, we could uncover determinism, but this may be impossible from spacetime.

09.08.2025 15:37 — 👍 3    🔁 0    💬 1    📌 0

Spacetime is likely a 3N configuration space projection. Quantum is more natural in configuration space. For practical purposes, 3N-6 can reduce dimensions and still represent the spacetime shadow. The wavefunctional is the quantum reality; configuration space is a good explanation for non-locality.

09.08.2025 15:30 — 👍 1    🔁 0    💬 0    📌 0

Only source is the kernel term (and the normalization piece), so energy is not created ex nihilo, it is relocated from microscopic configuration-space energy into the imprint field S(x) through metric-dependent coupling encoded in K. The energy is stored in curvature of R(X) and expectation matches.

08.08.2025 21:46 — 👍 0    🔁 0    💬 0    📌 0

Where the energy comes from in sPNP: The metric dependence of K produces local terms proportional to ∇·(S * geodesic-flux). Physically, this is exactly energy transferred from the microscopic configuration-space degrees (their dynamics and geometry) into the coarse-grained imprint that gravitates.

08.08.2025 21:43 — 👍 0    🔁 0    💬 1    📌 0

In sPNP, there is no creation ex nihilo; rather the energy is a re-labelling of stored information/curvature in R[X]. When you project and coarse-grain, the effective spacetime energy density is the expectation of existing configuration-space energy (projection of global amplitude).

08.08.2025 18:39 — 👍 0    🔁 0    💬 0    📌 0

QMM worry: where does the extra gravitational energy come from when imprint energy accrues after formation of matter?
sPNP cleaner bookkeeping: the quantum-potential term is already present in the configuration-space action (Jacobi–Fisher action). sPNP already contains the Hamiltonian that QMM needs

08.08.2025 18:33 — 👍 0    🔁 0    💬 1    📌 0

Uniqueness of the Fisher–Rao metric (Cencov’s theorem): its coordinate covariance, and its local dependence on ρ collectively ensure there is exactly one non-trivial, scale-invariant, second-order operator built from ρ. The Fisher action is the inevitable scaffold for any first-principles derivation

03.08.2025 02:05 — 👍 0    🔁 0    💬 0    📌 0

Kirk dies in both instances, unless you can beam Kirk intact. If you reuse Kirk's atoms, it is basically still Kirk, but atoms can't move at the speed of light. But if you use completely new atoms, it is a Kirk clone. If you have the information on all of Kirk's atoms, you can clone Kirk many times.

02.08.2025 15:27 — 👍 1    🔁 0    💬 0    📌 0

SGD = steepest descent in parameter space
NGD = steepest descent in distribution space

Is exact NGD the singularity? Full NGD requires inverting the FIM. Efficient K-FAC, diagonal FIM, or blockwise are not the same as going full NGD O(n³) for n parameters. AI models can then better model the world.

02.08.2025 15:09 — 👍 0    🔁 0    💬 0    📌 0

Fisher Scaling: sPNP can be applied to AI models. Natural gradients are induced by Fisher Info. NGD is invariant to reparameterization and converges faster in terms of iterations. SGD is computationally cheap; NGD is more advanced and more practical with more compute inverting the Fisher matrix.

02.08.2025 15:03 — 👍 0    🔁 0    💬 1    📌 0

sPNP: R = |Ψ| endows a true Riemannian metric G_IJ ∝ ∂_I R ∂_J R + (Laplace–Beltrami complement). Acting on R with this metric produces the quantum potential Q(X) and nonzero curvature. Particles then move along the curved geodesics of `G_IJ`, and those geodesic deviations are physical interactions

02.08.2025 04:21 — 👍 1    🔁 0    💬 0    📌 0

Distinctions declare that two configurations are different; an informational interaction. Such interactions generate curvature in the metric G_ IJ. With the non-zero curvature, particles follow geodesics that encode interaction. Thus, interaction is the phenomenological consequence of distinctions.

01.08.2025 20:42 — 👍 0    🔁 0    💬 1    📌 0

sPNP ontology: Distinctions underpin the geometry of the universe; its curvature, metric structure, and its dynamics. This formative "active information" (Bohm's implicate order) is encoded in the amplitude R=∣Ψ∣ of the universal wavefunctional; precisely quantified via the Fisher Information Metric

28.07.2025 16:07 — 👍 0    🔁 0    💬 1    📌 0

Sure is interesting. Basically, tailoring a model for a specific job, less hallucinations.

28.07.2025 14:47 — 👍 1    🔁 0    💬 0    📌 0

will become extremely easy to make a very potent model from open weights. Reinforcement fine-tuning, RAG, Test-Time Training, RSI likely to be accessible to anyone, which means an advanced malicious model within 5 years. Blue team will not only have to know cybersecurity, but be an expert in models.

26.07.2025 14:32 — 👍 1    🔁 0    💬 0    📌 0

Multi-particle Entanglement: Test whether Fisher curvature in configuration space affects entangled particle dynamics as sPNP predicts.
Spatial resolution and curvature sensitivity to detect second derivatives of the amplitude with enough precision to see deviations from standard quantum mechanics.

26.07.2025 03:25 — 👍 1    🔁 0    💬 1    📌 0

Promising experiments: test R to find Curvature in sPNP Controlled Amplitude Engineering for specific R-profiles in atomic systems and measure resulting particle dynamics.
Interferometric Curvature Detection: Use weak measurements to map amplitude curvature and correlate with trajectory deviations.

26.07.2025 03:22 — 👍 0    🔁 0    💬 1    📌 0
Preview
‎Gemini - FIM, Hessian, and SGD Overlap Created with Gemini

Even the AI models can understand how sPNP works

gemini.google.com/share/6f08c0...

25.07.2025 18:44 — 👍 0    🔁 0    💬 1    📌 0

Physicists may be hesitant of sPNP because the geometry of the wave function is described using the mathematical tools of Information Geometry. But really, sPNP posits that the wave function itself has a dynamic geometry. G_IJ is constructed from R in |Ψ|. Q(X) and curvature arises from acting on R.

23.07.2025 18:22 — 👍 0    🔁 0    💬 1    📌 0

Fundamental Fisher coupling, λ_F, whose non-Gaussian UV fixed point at ∼10⁻³ guarantees self-consistency and suffices all microphysical phenomena. sPNP preserves parsimony: in any minimal truncation you introduce only this one new fundamental coupling, with all other parameters slaved to its RG-flow

23.07.2025 15:24 — 👍 0    🔁 0    💬 0    📌 0

Curvature is derived from the Jacobi-Fisher metric. The Hessian form of the metric arises from the theory's fiber bundle structure. From this various curvature-based operators can be constructed. The more familiar Outer Product of Gradients is then understood as a powerful effective representation.

23.07.2025 15:24 — 👍 0    🔁 0    💬 1    📌 0

More can be described through specific formulas, for example a separate section on the Projection Kernel K(X,x) = (4πQ₀²)^(3N/2) e^(-D²(X,x)/4Q₀²) [1+O(Q₀²RF)]. Plus, a lot more work has been done since I wrote this pdf. My Whitewind blogs and BlueSky posts can supplement the heavy notes LaTeX math.

23.07.2025 04:34 — 👍 0    🔁 0    💬 1    📌 0

@philphi is following 20 prominent accounts