by Vasily Yanchilin
“A fascinating new approach to gravity and cosmology”
—Karl Pomeroy, QI
Please scroll to video below.
By Kathleen A. Rosser
Researchers both inside and outside the established physics community are currently questioning the theory of General Relativity (GR) for a number of reasons. The present review article is intended to catalogue some of these objections and lend perspective on their possible validity. It is hoped this effort will help reduce the growing confusion that has permeated the literature at all levels, from strict peer-reviewed journals, to publications with little or no peer review, technical books, educational websites, physics forums, and unpublished communications. Also proposed here is the hypothesis that incompleteness is the most critical flaw in the current general relativistic formalism, along with the conjecture that for some physical systems, GR offers no independent information about such observables as gravitational redshift and time dilation.
A list of the common reasons for refuting GR is presented below. These topics will be discussed in detail in later sections.
1) Galactic rotation curve (dark matter): Many physicists and astronomers believe that general relativity fails to explain the unexpectedly rapid orbital motion of the outer regions of galaxies except through the introduction of dark matter, a supposed non-radiating transparent material that has never been directly observed astronomically, nor verified to exist in particle accelerators, despite over half a century of searching.
2) Cosmic acceleration (dark energy): GR does not explain the apparent increasing expansion rate of the universe without the reintroduction of Einstein’s abandoned cosmological constant Λ, which must be fine-tuned in a seemingly improbable way, or the postulation of some form of phantom pressure called dark energy.
3) Incompleteness: Einstein’s field equations are possibly incomplete in that the gravitational mass-energy density ρ(xμ), which presumably comprises the source of the field, does not uniquely determine the metric, or equivalently, does not fully determine the geometry of spacetime, unless one selects an often ad hoc equation of state. Thus ρ(xμ) does not define such observables as time dilation, redshift, and certain properties of motion, except in special cases, which points to an inconsistency in the theory.
4) Speed of gravity: GR predicts that gravitational effects travel at the speed of light. However many independent researchers, as well as mainstream modified gravity theorists, postulate that the effects of gravity travel at higher or lower speeds.
5) Time dilation: Some researchers deny that time dilation, as predicted by GR, actually exists, asserting that redshift, which is often cited as proof of time dilation, is due to other causes such as motion of the photon through a potential.
6) Spacetime curvature: Some theorists doubt that the curvature of spacetime is the cause of gravitational effects, or even that 4-dimensional spacetime itself has physical meaning.
7) Energy: GR does not offer a definition of the localized energy of the field, which some researchers consider a flaw in the theory.
8) The singularity problem: The GR formalism leads to coordinate singularities as well to real singularities in the mass density. Yet the formalism is believed to break down at singularities, pointing to a contradiction.
II. PERSPECTIVES ON THE GENERAL RELATIVISTIC FORMALISM
General relativity, due to the subtlety and complexity of the mathematics, may rival only quantum mechanics as one of the most confusing theories ever developed. . . . READ MORE [14 pages]>>
Diffusion Gravity: An Heuristic Approach
By DH Fulton
The evidence from quantum vacuum research suggests that virtual particles may play a much larger role in gravity than previously attributed. Far from being an empty “stage” or blank slate, the vacuum is an active agent in the transmission of energy (photons), and therefore should also serve asthe active medium through which gravity works. The model presented here integrates key concepts of an active quantum vacuum and the fundamental physical process of mass diffusion to provide a prime motivator of gravity as well as the key mechanism for the gravitational force.
Fundamental forces of nature work from differences and the tendency of nature to level those differences. The most studied of these is heat and thermodynamics, for the obvious reason of employing and extracting useful work from those differences (Laws of Thermodynamics) to power modern society. Entropy, i.e. the second law, can even be construed as a force, as proposed by Erik Verlinde, for motivating his alternative theory of emergent gravity . In a similar way, laws of mechanics employ another fundamental “leveling” force of nature: Mass Diffusion. The laws governing this force are attributed to Fick , who developed the formalism as to how concentration or mass differences can drive processes to do useful work. Some previous research on gravity as it might integrate with diffusion models was published by Britten  in “A Gravitational Diffusion Model without Dark Matter “ (1997), however, that model proposed coefficients of diffusion primarily for long range as an alternative to dark matter at galactic scale.
Retracing to Newton’s Laws, the prime mover for gravity is mass, but there is no mechanism to explain the forces described. As he originally formulated those laws, Newton admitted that he had “no hypothesis” as to the causality or how the aether might mediate the force. Einstein subsequently in special relativity explicitly denied that the vacuum was anything other than a void with his famous statement “The introduction of a ‘luminiferous ether’ will prove to be superfluous” . But research and experiment have since shown that the vacuum is an active medium and that quantum fluctuations and Virtual Particles (VP’s) are the processes and players within it. More recently, advanced research  about the light propagation mechanisms has postulated that virtual particles, i.e., virtual fermions, can in fact be the transmitting agents of photons, i.e., a mechanism for transmission of electromagnetic energy through the vacuum. This work suggests that the quantum vacuum and virtual particles are nature’s completely efficient means of transmitting energy and could likewise be the most efficient means of transmitting gravity. The model proposed herein employs this concept, i.e., that virtual particles can be the transmitting agents of gravity through the vacuum, without themselves transmitting any matter or consuming any energy. The difference is important; the medium of the vacuum and virtual particles can act as carriers of the direction and density of the source mass object, which are the essential information for gravity to affect a destination mass object. Diffusion from the source mass through the vacuum provides the motive power to propel the virtual particles, with their corresponding information, toward the destination or affected mass, where the virtual particle flux from the source mass interacts with the destination virtual particle flux, to generate a resultant of net diffusion of direction and density, which manifests as a gravitational force between the source and destination. . . . READ MORE AT VIXRA>>
by Vasily Yanchilin
“A fascinating new approach to gravity and cosmology”
—Karl Pomeroy, QI
Posted Quemado Institute
April 12, 2018
This film is about The Quantum Theory of Gravitation, which is based on three simple formulas: for the speed of light, Plank’s constant and an electron’s mass. These formulas were discovered by Russian scientist Vasily Yanchilin. For more on The Quantum Theory of Gravitation, see Vasily’s Yanchilin’s recent articles:
The Experiment with a Laser to Refute General Relativity
By Vasily Yanchilin
By Kathleen A. Rosser
A static cosmological metric is derived that accounts for observed cosmic redshift without the requirement for an expanding universe. The metric is interpreted in such a way as to predict a universal potential that accounts the anomalous acceleration of outlying stars of spiral galaxies (the galactic rotation curve), obviating the need for dark matter or modifications to general relativity.
The Big Bang (BB) theory has been questioned over its 85-year history for a number of fundamental reasons. That all the matter in the universe was created instantly at a single point some finite time in the past defies intuition as well as established conservation laws. That the dynamic general relativistic Friedmann-Lemaitre-Robertson-Walker (FLRW) metric, which forms the basis of the BB theory, is orthogonal in the space and time coordinates raises questions about its mathematical validity, a concern expressed by Friedmann himself in his original paper of 19221. That Hubble deep-field images displaying thousands of distant galaxies reveal no apparent galactic structural evolution seems to suggest the universe is much older than the BB theory purports. Well-established patterns of stellar evolution indicate some stars predate by billions of years the event of the Big Bang, pointing to a flaw in the theory. Questions may also be raised about the superluminal expansion predicted by the FLRW metric, implying possible violations of locally observed special relativity. That Einstein himself believed singularities can’t exist casts doubt on the singular origin of matter. These objections are foundational.
It is also unsatisfying that the BB theory fails to predict more recent observations, such as the apparent increase of expansion rate known as cosmic acceleration2,3,4, currently explained by reintroducing Einstein’s abandoned cosmological constant Λ and/or by postulating an ad hoc mysterious quantity called dark energy. The BB hypothesis also does not account for early-epoch cosmic inflation5, nor does it accommodate the unexpectedly large rotation velocities of outlying stars in spiral galaxies, whose anomalous centripetal acceleration is often attributed to an unidentified substance called dark matter. A further defect in the theory is the premise of homogeneity. The universe in fact manifests large-scale inhomogeneities in the form of clusters, walls and voids. Averaging over these density variations has introduced difficulties of its own, a problem known as back-reaction6.
Generally, astrophysicists as a community acknowledge these problems inherent in the BB hypothesis7,8,9. Scores of papers are published annually in peer-reviewed journals such as Physical Review D exploring possible solutions to these problems10,11,12. Nevertheless, the FLRW metric of the expanding universe remains the basis for modern cosmology, and is widely accepted among physicists as a valid model. This standard theory of cosmology has been dubbed the Lambda-Cold-Dark-Matter (ΛCDM) model due to its reliance on Einstein’s cosmological constant to account for cosmic acceleration, and its appropriation of cold dark matter to explain the galactic rotation curve.
The ΛCDM model is the most comprehensive gravitational model of the universe currently known, and has a distinct appeal in that it preserves general relativity (GR) in its original form. Numerous theories of modified gravity, including bimetric, bigravity or massive gravity theories13,14, scalar-tensor theories (eg. the Brans-Dicke theory15), tensor-vector-scalar (TeVeS) theories16, vector-tensor theories17, modified Newtonian gravity (MOND)18, and f(R) gravity theories that modify Einstein’s field equations (EFE)19,20,21,22, have been investigated, but so far none have proven sufficiently compelling to replace GR as the prevailing theory of gravity. Many such variations suffer problems in the limit of solar system scales, where, to agree with observation, they must predict the same results as GR, giving rise to complicated schemes such as the chameleon mechanism23,24,25 and Galileon fields26. Many of these theories fail the test of Occam’s Razor.
The present article discusses the results of an investigation into an alternate description of the universe. This description obviates the need to modify general relativity, while also eliminating any requirement for dark matter. It accounts in a natural way for the redshift-distance relation, and postulates a simple metric explanation for the galactic rotation curve. . . . MORE>> View PDF (17 pages)
Anomalous Magnetic Moment and the g-Factor
One of the most curious properties of elementary particles is their anomalous magnetic moment. This is calculated from what is called the “g-factor”, which is based on the ratio of the magnetic moment due to the spinning particle’s charge, and the angular momentum due to its mass. For classical objects, such as planets, g = 1. But for quantum objects such as electrons, protons, neutrons, muons, mesons and quarks, g can in principle have any value. What this means is, it seems as if the charge of the particle is spinning at a different rate from its mass.
For the electron, g is almost exactly 2, naively implying the electron’s charge is rotating twice as fast as its mass. This would seem to mean the electron has internal structure. But accelerator measurements of its cross section indicate it has a radius of zero. Or so they say. But in fact, the Standard Model of particle physics, which comprises a disjointed collection of calculational techniques and equations, does not really offer a picture of the electron’s structure, if it has one, nor a definite way of predicting its properties. Indeed, the g-factor is an ongoing area of research into physics beyond the Standard Model.
Several papers in Physical Review D, the preeminent particle physics journal, appear each month on the subject of the g-factor and anomalous magnetic moment, which is usually defined as (g-2)/2. This has become a curious subject because the sum total of human knowledge cannot yet explain it.
The g-factor is the most critical parameter distinguishing one particle from another. Some of the g-factors are (approximately) 1 for the photon, 0 for the mesons, 2 for the electron and muon, about 5.6 for the proton and about minus 3.8 for the neutron. Quarks also have g-factors, but these are known only vaguely. Why the g-factor is more critical than say, other distinguishing properties such as charge, spin (quantized angular momentum), and mass, is that the g-factor plays a profound mathematical role at the very foundations of theory.
This can be understood as follows. The particles that physicists most often deal with are the photon (a quantum of electromagnetic radiation with spin 1), the meson (particles of charge +,- or 0 and spin 0), the proton (a positively charged particle that occupies the nucleus of the atom and has a spin of 1/2), the electron (a light negatively charged particle that surrounds the nucleus like a cloud and has a spin of 1/2), the neutron (a neutral spin 1/2 particle that also occupies the nuclei of atoms), and the quarks (particles that form the interiors of protons, neutrons and mesons, and have charges of 1/3 or 2/3). Oddly, all of these particles are described by different equations, equations that have very different structures and appearances and seem irreconcilable. These include the Dirac Equation, which is a relativistic generalization of the Schroedinger equation familiar from quantum mechanics and which describes the electron and its antiparticle the positron; the Klein Gordon equation, another completely different relativistic generalization of the Schroedinger equation that applies to spin-0 mesons; and Maxwell’s Equations, which describe the photon. There are others.
It turns out, however, that all of these equations can be combined into a single General Particle Equation that takes the g-factor as a parameter. When the g-factor is inserted into the General Particle Equation and the expression simplified by ordinary algebraic methods, the result is either the Dirac Equation, the Klein-Gordon Equation, or Maxwell’s Equations, depending on the value of g. So, for example, if you plug g=2 into the General Particle Equation and simplify, the result is the Dirac Equation, ie the equation for the electron.
The interest here is not so much the mathematics, as that the g-factor is so fundamental, it makes a particle what it is: an electron, a photon, a proton, and so on. So even though a photon seems very distinct from an electron, the two are fundamentally the same phenomenon except for their values of g. This very property is at the forefront of particle physics research today. Its nature surpasses the limits of our present understanding, and enormous work is going into its precise calculation and the theories that should, physicists hope, predict its value.