[557] **viXra:1701.0693 [pdf]**
*submitted on 2017-01-31 17:25:13*

**Authors:** Dongchan Lee

**Comments:** 34 Pages.

This is the part 3 of the math edu stagnations in the developed countries. Similar to the math score stagnations from the PISA and TIMSS 2015 and the overall time series score growth patterns of them 1995 or 2000 till 2015, the NAEP's math scores of the grades 4 and 8th of the USA seem to follow the similar patterns although there are time lags 1-3 years after the PISA or TIMSS math scores.

**Category:** Education and Didactics

[556] **viXra:1701.0692 [pdf]**
*submitted on 2017-01-31 17:37:51*

**Authors:** Dongchan Lee

**Comments:** 29 Pages.

This is the part 2 of the math edu stagnations in the developed countries. Similar to the math score stagnations from the PISA and TIMSS 2015 and the overall time series score growth patterns of them 1995 or 2000 till 2015, the NAEP's math scores of the grades 4 and 8th of most of the USA states seem to follow the similar patterns although there are time lags 1-3 years after the PISA or TIMSS math scores.

**Category:** Education and Didactics

[555] **viXra:1701.0691 [pdf]**
*submitted on 2017-01-31 18:17:02*

**Authors:** Jeffrey Joseph Wolynski

**Comments:** 1 Page.

For a star to form life in the general theory of stellar metamorphosis, a star needs to evolve over very long periods of time. If the star evolves too fast, life will not form. Explanation and a few examples are provided to explain this principle.

**Category:** Biochemistry

[554] **viXra:1701.0690 [pdf]**
*submitted on 2017-01-31 11:15:04*

**Authors:** George Rajna

**Comments:** 15 Pages.

The LHCb experiment has found hints of what could be a new piece of the jigsaw puzzle of the missing antimatter in our universe. [11] In a stringent test of a fundamental property of the standard model of particle physics, known as CPT symmetry, researchers from the RIKEN-led BASE collaboration at CERN have made the most precise measurements so far of the charge-to-mass ratio of protons and their antimatter counterparts, antiprotons. [10] The puzzle comes from experiments that aimed to determine how quarks, the building blocks of the proton, are arranged inside that particle. That information is locked inside a quantity that scientists refer to as the proton's electric form factor. The electric form factor describes the spatial distribution of the quarks inside the proton by mapping the charge that the quarks carry. [9] Taking into account the Planck Distribution Law of the electromagnetic oscillators, we can explain the electron/proton mass rate and the Weak and Strong Interactions. Lattice QCD gives the same results as the diffraction patterns of the electromagnetic oscillators, explaining the color confinement and the asymptotic freedom of the Strong Interactions.

**Category:** High Energy Particle Physics

[553] **viXra:1701.0689 [pdf]**
*submitted on 2017-01-31 12:19:59*

**Authors:** Martin Lopez-Corredoira

**Comments:** 63 Pages. accepted to be published in "Foundations of Physics"

The main foundations of the standard $\Lambda $CDM model of cosmology are that: 1) The redshifts of the galaxies are due to the expansion of the Universe plus peculiar motions; 2) The cosmic microwave background radiation and its anisotropies derive from the high energy
primordial Universe when matter and radiation became decoupled; 3) The abundance pattern of the light elements is explained in terms of primordial nucleosynthesis; and 4) The formation and evolution of galaxies can be explained only in terms of gravitation within a inflation+dark matter+dark energy scenario. Numerous tests have been carried out on these ideas and, although the standard model works pretty well in fitting many observations, there are also many data that present apparent caveats to be understood with it. In this paper, I offer a review of these tests and problems, as well as some examples of alternative models.

**Category:** Relativity and Cosmology

[552] **viXra:1701.0688 [pdf]**
*replaced on 2017-02-03 22:45:38*

**Authors:** Brent Jarvis

**Comments:** 6 Pages.

Periodic oscillations are observed in Newton's gravitational constant G that are contemporaneous with length of day data obtained from the International Earth Rotation and Reference System. Preliminary research has determined that the oscillatory period of G is ≈ 5.9 years (5.899 ± 0.062 years). In this paper, the oscillations are shown to be concomitant with the Earth's distance from the Sun and the angular frequency of its orbit. Implications for space exploration and dark matter are also discussed.

**Category:** Relativity and Cosmology

[551] **viXra:1701.0687 [pdf]**
*replaced on 2017-01-31 13:31:27*

**Authors:** David Brown

**Comments:** 5 Pages.

Does string theory with the infinite nature hypothesis imply supersymmetry while string theory with the finite nature hypothesis implies Wolframian pseudo-symmetry? I conjecture the Milgrom Denial Hypothesis: The main problem with string theory is that string theorists fail to realize that Milgrom is the Kepler of contemporary cosmology. Is the Koide formula merely a coincidence with little or no significance for physics? Does reality consist of a string landscape with many different string vacua? Consider two approximations: (muon mass)/(electron mass) = 206.7683 and
exp(pi * squareroot(72/25)) – (mass muon)/(mass electron) = –.0288 — so what? Is spacetime 4-dimensional? Is spacetime 26-dimensional? Measurements of spacetime using clocks and surveying instruments demonstrate that spacetime is 4-dimensional. I say that, from one point of view, spacetime is 26-dimensional. 26 dimensions = 1 dimension of matter time + 1 dimension of antimatter time + 24 dimensions of (±, ±, ± )-space. What is (±, ±, ±)-space? For the measurement of space, employ 6 particle beams consisting of 3 electron beams and 3 positron beams. For each dimension of space, employ all 3-tuples of beams selected from the 6 beams. By definition, (±, ±, ±)-space consists of 3 dimensions of ordinary space, each of which is measured in 8 different ways by using all of the possible 3-tuples of the 6 beams. The 24 dimensions of (±, ±, ±)-space reduce to the 3 dimensions of ordinary space because quantum field theory is empirically valid — however, (±, ±, ±)-space might be useful for representational redundancy (because of the role that the Leech lattice plays in the foundations of physics.) This brief communication offers speculations concerning Wolframian pseudo-symmetry and the Koide formula.

**Category:** Quantum Gravity and String Theory

[550] **viXra:1701.0686 [pdf]**
*submitted on 2017-01-31 05:09:00*

**Authors:** Thomas Preusser

**Comments:** 10 Pages.

In mid-2016 scientists at the Large Hadron Collider (LHC) announced that a possible new subatomic particle beyond the Higgs in mass/energy at 750 GeV/c2 went statistically unconfirmed at new higher collider energies. This paper offers new theoretical concepts predicting a gluon-like dark matter subatomic particle, called the netwon, at 750 GeV/c2. Since dark matter is “dark”, it is detected even more from inference than from actual observation. Moreover, particles with fractional “network charge”, a new theoretical concept developed in this paper, seem observationally troublesome because of variability. This includes neutrinos, gluons, and the new 750 GeV/c2 particle. The new Electron-Ion Collider (EIC) is proposed in part to deal with these troublesome variabilities. Therefore the hunt at 750 GeV/c2 at the LHC should continue, but modified with this new theoretical basis to be more inferential.

**Category:** High Energy Particle Physics

[549] **viXra:1701.0685 [pdf]**
*submitted on 2017-01-31 05:28:31*

**Authors:** Prado, PF et al

**Comments:** 77 Pages. unfineshed draft

very temptive model.

**Category:** General Mathematics

[548] **viXra:1701.0684 [pdf]**
*replaced on 2017-02-08 03:50:56*

**Authors:** Henok Tadesse

**Comments:** 9 Pages.

The origin of the force holding protons and neutrons together in the nucleus has been one of the daunting puzzles of physics, regardless of the Standard Model explanation. One possible consideration is the force of gravity as responsible for the stability of the nucleus. However, this idea will be immediately dismissed because gravitational force as we know it is weaker than electromagnetic force by a factor of about 8 x 10-37 . This is the very reason that gravity has eluded the attention of physicists as a possible explanation of nuclear force. Nature has hidden its mystery for almost a century by looking ridiculous. We know gravitation as introduced by Newton and have been stuck with that for centuries. This paper reveals a drastically different law of gravitation that ultimately resolves the mystery of nuclear force. This theory also has the potential to explain the phenomenon of cosmological acceleration and the Pioneer anomaly. Gravity is a force that behaves differently at vastly different distance scales: nuclear and atomic scale, macroscopic scale and astronomical scale.

**Category:** Nuclear and Atomic Physics

[547] **viXra:1701.0683 [pdf]**
*submitted on 2017-01-31 07:15:53*

**Authors:** Nikola Perkovic

**Comments:** 6 Pages.

Combining the equation for mass energy equivalence and the De Broglie equations for
wavelength with Feynman’s work on Quantum Electrodynamics, this paper will provide
an equation of quantum equivalence using the fine structure constant which is measured
to incredible accuracy in the study of QED. This equation will serve to prove that energy
mass equivalence is a product/consequence of quantum effects. Electrons will be used to
test the equation for both rest mass and rest energy as well as the wavelength, with the
help of using the Rydberg constant to simplify the calculus.

**Category:** Quantum Physics

[546] **viXra:1701.0682 [pdf]**
*submitted on 2017-01-30 17:11:35*

**Authors:** Federico Gabriel

**Comments:** 2 Pages.

In this article, a prime number distribution formula is given. The formula is based on the periodic property of the sine function and an important trigonometric limit.

**Category:** Number Theory

[545] **viXra:1701.0681 [pdf]**
*submitted on 2017-01-30 20:26:31*

**Authors:** Kenneth D. Oglesby

**Comments:** 3 Pages. Related to www.mcphysics.org website and viXra papers #1701.0002, #1611.0080 and #1609.0359

MC Physics previously proposed unifying all fundamental forces as being derived from electro-static charge force. Force strength was caused by and only interacted with and between quantized mono-charges of a charge type and set charge strength. In a separate paper mono-charges and the F*SCoTt process to build all particles, atoms and matter were described.
A modified Coulomb’s Law equation ( Charge Force, F = C1 * C2/ R^z ) for that unification was proposed, which also replaced Newton’s Law of Gravitation, and utilized a z relativistic impacted space exponent.
For each individual mono-charge and from the measured experimental data, z = 1.0 (est. range 0.5 to 1.5) for fully relativistic compressed (down to 2 dimensional, circular dilution) space of the lowest known charged mono-charges (e.g. photons of light) moving at the relativistic speed of light; z = 2 for normal space that is mostly static of mixed low and high charged mono-charges (e.g. in typical binary mass bodies) as for gravity; and z > 3+ for stretched/ expanded space for static mono-charges of the highest strength, ie.quarks.
The total force transaction z exponent between 2 mono-charges (MC1 and MC2) is proposed to be: z = ( z1 * z2 ) ^ 0.5

**Category:** Nuclear and Atomic Physics

[544] **viXra:1701.0679 [pdf]**
*submitted on 2017-01-30 21:21:09*

**Authors:** Miguel A. Sanchez-Rey

**Comments:** 2 Pages.

Establish topological schemes in metamorphic space as A-scheme and B-scheme.

**Category:** Mathematical Physics

[543] **viXra:1701.0678 [pdf]**
*submitted on 2017-01-31 01:22:05*

**Authors:** Victor Christianto, Florentin Smarandache, Yunita Umniyati

**Comments:** 20 Pages. This paper has been submitted to BAOJ Physics journal

It has been known for long time that the cosmic sound wave was there since the early epoch of the Universe. Signatures of its existence is abound. However, such an acoustic model of cosmology is rarely developed fully into a complete framework from the notion of space, cancer therapy up to the sky. This paper may be the first attempt towards such a complete description of the Universe based on classical wave equation of sound. It is argued that one can arrived a consistent description of space, elementary particles, Sachs-Wolfe acoustic theorem, up to a novel approach for cancer therapy, starting from this simple classical wave equation of sound. We also discuss a plausible extension of Acoustic Sachs-Wolfe theorem based on its analogue with Klein-Gordon equation to become Acoustic Sachs-Wolfe-Christianto-Smarandache-Umniyati (ASWoCSU) equation. It is our hope that the new proposed equation can be verified with observation data. But we admit that our model is still in its infancy, more researches are needed to fill all the missing details.

**Category:** Relativity and Cosmology

[542] **viXra:1701.0677 [pdf]**
*replaced on 2017-02-04 04:19:33*

**Authors:** Hervé Le Cornec

**Comments:** 6 Pages. New version with minor modifications. Excuse me for the inconvenience.

Looking at the hydrogen atom, we investigated the possibility to use the electron's
rotation speed, into the dilatation factor of the special relativity, even if the electron is
in a non inertial frame. Doing so, we were able to demonstrate that the electron's
charge-to-mass ratio is the subsequent relativistic frequency that appears to the
observer. We also show that a magnetic moment, very similar to the one of the
quantum mechanics, must appear, although we stay in the fields of classical and
relativistic physics. These facts, in excellent agreement with the experiment, lead us to
propose to extend the Einstein's postulate of inertial frame, to all frames having a
constant speed.

**Category:** Classical Physics

[541] **viXra:1701.0676 [pdf]**
*submitted on 2017-01-31 02:35:29*

**Authors:** René Friedrich

**Comments:** 8 Pages.

The well-proven principles of special and general relativity are permitting the derivation of the answer to the eighty-year-old question how general relativity may harmonize with quantum mechanics.
The solution is mainly found in a physical theory where it was not expected: Special relativity does neither discuss gravity nor quantum phenomena. However, it is the relative spacetime concept of special relativity which is considered to be incompatible with the absolute concepts of quantum mechanics for space and time.
The new approach is based on the discovery that the postulates of special relativity impose not only the relative concept of spacetime, but also absolute concepts for time and for space which are underlying relative spacetime. These absolute concepts reveal to be compatible with the absolute space and time concepts of quantum mechanics, and they are indicating the way how to apply the gravity concept of the Schwarzschild metric within quantum mechanics: Quantum gravity happens on particle level, not by quantization of spacetime.

**Category:** Quantum Gravity and String Theory

[540] **viXra:1701.0675 [pdf]**
*submitted on 2017-01-31 03:29:01*

**Authors:** George Rajna

**Comments:** 23 Pages.

In a new paper, researchers Ohad Lewin-Epstein, Ranit Aharonov, and Lilach Hadany at Tel-Aviv University in Israel have theoretically shown that microbes could influence their hosts to act altruistically. [12] A new study reveals how two populations of neurons in the brain contribute to the brain's inability to correctly assign emotional associations to events. Learning how this information is routed and misrouted could shed light on mental illnesses including depression, addiction, anxiety, and posttraumatic stress disorder. [11] In dynamic neuronal networks, pervasive oscillatory activity is usually explained by pointing to pacemaking elements that synchronize and drive the network. Recently, however, scientists at The Weizmann Institute of Science in Israel studied synchronized periodic bursting that emerged spontaneously in a network of in vitro rat hippocampus and cortex neurons, finding that roughly 60% of all active neurons were self-sustained oscillators when disconnected from the network – and that each neuron oscillated at its own frequency, which is controlled by the neuron's excitability. [10] Most biology students will be able to tell you that neural signals are sent via mechanisms such as synaptic transmission, gap junctions, and diffusion processes, but a new study suggests there's another way that our brains transmit information from one place to another. [9] Physicists are expected to play a vital role in this research, and already have an impressive record of developing new tools for neuroscience. From two-photon microscopy to magneto-encephalography, we can now record activity from individual synapses to entire brains in unprecedented detail. But physicists can do more than simply provide tools for data collection. [8] Discovery of quantum vibrations in 'microtubules' inside brain neurons supports controversial theory of consciousness. The human body is a constant flux of thousands of chemical/biological interactions and processes connecting molecules, cells, organs, and fluids, throughout the brain, body, and nervous system. Up until recently it was thought that all these interactions operated in a linear sequence, passing on information much like a runner passing the baton to the next runner. However, the latest findings in quantum biology and biophysics have discovered that there is in fact a tremendous degree of coherence within all living systems. The accelerating electrons explain not only the Maxwell Equations and the Special Relativity, but the Heisenberg Uncertainty Relation, the Wave-Particle Duality and the electron's spin also, building the Bridge between the Classical and Quantum Theories. The Planck Distribution Law of the electromagnetic oscillators explains the electron/proton mass rate and the Weak and Strong Interactions by the diffraction patterns. The Weak Interaction changes the diffraction patterns by moving the electric charge from one side to the other side of the diffraction pattern, which violates the CP and Time reversal symmetry. The diffraction patterns and the locality of the self-maintaining electromagnetic potential explains also the Quantum Entanglement, giving it as a natural part of the Relativistic Quantum Theory and making possible to understand the Quantum Biology.

**Category:** Physics of Biology

[539] **viXra:1701.0674 [pdf]**
*submitted on 2017-01-31 03:50:36*

**Authors:** George Rajna

**Comments:** 16 Pages.

High-energy electrons synced to ultrafast laser pulse to probe how vibrational states of atoms change in time. [10] A small team of researchers with affiliations to institutions in Italy, Japan and the U.S. has created a simulation that suggests that it should be possible for a single photon to simultaneously excite two atoms. [9] Molecules vibrate in many different ways—like tiny musical instruments. [8] For centuries, scientists believed that light, like all waves, couldn't be focused down smaller than its wavelength, just under a millionth of a metre. Now, researchers led by the University of Cambridge have created the world's smallest magnifying glass, which focuses light a billion times more tightly, down to the scale of single atoms. [7] A Purdue University physicist has observed a butterfly Rydberg molecule, a weak pairing of two highly excitable atoms that he predicted would exist more than a decade ago. [6] In a scientific first, a team of researchers from Macquarie University and the University of Vienna have developed a new technique to measure molecular properties – forming the basis for improvements in scientific instruments like telescopes, and with the potential to speed up the development of pharmaceuticals. [5] In the quantum world, physicists study the tiny particles that make up our classical world-neutrons, electrons, photons-either one at a time or in small numbers because the behaviour of the particles is completely different on such a small scale. If you add to the number of particles that are being studied, eventually there will be enough particles that they no longer act quantum mechanically and must be identified as classical, just like our everyday world. But where is the line between the quantum world and the classical world? A group of scientists from Okinawa Institute of Science and Technology Graduate University (OIST) explored this question by showing what was thought to be a quantum phenomenon can be explained classically. [4] The accelerating electrons explain not only the Maxwell Equations and the Special Relativity, but the Heisenberg Uncertainty Relation, the Wave-Particle Duality and the electron's spin also, building the Bridge between the Classical and Quantum Theories. The Planck Distribution Law of the electromagnetic oscillators explains the electron/proton mass rate and the Weak and Strong Interactions by the diffraction patterns. The Weak Interaction changes the diffraction patterns by moving the electric charge from one side to the other side of the diffraction pattern, which violates the CP and Time reversal symmetry.

**Category:** Quantum Physics

[538] **viXra:1701.0673 [pdf]**
*replaced on 2017-03-21 02:29:18*

**Authors:** A. Blato

**Comments:** 8 Pages. Version 8 in English.

This article presents an invariant formulation of special relativity which can be applied in any inertial reference frame. In addition, a new universal force is proposed.

**Category:** Relativity and Cosmology

[537] **viXra:1701.0672 [pdf]**
*replaced on 2017-02-01 03:20:34*

**Authors:** Sylwester Kornowski

**Comments:** 7 Pages.

The Scale-Symmetric Theory (SST) shows that the Halton Arp “quantized” inherent redshift results from shifted luminosity of quasars with some redshifts, not from higher abundances of quasars with such redshifts. The two functions describing dependence of the cosmological and inherent light travel time (LTT) on redshift are continuous but for the inherent LTT there are some increases in luminosity for 15 different redshifts: 0.061, 0.30, 0.60, 0.96, 1.41, 1.96, 2.63, 3.46, 4.48, 5.73, and so on - we obtained perfect consistency with observational facts. It leads to an illusion that the quasars with shifted luminosity are more numerous. SST shows that the quasars are the very distant objects because due to the inherent LTT, even quasars with very low redshift are already in LTT equal to 6.8 Gyr. Here we described the mechanism leading to the shifted luminosities. The cosmological and inherent LTTs result from different mechanisms of emission of photons by cosmic objects because of annihilation of particle-antiparticle pairs into two photons. The inherent LTT is produced by accretion discs whereas the cosmological LTT concerns the supernovae. Contrary to SST, within the General Theory of Relativity (GR) we cannot explain the origin of the shifted luminosity of quasars with strictly defined redshifts so the GR cosmology is only an approximate description of the expanding Universe.

**Category:** Quantum Gravity and String Theory

[536] **viXra:1701.0671 [pdf]**
*submitted on 2017-01-30 11:39:16*

**Authors:** Edgar Valdebenito

**Comments:** 6 Pages.

In this note we presents infinite products for some classical constants.

**Category:** General Mathematics

[535] **viXra:1701.0670 [pdf]**
*submitted on 2017-01-30 11:45:58*

**Authors:** Edgar Valdebenito

**Comments:** 5 Pages.

In this note presents a collection of double integrals involving constant pi.

**Category:** General Mathematics

[534] **viXra:1701.0669 [pdf]**
*submitted on 2017-01-30 09:07:54*

**Authors:** George Rajna

**Comments:** 19 Pages.

A UK, Canadian and Italian study has provided what researchers believe is the first observational evidence that our universe could be a vast and complex hologram. [13] Cosmologists trying to understand how to unite the two pillars of modern science – quantum physics and gravity – have found a new way to make robust predictions about the effect of quantum fluctuations on primordial density waves, ripples in the fabric of space and time. [12] Physicists have performed a test designed to investigate the effects of the expansion of the universe—hoping to answer questions such as "does the expansion of the universe affect laboratory experiments?", "might this expansion change the lengths of solid objects and the time measured by atomic clocks differently, in violation of Einstein's equivalence principle?", and "does spacetime have a foam-like structure that slightly changes the speed of photons over time?", an idea that could shed light on the connection between general relativity and quantum gravity. [11] Einstein's equivalence principle states that an object in gravitational free fall is physically equivalent to an object that is accelerating with the same amount of force in the absence of gravity. This principle lies at the heart of general relativity and has been experimentally tested many times. Now in a new paper, scientists have experimentally demonstrated a conceptually new way to test the equivalence principle that could detect the effects of a relatively new concept called spin-gravity coupling. [10] A recent peer-reviewed paper by physicist James Franson from the University of Maryland in the US has initiated a stir among physics community. Issued in the New Journal of Physics, the paper points to evidence proposing that the speed of light as defined by the theory of general relativity, is slower than originally thought. [9] Gravitational time dilation causes decoherence of composite quantum systems. Even if gravitons are there, it's probable that we would never be able to perceive them. Perhaps, assuming they continue inside a robust model of quantum gravity, there may be secondary ways of proving their actuality. [7] The magnetic induction creates a negative electric field, causing an electromagnetic inertia responsible for the relativistic mass change; it is the mysterious Higgs Field giving mass to the particles. The accelerating electrons explain not only the Maxwell Equations and the Special Relativity, but the Heisenberg Uncertainty Relation, the Wave-Particle Duality and the electron's spin also, building the Bridge between the Classical and Quantum Theories. The Planck Distribution Law of the electromagnetic oscillators explains the electron/proton mass rate and the Weak and Strong Interactions by the diffraction patterns. The Weak Interaction changes the diffraction patterns by moving the electric charge from one side to the other side of the diffraction pattern, which violates the CP and Time reversal symmetry. The self maintained electric potential of the accelerating charges equivalent with the General Relativity space-time curvature, and since it is true on the quantum level also, gives the base of the Quantum Gravity. The diffraction patterns and the locality of the self-maintaining electromagnetic potential explains also the Quantum Entanglement, giving it as a natural part of the relativistic quantum theory.

**Category:** Astrophysics

[533] **viXra:1701.0668 [pdf]**
*replaced on 2017-01-30 10:23:21*

**Authors:** Ameet Sharma

**Comments:** 11 Pages.

We propose developing an XML-based system to enhance scientific papers and articles. A system whereby the premises of arguments are made explicit in XML tags. These tags provide a link between papers to more clearly exhibit deductive knowledge dependencies. The tags allow us to construct deductive networks which are a visual representation of deductive knowledge dependencies. A deductive network (DN) is a kind of bayesian network, but without probabilities.

**Category:** Data Structures and Algorithms

[532] **viXra:1701.0667 [pdf]**
*replaced on 2017-04-17 15:27:38*

**Authors:** Hyoyoung Choi

**Comments:** 6 Pages.

Singularity problem is a long-standing weak point in the theory of general relativity. Most scholars assume that the solution for this singularity consists in quantum mechanics. However, waiting for quantum gravity theory to be completed to solve the singularity problem in a black hole is wrong. Hear we show that gravitational self-energy has a negative value can solve singularity problem and rescue general relativity. Black hole does not have a singularity and there exists a zone that has a uniform energy density within the black hole. The distribution of mass can't be reduced to at least radius 0.3R_S(Schwarzschild radius). Also, gravitational self-energy and R_gs guarantee uniform density due to repulsive gravity effect and this can be grounds for the expansion in the early universe and the uniform universe.

**Category:** Relativity and Cosmology

[531] **viXra:1701.0666 [pdf]**
*submitted on 2017-01-29 23:37:06*

**Authors:** Xiaodong Liu, Yu Liang, Qichang Liang

**Comments:** 6 Pages.

In this work, a parasitic dipole was mounted inside a waveguide cavity. The amplitude of the oscillating wave through the waveguide was amplified 15% by the negative impedance reflected from the parasitic resonator to the waveguide.

**Category:** Classical Physics

[530] **viXra:1701.0665 [pdf]**
*submitted on 2017-01-30 01:45:41*

**Authors:** Miguel A. Sanchez-Rey

**Comments:** 3 Pages.

Depth and dynamics.

**Category:** Social Science

[529] **viXra:1701.0664 [pdf]**
*replaced on 2017-04-16 16:16:48*

**Authors:** Andrei Lucian Dragoi

**Comments:** 32 Pages.

(BGC and TGC) [1,2,3,4] [5,6,7], briefly called “the Vertical Goldbach’s Conjectures” (VBGC and VTGC), which are essentially meta-conjectures (as VBGC states an infinite number of conjectures stronger than BGC). VBGC was discovered in 2007[1] and perfected until 2016[2] by using the arrays (S_p and S_i,p) of Matrix of Goldbach index-partitions (GIPs) (simple M_p,n and recursive M_i,p,n, with iteration order i ≥ 0), which are a useful tool in studying BGC by focusing on prime indexes (as the function P_n that numbers the primes is a bijection). Simple M (M_p,n) and recursive M (M_i,p,n) are related to the concept of generalized “primeths” (a term first used by Fernandez N. in his “The Exploring Primeness Project”), which is the generalization with iteration order i≥0 of the known “higher-order prime numbers” (alias “superprime numbers”, “super-prime numbers”, ”super-primes”, ” super-primes” or “prime-indexed primes[PIPs]”) as a subset of (simple or recursive) primes with (also) prime indexes (iPx is the x-th o-primeth, with iteration order i ≥ 0 as explained later on). The author of this article also brings in a S-M-synthesis of some Goldbach-like conjectures (GLC) (including those which are “stronger” than BGC) and a new class of GLCs “stronger” than BGC, from which VBGC (which is essentially a variant of BGC applied on a serial array of subsets of primeths with a general iteration order i ≥ 0) distinguishes as a very important conjecture of primes (with great importance in the optimization of the BGC experimental verification and other potential useful theoretical and practical applications in mathematics [including cryptography and fractals] and physics [including crystallography and M-Theory]), and a very special self-similar propriety of the primes subset of (noted/abbreviated as or as explained later on in this article). Keywords: Prime (number), primes with prime indexes, the i-primeths (with iteration order i≥0), the Binary Goldbach Conjecture (BGC), the Ternary Goldbach Conjecture (TGC), Goldbach index-partition (GIP), fractal patterns of the number and distribution of Goldbach index-partitions, Goldbach-like conjectures (GLC), the Vertical Binary Goldbach Conjecture (VBGC) and Vertical Ternary Goldbach Conjecture (VTGC) the as applied on i-primeths

**Category:** Number Theory

[528] **viXra:1701.0663 [pdf]**
*submitted on 2017-01-29 16:00:15*

**Authors:** Ahmed Ibrahim Mohamed Ahmed

**Comments:** 7 Pages. E-mail: 15004@stemegypt.edu.eg

The whole world suffers from a huge problem which is the lack of energy whether it’s because the insufficient production or the increasing consumption and there are consequences for this problem such as increasing the percent of pollutants and harmful gases in the environment because of using use fossil fuel as a source of energy to compensate the lack of energy, so the whole tries to exploit alternative energies such as renewable energy because they’re clean, cheap and will solve the problem of energy. (Urine Hydrogen Power) is believed to be a huge factor in solving the energy problem as urine is going to be used to generate electricity. It’s an efficient, sustainable and economic solution since urine is produced everywhere, as humans alone are estimated to produce 6.4 trillion liters per year. The project will meet our grand challenge as we will work on renewable energy resources. This project is estimated to produce about 3 moles of Hydrogen with a weight of 6 grams from the electrolysis of one liter of urine, then the fuel cell is used to convert this hydrogen into energy to produce electricity by combining it with oxygen. A solution like that will not cost so much and would be efficient, so it meets the design requirements of any successful solution. In conclusion, the results of our tests which were much better than we expected showed that this project is a perfect solution to solve the problem of one of the world’s largest demands which is energy.

**Category:** Chemistry

[527] **viXra:1701.0662 [pdf]**
*submitted on 2017-01-29 11:15:37*

**Authors:** Lamont Williams

**Comments:** 19 Pages.

The hierarchy problem — the problem of why gravity is far weaker than electromagnetism — is one of the greatest problems in physics. In this study, it is hypothesized that the disparity between the forces stems from their having an inverse, or seesaw-like, relationship — with one strength value naturally being high when the other
value is low. In accordance with this seesaw-like relationship, it is further hypothesized
that, as energy is increased, the strength of electromagnetism falls while the strength of
gravity rises. The author suggests that theory and observation indicating a rise in electromagnetic strength with increasing energy are not accounting for gravity’s contribution to the calculated and measured coupling. It is shown that removing this
contribution exposes the inverse relationship between the forces and, importantly, the lowering of electromagnetism’s strength over the increasing energy levels. Taken together, the concepts presented here may help in solving the hierarchy problem. This, in turn, may point the way to combining gravity and electromagnetism into a single framework and ultimately unifying general relativity and quantum mechanics.

**Category:** High Energy Particle Physics

[526] **viXra:1701.0661 [pdf]**
*submitted on 2017-01-29 08:45:47*

**Authors:** George Rajna

**Comments:** 23 Pages.

Biologists at The Scripps Research Institute (TSRI) have identified a brain hormone that appears to trigger fat burning in the gut. Their findings in animal models could have implications for future pharmaceutical development. [12] A new study reveals how two populations of neurons in the brain contribute to the brain's inability to correctly assign emotional associations to events. Learning how this information is routed and misrouted could shed light on mental illnesses including depression, addiction, anxiety, and posttraumatic stress disorder. [11] In dynamic neuronal networks, pervasive oscillatory activity is usually explained by pointing to pacemaking elements that synchronize and drive the network. Recently, however, scientists at The Weizmann Institute of Science in Israel studied synchronized periodic bursting that emerged spontaneously in a network of in vitro rat hippocampus and cortex neurons, finding that roughly 60% of all active neurons were self-sustained oscillators when disconnected from the network – and that each neuron oscillated at its own frequency, which is controlled by the neuron's excitability. [10] Most biology students will be able to tell you that neural signals are sent via mechanisms such as synaptic transmission, gap junctions, and diffusion processes, but a new study suggests there's another way that our brains transmit information from one place to another. [9] Physicists are expected to play a vital role in this research, and already have an impressive record of developing new tools for neuroscience. From two-photon microscopy to magneto-encephalography, we can now record activity from individual synapses to entire brains in unprecedented detail. But physicists can do more than simply provide tools for data collection. [8] Discovery of quantum vibrations in 'microtubules' inside brain neurons supports controversial theory of consciousness. The human body is a constant flux of thousands of chemical/biological interactions and processes connecting molecules, cells, organs, and fluids, throughout the brain, body, and nervous system. Up until recently it was thought that all these interactions operated in a linear sequence, passing on information much like a runner passing the baton to the next runner. However, the latest findings in quantum biology and biophysics have discovered that there is in fact a tremendous degree of coherence within all living systems. The accelerating electrons explain not only the Maxwell Equations and the Special Relativity, but the Heisenberg Uncertainty Relation, the Wave-Particle Duality and the electron's spin also, building the Bridge between the Classical and Quantum Theories. The Planck Distribution Law of the electromagnetic oscillators explains the electron/proton mass rate and the Weak and Strong Interactions by the diffraction patterns. The Weak Interaction changes the diffraction patterns by moving the electric charge from one side to the other side of the diffraction pattern, which violates the CP and Time reversal symmetry. The diffraction patterns and the locality of the self-maintaining electromagnetic potential explains also the Quantum Entanglement, giving it as a natural part of the Relativistic Quantum Theory and making possible to understand the Quantum Biology.

**Category:** Physics of Biology

[525] **viXra:1701.0660 [pdf]**
*replaced on 2017-04-24 01:32:44*

**Authors:** Carlos Castro

**Comments:** 18 Pages.

We revisit the construction of diffeomorphic but $not$ isometric metric solutions to the Schwarzschild metric. These solutions require the introduction of non-trivial
areal-radial functions and are characterized by the key property that the radial horizon's location is $displaced$ continuously towards the singularity ($ r = 0 $). In the limiting case scenario the location of the singularity and horizon $merges$ and any infalling observer hits a null singularity at the very moment he/she crosses the horizon. This fact may have important consequences for the resolution of the fire wall problem and the complementarity controversy in black holes. This construction allows to borrow the results over the past two decades pertaining the study of the Renormalization Group (RG) improvement of Einstein's equations which was based on the possibility that Quantum Einstein Gravity might be non-perturbatively renormalizable and asymptotically safe due to the presence of interacting (non-Gaussian) ultraviolet fixed points. The particular areal-radial function that eliminates the interior of a black hole, and furnishes a truly static metric solution everywhere, is used to establish the desired energy-scale relation $ k = k (r) $ which is obtained from the $k$ (energy) dependent modifications to the running Newtonian coupling $G (k) $, cosmological constant $\Lambda (k) $ and spacetime metric $g_{ij, (k) } (x)$. (Anti) de Sitter-Schwarzschild metrics are also explored as examples. We conclude with a discussion of the role that Asymptotic Safety might have in the geometry of phase spaces (cotangent bundles of spacetime); i.e. namely, in establishing a quantum spacetime geometry/classical phase geometry correspondence $g_{ij, (k) } (x) \leftrightarrow g_{ij} (x, E) $.

**Category:** Quantum Gravity and String Theory

[524] **viXra:1701.0659 [pdf]**
*submitted on 2017-01-29 04:58:58*

**Authors:** George Rajna

**Comments:** 13 Pages.

As a massive star dies, expelling most of its guts across the universe in a supernova explosion, its iron heart, the star's core, collapses to create the densest form of observable matter in the universe: a neutron star. [7] NASA's Chandra X-ray Observatory has discovered the first direct evidence for a superfluid, a bizarre, friction-free state of matter, at the core of a neutron star. Superfluids created in laboratories on Earth exhibit remarkable properties, such as the ability to climb upward and escape airtight containers. The finding has important implications for understanding nuclear interactions in matter at the highest known densities. [6] This paper explains the Accelerating Universe, the Special and General Relativity from the observed effects of the accelerating electrons, causing naturally the experienced changes of the electric field potential along the moving electric charges. The accelerating electrons explain not only the Maxwell Equations and the Special Relativity, but the Heisenberg Uncertainty Relation, the wave particle duality and the electron's spin also, building the bridge between the Classical and Relativistic Quantum Theories. The Big Bang caused acceleration created the radial currents of the matter and since the matter composed of negative and positive charges, these currents are creating magnetic field and attracting forces between the parallel moving electric currents. This is the gravitational force experienced by the matter, and also the mass is result of the electromagnetic forces between the charged particles. The positive and negative charged currents attracts each other or by the magnetic forces or by the much stronger electrostatic forces. The gravitational force attracting the matter, causing concentration of the matter in a small space and leaving much space with low matter concentration: dark matter and energy.

**Category:** Astrophysics

[523] **viXra:1701.0658 [pdf]**
*submitted on 2017-01-28 22:50:15*

**Authors:** Vladimir F. Tamari

**Comments:** 5 Pages.

A patient describes and illustrates his experience of seeing closed-eye hallucinations after waking up from a 17-hour surgery under total anaesthesia.

**Category:** Mind Science

[522] **viXra:1701.0657 [pdf]**
*submitted on 2017-01-29 02:27:53*

**Authors:** Nikola Perkovic

**Comments:** 6 Pages.

Determining the mass of hadrons was a predicament in Quantum Chromodynamics, suggestions
and attempts of using a theory of lattice QCD for such determination has not provided satisfying
results. This paper will suggest a new method that offers 99% accuracy therefore much higher
than lattice QCD, as well as simplicity. The methodology provided in this paper is relatively
simple which makes it easier to do the calculus without unnecessarily losing time on extremely
complex equations that serve not practical purpose since the accuracy of lattice QCD in
determining hadrionic mass is approximately 10% which is underwhelming. The aforementioned
new method will be applied for protons, neutrons and pions.

**Category:** Nuclear and Atomic Physics

[521] **viXra:1701.0656 [pdf]**
*replaced on 2017-01-31 05:57:33*

**Authors:** Rodney Bartlett

**Comments:** 12 Pages.

Fate has me doing the reverse of what "viXra Info" says (http://vixra.org/info). It states, "Acceptance onto viXra is just a first step which needs to be followed up by submitting to a journal …" I first sent this article to a couple of journals, got rejected, then – to secure my place as the first submitter of these ideas (as far as I know) - I adapted it for "The founders of viXra (who) believe that the universal right of free speech applies to all works of science and all researchers should be allowed to place their ideas in public view for scrutiny."
For decades, I've had an unshakeable belief in physics' Unification of everything in space and time. This caused me to decide that topological materials on Earth, and topological fields surrounding some astronomical bodies, must co-exist with a universal topology. This leads to a non-expanding universe. Edwin Hubble, the astronomer credited with discovery of cosmic expansion, always believed "expanding models are a forced interpretation of the observational results." A topological cosmology allows us to, in his words, "find ourselves in the presence of one of the principles of nature that is still unknown to us today". (see "Effects of Red Shifts on the Distribution of Nebulae" by E. Hubble, Ap. J., 84, 517, 1936). The subjects of unification and topology should be of interest to nonspecialists because a) the trend of modern physics is towards finding a unified theory (new physics) that explains everything – matter from the subatomic to the cosmic scale, all forces, quantum mechanics, relativity, and b) the subject of topology won the Nobel Prize for Physics in 2016.
Also included is a subsection proposing the existence of zero compactified dimensions with 8 macroscopic dimensions (5 of space, 3 of time), and solution of the dark energy and dark matter problems by reference to gravitation and the Complex Number Plane made physical. Finally, this updated version includes a few lines responding to "From Planck Data to Planck Era: Observational Tests of Holographic Cosmology" by Niayesh Afshordi, Claudio Corianò, Luigi Delle Rose, Elizabeth Gould, and Kostas Skenderis: Phys. Rev. Lett. 118, 041301 (2017) - Published 27 January 2017 (http://journals.aps.org/prl/abstract/10.1103/PhysRevLett.118.041301)

**Category:** Quantum Gravity and String Theory

[520] **viXra:1701.0654 [pdf]**
*submitted on 2017-01-28 13:52:43*

**Authors:** Ricardo Gobato, Marilene Turini Piccinato, Eduardo Di Mauro, André Tsutomu Ota, M. F. Costa

**Comments:** 1 Page. Panel presented the First Brazilian School of Bioinformatics (EBB-2008), from August 25 to 28, 2008, at the Federal University of ABC (UFABC)

Phenalenyl or perinaphthenyl (C13H9) is an organic component that can be used as a spin probe. One of the co-authors characterized the phenalenyl experimentally through the technique of electronic resonance spin (EPR). When the temperature is less than 250 C, the EPR signal disappears because dimerization occurs through a chemical bond eliminating the unpaired spin. In the middle also degradation occurs due to oxidation.
In this work we present the results of the geometry optimization calculations obtained through the method of the functional density (DFT) / B3LYP in the base 6-311+G(2d, p), with load -1 and multiplicity 3. The best results, compared with The experimental data, spin densities and hyperfine coupling constants were obtained using the base 6-311+G(3df,3pd), with loads varying from -1 to 1.

**Category:** Condensed Matter

[519] **viXra:1701.0653 [pdf]**
*submitted on 2017-01-28 10:44:20*

**Authors:** J.L.Paillet, A.Meulenberg

**Comments:** 12 Pages.

In previous works, we discussed arguments for and against the deep orbits, as exemplified in published solutions. So we considered the works of Maly and Va’vra on the topic, the most complete solution available and one showing an infinite family of EDO solutions. In particular, we deeply analyzed their 2nd of these papers, where they consider a finite nucleus and look for solutions with a Coulomb potential modified inside the nucleus. In the present paper, we quickly recall our analysis, verification, and extension of their results. Moreover, we answer to a recent criticism that the EDOs would represent negative energy states and therefore would not qualify as an answer to the questions posed by Cold Fusion results. We can prove, by means of a simple algebraic argument based on the solution process, that, while at the transition region, the energy of the EDOs are positive. Next, we deepen the essential role of Special Relativity as source of the EDOs, which we discussed in previous papers. But the central topic of our present study is an initial analysis of the magnetic interactions near the nucleus, with the aim of solving important physical questions: do the EDOs satisfy the Heisenberg Uncertainty relation (HUR)? Are the orbits stable? So, we examine some works related to the Vigier-Barut Model, with potentials including magnetic coupling. We also carried out approximate computations to evaluate the strength of these interactions and the possibilities of their answering some of our questions. As a first result, we can expect the HUR to be respected by EDOs, due to the high energies of the magnetic interactions near the nucleus. Present computations for stability do not yet give a plain result; we need further studies and tools based on QED to face the complexity of the near-nuclear region. For the creation of EDOs, we outline a possibility based on magnetic coupling.

**Category:** Mathematical Physics

[518] **viXra:1701.0652 [pdf]**
*submitted on 2017-01-28 12:49:11*

**Authors:** George Rajna

**Comments:** 21 Pages.

Physicists uncover clues to mechanism behind magnetic reconnection. [12] Scientists at the U.S. Department of Energy's (DOE) Princeton Plasma Physics Laboratory (PPPL) and Princeton University have proposed a groundbreaking solution to a mystery that has puzzled physicists for decades. At issue is how magnetic reconnection, a universal process that sets off solar flares, northern lights and cosmic gamma-ray bursts, occurs so much faster than theory says should be possible. [11] New method of superstrong magnetic fields' generation proposed by Russian scientists in collaboration with foreign colleagues. [10] By showing that a phenomenon dubbed the "inverse spin Hall effect" works in several organic semiconductors-including carbon-60 buckyballs-University of Utah physicists changed magnetic "spin current" into electric current. The efficiency of this new power conversion method isn't yet known, but it might find use in future electronic devices including batteries, solar cells and computers. [9] Researchers from the Norwegian University of Science and Technology (NTNU) and the University of Cambridge in the UK have demonstrated that it is possible to directly generate an electric current in a magnetic material by rotating its magnetization. [8] This paper explains the magnetic effect of the electric current from the observed effects of the accelerating electrons, causing naturally the experienced changes of the electric field potential along the electric wire. The accelerating electrons explain not only the Maxwell Equations and the Special Relativity, but the Heisenberg Uncertainty Relation, the wave particle duality and the electron's spin also, building the bridge between the Classical and Quantum Theories. The changing acceleration of the electrons explains the created negative electric field of the magnetic induction, the changing relativistic mass and the Gravitational Force, giving a Unified Theory of the physical forces. Taking into account the Planck Distribution Law of the electromagnetic oscillators also, we can explain the electron/proton mass rate and the Weak and Strong Interactions.

**Category:** Quantum Physics

[517] **viXra:1701.0651 [pdf]**
*submitted on 2017-01-28 08:06:57*

**Authors:** Robert B. Easter, Eckhard Hitzer

**Comments:** 10 pages. In proceedings: S. Sivasundaram (ed.), International Conference in Nonlinear Problems in Aviation and Aerospace ICNPAA 2016, AIP Conf. Proc., Vol. 1798, 020066 (2017); doi: 10.1063/1.4972658. 4 color figures.

The Double Conformal Space-Time Algebra (DCSTA) is a high-dimensional 12D Geometric Algebra G(4,8) that extends the concepts introduced with the Double Conformal / Darboux Cyclide Geometric Algebra (DCGA) G(8,2) with entities for Darboux cyclides (incl. parabolic and Dupin cyclides, general quadrics, and ring torus) in spacetime with a new boost operator. The base algebra in which spacetime geometry is modeled is the Space-Time Algebra (STA) G(1,3). Two Conformal Space-Time subalgebras (CSTA) G(2,4) provide spacetime entities for points, flats (incl. worldlines), and hyperbolics, and a complete set of versors for their spacetime transformations that includes rotation, translation, isotropic dilation, hyperbolic rotation (boost), planar reflection, and (pseudo)spherical inversion in rounds or hyperbolics. The DCSTA G(4,8) is a doubling product of two G(2,4) CSTA subalgebras that inherits doubled CSTA entities and versors from CSTA and adds new bivector entities for (pseudo)quadrics and Darboux (pseudo)cyclides in spacetime that are also transformed by the doubled versors. The "pseudo" surface entities are spacetime hyperbolics or other surface entities using the time axis as a pseudospatial dimension. The (pseudo)cyclides are the inversions of (pseudo)quadrics in rounds or hyperbolics. An operation for the directed non-uniform scaling (anisotropic dilation) of the bivector general quadric entities is defined using the boost operator and a spatial projection. DCSTA allows general quadric surfaces to be transformed in spacetime by the same complete set of doubled CSTA versor (i.e., DCSTA versor) operations that are also valid on the doubled CSTA point entity (i.e., DCSTA point) and the other doubled CSTA entities. The new DCSTA bivector entities are formed by extracting values from the DCSTA point entity using specifically defined inner product extraction operators. Quadric surface entities can be boosted into moving surfaces with constant velocities that display the length contraction effect of special relativity. DCSTA is an algebra for computing with quadrics and their cyclide inversions in spacetime. For applications or testing, DCSTA G(4,8) can be computed using various software packages, such as Gaalop, the Clifford Multivector Toolbox (for MATLAB), or the symbolic computer algebra system SymPy with the GAlgebra module.

**Category:** Mathematical Physics

[516] **viXra:1701.0650 [pdf]**
*submitted on 2017-01-28 10:24:55*

**Authors:** Manuel Abarca Hernandez

**Comments:** 17 Pages.

In this work has been calculated two new non baryonic DM density profiles inside halo region of Milky Way, MW hereafter, and it has been demonstrated that both ones are mathematically equivalents. Data have been got from rotation curve published in [ 17] Bhattacharjee, P.2014.
The first profile is called Direct DM density because it is got directly from velocity as power regression of radius in halo rotation curve. In other words velocity of rotation curve depend on radius as a power function.
The second one, DM density as power of E, E is gravitational field, has been introduced by author in previous papers, [8] Abarca,M.2016, where it has been used to study non baryonic DM in several galaxies. It is called “as power of E” because DM density depend on E as a power function.
Hypothesis which is the basis of theory is that non baryonic DM is generated locally by the own gravitational field according a power law. DM density = A• E^B where A& B are coefficients and E is gravitational intensity of field.
To find reasons that author has to do so daring statement, reader can consult [1] Abarca,M.2014. Dark matter model by quantum vacuum. [8] Abarca,M.2016. Dark matter density on big galaxies depend on gravitational field as Universal law and other papers quoted in bibliography.
Briefly will be explained method followed to develop this paper. Firstly are presented rotation curve and table with data points inside MW halo. These data come from [ 17] Bhattacharjee, P. Chaudary, S. Kundu, S.2014. In addition it is fitted a power regression of rotation curve points in halo region whose function is v = a•r^b .
In fourth chapter it is developed a mathematical method to get a new DM density depending on radius called direct DM density because it is got directly from power regression of velocity depending on radius. Also it is compared Direct DM got from rotation curve [ 17] Bhattacharjee, P.2014 and Direct DM got from rotation curve [5] Sofue, Y.2015. It is shown that relative difference oscillate between 2.6% at 40 kpc and 3.8 % at 190 kpc which is a very exiguous difference. It is a very good news that two prominent teams of researchers got so similar results.
In fifth chapter it has been demonstrated that Direct DM profile is mathematically equivalent to DM density depending on gravitational field, as a power function i.e. DM density = A• E^B, where A& B are cleared up depending on a & b (parameters of power regression of velocity).
In sixth chapter it has been got that for radius bigger than 40 kpc ratio baryonic density versus DM density is under 4% so it is reasonable to consider negligible baryonic density in order to develop theory introduced in this work.
In seventh chapter is compared Direct DM density got in this paper with NFW density profile fitted by Sofue in his paper. [5] Sofue, Y.2015. Throughout dominion NFW profile is bigger than Direct DM profile. Its relative difference oscillate between 25% at 40 kpc and 22% at 190 kpc.
In my opinion this remarkable fact could be explained because NFW profile is fitted with total DM enclosed inside galactic disc and as it is known inside bulge and disc there is an unknown amount of baryonic DM such as dwarf browns and cold gas clouds. However Direct DM profile is fitted with data which radius are bigger than 40 kpc where baryonic matter is negligible. It is clear that extra DM density data inside disc have influence over the whole NFW profile so it is right to conclude that relative difference between Direct DM and NFW profile might be explained by baryonic DM inside bulge and disc.
In eight chapter is compared DM density as power E in MW with DM density as power E in M31, which was published in [ 11] Abarca,M.2016. Results show that at a specific E, both DM densities are very similar. Relative differences are under 15 % inside main part of dominion. This fact support strongly author hypothesis about DM as power of E as Universal law.

**Category:** Astrophysics

[515] **viXra:1701.0649 [pdf]**
*submitted on 2017-01-28 01:09:57*

**Authors:** Arturo Tozzi

**Comments:** 3 Pages.

Recently introduced versions of the Borsuk-Ulam theorem (BUT) state that a feature on a n-manifold projects to two features with matching description onto a n+1 manifold. Starting from this rather simple “abstract” claim, a fruitful general framework has been built, able to elucidate disparate “real” physical and biological phenomena, from quantum entanglement, to brain activity, from gauge theories to pre- big bang scenarios. One of the main concerns of such a topological approach to systems features is that it talks in rather general terms, leaving apart the peculiar features of individuals and of single physical and biological processes. In order to tackle this issue, in this brief note we ask: what does it mean “matching description”? has matching description anything to do with “identity”?

**Category:** Topology

[514] **viXra:1701.0647 [pdf]**
*submitted on 2017-01-28 03:12:53*

**Authors:** M. MADANI Bouabdallah

**Comments:** 7 Pages. Seul M. Andrzej Schinzel (IMPAN) a accepté d'examiner mon texte début janvier,il en a résulté 3 observations.Les 2 premières ont été solutionnées (lemmes 1 et 2) et la 3ème a fait l'objet d'un désaccord.J'ai demandé l'arbitrage à MM. Pierre Deligne,E. Bom

J.P. Gram (1903)writes p.298 of his paper
'Note sur les zéros de la fonction zéta de Riemann' :
'Mais le résultat le plus intéressant qu'ait donné ce calcul consiste en ce qu'il révèle l'irrégularité qui se trouve dans la série des α. Il est très probable que ces racines sont liées intimement aux nombres premiers.
La recherche de cette dépendance, c'est-à-dire la manière dont une α donnée est exprimée au moyen des nombres premiers sera l'objet d'études ultérieures.'
Also the proof of the Riemann hypothesis is based on the definition of an application between the set P of the prime numbers and the set S of the zeros of ζ.

**Category:** Number Theory

[513] **viXra:1701.0645 [pdf]**
*submitted on 2017-01-28 04:38:58*

**Authors:** George Rajna

**Comments:** 25 Pages.

Electrical engineers at Duke University have created the world's first electromagnetic metamaterial made without any metal. The device's ability to absorb electromagnetic energy without heating up has direct applications in imaging, sensing and lighting. [14] Paint these days is becoming much more than it used to be. Already researchers have developed photovoltaic paint, which can be used to make "paint-on solar cells" that capture the sun's energy and turn it into electricity. Now in a new study, researchers have created thermoelectric paint, which captures the waste heat from hot painted surfaces and converts it into electrical energy. [13] Scientists at Aalto University, Finland, have made a breakthrough in physics. They succeeded in transporting heat maximally effectively ten thousand times further than ever before. The discovery may lead to a giant leap in the development of quantum computers. [12] Maxwell's demon, a hypothetical being that appears to violate the second law of thermodynamics, has been widely studied since it was first proposed in 1867 by James Clerk Maxwell. But most of these studies have been theoretical, with only a handful of experiments having actually realized Maxwell's demon. [11] In 1876, the Austrian physicist Ludwig Boltzmann noticed something surprising about his equations that describe the flow of heat in a gas. Usually, the colliding gas particles eventually reach a state of thermal equilibrium, the point at which no net flow of heat energy occurs. But Boltzmann realized that his equations also predict that, when gases are confined in a specific way, they should remain in persistent non-equilibrium, meaning a small amount of heat is always flowing within the system. [10] There is also connection between statistical physics and evolutionary biology, since the arrow of time is working in the biological evolution also. From the standpoint of physics, there is one essential difference between living things and inanimate clumps of carbon atoms: The former tend to be much better at capturing energy from their environment and dissipating that energy as heat. [8] This paper contains the review of quantum entanglement investigations in living systems, and in the quantum mechanically modeled photoactive prebiotic kernel systems. [7] The human body is a constant flux of thousands of chemical/biological interactions and processes connecting molecules, cells, organs, and fluids, throughout the brain, body, and nervous system. Up until recently it was thought that all these interactions operated in a linear sequence, passing on information much like a runner passing the baton to the next runner. However, the latest findings in quantum biology and biophysics have discovered that there is in fact a tremendous degree of coherence within all living systems. The accelerating electrons explain not only the Maxwell Equations and the Special Relativity, but the Heisenberg Uncertainty Relation, the Wave-Particle Duality and the electron's spin also, building the Bridge between the Classical and Quantum Theories. The Planck Distribution Law of the electromagnetic oscillators explains the electron/proton mass rate and the Weak and Strong Interactions by the diffraction patterns. The Weak Interaction changes the diffraction patterns by moving the electric charge from one side to the other side of the diffraction pattern, which violates the CP and Time reversal symmetry. The diffraction patterns and the locality of the self-maintaining electromagnetic potential explains also the Quantum Entanglement, giving it as a natural part of the Relativistic Quantum Theory and making possible to understand the Quantum Biology.

**Category:** Quantum Physics

[512] **viXra:1701.0644 [pdf]**
*submitted on 2017-01-28 05:05:24*

**Authors:** Adrian Ferent

**Comments:** 55 Pages. © 2015 Adrian Ferent

I discovered a new Gravitation theory after I realized that Einstein Gravitation theory is a wrong theory!
“Mass does not bend space”
Adrian Ferent
“The gravitons emitted by a black hole do not follow the curvature created by the black hole, because Einstein Gravitation theory is wrong”
Adrian Ferent
Einstein Gravitation theory is wrong because Einstein field equations are wrong.
Einstein Gravitation theory is wrong because Mass does not bend space.
Einstein Gravitation theory is wrong because Gravitational waves are NOT ripples in the curvature of spacetime that propagate as waves with the speed of light.
Einstein Gravitation theory is wrong because Einstein equivalence principle is wrong.
Einstein Gravitation theory is wrong because is limited to the speed of light.
Because Einstein Gravitation theory is wrong, these pictures are wrong!
All physicists did not understand Gravitation; they followed Einstein Gravitation theory in the last 100 years.
Physicists discovered wrong theories like String theory, LQG…they tried to unify the forces of nature with Einstein Gravitation theory, a wrong theory.

**Category:** Quantum Gravity and String Theory

[511] **viXra:1701.0643 [pdf]**
*submitted on 2017-01-28 06:12:43*

**Authors:** Victor D. Krasnov

**Comments:** 10 Pages.

The existing laws that describe planetary motion fail to predict and explain the presence of rotational plane inclination and the angle of inclination of this plane. These laws also fail to explain planetary rotation in one plane and how planetary motion in the direction of planetary system movement affects orbital parameters. It has been found that planetary motion in the direction of planetary system motion under the effect of the star's attraction gravitational component occurs as cyclic oscillations (motion with cyclically changing speed). A planet's cyclic oscillations form the visible declination observed in the system of coordinates of the planetary system, the rotational plane inclination and the inclination angle. The results obtained demonstrate the new understanding of the mechanisms that form the orbits of planets, and show the decisive role in this process of the star's attraction gravitational component, which acts in the direction of planetary system motion. The result
s are new and are the complete law of motion of objects within planetary type systems.

**Category:** Relativity and Cosmology

[510] **viXra:1701.0642 [pdf]**
*submitted on 2017-01-28 06:24:00*

**Authors:** George Rajna

**Comments:** 15 Pages.

A small team of researchers with affiliations to institutions in Italy, Japan and the U.S. has created a simulation that suggests that it should be possible for a single photon to simultaneously excite two atoms. [9] Molecules vibrate in many different ways—like tiny musical instruments. [8] For centuries, scientists believed that light, like all waves, couldn't be focused down smaller than its wavelength, just under a millionth of a metre. Now, researchers led by the University of Cambridge have created the world's smallest magnifying glass, which focuses light a billion times more tightly, down to the scale of single atoms. [7] A Purdue University physicist has observed a butterfly Rydberg molecule, a weak pairing of two highly excitable atoms that he predicted would exist more than a decade ago. [6] In a scientific first, a team of researchers from Macquarie University and the University of Vienna have developed a new technique to measure molecular properties – forming the basis for improvements in scientific instruments like telescopes, and with the potential to speed up the development of pharmaceuticals. [5] In the quantum world, physicists study the tiny particles that make up our classical world-neutrons, electrons, photons-either one at a time or in small numbers because the behaviour of the particles is completely different on such a small scale. If you add to the number of particles that are being studied, eventually there will be enough particles that they no longer act quantum mechanically and must be identified as classical, just like our everyday world. But where is the line between the quantum world and the classical world? A group of scientists from Okinawa Institute of Science and Technology Graduate University (OIST) explored this question by showing what was thought to be a quantum phenomenon can be explained classically. [4] The accelerating electrons explain not only the Maxwell Equations and the Special Relativity, but the Heisenberg Uncertainty Relation, the Wave-Particle Duality and the electron's spin also, building the Bridge between the Classical and Quantum Theories. The Planck Distribution Law of the electromagnetic oscillators explains the electron/proton mass rate and the Weak and Strong Interactions by the diffraction patterns. The Weak Interaction changes the diffraction patterns by moving the electric charge from one side to the other side of the diffraction pattern, which violates the CP and Time reversal symmetry.

**Category:** Quantum Physics

[509] **viXra:1701.0641 [pdf]**
*submitted on 2017-01-28 06:41:02*

**Authors:** Victor D, Krasnov

**Comments:** 11 Pages. in Russia

The existing laws that describe planetary motion fail to predict and explain the presence of rotational plane inclination and the angle of inclination of this plane. These laws also fail to explain planetary rotation in one plane and how planetary motion in the direction of planetary system movement affects orbital parameters. It has been found that planetary motion in the direction of planetary system motion under the effect of the star's attraction gravitational component occurs as cyclic oscillations (motion with cyclically changing speed). A planet's cyclic oscillations form the visible declination observed in the system of coordinates of the planetary system, the rotational plane inclination and the inclination angle. The results obtained demonstrate the new understanding of the mechanisms that form the orbits of planets, and show the decisive role in this process of the star's attraction gravitational component, which acts in the direction of planetary system motion. The results are new and are the complete law of motion of objects within planetary type systems.//
Существующие законы, описывающие движение планет, не предсказывают и не объясняют наличие наклона плоскости вращения и величину угла наклона этой плоскости. Не объясняют вращение планет в одной плоскости. Не объясняют, какое влияние на параметры орбит оказывает движение планет в направлении движения планетарной системы.
В процессе исследования установлено, что движение планет в направлении движения планетарной системы (в направлении движения звезды), происходит в виде циклических колебаний, формируемых под воздействием гравитационной составляющей притяжения звезды.
Показано, что циклические колебания планеты в направлении движения планетарной системы формируют наблюдаемое склонение и наблюдаемый в системе координат планетарной системы наклон плоскости вращения.
Выполнены расчёты, подтверждающие физическую модель формирования орбит.
Полученные результаты углубляют знания о механизмах формирующих орбиты планет, показывают решающую роль в этом процессе гравитационной составляющей притяжения звезды, действующей в направлении движения планетарной системы. Полученные результаты не противоречат законам И. Кеплера и И. Ньютона, являются их дальнейшим развитием.

**Category:** Relativity and Cosmology

[508] **viXra:1701.0640 [pdf]**
*submitted on 2017-01-28 06:44:22*

**Authors:** Amir Ali Tavajoh

**Comments:** 12 Pages.

Can we apply Kepler’s Laws to the motion of stars of a galaxy?
Is it true that luminous matter contains the total galaxy’s mass?
When we observe galaxies, we see interstellar gas, dust and stars which is called luminous matter.
In 1922, a German astronomer, Jacobus Kapteyn was the first who suggested that dark matter exists.
In 1933, A Bulgarian-American astronomer, Fritz Zwicky, explained the reason for existence of dark matter. He realized that gravitational lensing would provide the means for the most direct determination of the mass of very large galactic clusters of galaxies, including dark matter. [1].
Gravitational lensing is the consequence of Einstein’s general relativity. It was first observed in 1919, when an apparent angular shift of the Mercury close to the solar limb was measured during a solar eclipse and it was a strong proof for Einstein’s theory.
Astronomers measure the total mass of a galaxy by Kepler’s laws (especially the law of periods) [2].
〖 T〗^2=(4π^2)/GM a^3 (1)
α: per Astronomical Unit
M: per Solar Mass
First of all, luminous matter is not equally distributed in galaxy because astronomers while evaluating the spectrums of stars of galaxy, found that stars of galaxy have different masses.
Also based on Kepler’s law of areas, stars located closer to the center of black hole should have more orbital velocity than stars located further from center of galaxy but based on the Doppler Effect, when astronomers found out the orbital velocity of both stars by analyzing the absorption lines in spectrum of them, both were the same in orbital velocity [3].
We come to this conclusion that there should be a matter which is not luminous (because it doesn’t have any electromagnetic interaction) that let this phenomenon take place.

**Category:** Relativity and Cosmology

[507] **viXra:1701.0639 [pdf]**
*submitted on 2017-01-27 13:45:38*

**Authors:** Gerhard Jan Smit, Jelle Ebel van der Schoot

**Comments:** 15 Pages.

In this article a particle will be presented through which all forces are explained in a satisfactory way. It concerns the so-called dimensional basic (db). After much reflection, Gerhard Jan Smit and Jelle Ebel van der Schoot are of the opinion that with this theory, the foundation of the observed particles and forces has been found.
The accompanying formula is:
In the formula Kr = curvature, x,y,z are coordinates in space/time.
Implications:
- The observed cosmic redshift is a gravitational redshift;
- the cosmic background is formed through the mutual interactions of the 1-db-particles;
- neutron consists -notwithstanding the current insights- of a foursome of quarks (2 quarks up, 2 quarks down);
- complex particles -rationalized from the basis- can be mathematically determined and simulated;
- the entanglement of particles is caused by curvatures, changes that one of the “partner-particles” experiences will instantaneously be transmitted to the other “partner-particle”;
- electromagnetic fields around energized wires are being caused by aspirating 1-db particles. By winding of an energized wire in a coil the electromagnetic fields are being cumulated, this resulting in the fields as observed around an energized coil.

**Category:** Quantum Gravity and String Theory

[506] **viXra:1701.0638 [pdf]**
*submitted on 2017-01-27 14:30:51*

**Authors:** John A. Gowan

**Comments:** 5 Pages.

The charges of matter are symmetry debts of light. Gravity is matter's memory it once was light.

**Category:** General Science and Philosophy

[505] **viXra:1701.0637 [pdf]**
*replaced on 2017-03-26 13:21:27*

**Authors:** Emmanuil Manousos

**Comments:** 215 Pages.

With the term “Law of Selfvariations” we mean an exactly determined increase of the rest mass and the absolute value of the electric charge of material particles. In this article we present the basic theoretical investigation of the law of selfvariations. We arrive at the central conclusion that the interaction of material particles, the corpuscular structure of matter, and the quantum phenomena can be justified by the law of Selfvariations. We predict a unified interaction between particles with a unified mechanism (the Unified Selfvariation Interaction, USVI). Every interaction is described by the three distinct terms with distinct consequences in the USVI. The theory predicts a wave equation, whose special cases are the Maxwell equations, the Schrödinger equation and the related wave equations. The theory provides a mathematical expression for any conservable physical quantity, and the current density 4-vector in every case. The corpuscular structure and wave behaviour of matter and the relation between this emerge clearly and the theory also predicts the rest masses of material particles. We prove an «internal symmetry» theorem which justifies the cosmological data. The study we present can be the basis for further investigation of the theory and their consequences.

**Category:** Quantum Physics

[504] **viXra:1701.0636 [pdf]**
*replaced on 2017-01-28 08:51:07*

**Authors:** Allen Graycek

**Comments:** 8 Pages.

Very recent studies of halos around galaxies found them to be much more extensive than previous studies indicated, and that they have enormous mass consisting of gas and dust from supernovas, SNs. The results of these studies and a recent study of SN rate of occurrence can be used to determine age, and for the Milky Way the result is roughly six trillion years with a conservative preliminary calculation. Other strong evidence exists as well which indicates a great age, yet it is truly a wonder any of this will shake up or change current beliefs.

**Category:** Relativity and Cosmology

[503] **viXra:1701.0635 [pdf]**
*submitted on 2017-01-27 09:05:04*

**Authors:** George Rajna

**Comments:** 15 Pages.

Researchers from Ludwig-Maximilians-Universitaet (LMU) Munich have, for the first time, measured the lifetime of an excited state in the nucleus of an unstable element. This is a major step toward a nuclear clock that could keep even better time than today's best atomic timekeepers. [12] The work elucidates the interplay between collective and single-particle excitations in nuclei and proposes a quantitative theoretical explanation. It has as such great potential to advance our understanding of nuclear structure. [11] When two protons approaching each other pass close enough together, they can " feel " each other, similar to the way that two magnets can be drawn closely together without necessarily sticking together. According to the Standard Model, at this grazing distance, the protons can produce a pair of W bosons. [10] The fact that the neutron is slightly more massive than the proton is the reason why atomic nuclei have exactly those properties that make our world and ultimately our existence possible. Eighty years after the discovery of the neutron, a team of physicists from France, Germany, and Hungary headed by Zoltán Fodor, a researcher from Wuppertal, has finally calculated the tiny neutron-proton mass difference. [9] Taking into account the Planck Distribution Law of the electromagnetic oscillators, we can explain the electron/proton mass rate and the Weak and Strong Interactions. Lattice QCD gives the same results as the diffraction patterns of the electromagnetic oscillators, explaining the color confinement and the asymptotic freedom of the Strong Interactions.

**Category:** High Energy Particle Physics

[502] **viXra:1701.0634 [pdf]**
*submitted on 2017-01-27 10:05:33*

**Authors:** Mark M. Grinshtein

**Comments:** 7 Pages. in Russian

The article analyzes the possibility of determining the duration of human life. It is shown that no decisive answer can exist due to multisided nature of this issue, however classical medicine confirms such possibility through telomere theory. The article shows that the theory is groundless from the point of view of Information-Wave Medicine (IWM). \\ В статье проанализирована возможность определения продолжительности жизни человека. Показано, что в виду многомерности этой проблемы однозначного ответа на вопрос не может быть получено, тем не менее классическая медицина подтверждает такую возможность с помощью теории теломер. Показана несостоятельность этой теории с позиций информационно-волновой медицины (ИВМ).

**Category:** Physics of Biology

[501] **viXra:1701.0633 [pdf]**
*submitted on 2017-01-27 05:15:17*

**Authors:** Leo Vuyk

**Comments:** 21 Pages.

Construction principles for chiral “atoms of spacetime” based on geometrical 3-D chiral vacuum lattice models and consequences for spacetime, general relativity based space curvature and time variation, including cyclic Multi- Universal time, and local time.

**Category:** Astrophysics

[500] **viXra:1701.0632 [pdf]**
*submitted on 2017-01-27 05:26:55*

**Authors:** George Rajna

**Comments:** 34 Pages.

New theoretical work shows how much faster quantum information can travel through a system than classical information. [25] Characterising quantum channels with non-separable states of classical light the researchers demonstrate the startling result that sometimes Nature cannot tell the difference between particular types of laser beams and quantum entangled photons. [24] Physicists at Princeton University have revealed a device they've created that will allow a single electron to transfer its quantum information to a photon. [23] A strong, short light pulse can record data on a magnetic layer of yttrium iron garnet doped with Co-ions. This was discovered by researchers from Radboud University in the Netherlands and Bialystok University in Poland. The novel mechanism outperforms existing alternatives, allowing the fastest read-write magnetic recording accompanied by unprecedentedly low heat load. [22] It goes by the unwieldy acronym STT-MRAM, which stands for spin-transfer torque magnetic random access memory. [21] Memory chips are among the most basic components in computers. The random access memory is where processors temporarily store their data, which is a crucial function. Researchers from Dresden and Basel have now managed to lay the foundation for a new memory chip concept. [20] Researchers have built a record energy-efficient switch, which uses the interplay of electricity and a liquid form of light, in semiconductor microchips. The device could form the foundation of future signal processing and information technologies, making electronics even more efficient. [19] The magnetic structure of a skyrmion is symmetrical around its core; arrows indicate the direction of spin. [18] According to current estimates, dozens of zettabytes of information will be stored electronically by 2020, which will rely on physical principles that facilitate the use of single atoms or molecules as basic memory cells. [17] EPFL scientists have developed a new perovskite material with unique properties that can be used to build next-generation hard drives. [16] Scientists have fabricated a superlattice of single-atom magnets on graphene with a density of 115 terabits per square inch, suggesting that the configuration could lead to next-generation storage media. [15]

**Category:** Quantum Physics

[499] **viXra:1701.0631 [pdf]**
*replaced on 2017-03-03 08:56:47*

**Authors:** Mario Everaldo de Souza

**Comments:** 21 Pages. I corrected more the grammar and included an equation that I had forgotten.

A new cosmological model is proposed for the dynamics of the Universe and the formation and evolution of galaxies. It is shown that the matter of the Universe contracts and expands in cycles, and that galaxies in a particular cycle may have imprints from the previous cycle. It is proposed that RHIC’s liquid gets trapped in the cores of galaxies in the beginning of each cycle and is liberated throughout time and is, thus, the power engine of AGNs. It is also proposed that the large-scale structure is a permanent property of the Universe, and thus, it is not created. It is proposed that spiral galaxies and elliptical galaxies are formed by mergers of nucleon vortices (vorteons) at the time of the big squeeze and immediately afterwards and that the merging process, in general, lasts an extremely long time, of many billion years. The origin of quasars is explained and the evaporation rate of RHIC’s liquid is calculated. The large mass at the center of quasar PDS 456 is calculated and agrees in order of magnitude with that attributed to a supposed black hole. It is concluded that the Universe is eternal and that space should be infinite or almost.

**Category:** Relativity and Cosmology

[498] **viXra:1701.0630 [pdf]**
*submitted on 2017-01-26 22:23:47*

**Authors:** Kelvin Kian Loong Wong

**Comments:** 17 Pages. French translation for abstract and keywords

This paper provides a potential pathway to a formal simple proof of Fermat's Last Theorem. The geometrical formulations of n-dimensional hypergeometrical models in relation to Fermat's Last Theorem are presented. By imposing geometrical constraints pertaining to the spatial allowance of these hypersphere configurations, it can be shown that a violation of the constraints confirms the theorem for n equal to infinity to be true.

**Category:** Number Theory

[497] **viXra:1701.0629 [pdf]**
*submitted on 2017-01-27 00:13:53*

**Authors:** Ryan C. Rankin

**Comments:** 14 Pages.

Using the Friedman-Lemaitre-Robertson-Walker (FLRW) universe as a background metric, purely
General relativistic (classical) scalar metric perturbations are investigated for small bodies. For the
approximation of a point-like perturbing mass in the closed FLRW universe, the scalar perturbation
may be written in a form obeying precisely the Dirac equation up to a factor playing the role
of Planck’s constant. A physical interpretation suggests the scalar perturbation in this form is the
wavefunction of quantum mechanics. Such an interpretation indicates the nonlocality of gravitational energy/momentum in General relativity leads naturally to the indeterminacy of quantum
mechanics. Some physical consequences and predictions are discussed and briefly explored.

**Category:** Relativity and Cosmology

[496] **viXra:1701.0628 [pdf]**
*submitted on 2017-01-26 13:37:52*

**Authors:** George Rajna

**Comments:** 18 Pages.

Are time crystals just a mathematical curiosity, or could they actually physically exist? Physicists have been debating this question since 2012, when Nobel laureate Frank Wilczek first proposed the idea of time crystals. He argued that these hypothetical objects can exhibit periodic motion, such as moving in a circular orbit, in their state of lowest energy, or their "ground state." [28] Researchers from the Foundation for Fundamental Research on Matter and the University of Amsterdam (the Netherlands), together with researchers from the Institute for Materials Science in Tsukuba (Japan), have discovered an exceptional new quantum state within a superconducting material. This exceptional quantum state is characterised by a broken rotational symmetry – in other words, if you turn the material in a magnetic field, the superconductivity isn't the same everywhere in the material. [27], and collaborators have produced the first direct evidence of a state of electronic matter first predicted by theorists in 1964. The discovery, described in a paper published online April 13, 2016, in Nature, may provide key insights into the workings of high-temperature superconductors. [26] This paper explains the magnetic effect of the superconductive current from the observed effects of the accelerating electrons, causing naturally the experienced changes of the electric field potential along the electric wire. The accelerating electrons explain not only the Maxwell Equations and the Special Relativity, but the Heisenberg Uncertainty Relation, the wave particle duality and the electron's spin also, building the bridge between the Classical and Quantum Theories. The changing acceleration of the electrons explains the created negative electric field of the magnetic induction, the Higgs Field, the changing Relativistic Mass and the Gravitational Force, giving a Unified Theory of the physical forces. Taking into account the Planck Distribution Law of the electromagnetic oscillators also, we can explain the electron/proton mass rate and the Weak and Strong Interactions.

**Category:** Condensed Matter

[495] **viXra:1701.0627 [pdf]**
*submitted on 2017-01-26 13:57:24*

**Authors:** Terubumi Honjou

**Comments:** 3 Pages.

Dark energy pulsating hypothesis challenging mystery of cosmic large-scale structure.
Even a hundred billion galaxies are distributed like bubbles (void) of aggregate space design structure.
Galaxies are distributed to the grid of bubbles.
Galaxy does not exist on the inside of the bubble.
That is a mystery of Cosmic Physics.
Oscillatory universe model to challenge its mysteries.
Repeated mini-big bang due to pulsation of the microcosm.
In each of its
Moving bubbles around galaxies formed by the previous mini-big bang, centered in the grid of bubbles.

**Category:** Astrophysics

[494] **viXra:1701.0626 [pdf]**
*submitted on 2017-01-26 13:54:34*

**Authors:** Edgar Valdebenito

**Comments:** 18 Pages.

In this note we presents some formulas related with the recurrences:(i)u(n+5)=u(n+3)+u(n+2)+u(n),
u(0)=u(1)=u(2)=u(3)=0,u(4)=1;
(ii)v(n+5)=v(n+4)+v(n+1)+v(n),
v(0)=v(1)=v(2)=v(3)=0,v(4)=1

**Category:** General Mathematics

[493] **viXra:1701.0625 [pdf]**
*submitted on 2017-01-26 14:24:39*

**Authors:** George Rajna

**Comments:** 26 Pages.

According to a new study led by scientists at the Department of Energy's Lawrence Berkeley National Laboratory (Berkeley Lab) and at the University of California, Berkeley, electrons in vanadium dioxide can conduct electricity without conducting heat. [14]
Paint these days is becoming much more than it used to be. Already researchers have developed photovoltaic paint, which can be used to make "paint-on solar cells" that capture the sun's energy and turn it into electricity. Now in a new study, researchers have created thermoelectric paint, which captures the waste heat from hot painted surfaces and converts it into electrical energy. [13]
Scientists at Aalto University, Finland, have made a breakthrough in physics. They succeeded in transporting heat maximally effectively ten thousand times further than ever before. The discovery may lead to a giant leap in the development of quantum computers. [12]
Maxwell's demon, a hypothetical being that appears to violate the second law of thermodynamics, has been widely studied since it was first proposed in 1867 by James Clerk Maxwell. But most of these studies have been theoretical, with only a handful of experiments having actually realized Maxwell's demon. [11]
In 1876, the Austrian physicist Ludwig Boltzmann noticed something surprising about his equations that describe the flow of heat in a gas. Usually, the colliding gas particles eventually reach a state of thermal equilibrium, the point at which no net flow of heat energy occurs. But Boltzmann realized that his equations also predict that, when gases are confined in a specific way, they should remain in persistent non-equilibrium, meaning a small amount of heat is always flowing within the system. [10]
There is also connection between statistical physics and evolutionary biology, since the arrow of time is working in the biological evolution also.
From the standpoint of physics, there is one essential difference between living things and inanimate clumps of carbon atoms: The former tend to be much better at capturing energy from their environment and dissipating that energy as heat. [8]
This paper contains the review of quantum entanglement investigations in living systems, and in the quantum mechanically modeled photoactive prebiotic kernel systems. [7]
The human body is a constant flux of thousands of chemical/biological interactions and processes connecting molecules, cells, organs, and fluids, throughout the brain, body, and nervous system. Up until recently it was thought that all these interactions operated in a linear sequence, passing on information much like a runner passing the baton to the next runner. However, the latest findings in quantum biology and biophysics have discovered that there is in fact a tremendous degree of coherence within all living systems.
The accelerating electrons explain not only the Maxwell Equations and the Special Relativity, but the Heisenberg Uncertainty Relation, the Wave-Particle Duality and the electron’s spin also, building the Bridge between the Classical and Quantum Theories.
The Planck Distribution Law of the electromagnetic oscillators explains the electron/proton mass rate and the Weak and Strong Interactions by the diffraction patterns. The Weak Interaction changes the diffraction patterns by moving the electric charge from one side to the other side of the diffraction pattern, which violates the CP and Time reversal symmetry.
The diffraction patterns and the locality of the self-maintaining electromagnetic potential explains also the Quantum Entanglement, giving it as a natural part of the Relativistic Quantum Theory and making possible to understand the Quantum Biology.

**Category:** Condensed Matter

[492] **viXra:1701.0624 [pdf]**
*submitted on 2017-01-26 14:52:05*

**Authors:** Fu Yuhua

**Comments:** 125 Pages.

Consider all possible situations, won the best ending. This is our starting point for establishing New Newton Mechanics and for solving many complicated problems. Practice has proved that this idea is successful.

**Category:** Classical Physics

[491] **viXra:1701.0623 [pdf]**
*submitted on 2017-01-26 14:52:51*

**Authors:** George Rajna

**Comments:** 36 Pages.

Nearly a century after it was theorized, Harvard scientists have succeeded in creating the rarest-and potentially one of the most valuable-materials on the planet. [23] ORNL researchers have discovered a new type of quantum critical point, a new way in which materials change from one state of matter to another. [22] New research conducted at the University of Chicago has confirmed a decades-old theory describing the dynamics of continuous phase transitions. [21] No matter whether it is acoustic waves, quantum matter waves or optical waves of a laser—all kinds of waves can be in different states of oscillation, corresponding to different frequencies. Calculating these frequencies is part of the tools of the trade in theoretical physics. Recently, however, a special class of systems has caught the attention of the scientific community, forcing physicists to abandon well-established rules. [20] Until quite recently, creating a hologram of a single photon was believed to be impossible due to fundamental laws of physics. However, scientists at the Faculty of Physics, University of Warsaw, have successfully applied concepts of classical holography to the world of quantum phenomena. A new measurement technique has enabled them to register the first-ever hologram of a single light particle, thereby shedding new light on the foundations of quantum mechanics. [19] A combined team of researchers from Columbia University in the U.S. and the University of Warsaw in Poland has found that there appear to be flaws in traditional theory that describe how photodissociation works. [18] Ultra-peripheral collisions of lead nuclei at the LHC accelerator can lead to elastic collisions of photons with photons. [17] Physicists from Trinity College Dublin's School of Physics and the CRANN Institute, Trinity College, have discovered a new form of light, which will impact our understanding of the fundamental nature of light. [16] Light from an optical fiber illuminates the metasurface, is scattered in four different directions, and the intensities are measured by the four detectors. From this measurement the state of polarization of light is detected. [15] Converting a single photon from one color, or frequency, to another is an essential tool in quantum communication, which harnesses the subtle correlations between the subatomic properties of photons (particles of light)

**Category:** Condensed Matter

[490] **viXra:1701.0622 [pdf]**
*submitted on 2017-01-26 11:03:48*

**Authors:** David Brown

**Comments:** 7 Pages.

Is F-theory somehow related to the fact that 5^9 divides the order of the monster group? Is Milgrom underestimated by most astrophysicists? Does the Koide formula suggest that string vibrations are confined to 3 copies of the Leech lattice? Is Lestone’s heuristic string theory somehow related to the fact that 7^6 divides the order of the monster group? I say that my 3 most important ideas are: (1) Milgrom is the Kepler of contemporary cosmology. (2) The Koide formula is essential for understanding the foundations of physics. (3) Lestone’s heuristic string theory is essential for understanding the foundations of physics. Do most physicists agree with me concerning the preceding 3 ideas? No, but the passage of time should settle the status of each of the 3 ideas. For the sake of argument, let us assume that Milgrom’s MOND is empirically valid and that conventional physics cannot explain MOND. I have speculated that MOND is explained by the Fernández-Rañada-Milgrom effect and by string theory with the finite nature hypothesis. It seems to me that my previous attempts at explaining a multiverse model for MOND are somewhat unsatisfactory. In this communication I speculate on how the geometry of strings and branes might be a smoothing of a Wolframian model involving the monster group and the 6 pariah groups. I also attempt to clarify my speculation on how the Fernández-Rañada-Milgrom effect might explain the flyby anomaly.

**Category:** Quantum Gravity and String Theory

[489] **viXra:1701.0621 [pdf]**
*replaced on 2017-02-06 17:28:19*

**Authors:** Colin Walker

**Comments:** 11 Pages.

A two-dimensional vector can be made from a constant signal component plus a
randomly oriented noise component.
This simple model can exploit detection and post-selection loopholes to
produce Bell correlations within 0.01 of the theoretical cosine
expected from quantum mechanics. The model is shown to be in accord with
McEachern's hypothesis that quantum correlations are associated with processes
which can provide only one bit of information per sample.

**Category:** Quantum Physics

[488] **viXra:1701.0620 [pdf]**
*submitted on 2017-01-26 10:02:09*

**Authors:** George Rajna

**Comments:** 11 Pages.

A team of theoretical physicists has proposed a way to simulate black holes on an electronic chip. Additionally, the technology used to create these lab-made black holes may be useful for quantum technologies. [12]
To carry out this experiment, Chen and Mourou suggest a laser pulse could be sent through a plasma target. [11]
Jeff Steinhauer, a physicist at the Israel Institute of Technology, has published a paper in the journal Nature Physics describing experiments in which he attempted to create a virtual black hole in the lab in order to prove that Stephen Hawking's theory of radiation emanating from black holes is correct —though his experiments are based on sound, rather than light. In his paper, he claims to have observed the quantum effects of Hawking radiation in his lab as part of a virtual black hole—which, if proven to be true, will be the first time it has ever been achieved.
New Research Mathematically Proves Quantum Effects Stop the Formation of Black Holes. By merging two seemingly conflicting theories, Laura Mersini-Houghton, a physics professor at UNC-Chapel Hill in the College of Arts and Sciences, has proven, mathematically, that black holes can never come into being in the first place. The works not only forces scientists to reimagining the fabric of space-time, but also rethink the origins of the universe.
Considering the positive logarithmic values as the measure of entropy and the negative logarithmic values as the measure of information we get the Information – Entropy Theory of Physics, used first as the model of the computer chess program built in the Hungarian Academy of Sciences.
Applying this model to physics we have an understanding of the perturbation theory of the QED and QCD as the Information measure of Physics. We have an insight to the current research of Quantum Information Science. The generalization of the Weak Interaction shows the arrow of time in the associate research fields of the biophysics and others. We discuss also the event horizon of the Black Holes, closing the information inside.

**Category:** Astrophysics

[487] **viXra:1701.0619 [pdf]**
*submitted on 2017-01-25 17:22:50*

**Authors:** Sylwester Kornowski

**Comments:** 3 Pages.

Here, applying the Scale-Symmetric Theory (SST), we derived formula that converts the SST spatial distance to the SST light travel time that for redshift up to 0.6415 is about 14 - 17% longer than the General Relativity (GR) light travel time. It causes that the Type Ia supernovae are fainter than they should be - it leads to an illusion of acceleration of the expansion of the Universe about 6 - 7 Gyr ago. SST shows that in reality, to describe correctly the expansion of the Universe, we must take into account the initial conditions for the expansion, the mechanisms of creation of photons and the quantum entanglement of photons in pairs of them. We showed that there is a stepwise change in the light travel time for redshift about 0.64 - it suggests that there is not a smooth transition from the near Universe to distant Universe - it is inconsistent with GR. The GR formula correctly describes galaxies in the same spatial distance moving with different recessional velocities i.e. concerns the distant Universe.

**Category:** Quantum Gravity and String Theory

[486] **viXra:1701.0618 [pdf]**
*replaced on 2017-03-11 10:01:21*

**Authors:** Juan G. Orozco

**Comments:** 9 Pages.

This paper introduces proofs to several open problems in number theory, particularly the Goldbach Conjecture and the Twin Prime Conjecture. These two conjectures are proven by using a greedy elimination algorithm, and incorporating Mertens' third theorem and the twin prime constant. The argument is extended to Germain primes, Cousin Primes, and other prime related conjectures. A generalization is provided for all algorithms that result in an Euler product like\prod{\left(1-\frac{a}{p}\right)}.

**Category:** Number Theory

[485] **viXra:1701.0617 [pdf]**
*replaced on 2017-02-01 10:10:22*

**Authors:** Ilya Chernykh

**Comments:** 8 Pages. In Russian language

We propose an extension of real numbers which reveals a surprising algebraic role of Bernoulli numbers, Hurwitz Zeta function, Euler-Mascheroni constant as well as generalized summations of divergent series and integrals. We extend elementary functions to the proposed numerical system and analyze some symmetries of the special elements. This reveals intriguing closed-form relations between trigonometric and inverse trigonometric functions. Besides this we show that the proposed system can be naturally used for fine comparison between countable sets in metric space which respects the intuitive notion of the set's size.

**Category:** Functions and Analysis

[484] **viXra:1701.0616 [pdf]**
*submitted on 2017-01-26 02:05:30*

**Authors:** Muhammad Akram, K. P. Shum

**Comments:** 21 Pages.

Fuzzy graph theory is used for solving real-world problems in different fields, including theoretical computer science, engineering, physics, combinatorics and medical sciences. In this paper, we present conepts of bipolar neutrosophic multigraphs, bipolar neutrosophic planar graphs, bipolar neutrosophic dual graphs, and study some of their related properties. We also describe applications
of bipolar neutrosophic graphs in road network and electrical connections.

**Category:** General Mathematics

[483] **viXra:1701.0615 [pdf]**
*submitted on 2017-01-25 14:19:03*

**Authors:** Imrich Krištof

**Comments:** 5 Pages.

There have been observed very scientifically important facts and astrophysical events in the Scorpius Stars Constellation during the last twenty years. Like an example I can introduce Discovery of nova Scorpii 2007, probably Binar System (V 1280 Sco.), described by Japanese astronomer Yukio Samurai on 4th – 5th February 2007. In 1998 in many scientific journals (Science, Scientific American) was published many informations about our Sun's (G2V) homolog or equivalent in Scorpii Constellation. In exceptionally but probably case an existence of high–tech civilization or metacivilization with building of Super Dyson sphere could be important for radiowave or other way of communication. There could be an existence some extrasolar's planets with advanced hi–tech civilization respectively based on neutrinic or spin–tronic technology. In the Scorpius Constellation are these typically our Sun's homolog stars: Shaula (λ Scorpii), Dschubba (δ Scorpii) -> tetrastellar system, Acrab (β Scorpii) and Binar Sun System Wei (ε Scorpii).

**Category:** Astrophysics

[482] **viXra:1701.0614 [pdf]**
*replaced on 2017-03-20 17:17:40*

**Authors:** S Halayka

**Comments:** 9 Pages.

In this paper, Blinn's metaballs are used to model the pre-ringdown phase of the merger of $n = 2$ Schwarzschild black holes.
An analytical solution is provided.

**Category:** Relativity and Cosmology

[481] **viXra:1701.0613 [pdf]**
*submitted on 2017-01-25 14:41:49*

**Authors:** P. R. Silva

**Comments:** 09 pages, 12 references.

– Electrons interacting with the QED vacuum and distorting the space-time tissue are considered as a means to express the gravitational constant in terms of electromagnetic parameters. The link between gravity and weak interactions is also worked out in this paper. Finally we extend the first treatment to hadrons, in order to evaluate the strong interaction coupling at low energies (at the energy scale of the proton mass).

**Category:** General Science and Philosophy

[480] **viXra:1701.0610 [pdf]**
*submitted on 2017-01-25 09:25:59*

**Authors:** Leonid Kanevskyy

**Comments:** 12 Pages. The paper contains the introduction and the abstract in English, the rest of the paper - in German

This paper’s aim is to establish existence of the new phenomenon – Turbo Self-Injection. It is a new type of circulation of liquid flow that appears inside cylindrical, conical or spherical hollow rotationally symmetrical solid-bodies.Throttling water jet turns into the circular, vortex flow of liquid, for example water, in such solid-bodies and creates a static pressure drop that increases suction capacity of injectors several fold; the injectors also have much smaller sizes and consume considerably less energy than any known models.
To achieve that, I created a mixing chamber in a pipe with a throttling disk on one end and a dead impact wall on another end and with an outlet sideways in front of the impact wall. The mixing chamber had two more holes: one for air intake and one for liquid soap.As it turned out, the water was not coming out of the holes for air and for soap even when the throttling water jet was hitting the dead flat perpendicular wall in the pipe. Instead, the water was rotating in a vortical manner around the mixing chamber longitudinal axis which is parallel to the throttling water jet. In this process, sucked-in air increases volume and velocity of the circular vortex flow several fold, and suction pressure is created which is 20 to 30 times higher than in previously known injectors. Because of this, simultaneous suction of both air and liquid soap is possible. In the process a homogenous fine soap foam is created with air bubbles of 1-2 millimetre diameter. This sub-mission is a brief summary of the experimental research that I have been conducting on my own for 14 years.

**Category:** Classical Physics

[479] **viXra:1701.0609 [pdf]**
*submitted on 2017-01-25 10:02:39*

**Authors:** George Rajna

**Comments:** 23 Pages.

Utilizing electrons on a liquid helium surface for quantum computing requires isolating individual electrons on a helium surface and controlling their quantum degrees of freedom, either motional or spin. [17] EPFL scientists have developed a new perovskite material with unique properties that can be used to build next-generation hard drives. [16] Scientists have fabricated a superlattice of single-atom magnets on graphene with a density of 115 terabits per square inch, suggesting that the configuration could lead to next-generation storage media. [15] Now a researcher and his team at Tyndall National Institute in Cork have made a 'quantum leap' by developing a technical step that could enable the use of quantum computers sooner than expected. [14] A method to produce significant amounts of semiconducting nanoparticles for light-emitting displays, sensors, solar panels and biomedical applications has gained momentum with a demonstration by researchers at the Department of Energy's Oak Ridge National Laboratory. [13] A source of single photons that meets three important criteria for use in quantum-information systems has been unveiled in China by an international team of physicists. Based on a quantum dot, the device is an efficient source of photons that emerge as solo particles that are indistinguishable from each other. The researchers are now trying to use the source to create a quantum computer based on "boson sampling". [11] With the help of a semiconductor quantum dot, physicists at the University of Basel have developed a new type of light source that emits single photons. For the first time, the researchers have managed to create a stream of identical photons. [10] Optical photons would be ideal carriers to transfer quantum information over large distances. Researchers envisage a network where information is processed in certain nodes and transferred between them via photons. [9] While physicists are continually looking for ways to unify the theory of relativity, which describes large-scale phenomena, with quantum theory, which describes small-scale phenomena, computer scientists are searching for technologies to build the quantum computer using Quantum Information. In August 2013, the achievement of "fully deterministic" quantum teleportation, using a hybrid technique, was reported. On 29 May 2014, scientists announced a reliable way of transferring data by quantum teleportation. Quantum teleportation of data had been done before but with highly unreliable methods. The accelerating electrons explain not only the Maxwell Equations and the Special Relativity, but the Heisenberg Uncertainty Relation, the Wave-Particle Duality and the electron's spin also, building the Bridge between the Classical and Quantum Theories. The Planck Distribution Law of the electromagnetic oscillators explains the electron/proton mass rate and the Weak and Strong Interactions by the diffraction patterns. The Weak Interaction changes the diffraction patterns by moving the electric charge from one side to the other side of the diffraction pattern, which violates the CP and Time reversal symmetry. The diffraction patterns and the locality of the self-maintaining electromagnetic potential explains also the Quantum Entanglement, giving it as a natural part of the Relativistic Quantum Theory and making possible to build the Quantum Computer with the help of Quantum Information.

**Category:** Quantum Physics

[478] **viXra:1701.0608 [pdf]**
*submitted on 2017-01-25 10:42:19*

**Authors:** Miguel A. Sanchez-Rey

**Comments:** 3 Pages.

Ideological war-crime and propaganda.

**Category:** Social Science

[477] **viXra:1701.0607 [pdf]**
*submitted on 2017-01-25 10:42:08*

**Authors:** George Rajna

**Comments:** 10 Pages.

To carry out this experiment, Chen and Mourou suggest a laser pulse could be sent through a plasma target. [11]
Jeff Steinhauer, a physicist at the Israel Institute of Technology, has published a paper in the journal Nature Physics describing experiments in which he attempted to create a virtual black hole in the lab in order to prove that Stephen Hawking's theory of radiation emanating from black holes is correct —though his experiments are based on sound, rather than light. In his paper, he claims to have observed the quantum effects of Hawking radiation in his lab as part of a virtual black hole—which, if proven to be true, will be the first time it has ever been achieved.
New Research Mathematically Proves Quantum Effects Stop the Formation of Black Holes. By merging two seemingly conflicting theories, Laura Mersini-Houghton, a physics professor at UNC-Chapel Hill in the College of Arts and Sciences, has proven, mathematically, that black holes can never come into being in the first place. The works not only forces scientists to reimagining the fabric of space-time, but also rethink the origins of the universe.
Considering the positive logarithmic values as the measure of entropy and the negative logarithmic values as the measure of information we get the Information – Entropy Theory of Physics, used first as the model of the computer chess program built in the Hungarian Academy of Sciences.
Applying this model to physics we have an understanding of the perturbation theory of the QED and QCD as the Information measure of Physics. We have an insight to the current research of Quantum Information Science. The generalization of the Weak Interaction shows the arrow of time in the associate research fields of the biophysics and others. We discuss also the event horizon of the Black Holes, closing the information inside.

**Category:** Astrophysics

[476] **viXra:1701.0606 [pdf]**
*submitted on 2017-01-25 13:02:07*

**Authors:** George Rajna

**Comments:** 35 Pages.

Advanced photonic nanostructures are well on their way to revolutionising quantum technology for quantum networks based on light. Researchers from the Niels Bohr Institute have now developed the first building blocks needed to construct complex quantum photonic circuits for quantum networks. [25] Characterising quantum channels with non-separable states of classical light the researchers demonstrate the startling result that sometimes Nature cannot tell the difference between particular types of laser beams and quantum entangled photons. [24] Physicists at Princeton University have revealed a device they've created that will allow a single electron to transfer its quantum information to a photon. [23] A strong, short light pulse can record data on a magnetic layer of yttrium iron garnet doped with Co-ions. This was discovered by researchers from Radboud University in the Netherlands and Bialystok University in Poland. The novel mechanism outperforms existing alternatives, allowing the fastest read-write magnetic recording accompanied by unprecedentedly low heat load. [22] It goes by the unwieldy acronym STT-MRAM, which stands for spin-transfer torque magnetic random access memory. [21] Memory chips are among the most basic components in computers. The random access memory is where processors temporarily store their data, which is a crucial function. Researchers from Dresden and Basel have now managed to lay the foundation for a new memory chip concept. [20] Researchers have built a record energy-efficient switch, which uses the interplay of electricity and a liquid form of light, in semiconductor microchips. The device could form the foundation of future signal processing and information technologies, making electronics even more efficient. [19] The magnetic structure of a skyrmion is symmetrical around its core; arrows indicate the direction of spin. [18] According to current estimates, dozens of zettabytes of information will be stored electronically by 2020, which will rely on physical principles that facilitate the use of single atoms or molecules as basic memory cells. [17] EPFL scientists have developed a new perovskite material with unique properties that can be used to build next-generation hard drives. [16]

**Category:** Quantum Physics

[475] **viXra:1701.0605 [pdf]**
*replaced on 2017-02-07 03:28:33*

**Authors:** Solomon I. Khmelnik

**Comments:** 4 Pages.

Based on gravitomagnetism theory, a mathematical model of cloud is suggested. It allows answering the question put in the header.

**Category:** Geophysics

[474] **viXra:1701.0604 [pdf]**
*submitted on 2017-01-25 05:44:04*

**Authors:** George Rajna

**Comments:** 29 Pages.

Scientists at the Center for Axion and Precision Physics Research (CAPP), within the Institute for Basic Science (IBS) have optimized some of the characteristics of a magnet to hunt for one possible component of dark matter called axion. [21] The first sighting of clustered dwarf galaxies bolsters a leading theory about how big galaxies such as our Milky Way are formed, and how dark matter binds them, researchers said Monday. [20] Scientists from The University of Manchester working on a revolutionary telescope project have harnessed the power of distributed computing from the UK's GridPP collaboration to tackle one of the Universe's biggest mysteries – the nature of dark matter and dark energy. [18] In the search for the mysterious dark matter, physicists have used elaborate computer calculations to come up with an outline of the particles of this unknown form of matter. [17] Unlike x-rays that the naked eye can't see but equipment can measure, scientists have yet to detect dark matter after three decades of searching, even with the world's most sensitive instruments. [16] Scientists have lost their latest round of hide-and-seek with dark matter, but they're not out of the game. [15] A new study is providing evidence for the presence of dark matter in the innermost part of the Milky Way, including in our own cosmic neighborhood and the Earth's location. The study demonstrates that large amounts of dark matter exist around us, and also between us and the Galactic center. The result constitutes a fundamental step forward in the quest for the nature of dark matter. [14] Researchers may have uncovered a way to observe dark matter thanks to a discovery involving X-ray emissions. [13] Between 2009 and 2013, the Planck satellite observed relic radiation, sometimes called cosmic microwave background (CMB) radiation. Today, with a full analysis of the data, the quality of the map is now such that the imprints left by dark matter and relic neutrinos are clearly visible. [12] The gravitational force attracting the matter, causing concentration of the matter in a small space and leaving much space with low matter concentration: dark matter and energy. There is an asymmetry between the mass of the electric charges, for example proton and electron, can understood by the asymmetrical Planck Distribution Law. This temperature dependent energy distribution is asymmetric around the maximum intensity, where the annihilation of matter and antimatter is a high probability event. The asymmetric sides are creating different frequencies of electromagnetic radiations being in the same intensity level and compensating each other. One of these compensating ratios is the electron – proton mass ratio. The lower energy side has no compensating intensity level, it is the dark energy and the corresponding matter is the dark matter. The Weak Interaction changes the temperature dependent Planck Distribution of the electromagnetic oscillations and changing the non-compensated dark matter rate, giving the responsibility to the sterile neutrino.

**Category:** Astrophysics

[473] **viXra:1701.0603 [pdf]**
*submitted on 2017-01-23 21:19:17*

**Authors:** Ricardo Gobato

**Comments:** 9 Pages. PJSE, v.3, n.1. 1-9 (2017). Parana Journal of Science and Education.

The present work describes the equilibrium configuration of the caramboxin molecule studied using the Hartree-Fock (HF) and Density functional theory (DFT) calculations. With the DFT calculations, the total energy for the singlet state of caramboxin molecule has been estimated to be -933.3870701 a.u. Furthermore, the binding energy of the caramboxin molecule has been estimated to be 171.636 kJ/mol. The carambola or star fruit is a fruit used for human consumption in juices, desserts, pastries, custards, jellies, or even in natural consumption. Recent research indicates that it has great toxicity for people with kidney failure, and may even lead to death. Experiments demonstrated that it has glutamatergic effects, which means that it affects the function of the neurotransmitter glutamate, thus explaining the neurological effects. Our calculations indicate that the main active sites in carambox are the -OH (alcohols) groups, and the two carboxyl (-COOH) groups.

**Category:** Condensed Matter

[472] **viXra:1701.0602 [pdf]**
*submitted on 2017-01-24 00:00:25*

**Authors:** Marius Coman

**Comments:** 2 Pages.

In this paper I conjecture that for any 3-Carmichael number (absolute Fermat pseudoprime with three prime factors, see the sequence A087788 in OEIS) of the form (4*h + 1)*(4*j + 1)*(4*k + 1) is true that h, j and k must share a common factor (in fact, for seven from a randomly chosen set of ten consecutive, reasonably large, such numbers it is true that both j and k are multiples of h). The conjecture is probably true even for the larger set of 3-Poulet numbers (Fermat pseudoprimes to base 2 with three prime factors, see the sequence 215672 in OEIS).

**Category:** Number Theory

[471] **viXra:1701.0600 [pdf]**
*submitted on 2017-01-24 02:35:20*

**Authors:** Marius Coman

**Comments:** 2 Pages.

In this paper I conjecture that for any 3-Carmichael number (absolute Fermat pseudoprime with three prime factors, see the sequence A087788 in OEIS) of the form (4*h + 3)*(4*j + 1)*(4*k + 3) is true that (k – h) and j must share a common factor (sometimes (k – h) is a multiple of j). The conjecture is probably true even for the larger set of 3-Poulet numbers (Fermat pseudoprimes to base 2 with three prime factors, see the sequence 215672 in OEIS).

**Category:** Number Theory

[470] **viXra:1701.0599 [pdf]**
*replaced on 2017-02-09 17:17:05*

**Authors:** Furkan Semih Dündar, Barış Tamer Tonguç

**Comments:** 2 Pages.

A model universe is analyzed with N protons and electrons where there are electromagnetic and spin interactions in the Hamiltonian is investigated in the context of quantum shape kinematics. We have found that quantum shape space exists for N≥4 particles and has 2N−7 functional degrees of freedom in the case of spin-1/2 particles. The emergence of space is associated with non-vanishing expectation value ⟨L^2⟩. We have shown that for odd N space always emerges, and for large even N space almost always emerges because ⟨L^2⟩≠0 for almost all states. In the limit N→∞ the density of states that yields ⟨L^2⟩=0 vanishes. Therefore we conclude that the space is almost always emergent in quantum shape kinematics.

**Category:** Quantum Gravity and String Theory

[469] **viXra:1701.0597 [pdf]**
*replaced on 2017-02-15 12:59:03*

**Authors:** Muhammad Akram, Saba Siddique

**Comments:** 16 Pages.

Neutrosophic sets are the generalization of the concept of fuzzy sets and intuitionistic fuzzy sets. Neutrosophic
models give more flexibility, precisions and compatibility to the system as compared to the classical, fuzzy and intuitionistic fuzzy models. In this research paper, we present certain types of single-valued neutrosophic graphs, including regular single-valued neutrosophic graphs, totally regular single-valued neutrosophic graphs, edge regular single-valued neutrosophic graphs and totally edge regular single-valued neutrosophic graphs. We also investigate some of their related properties

**Category:** General Mathematics

[468] **viXra:1701.0596 [pdf]**
*submitted on 2017-01-24 08:29:10*

**Authors:** Wan-Chung Hu

**Comments:** 8 Pages.

Charge relativity means that charge can cause space-time vortex. Using this concept, we can solve the puzzle of dark matter. Charge causes spacetime torsion. By combining Einstein field equation and Faraday electromagnetic torsion tensor, we can get a complete universe field equation including gravity, light pressure, and electromagnetism.

**Category:** Classical Physics

[467] **viXra:1701.0595 [pdf]**
*submitted on 2017-01-24 08:33:34*

**Authors:** Satyavarapu Naga Parameswara Gupta

**Comments:** 9 Pages. This Exxay was submitted to FQXI---- FQXi FORUM: Wandering Towards a Goal Essay Contest (2016-2017) January 24, 2017

There are many distant Galaxies whose distances are about 30 Giga light years. There are Galaxies which were born just after 400 million years after Bigbang and which were born 6 to 7 billion years after Bigbang. In Dynamic Universe Model all the Galaxies are distributed at different distances and the Universe looks similar as it is now. On 01March2016 the news of the discovery about the observation of most distant Galaxy ‘GNZ11 or GN11’having an light travel distance 13.4 Giga light years & co-moving distance of 32 Giga light years and having a red shift z of 11.1 created quite a stir. There are many such Galaxies like EGSY8p7 (with z= 8.68, age =13.2) and EGS-zs8-1 (with z= 7.73, age =13.04). To show that Universe exists further to GNz11, a Galaxy at distance of 100 times 13.4 Billion light years (1.26862E+28 meters) is simulated and named it as GNz11 in this simulation. 132 more galaxies were assumed in the range (3.02001E+26 to 1.26862E+28) meter. Later distance of the first Galaxy was reduced by 50% and found the graphs of Universe become similar in both the simulations after 102 iterations.
Slowly life of the stars and hence subsequently the life of Galaxy will come to end because of their electromagnetic radiation. Galaxies tend to evolve from spiral to elliptical structure and they perish to form Blue clouds known as Galaxy "quenching". Hence we can say that our Universe had reproduction ability, which is a very slow process. Universe produces new Galaxies, and the already formed Galaxies perish slowly. Ours is single universe and is a closed one. In other words, our Universe reproduces its Galaxies, as and when light and other electromagnetic radiation condenses to form enough matter.

**Category:** Astrophysics

[466] **viXra:1701.0594 [pdf]**
*submitted on 2017-01-24 08:53:27*

**Authors:** Satyavarapu Naga Parameswara Gupta

**Comments:** 32 Pages. This paper is published in Open Journal of Modelling and Simulation with DOI 10.4236/ojmsi.2017.51009 . ISSN Print: 2327-4018 , ISSN Online: 2327-4026 , Gupta, S.N.P., (2017), 5, 113-144.

There are many blue shifted Galaxies in our universe. Here we will see old simulations
to make such predictions from the output graphs using SITA simulations.
There are four new simulations also presented here. In these sets of simulations, different
point masses are placed in different distances in a 3D Cartesian coordinate
grid; and these point masses are allowed to move on universal gravitation force
(UGF) acting on each mass at that instant of time at its position. The output pictures
depict the three dimensional orbit formations of point masses after some iterations.
In an orbit so formed, some Galaxies are coming near (Blue shifted) and some are
going away (Red shifted). In this paper, the simulations predicted the existence of a
large number of Blue shifted Galaxies, in an expanding universe, in 2004 itself. Over
8300 blue shifted galaxies have been discovered extending beyond the Local Group
by Hubble Space Telescope (HST) in the year 2009. Thus Dynamic Universe model
predictions came true.

**Category:** Astrophysics

[465] **viXra:1701.0591 [pdf]**
*submitted on 2017-01-24 11:51:47*

**Authors:** Alphonsus J. Fagan

**Comments:** 11 Pages.

The Beckenstein-Hawking formula for black hole entropy indicates that black holes have the highest possible entropy density in the universe. This suggests that, in addition to increasing entropy by its inexorable expansion, the universe can also increase it by creating more black holes. And one way to accomplish this would be to have the gravitational 'constant' ('Big G') increase with time. This paper explores how the dynamic of these two competing entropic processes might play out, how 'Big G' might vary with time, and where to look for evidence.

**Category:** Astrophysics

[464] **viXra:1701.0590 [pdf]**
*replaced on 2017-04-04 11:43:56*

**Authors:** Yurii A. Spirichev

**Comments:** 8 Pages.

The article deals with the choice of the energy-momentum tensor in electrodynamics. Considered the electromagnetic force in a continuous medium of the following Minkowski and Abraham tensors. From Minkowski tensor the equations of conservation of energy-momentum density, density of electromagnetic force balance in a continuous medium and the equation for the Abraham force. It is shown that it is equal to zero when choosing a canonical material equations. It is shown that the equivalence of the Minkowski momentum density and Abraham. Arguments in favor of a unique choice of the tensor of the Minkowski and Abraham tensor incomplete.

**Category:** Classical Physics

[463] **viXra:1701.0589 [pdf]**
*replaced on 2017-01-25 14:51:55*

**Authors:** Tamas Lajtner

**Comments:** 16 Pages.

In space-matter model both matter and space have three spatial dimensions. Time is the result of the action-reaction of space and matter. The action-reaction motions of space and matter must be synchronized. The synchronization of these motions needs algorithms of both sides; matter and space must have algorithms. Space cannot be defined without matter. Space is what matter uses as space. Matter is what can exist as matter in the given space. The relation of space and matter cannot be created if the amount of information of space and matter cannot maintain the relationship of space and matter.
In space-matter model solely through the use of space waves, we can express spatial distance, time and energy. It is possible to express all these phenomena in eVolt, so meters can be converted into seconds or into kgs and vice versa. Saying this, we must realize that there is a surprising gateway between space and matter.

**Category:** Relativity and Cosmology

[462] **viXra:1701.0588 [pdf]**
*replaced on 2017-04-14 07:26:33*

**Authors:** Andrei-Lucian Dragoi

**Comments:** 30 Pages.

This paper proposes the generalization of the both binary (strong) and ternary (weak) Goldbach’s Conjectures (BGC and TGC) [1,2,3,4] [5,6,7], briefly called “the Vertical Goldbach’s Conjectures” (VBGC and VTGC), which are essentially meta-conjectures (as VBGC states an infinite number of conjectures stronger than BGC). VBGC was discovered in 2007[1] and perfected until 2016[2] by using the arrays (S_p and S_i,p) of Matrix of Goldbach index-partitions (GIPs) (simple M_p,n and recursive M_i,p,n, with iteration order i ≥ 0), which are a useful tool in studying BGC by focusing on prime indexes (as the function P_n that numbers the primes is a bijection). Simple M (M_p,n) and recursive M (M_i,p,n) are related to the concept of generalized “primeths” (a term first used by Fernandez N. in his “The Exploring Primeness Project”), which is the generalization with iteration order i≥0 of the known “higher-order prime numbers” (alias “superprime numbers”, “super-prime numbers”, ”super-primes”, ” super-primes” or “prime-indexed primes[PIPs]”) as a subset of (simple or recursive) primes with (also) prime indexes (iPx is the x-th o-primeth, with iteration order i ≥ 0 as explained later on). The author of this article also brings in a S-M-synthesis of some Goldbach-like conjectures (GLC) (including those which are “stronger” than BGC) and a new class of GLCs “stronger” than BGC, from which VBGC (which is essentially a variant of BGC applied on a serial array of subsets of primeths with a general iteration order i ≥ 0) distinguishes as a very important conjecture of primes (with great importance in the optimization of the BGC experimental verification and other potential useful theoretical and practical applications in mathematics [including cryptography and fractals] and physics [including crystallography and M-Theory]), and a very special self-similar propriety of the primes subset of (noted/abbreviated as or as explained later on in this article). Keywords: Prime (number), primes with prime indexes, the i-primeths (with iteration order i≥0), the Binary Goldbach Conjecture (BGC), the Ternary Goldbach Conjecture (TGC), Goldbach index-partition (GIP), fractal patterns of the number and distribution of Goldbach index-partitions, Goldbach-like conjectures (GLC), the Vertical Binary Goldbach Conjecture (VBGC) and Vertical Ternary Goldbach Conjecture (VTGC) the as applied on i-primeths
(VBGC 1.5e - the conjecture only - 23.02.2017 - 21 pages) The "Vertical" (generalization of) the Binary Goldbach's Conjecture (VBGC) as applied on “iterative” primes with (recursive) prime indexes (i-primeths). Available from: https://www.researchgate.net/publication/313038562_VBGC_15e_-_the_conjecture_only_-_23022017_-_21_pages_The_Vertical_generalization_of_the_Binary_Goldbach%27s_Conjecture_VBGC_as_applied_on_iterative_primes_with_recursive_prime_indexes_i-primeths [accessed Apr 14, 2017].

**Category:** Number Theory

[461] **viXra:1701.0587 [pdf]**
*submitted on 2017-01-25 03:32:22*

**Authors:** George Rajna

**Comments:** 22 Pages.

Patrick Hayden and Robert Myers describe how the study of “qubits”, quantum bits of information, may hold the key to uniting quantum theory and general relativity into a unified theory of quantum gravity. [13]
Cosmologists trying to understand how to unite the two pillars of modern science – quantum physics and gravity – have found a new way to make robust predictions about the effect of quantum fluctuations on primordial density waves, ripples in the fabric of space and time. [12]
Physicists have performed a test designed to investigate the effects of the expansion of the universe—hoping to answer questions such as "does the expansion of the universe affect laboratory experiments?", "might this expansion change the lengths of solid objects and the time measured by atomic clocks differently, in violation of Einstein's equivalence principle?", and "does spacetime have a foam-like structure that slightly changes the speed of photons over time?", an idea that could shed light on the connection between general relativity and quantum gravity. [11]
Einstein's equivalence principle states that an object in gravitational free fall is physically equivalent to an object that is accelerating with the same amount of force in the absence of gravity. This principle lies at the heart of general relativity and has been experimentally tested many times. Now in a new paper, scientists have experimentally demonstrated a conceptually new way to test the equivalence principle that could detect the effects of a relatively new concept called spin-gravity coupling. [10]
A recent peer-reviewed paper by physicist James Franson from the University of Maryland in the US has initiated a stir among physics community. Issued in the New Journal of Physics, the paper points to evidence proposing that the speed of light as defined by the theory of general relativity, is slower than originally thought. [9]
Gravitational time dilation causes decoherence of composite quantum systems. Even if gravitons are there, it’s probable that we would never be able to perceive them. Perhaps, assuming they continue inside a robust model of quantum gravity, there may be secondary ways of proving their actuality. [7]
The magnetic induction creates a negative electric field, causing an electromagnetic inertia responsible for the relativistic mass change; it is the mysterious Higgs Field giving mass to the particles. The accelerating electrons explain not only the Maxwell Equations and the Special Relativity, but the Heisenberg Uncertainty Relation, the Wave-Particle Duality and the electron’s spin also, building the Bridge between the Classical and Quantum Theories.
The Planck Distribution Law of the electromagnetic oscillators explains the electron/proton mass rate and the Weak and Strong Interactions by the diffraction patterns. The Weak Interaction changes the diffraction patterns by moving the electric charge from one side to the other side of the diffraction pattern, which violates the CP and Time reversal symmetry.
The self maintained electric potential of the accelerating charges equivalent with the General Relativity space-time curvature, and since it is true on the quantum level also, gives the base of the Quantum Gravity.
The diffraction patterns and the locality of the self-maintaining electromagnetic potential explains also the Quantum Entanglement, giving it as a natural part of the relativistic quantum theory.

**Category:** Astrophysics

[460] **viXra:1701.0586 [pdf]**
*submitted on 2017-01-25 04:36:42*

**Authors:** George Rajna

**Comments:** 27 Pages.

The first sighting of clustered dwarf galaxies bolsters a leading theory about how big galaxies such as our Milky Way are formed, and how dark matter binds them, researchers said Monday. [20] Scientists from The University of Manchester working on a revolutionary telescope project have harnessed the power of distributed computing from the UK's GridPP collaboration to tackle one of the Universe's biggest mysteries – the nature of dark matter and dark energy. [18] In the search for the mysterious dark matter, physicists have used elaborate computer calculations to come up with an outline of the particles of this unknown form of matter. [17] Unlike x-rays that the naked eye can't see but equipment can measure, scientists have yet to detect dark matter after three decades of searching, even with the world's most sensitive instruments. [16] Scientists have lost their latest round of hide-and-seek with dark matter, but they're not out of the game. [15] A new study is providing evidence for the presence of dark matter in the innermost part of the Milky Way, including in our own cosmic neighborhood and the Earth's location. The study demonstrates that large amounts of dark matter exist around us, and also between us and the Galactic center. The result constitutes a fundamental step forward in the quest for the nature of dark matter. [14] Researchers may have uncovered a way to observe dark matter thanks to a discovery involving X-ray emissions. [13] Between 2009 and 2013, the Planck satellite observed relic radiation, sometimes called cosmic microwave background (CMB) radiation. Today, with a full analysis of the data, the quality of the map is now such that the imprints left by dark matter and relic neutrinos are clearly visible. [12] The gravitational force attracting the matter, causing concentration of the matter in a small space and leaving much space with low matter concentration: dark matter and energy. There is an asymmetry between the mass of the electric charges, for example proton and electron, can understood by the asymmetrical Planck Distribution Law. This temperature dependent energy distribution is asymmetric around the maximum intensity, where the annihilation of matter and antimatter is a high probability event. The asymmetric sides are creating different frequencies of electromagnetic radiations being in the same intensity level and compensating each other. One of these compensating ratios is the electron – proton mass ratio. The lower energy side has no compensating intensity level, it is the dark energy and the corresponding matter is the dark matter. The Weak Interaction changes the temperature dependent Planck Distribution of the electromagnetic oscillations and changing the non-compensated dark matter rate, giving the responsibility to the sterile neutrino.

**Category:** Astrophysics

[459] **viXra:1701.0585 [pdf]**
*submitted on 2017-01-23 13:26:30*

**Authors:** Marius Coman

**Comments:** 2 Pages.

In this paper I conjecture that for any 2-Poulet number (Fermat pseudoprime to base 2 with two prime factors, see the sequence A214305 in OEIS) of the form (4*h + 1)*(4*k + 1) is true that h and k can not be relatively primes (in fact, for sixteen from the first twenty 2-Poulet numbers of this form is true that k is a multiple of h and this is also the case for four from a randomly chosen set of five consecutive, much larger, such numbers).

**Category:** Number Theory

[458] **viXra:1701.0583 [pdf]**
*submitted on 2017-01-23 10:13:23*

**Authors:** George Rajna

**Comments:** 26 Pages.

A team of scientists has compositionally modified magnetite to capture visible sunlight and convert this light energy into electrical current. [15] Just like in normal road traffic, crossings are indispensable in optical signal processing. In order to avoid collisions, a clear traffic rule is required. A new method has now been developed at TU Wien to provide such a rule for light signals. [14] Researchers have developed a way to use commercial inkjet printers and readily available ink to print hidden images that are only visible when illuminated with appropriately polarized waves in the terahertz region of the electromagnetic spectrum. [13] That is, until now, thanks to the new solution devised at TU Wien: for the first time ever, permanent magnets can be produced using a 3D printer. This allows magnets to be produced in complex forms and precisely customised magnetic fields, required, for example, in magnetic sensors. [12] For physicists, loss of magnetisation in permanent magnets can be a real concern. In response, the Japanese company Sumitomo created the strongest available magnet—one offering ten times more magnetic energy than previous versions—in 1983. [11] New method of superstrong magnetic fields' generation proposed by Russian scientists in collaboration with foreign colleagues. [10] By showing that a phenomenon dubbed the "inverse spin Hall effect" works in several organic semiconductors-including carbon-60 buckyballs-University of Utah physicists changed magnetic "spin current" into electric current. The efficiency of this new power conversion method isn't yet known, but it might find use in future electronic devices including batteries, solar cells and computers. [9] Researchers from the Norwegian University of Science and Technology (NTNU) and the University of Cambridge in the UK have demonstrated that it is possible to directly generate an electric current in a magnetic material by rotating its magnetization. [8] This paper explains the magnetic effect of the electric current from the observed effects of the accelerating electrons, causing naturally the experienced changes of the electric field potential along the electric wire. The accelerating electrons explain not only the Maxwell Equations and the Special Relativity, but the Heisenberg Uncertainty Relation, the wave particle duality and the electron's spin also, building the bridge between the Classical and Quantum Theories. The changing acceleration of the electrons explains the created negative electric field of the magnetic induction, the changing relativistic mass and the Gravitational Force, giving a Unified Theory of the physical forces. Taking into account the Planck Distribution Law of the electromagnetic oscillators also, we can explain the electron/proton mass rate and the Weak and Strong Interactions.

**Category:** Quantum Physics

[457] **viXra:1701.0581 [pdf]**
*submitted on 2017-01-23 10:59:24*

**Authors:** George Rajna

**Comments:** 34 Pages.

Characterising quantum channels with non-separable states of classical light the researchers demonstrate the startling result that sometimes Nature cannot tell the difference between particular types of laser beams and quantum entangled photons. [24] Physicists at Princeton University have revealed a device they've created that will allow a single electron to transfer its quantum information to a photon. [23] A strong, short light pulse can record data on a magnetic layer of yttrium iron garnet doped with Co-ions. This was discovered by researchers from Radboud University in the Netherlands and Bialystok University in Poland. The novel mechanism outperforms existing alternatives, allowing the fastest read-write magnetic recording accompanied by unprecedentedly low heat load. [22] It goes by the unwieldy acronym STT-MRAM, which stands for spin-transfer torque magnetic random access memory. [21] Memory chips are among the most basic components in computers. The random access memory is where processors temporarily store their data, which is a crucial function. Researchers from Dresden and Basel have now managed to lay the foundation for a new memory chip concept. [20] Researchers have built a record energy-efficient switch, which uses the interplay of electricity and a liquid form of light, in semiconductor microchips. The device could form the foundation of future signal processing and information technologies, making electronics even more efficient. [19] The magnetic structure of a skyrmion is symmetrical around its core; arrows indicate the direction of spin. [18] According to current estimates, dozens of zettabytes of information will be stored electronically by 2020, which will rely on physical principles that facilitate the use of single atoms or molecules as basic memory cells. [17] EPFL scientists have developed a new perovskite material with unique properties that can be used to build next-generation hard drives. [16] Scientists have fabricated a superlattice of single-atom magnets on graphene with a density of 115 terabits per square inch, suggesting that the configuration could lead to next-generation storage media. [15]

**Category:** Quantum Physics

[456] **viXra:1701.0580 [pdf]**
*submitted on 2017-01-23 11:48:53*

**Authors:** George Rajna

**Comments:** 17 Pages.

Cosmologists trying to understand how to unite the two pillars of modern science – quantum physics and gravity – have found a new way to make robust predictions about the effect of quantum fluctuations on primordial density waves, ripples in the fabric of space and time. [12] Physicists have performed a test designed to investigate the effects of the expansion of the universe—hoping to answer questions such as "does the expansion of the universe affect laboratory experiments?", "might this expansion change the lengths of solid objects and the time measured by atomic clocks differently, in violation of Einstein's equivalence principle?", and "does spacetime have a foam-like structure that slightly changes the speed of photons over time?", an idea that could shed light on the connection between general relativity and quantum gravity. [11] Einstein's equivalence principle states that an object in gravitational free fall is physically equivalent to an object that is accelerating with the same amount of force in the absence of gravity. This principle lies at the heart of general relativity and has been experimentally tested many times. Now in a new paper, scientists have experimentally demonstrated a conceptually new way to test the equivalence principle that could detect the effects of a relatively new concept called spin-gravity coupling. [10] A recent peer-reviewed paper by physicist James Franson from the University of Maryland in the US has initiated a stir among physics community. Issued in the New Journal of Physics, the paper points to evidence proposing that the speed of light as defined by the theory of general relativity, is slower than originally thought. [9] Gravitational time dilation causes decoherence of composite quantum systems. Even if gravitons are there, it's probable that we would never be able to perceive them. Perhaps, assuming they continue inside a robust model of quantum gravity, there may be secondary ways of proving their actuality. [7] The magnetic induction creates a negative electric field, causing an electromagnetic inertia responsible for the relativistic mass change; it is the mysterious Higgs Field giving mass to the particles. The accelerating electrons explain not only the Maxwell Equations and the Special Relativity, but the Heisenberg Uncertainty Relation, the Wave-Particle Duality and the electron's spin also, building the Bridge between the Classical and Quantum Theories. The Planck Distribution Law of the electromagnetic oscillators explains the electron/proton mass rate and the Weak and Strong Interactions by the diffraction patterns. The Weak Interaction changes the diffraction patterns by moving the electric charge from one side to the other side of the diffraction pattern, which violates the CP and Time reversal symmetry. The self maintained electric potential of the accelerating charges equivalent with the General Relativity space-time curvature, and since it is true on the quantum level also, gives the base of the Quantum Gravity. The diffraction patterns and the locality of the self-maintaining electromagnetic potential explains also the Quantum Entanglement, giving it as a natural part of the relativistic quantum theory.

**Category:** Quantum Gravity and String Theory

[455] **viXra:1701.0576 [pdf]**
*submitted on 2017-01-23 09:11:18*

**Authors:** Dragan Turanyanin, Svetozar Jovičin

**Comments:** 14 Pages.

The aim of this article would be to show planar (whether polar, Cartesian or parametric) functions from a different, implicit viewpoint, hence the term inpolars (inpolar curves). The whole set of brand new planar curves can be seen from that perspective. Their generic mechanism is the so called inpolar transformation as well as its inpolar inversion. One entirely new geometric system is defined this way.

**Category:** Geometry

[454] **viXra:1701.0575 [pdf]**
*submitted on 2017-01-23 09:09:10*

**Authors:** Raymond HV Gallucci

**Comments:** 4 Pages.

Following some earlier work by Antoci and Abrams, Crothers has spent at least the past decade arguing the mathematical impossibility of the black hole. Following a brief review of the mathematical argument, a physical one is presented, based on analysis of the ‘irresistible force’ of increasing gravity allegedly collapsing a neutron star with an even greater ‘immovable object’ of increasing density into a black hole. This physical argument supports Crothers’, et al., contention that a black hole is both a mathematical as well as physical impossibility.

**Category:** Relativity and Cosmology

[453] **viXra:1701.0574 [pdf]**
*submitted on 2017-01-22 21:33:04*

**Authors:** Thomas Lambert

**Comments:** 8 Pages.

In recent years, much research has been devoted
to the improvement of architecture; unfortunately,
few have explored the emulation of the
World Wide Web. In fact, few biologists would
disagree with the deployment of evolutionary
programming. While this discussion is never a
confirmed intent, it is derived from known results.
Mugwump, our new framework for hash
tables [28], is the solution to all of these challenges.

**Category:** Artificial Intelligence

[452] **viXra:1701.0573 [pdf]**
*submitted on 2017-01-22 21:38:03*

**Authors:** Mildred Bennet, Timothy Sato, Frank West

**Comments:** 6 Pages.

Many end-users would agree that, had it
not been for systems, the improvement of
fiber-optic cables might never have occurred.
Given the current status of self-learning symmetries,
physicists clearly desire the deployment
of courseware, which embodies the compelling
principles of unstable operating systems.
We construct a novel methodology for
the evaluation of hash tables, which we call
MOP.

**Category:** Data Structures and Algorithms

[451] **viXra:1701.0572 [pdf]**
*submitted on 2017-01-22 21:57:46*

**Authors:** R. Salvato, G. Casey

**Comments:** 6 Pages.

Many experts would agree that, had it not
been for the study of context-free grammar,
the understanding of the UNIVAC computer
might never have occurred. This is crucial
to the success of our work. In fact, few analysts
would disagree with the visualization of
spreadsheets, which embodies the important
principles of software engineering. In order
to realize this intent, we describe new robust
modalities (Destrer), which we use to validate
that architecture and wide-area networks can
collude to realize this intent

**Category:** Data Structures and Algorithms

[450] **viXra:1701.0571 [pdf]**
*submitted on 2017-01-22 13:17:59*

**Authors:** Dragan Turanyanin

**Comments:** 8 Pages.

Gravity phenomenon is observed as periodic and wavy by its nature. Wave function describing state of space encircling gravitodynamic vortex is being suggested. In the “strong field” area, a quantization of orbits should be quite natural and fully observable. That phenomenon is named as gravitonium. The so-called Resonance with de Broglie’s wave arises as a natural. A direct consequence is the natural existence of Planck’s values as the main quanta. Resonance observed there could be possible mechanism of mass creation. The whole concept leads to change of 20th century geometrization paradigm towards real wave dynamic description of Universe.

**Category:** Astrophysics

[449] **viXra:1701.0570 [pdf]**
*replaced on 2017-02-07 03:29:37*

**Authors:** Solomon I. Khmelnik

**Comments:** 4 Pages.

Based gravitoelectromagnetism theory, described by the mathematical model of the cloud , that allows you to answer the question posed in the title. \\ На основе теории гравитомагнетизма предлагается математическая модель облака, которая позволяет ответить на вопрос, поставленный в заглавии.

**Category:** Geophysics

[448] **viXra:1701.0569 [pdf]**
*submitted on 2017-01-22 07:52:29*

**Authors:** Jeremy Dunning-Davies

**Comments:** 4 Pages.

Entropy and its physical meaning have been a problem in physics almost since the concept was introduced. Here questions are raised over the correctness of the idea that the Second Law of Thermodynamics may be expressed simply as 'the entropy never decreases'.

**Category:** Thermodynamics and Energy

[447] **viXra:1701.0568 [pdf]**
*submitted on 2017-01-22 09:14:10*

**Authors:** George Rajna

**Comments:** 32 Pages.

Postdoctoral Fellow Dr Seb Weidt, PhD students Kim Lake and Joe Randall at work on the experiment creating 'entanglement' using microwave radiation. [23] A strong, short light pulse can record data on a magnetic layer of yttrium iron garnet doped with Co-ions. This was discovered by researchers from Radboud University in the Netherlands and Bialystok University in Poland. The novel mechanism outperforms existing alternatives, allowing the fastest read-write magnetic recording accompanied by unprecedentedly low heat load. [22] It goes by the unwieldy acronym STT-MRAM, which stands for spin-transfer torque magnetic random access memory. [21] Memory chips are among the most basic components in computers. The random access memory is where processors temporarily store their data, which is a crucial function. Researchers from Dresden and Basel have now managed to lay the foundation for a new memory chip concept. [20] Researchers have built a record energy-efficient switch, which uses the interplay of electricity and a liquid form of light, in semiconductor microchips. The device could form the foundation of future signal processing and information technologies, making electronics even more efficient. [19] The magnetic structure of a skyrmion is symmetrical around its core; arrows indicate the direction of spin. [18] According to current estimates, dozens of zettabytes of information will be stored electronically by 2020, which will rely on physical principles that facilitate the use of single atoms or molecules as basic memory cells. [17] EPFL scientists have developed a new perovskite material with unique properties that can be used to build next-generation hard drives. [16] Scientists have fabricated a superlattice of single-atom magnets on graphene with a density of 115 terabits per square inch, suggesting that the configuration could lead to next-generation storage media. [15] Now a researcher and his team at Tyndall National Institute in Cork have made a 'quantum leap' by developing a technical step that could enable the use of quantum computers sooner than expected. [14]

**Category:** Quantum Physics

[446] **viXra:1701.0567 [pdf]**
*replaced on 2017-02-18 16:22:14*

**Authors:** Peter Cameron, Michaele Suisse

**Comments:** 15 Pages.

We present a wavefunction comprised of the eight fundamental geometric objects of a minimally complete Pauli algebra of 3D space - point, line, plane, and volume elements - endowed with electromagnetic fields. Interactions are modeled as geometric products of wavefunctions, generating a 4D Dirac algebra of flat Minkowski spacetime. The resulting model is naturally gauge invariant, finite, and confined. With regard to the U1 x SU2 x SU3 gauge group at the core of the Standard Model, natural finiteness and gauge invariance are benign. However, reflections from wavefunction geometric impedance mismatches yields natural confinement to the Compton wavelength, providing a new perspective on both weak and strong nuclear forces.

**Category:** High Energy Particle Physics

[445] **viXra:1701.0566 [pdf]**
*submitted on 2017-01-22 00:43:19*

**Authors:** Tufail Abbas

**Comments:** 16 Pages. Keywords: Space, Time, Energy, Charge, Perception of Vision, Dimensions, Rest Velocity, Cubic Lattice.

This paper proposes an introductory conceptual framework for developing explanation about the Universe based upon stationery cubic lattice structure arranged as infinite matrix of infinitesimal cubes. The space and time moves through this lattice resulting into all mass, charge and energy that we see. It is envisioned that through developing of this model based upon base lattice structure it would be possible to express all physical properties (i.e. space, time, energy, charge, momentum, temperature, velocity, force etc.) , in terms of the geometric parameters (i.e. length, surface area, volume, angle, orientation, cycles) of the base structure. Physical reality being a function of perception, the paper discusses our general understanding of perception (of vision in particular), to make interpretations regarding the nature of observable reality and understanding about dimensions, based upon reasoning. Based upon these interpretation, concept of “velocity at rest”is introduced, which is envisioned to provide an explanation about rest mass and gravity

**Category:** Quantum Gravity and String Theory

[444] **viXra:1701.0565 [pdf]**
*submitted on 2017-01-22 02:59:14*

**Authors:** Faisal Amin Yassein Abdelmossin

**Comments:** 4 Pages.

We have modified the standard Einstein field equations by introducing a general function that depends on Ricci’s scalar without prior assumption of the mathematical form of the function. By demanding that the covariant derivative of the energy-momentum tensor should vanish and with application of Bianchi identity a first order ordinary differential equation in the Ricci’s scalar has emerged. By integrating the resulting equation a constant of integration resulting from solving the equation is interpreted as the cosmological constant introduced by Einstein.
The form of function on Ricci’s scalar and on the cosmological constant corresponds to the form of Einstein-Hilbert’s Lagrangian appearing in the action integral.

**Category:** Relativity and Cosmology

[443] **viXra:1701.0564 [pdf]**
*submitted on 2017-01-22 04:25:16*

**Authors:** Amir Deljoo

**Comments:** 11 Pages.

This is a declaration. The identity of mathematics and number theory or arithmetics. I have defined a pattern here that shows consciousness is a pure unique entity that is present everywhere and whole the existence is a graphical manifestation that has been phenomenoned over to enclose it and I hermetically simplify my intuition to transfer it to curious ones. Since explaining the methodology requires in thousands of pages, the final concluded statements and equations are only declared here.

**Category:** Set Theory and Logic

[442] **viXra:1701.0563 [pdf]**
*replaced on 2017-01-31 11:38:56*

**Authors:** امیر دلجو

**Comments:** 12 Pages.

این سند یک منشور است برای بیان ماهیت ریاضیات و نظریه ی اعداد یا حساب. در اینجا من الگویی را تعریف کرده ام که نشان می دهد، خودآگاهی یک وجود واحده بسیط و همه جاحاضر بوده و هستی به مثابه یک کلیّت، یک تجلّی گرافیکی ست که در جهت افشای این خودآگاهی عارض شده و من هرمس وار، شهود خود بر یگانگی و آفرینش را برای انتقال به انسان های کنجکاو ساده سازی و تحریر کرده ام. از آنجا که تبیین روش شناختی این منشور مستلزم هزاران صفحه است، در اینجا تنها عبارات و معادلات منتج شده ی نهایی اعلان می شود.

**Category:** Set Theory and Logic

[441] **viXra:1701.0562 [pdf]**
*submitted on 2017-01-22 06:50:10*

**Authors:** Thomas Görnitz

**Comments:** 34 Pages.

So far the main obstacle for a scientific conception of consciousness as a real and powerfully acting entity was not inherent to the realms of psychology or brain research, but rather the field of physics. A solution had to be sought here. Such a solution is afforded by a new foundation of the physical reality established by abstract and absolute quantum bits of information (AQI bits).
To avoid the popular misunderstanding of „information“ as „meaning“ it was advisable to coin a new designation for the free-of-meaning AQI bits: the AQI bits establish a quantum pre-structure termed „protyposis“ (Greek: „pre-formation“), out of which real objects can form, beginning with energetical and material elementary particles.
The protyposis AQI bits provide a pre-structure for all entities encountered in natural sciences. The AQI bits establish a common basis, from which, in the course of the cosmological and encompassed biological evolution, both the material reality of the brain and the mental reality of the consciousness has formed.
A real understanding of quantum structures can remove the resistance against establishing quantum theory in the field of brain research and consciousness. The key for an understanding lies in the conception of protyposis, abstract quantum information free of any definite meaning. With the AQI bits of the protyposis, massless and massive quantum particles can be constructed, and, ultimately, even quantum information with a special meaning, such as our grammatically formulated thoughts, can become possible.
As long as the fundamental basis of quantum theory is misunderstood as being formed by a manifold of some small objects like atoms, quarks, or strings, then the problem of understanding consciousness has no solution. If in contrast to this quantum theory is understood as a theory based on quantum structures that are not spatially small but truly simple, then there is no longer a fundamental problem for an understanding of consciousness.

**Category:** Mind Science

[440] **viXra:1701.0561 [pdf]**
*submitted on 2017-01-21 14:30:05*

**Authors:** George R. Briggs

**Comments:** 2 Pages.

Abstract: As part of the production of 8-fold composite particles of life in the epoch before the big bang, top quarks and their anti-quarks were produced in large numbers. After disruption of the composite entities (Briggs fermibosons) the freed top quark particles met and annihilated (in active quasars) early in the present epoch. This annihilation was not complete, due to CP violation,however, resulting in the universe without antimatter we see today

**Category:** Relativity and Cosmology

[439] **viXra:1701.0560 [pdf]**
*submitted on 2017-01-21 14:57:21*

**Authors:** Arturo Tozzi, James F Peters, Colin James III

**Comments:** 16 Pages.

The possible presence of further dimensions hidden in our three-dimensional-plus time world might help to elucidate countless physical and biological systems’ behaviors, from quantum entanglement to brain function. Nevertheless, suggestions concerning multidimensional arrangement of physical and biological systems do not deserve the role of scientific claims, unless the suggested additional dimensions can be verified via empirically testable hypotheses and through experimental apparatus. Here we suggest that the widespread nonlinear dynamics and chaotic behavior of physical and biological collective systems might mirror further dimensions hidden in our world. Indeed, bringing together disparate knowledge from seemingly unrelated fields (brane cosmology, fluid dynamics, algebraic topology, computational topology, dynamic systems theory, logic and statistical mechanics), we show how, in logistic maps derived from nonlinear dynamical equations, the typical bifurcation diagrams might arise from linear flow paths, that intersect large-sized hidden dimensions at the canonical phase parameter’s values between three and four. Therefore, chaotic dynamics suggests the existence of a further hidden dimension in our Universe. We also provide a thermodynamic framework which suggests that the cosmic entropy is encompassed in a multidimensional manifold.

**Category:** Quantum Gravity and String Theory

[438] **viXra:1701.0559 [pdf]**
*submitted on 2017-01-21 11:05:33*

**Authors:** George Rajna

**Comments:** 38 Pages.

A Northwestern University team developed a new computational model that performs at human levels on a standard intelligence test. This work is an important step toward making artificial intelligence systems that see and understand the world as humans do. [25] Neuroscience and artificial intelligence experts from Rice University and Baylor College of Medicine have taken inspiration from the human brain in creating a new "deep learning" method that enables computers to learn about the visual world largely on their own, much as human babies do. [24]

**Category:** Artificial Intelligence

[437] **viXra:1701.0558 [pdf]**
*submitted on 2017-01-21 04:56:54*

**Authors:** Thomas Preusser

**Comments:** 6 Pages.

In 2015 scientists at the Large Hadron Collider (LHC) announced the first ever statistically significant observation of a “pentaquark” subatomic particle (R. Aaij, LHCb collaboration, Phys Rev Lett 115, 072001, 12 August 2015). Such a “pentaquark” subatomic particle is allowed within the framework of quantum chromodynamics (QCD) theory which encompasses quark and gluon strong binding interactions. The “pentaquark” is an up-down-up-charm-anti-charm quark combination, i.e. five quarks, hence the name “pentaquark”. The hundreds of papers following pentaquark discovery mostly try to extend current QCD mathematics to explain the pentaquark. These explanations fall short of dealing with the networked processes of the pentaquark. This paper comes at the pentaquark from a higher level networked complex adaptive systems perspective. Ultimately this involves a new theory, General Projective Relativity (GPR), which is based on probabilistic computational entanglement in a projection geometry that goes beyond holographic and ultimately offers promise in furthering scientific knowledge across a wide spectrum including dark matter and dark energy.

**Category:** High Energy Particle Physics

[436] **viXra:1701.0557 [pdf]**
*submitted on 2017-01-21 05:17:20*

**Authors:** Ulrich Berger

**Comments:** 14 Pages.

Abstract
Based on the NCU concept presented earlier, this article describes and calculates a historical scenario for the development of the universe without “Dark Energy” but caused by an average excess of unneutralized protons (pn). This historical scenario consists of the following steps:
Starting with one initial quantum fluctuation, a “Primordial Nucleus” is formed from protons generated by further quantum fluctuations. Based on Mach’s principle, the nucleus is held together due to the very high gravitational constant which is given because of the tiny mass the nucleus contains in the beginning.
The more pn are condensed in the nucleus, the lower G becomes according to Mach’s principle. So, a turning point is reached, and beyond that point the nucleus explodes at a speedof almost c. This event plays the role of the “Big Bang” in the NCU scenario.
Caused by the extremely high acceleration experienced by the pn, they form an expanding hollow sphere and thereby generate our known 3D space. During the expansion, additional pn are steadily imported from fluctuations at the horizon.
Because of the steadily impacting Coulomb acceleration, all pn collect more and more
energy, which is converted into relativistic mass growth, and finally that mass is transformed into stable particles, i.e. protons and electrons - the known neutral matter.

**Category:** Relativity and Cosmology

[435] **viXra:1701.0556 [pdf]**
*submitted on 2017-01-20 18:49:12*

**Authors:** Gerges Francis Tawdrous

**Comments:** 107 Pages.

Arabic version for
the Tabernacle Geometry

**Category:** Archaeology

[434] **viXra:1701.0555 [pdf]**
*submitted on 2017-01-20 21:17:29*

**Authors:** Jeffrey Joseph Wolynski

**Comments:** 1 Page.

If there are older stars orbiting younger ones, then star system formation via capture is the law of star system formation.

**Category:** Astrophysics

[433] **viXra:1701.0554 [pdf]**
*submitted on 2017-01-21 03:08:08*

**Authors:** George Rajna

**Comments:** 32 Pages.

Physicists at Princeton University have revealed a device they’ve created that will allow a single electron to transfer its quantum information to a photon. [23]
A strong, short light pulse can record data on a magnetic layer of yttrium iron garnet doped with Co-ions. This was discovered by researchers from Radboud University in the Netherlands and Bialystok University in Poland. The novel mechanism outperforms existing alternatives, allowing the fastest read-write magnetic recording accompanied by unprecedentedly low heat load. [22]
It goes by the unwieldy acronym STT-MRAM, which stands for spin-transfer torque magnetic random access memory. [21]
Memory chips are among the most basic components in computers. The random access memory is where processors temporarily store their data, which is a crucial function. Researchers from Dresden and Basel have now managed to lay the foundation for a new memory chip concept. [20]
Researchers have built a record energy-efficient switch, which uses the interplay of electricity and a liquid form of light, in semiconductor microchips. The device could form the foundation of future signal processing and information technologies, making electronics even more efficient. [19]
The magnetic structure of a skyrmion is symmetrical around its core; arrows indicate the direction of spin. [18]
According to current estimates, dozens of zettabytes of information will be stored electronically by 2020, which will rely on physical principles that facilitate the use of single atoms or molecules as basic memory cells. [17]
EPFL scientists have developed a new perovskite material with unique properties that can be used to build next-generation hard drives. [16]
Scientists have fabricated a superlattice of single-atom magnets on graphene with a density of 115 terabits per square inch, suggesting that the configuration could lead to next-generation storage media. [15]
Now a researcher and his team at Tyndall National Institute in Cork have made a 'quantum leap' by developing a technical step that could enable the use of quantum computers sooner than expected. [14]
A method to produce significant amounts of semiconducting nanoparticles for light-emitting displays, sensors, solar panels and biomedical applications has gained momentum with a demonstration by researchers at the Department of Energy's Oak Ridge National Laboratory. [13]
A source of single photons that meets three important criteria for use in quantum-information systems has been unveiled in China by an international team of physicists. Based on a quantum dot, the device is an efficient source of photons that emerge as solo particles that are indistinguishable from each other. The researchers are now trying to use the source to create a quantum computer based on "boson sampling". [11]
With the help of a semiconductor quantum dot, physicists at the University of Basel have developed a new type of light source that emits single photons. For the first time, the researchers have managed to create a stream of identical photons. [10]
Optical photons would be ideal carriers to transfer quantum information over large distances. Researchers envisage a network where information is processed in certain nodes and transferred between them via photons. [9]
While physicists are continually looking for ways to unify the theory of relativity, which describes large-scale phenomena, with quantum theory, which describes small-scale phenomena, computer scientists are searching for technologies to build the quantum computer using Quantum Information.
In August 2013, the achievement of "fully deterministic" quantum teleportation, using a hybrid technique, was reported. On 29 May 2014, scientists announced a reliable way of transferring data by quantum teleportation. Quantum teleportation of data had been done before but with highly unreliable methods.
The accelerating electrons explain not only the Maxwell Equations and the Special Relativity, but the Heisenberg Uncertainty Relation, the Wave-Particle Duality and the electron’s spin also, building the Bridge between the Classical and Quantum Theories.
The Planck Distribution Law of the electromagnetic oscillators explains the electron/proton mass rate and the Weak and Strong Interactions by the diffraction patterns. The Weak Interaction changes the diffraction patterns by moving the electric charge from one side to the other side of the diffraction pattern, which violates the CP and Time reversal symmetry.
The diffraction patterns and the locality of the self-maintaining electromagnetic potential explains also the Quantum Entanglement, giving it as a natural part of the Relativistic Quantum Theory and making possible to build the Quantum Computer with the help of Quantum Information.

**Category:** Quantum Physics

[432] **viXra:1701.0553 [pdf]**
*submitted on 2017-01-20 10:05:13*

**Authors:** George Rajna

**Comments:** 28 Pages.

Physicists have proposed that violations of energy conservation in the early universe, as predicted by certain modified theories of quantum mechanics and quantum gravity, may explain the cosmological constant problem, which is sometimes referred to as "the worst theoretical prediction in the history of physics." [20] Scientists from The University of Manchester working on a revolutionary telescope project have harnessed the power of distributed computing from the UK's GridPP collaboration to tackle one of the Universe's biggest mysteries – the nature of dark matter and dark energy. [18] In the search for the mysterious dark matter, physicists have used elaborate computer calculations to come up with an outline of the particles of this unknown form of matter. [17] Unlike x-rays that the naked eye can't see but equipment can measure, scientists have yet to detect dark matter after three decades of searching, even with the world's most sensitive instruments. [16] Scientists have lost their latest round of hide-and-seek with dark matter, but they're not out of the game. [15] A new study is providing evidence for the presence of dark matter in the innermost part of the Milky Way, including in our own cosmic neighborhood and the Earth's location. The study demonstrates that large amounts of dark matter exist around us, and also between us and the Galactic center. The result constitutes a fundamental step forward in the quest for the nature of dark matter. [14] Researchers may have uncovered a way to observe dark matter thanks to a discovery involving X-ray emissions. [13] Between 2009 and 2013, the Planck satellite observed relic radiation, sometimes called cosmic microwave background (CMB) radiation. Today, with a full analysis of the data, the quality of the map is now such that the imprints left by dark matter and relic neutrinos are clearly visible. [12] The gravitational force attracting the matter, causing concentration of the matter in a small space and leaving much space with low matter concentration: dark matter and energy. There is an asymmetry between the mass of the electric charges, for example proton and electron, can understood by the asymmetrical Planck Distribution Law. This temperature dependent energy distribution is asymmetric around the maximum intensity, where the annihilation of matter and antimatter is a high probability event. The asymmetric sides are creating different frequencies of electromagnetic radiations being in the same intensity level and compensating each other. One of these compensating ratios is the electron – proton mass ratio. The lower energy side has no compensating intensity level, it is the dark energy and the corresponding matter is the dark matter. The Weak Interaction changes the temperature dependent Planck Distribution of the electromagnetic oscillations and changing the non-compensated dark matter rate, giving the responsibility to the sterile neutrino.

**Category:** Astrophysics

[431] **viXra:1701.0552 [pdf]**
*submitted on 2017-01-20 11:16:46*

**Authors:** George Rajna

**Comments:** 17 Pages.

Researchers have found a way to trigger the innate, but previously hidden, ability of graphene to act as a superconductor-meaning that it can be made to carry an electrical current with zero resistance. [28] Researchers in Japan have found a way to make the 'wonder material' graphene superconductive-which means electricity can flow through it with zero resistance. The new property adds to graphene's already impressive list of attributes, like the fact that it's stronger than steel, harder than diamond, and incredibly flexible. [27] Superconductivity is a rare physical state in which matter is able to conduct electricity—maintain a flow of electrons—without any resistance. It can only be found in certain materials, and even then it can only be achieved under controlled conditions of low temperatures and high pressures. New research from a team including Carnegie's Elissaios Stavrou, Xiao-Jia Chen, and Alexander Goncharov hones in on the structural changes underlying superconductivity in iron arsenide compounds—those containing iron and arsenic. [26] This paper explains the magnetic effect of the superconductive current from the observed effects of the accelerating electrons, causing naturally the experienced changes of the electric field potential along the electric wire. The accelerating electrons explain not only the Maxwell Equations and the Special Relativity, but the Heisenberg Uncertainty Relation, the wave particle duality and the electron's spin also, building the bridge between the Classical and Quantum Theories. The changing acceleration of the electrons explains the created negative electric field of the magnetic induction, the Higgs Field, the changing Relativistic Mass and the Gravitational Force, giving a Unified Theory of the physical forces. Taking into account the Planck Distribution Law of the electromagnetic oscillators also, we can explain the electron/proton mass rate and the Weak and Strong Interactions.

**Category:** Condensed Matter

[430] **viXra:1701.0551 [pdf]**
*replaced on 2017-02-09 23:03:27*

**Authors:** Valentin Danci

**Comments:** 24 Pages. Version 2

Since 1905, when Einstein introduced the Special Relativity Theory, various researchers independently observed that his theory contains at least one more postulate besides the two postulates stated by him explicitly. Putting together all those observations about the different additional postulates, we will describe here how the Special Relativity Theory was unfortunately based on nineteen postulates, and how most of them were implied and used in Einstein's 1905 article, later in his article of 1910, and also further in his manuscript written between 1912 and 1914.

**Category:** Relativity and Cosmology

[429] **viXra:1701.0550 [pdf]**
*submitted on 2017-01-20 06:31:09*

**Authors:** George Rajna

**Comments:** 14 Pages.

Molecules vibrate in many different ways—like tiny musical instruments. [8] For centuries, scientists believed that light, like all waves, couldn't be focused down smaller than its wavelength, just under a millionth of a metre. Now, researchers led by the University of Cambridge have created the world's smallest magnifying glass, which focuses light a billion times more tightly, down to the scale of single atoms. [7] A Purdue University physicist has observed a butterfly Rydberg molecule, a weak pairing of two highly excitable atoms that he predicted would exist more than a decade ago. [6] In a scientific first, a team of researchers from Macquarie University and the University of Vienna have developed a new technique to measure molecular properties – forming the basis for improvements in scientific instruments like telescopes, and with the potential to speed up the development of pharmaceuticals. [5] In the quantum world, physicists study the tiny particles that make up our classical world-neutrons, electrons, photons-either one at a time or in small numbers because the behaviour of the particles is completely different on such a small scale. If you add to the number of particles that are being studied, eventually there will be enough particles that they no longer act quantum mechanically and must be identified as classical, just like our everyday world. But where is the line between the quantum world and the classical world? A group of scientists from Okinawa Institute of Science and Technology Graduate University (OIST) explored this question by showing what was thought to be a quantum phenomenon can be explained classically. [4] The accelerating electrons explain not only the Maxwell Equations and the Special Relativity, but the Heisenberg Uncertainty Relation, the Wave-Particle Duality and the electron's spin also, building the Bridge between the Classical and Quantum Theories. The Planck Distribution Law of the electromagnetic oscillators explains the electron/proton mass rate and the Weak and Strong Interactions by the diffraction patterns. The Weak Interaction changes the diffraction patterns by moving the electric charge from one side to the other side of the diffraction pattern, which violates the CP and Time reversal symmetry.

**Category:** Quantum Physics

[428] **viXra:1701.0549 [pdf]**
*submitted on 2017-01-20 07:23:35*

**Authors:** George Rajna

**Comments:** 33 Pages.

ORNL researchers have discovered a new type of quantum critical point, a new way in which materials change from one state of matter to another. [22] New research conducted at the University of Chicago has confirmed a decades-old theory describing the dynamics of continuous phase transitions. [21] No matter whether it is acoustic waves, quantum matter waves or optical waves of a laser—all kinds of waves can be in different states of oscillation, corresponding to different frequencies. Calculating these frequencies is part of the tools of the trade in theoretical physics. Recently, however, a special class of systems has caught the attention of the scientific community, forcing physicists to abandon well-established rules. [20] Until quite recently, creating a hologram of a single photon was believed to be impossible due to fundamental laws of physics. However, scientists at the Faculty of Physics, University of Warsaw, have successfully applied concepts of classical holography to the world of quantum phenomena. A new measurement technique has enabled them to register the first-ever hologram of a single light particle, thereby shedding new light on the foundations of quantum mechanics. [19] A combined team of researchers from Columbia University in the U.S. and the University of Warsaw in Poland has found that there appear to be flaws in traditional theory that describe how photodissociation works. [18] Ultra-peripheral collisions of lead nuclei at the LHC accelerator can lead to elastic collisions of photons with photons. [17] Physicists from Trinity College Dublin's School of Physics and the CRANN Institute, Trinity College, have discovered a new form of light, which will impact our understanding of the fundamental nature of light. [16] Light from an optical fiber illuminates the metasurface, is scattered in four different directions, and the intensities are measured by the four detectors. From this measurement the state of polarization of light is detected. [15] Converting a single photon from one color, or frequency, to another is an essential tool in quantum communication, which harnesses the subtle correlations between the subatomic properties of photons (particles of light) to securely store and transmit information. Scientists at the National Institute of Standards and Technology (NIST) have now developed a miniaturized version of a frequency converter, using technology similar to that used to make computer chips. [14]

**Category:** Quantum Physics

[427] **viXra:1701.0548 [pdf]**
*submitted on 2017-01-20 01:16:07*

**Authors:** Elias Khalil

**Comments:** 2 Pages.

A new technology based on nano bubbles developed and patented by a Spanish company, Jeanologia, is known as e-flow. The e-flow ‘breaks up’ the surface of the garment, achieving soft hand feel and controlling shrinkage. A minimal quantity of water is needed and there is zero discharge from the process. Air from the atmosphere is introduced into an electro flow reactor and subjected to an electromechanical shock creating nano bubbles and a flow of wet air. The nano bubble mix is then transported into a rotating tumbler containing the denim garments, and when it comes into contact with them produces a soft and natural hand feel. The garments are then dried in the same tumbler. When treating indigo dyed garments with this technology, some indigo cross contamination may occur that can be eliminated by a dry ozone treatment.

**Category:** Chemistry

[426] **viXra:1701.0546 [pdf]**
*submitted on 2017-01-19 12:04:24*

**Authors:** George Rajna

**Comments:** 22 Pages.

Symmetry is the essential basis of nature, which gives rise to conservation laws. In comparison, the breaking of the symmetry is also indispensable for many phase transitions and nonreciprocal processes. Among various symmetry breaking phenomena, spontaneous symmetry breaking lies at the heart of many fascinating and fundamental properties of nature. [16] One of the biggest challenges in physics is to understand why everything we see in our universe seems to be formed only of matter, whereas the Big Bang should have created equal amounts of matter and antimatter. CERN's LHCb experiment is one of the best hopes for physicists looking to solve this longstanding mystery. [15] Imperial physicists have discovered how to create matter from light-a feat thought impossible when the idea was first theorized 80 years ago. [14] How can the LHC experiments prove that they have produced dark matter? They can't… not alone, anyway. [13] The race for the discovery of dark matter is on. Several experiments worldwide are searching for the mysterious substance and pushing the limits on the properties it may have. [12] Dark energy is a mysterious force that pervades all space, acting as a "push" to accelerate the universe's expansion. Despite being 70 percent of the universe, dark energy was only discovered in 1998 by two teams observing Type Ia supernovae. A Type 1a supernova is a cataclysmic explosion of a white dwarf star. The best way of measuring dark energy just got better, thanks to a new study of Type Ia supernovae. [11] Newly published research reveals that dark matter is being swallowed up by dark energy, offering novel insight into the nature of dark matter and dark energy and what the future of our Universe might be. [10] The gravitational force attracting the matter, causing concentration of the matter in a small space and leaving much space with low matter concentration: dark matter and energy. There is an asymmetry between the mass of the electric charges, for example proton and electron, can understood by the asymmetrical Planck Distribution Law. This temperature dependent energy distribution is asymmetric around the maximum intensity, where the annihilation of matter and antimatter is a high probability event. The asymmetric sides are creating different frequencies of electromagnetic radiations being in the same intensity level and compensating each other. One of these compensating ratios is the electron – proton mass ratio. The lower energy side has no compensating intensity level, it is the dark energy and the corresponding matter is the dark matter.

**Category:** High Energy Particle Physics

[425] **viXra:1701.0545 [pdf]**
*submitted on 2017-01-19 12:19:53*

**Authors:** Andrew Beckwith

**Comments:** 11 Pages. submitted to JHEPGC for review

We examine the role of particle nucleation in the initial universe, and argue that there is a small effect due to particle nucleation in terms of lowering initial temperature, in tandem with energy density and scale factor contributions. If such scaling exists as a major order effect, then quenching of temperature proportional to a vacuum nucleation at or before the electro weak era is heavily influenced by a number, n, which is either a quantum number (quantum cosmology) or a ‘particle count before the electro weak era. If the supposition is for a particle count, say of gravitons from a prior universe to today's universe, initially, we can compare via a thermodynamic argument compared as to a modified Heisenberg uncertainty principle as to what this says about particle count information, we have a richer cosmological picture to contend with. We close with a speculation as to how a quantum teleportation picture for Pre- Planckian space-time physics may influence our physics discussion.

**Category:** Quantum Gravity and String Theory

[424] **viXra:1701.0544 [pdf]**
*submitted on 2017-01-19 12:56:31*

**Authors:** Thomas Görnitz

**Comments:** 40 Pages.

Based on the simplest possible quantum structures, that is, the abstract free-of-meaning quan-tum information (AQI) bits establishing the fundamental substance referred to as protyposis, it is shown, using just three plausible postulates, how a cosmological model can be derived that describes the observation data better than the „flat ΛCDM“ standard model. The postulates are the Planck relation, E = hc/λ, the existence of a distinguished velocity, i.e. the velocity of light in vacuum, and the first law of thermodynamics. Assumptions concerning inexplicable fictitious entities, such as „inflation“ or „dark energy“ can be dispensed with. The model solves „cosmolog-ical problems“.
Einstein’s equations result by requiring that the cosmic relation between the radius of curvature and the energy density can be transferred to local density variations within the cosmos. General Relativity is shown up as a classical approximation of the quantum cosmology. Therefore the relations are clarified in principle that happen between quantum theory and gravity theory.
The AQI concept allows for a simple derivation of black hole entropies and, moreover, establish-es a rationalization of the gauge groups associated with the three fundamental forces. Relati-vistic particles with and without rest mass can be constructed from the AQI bits, and, thus, all objects described in natural sciences. In living beings, the AQI can manifest both in the material body and in meaningful quantum information of the psyche, eventually closing the „explanatory gap“ between „body and mind“.

**Category:** Relativity and Cosmology

[423] **viXra:1701.0543 [pdf]**
*submitted on 2017-01-19 13:07:10*

**Authors:** George Rajna

**Comments:** 31 Pages.

A team of researchers from several institutions in Israel has, for the first time, identified a molecule that phages use to communicate with one another. [17] Molecules that change colour can be used to follow in real-time how bacteria form a protective biofilm around themselves. This new method, which has been developed in collaboration between researchers at Linköping University and Karolinska Institutet in Sweden, may in the future become significant both in medical care and the food industry, where bacterial biofilms are a problem. [16] Researchers led by Carnegie Mellon University physicist Markus Deserno and University of Konstanz (Germany) chemist Christine Peter have developed a computer simulation that crushes viral capsids. By allowing researchers to see how the tough shells break apart, the simulation provides a computational window for looking at how viruses and proteins assemble. [15] IBM scientists have developed a new lab-on-a-chip technology that can, for the first time, separate biological particles at the nanoscale and could enable physicians to detect diseases such as cancer before symptoms appear. [14] Scientists work toward storing digital information in DNA. [13] Leiden theoretical physicists have proven that DNA mechanics, in addition to genetic information in DNA, determines who we are. Helmut Schiessel and his group simulated many DNA sequences and found a correlation between mechanical cues and the way DNA is folded. They have published their results in PLoS One. [12] We model the electron clouds of nucleic acids in DNA as a chain of coupled quantum harmonic oscillators with dipole-dipole interaction between nearest neighbours resulting in a van der Waals type bonding. [11] Scientists have discovered a secret second code hiding within DNA which instructs cells on how genes are controlled. The amazing discovery is expected to open new doors to the diagnosis and treatment of diseases, according to a new study. [10] There is also connection between statistical physics and evolutionary biology, since the arrow of time is working in the biological evolution also. From the standpoint of physics, there is one essential difference between living things and inanimate clumps of carbon atoms: The former tend to be much better at capturing energy from their environment and dissipating that energy as heat. [8] This paper contains the review of quantum entanglement investigations in living systems, and in the quantum mechanically modeled photoactive prebiotic kernel systems. [7] The human body is a constant flux of thousands of chemical/biological interactions and processes connecting molecules, cells, organs, and fluids, throughout the brain, body, and nervous system. Up until recently it was thought that all these interactions operated in a linear sequence, passing on information much like a runner passing the baton to the next runner. However, the latest findings in quantum biology and biophysics have discovered that there is in fact a tremendous degree of coherence within all living systems. The accelerating electrons explain not only the Maxwell Equations and the Special Relativity, but the Heisenberg Uncertainty Relation, the Wave-Particle Duality and the electron's spin also, building the Bridge between the Classical and Quantum Theories. The Planck Distribution Law of the electromagnetic oscillators explains the electron/proton mass rate and the Weak and Strong Interactions by the diffraction patterns. The Weak Interaction changes the diffraction patterns by moving the electric charge from one side to the other side of the diffraction pattern, which violates the CP and Time reversal symmetry. The diffraction patterns and the locality of the self-maintaining electromagnetic potential explains also the Quantum Entanglement, giving it as a natural part of the Relativistic Quantum Theory and making possible to understand the Quantum Biology.

**Category:** Physics of Biology

[422] **viXra:1701.0542 [pdf]**
*submitted on 2017-01-19 08:52:21*

**Authors:** George Rajna

**Comments:** 31 Pages.

A strong, short light pulse can record data on a magnetic layer of yttrium iron garnet doped with Co-ions. This was discovered by researchers from Radboud University in the Netherlands and Bialystok University in Poland. The novel mechanism outperforms existing alternatives, allowing the fastest read-write magnetic recording accompanied by unprecedentedly low heat load. [22]
It goes by the unwieldy acronym STT-MRAM, which stands for spin-transfer torque magnetic random access memory. [21]
Memory chips are among the most basic components in computers. The random access memory is where processors temporarily store their data, which is a crucial function. Researchers from Dresden and Basel have now managed to lay the foundation for a new memory chip concept. [20]
Researchers have built a record energy-efficient switch, which uses the interplay of electricity and a liquid form of light, in semiconductor microchips. The device could form the foundation of future signal processing and information technologies, making electronics even more efficient. [19]
The magnetic structure of a skyrmion is symmetrical around its core; arrows indicate the direction of spin. [18]
According to current estimates, dozens of zettabytes of information will be stored electronically by 2020, which will rely on physical principles that facilitate the use of single atoms or molecules as basic memory cells. [17]
EPFL scientists have developed a new perovskite material with unique properties that can be used to build next-generation hard drives. [16]
Scientists have fabricated a superlattice of single-atom magnets on graphene with a density of 115 terabits per square inch, suggesting that the configuration could lead to next-generation storage media. [15]
Now a researcher and his team at Tyndall National Institute in Cork have made a 'quantum leap' by developing a technical step that could enable the use of quantum computers sooner than expected. [14]
A method to produce significant amounts of semiconducting nanoparticles for light-emitting displays, sensors, solar panels and biomedical applications has gained momentum with a demonstration by researchers at the Department of Energy's Oak Ridge National Laboratory. [13]
A source of single photons that meets three important criteria for use in quantum-information systems has been unveiled in China by an international team of physicists. Based on a quantum dot, the device is an efficient source of photons that emerge as solo particles that are indistinguishable from each other. The researchers are now trying to use the source to create a quantum computer based on "boson sampling". [11]
With the help of a semiconductor quantum dot, physicists at the University of Basel have developed a new type of light source that emits single photons. For the first time, the researchers have managed to create a stream of identical photons. [10]
Optical photons would be ideal carriers to transfer quantum information over large distances. Researchers envisage a network where information is processed in certain nodes and transferred between them via photons. [9]
While physicists are continually looking for ways to unify the theory of relativity, which describes large-scale phenomena, with quantum theory, which describes small-scale phenomena, computer scientists are searching for technologies to build the quantum computer using Quantum Information.
In August 2013, the achievement of "fully deterministic" quantum teleportation, using a hybrid technique, was reported. On 29 May 2014, scientists announced a reliable way of transferring data by quantum teleportation. Quantum teleportation of data had been done before but with highly unreliable methods.
The accelerating electrons explain not only the Maxwell Equations and the Special Relativity, but the Heisenberg Uncertainty Relation, the Wave-Particle Duality and the electron’s spin also, building the Bridge between the Classical and Quantum Theories.
The Planck Distribution Law of the electromagnetic oscillators explains the electron/proton mass rate and the Weak and Strong Interactions by the diffraction patterns. The Weak Interaction changes the diffraction patterns by moving the electric charge from one side to the other side of the diffraction pattern, which violates the CP and Time reversal symmetry.
The diffraction patterns and the locality of the self-maintaining electromagnetic potential explains also the Quantum Entanglement, giving it as a natural part of the Relativistic Quantum Theory and making possible to build the Quantum Computer with the help of Quantum Information.

**Category:** Digital Signal Processing

[421] **viXra:1701.0541 [pdf]**
*submitted on 2017-01-19 07:15:53*

**Authors:** Manik Dawar

**Comments:** 7 Pages. Contact information: manikdawar@live.com

The entirety of this document assumes the existence of a maximum speed with which any entity in the universe can travel from a set of points in space to any other set of points in space. The consequences on the motion of the constituents of a typical system of particles, when the system is travelling at a speed which is close to the speed limit of the universe, are initially subjected to a qualitative analysis, the conclusions of which hint at a mechanical definition of time. A quantitative analysis of the same reveals the Lorentz Transformation Factor. The fact that the Lorentz transformation factor is derived on applying the definition of time, which was hinted from the qualitative analysis, supports that definition. The quantitative analysis, however, also revealed a different value (transformation factor*). Both the transformation factors are combined to form one transformation factor, which, given that n (the number of spatial dimensions in the universe through which any moving object traverses) is large enough, approximately equates to the Lorentz Transformation Factor. Thus, using the results derived here, the value of n might be revealed.

**Category:** Classical Physics

[420] **viXra:1701.0540 [pdf]**
*submitted on 2017-01-18 19:08:15*

**Authors:** Michail Zak

**Comments:** 7 Pages.

New physical principle for simulations of PDE has been introduced. It is based upon replacing the PDE to be solved by the system of ODE for which the PDE represents the corresponding Liouville equation. The proposed approach has a polynomial (rather than exponential) algorithmic complexity, and it is applicable to nonlinear parabolic, hyperbolic, and elliptic PDE.

**Category:** Mathematical Physics

[419] **viXra:1701.0539 [pdf]**
*submitted on 2017-01-18 22:00:17*

**Authors:** Rochelle Forrester

**Comments:** 11 Pages.

Quantum physicists have made many attempts to solve the quantum measurement problem, but no solution seems to have received widespread acceptance. The time has come for a new approach. In Sense Perception and Reality: A Theory of Perceptual Relativity, Quantum Mechanics and the Observer Dependent Universe and in a new paper The End of Realism I suggest the quantum measurement problem is caused by a failure to understand that each species has its own sensory world and that when we say the wave function collapses and brings a particle into existence we mean the particle is bought into existence in the human sensory world by the combined operation of the human sensory apparatus, particle detectors and the experimental set up. This is similar to the Copenhagen Interpretation suggested by Niels Bohr and others, but the understanding that the collapse of the wave function brings a particle into existence in the human sensory world removes the need for a dividing line between the quantum world and the macro world. The same rules can apply to both worlds and the ideas stated in this paper considerably strengthen the Copenhagen Interpretation of quantum mechanics.

**Category:** Quantum Physics

[418] **viXra:1701.0538 [pdf]**
*submitted on 2017-01-19 02:17:57*

**Authors:** George Rajna

**Comments:** 12 Pages.

Usha Mallik and her team used a grant from the U.S. Department of Energy to help build a sub-detector at the Large Hadron Collider, the world's largest and most powerful particle accelerator, located in Switzerland. They're running experiments on the sub-detector to search for a pair of bottom quarks—subatomic yin-and-yang particles that should be produced about 60 percent of the time a Higgs boson decays. [8]
A new way of measuring how the Higgs boson couples to other fundamental particles has been proposed by physicists in France, Israel and the US. Their technique would involve comparing the spectra of several different isotopes of the same atom to see how the Higgs force between the atom's electrons and its nucleus affects the atomic energy levels. [7]
The magnetic induction creates a negative electric field, causing an electromagnetic inertia responsible for the relativistic mass change; it is the mysterious Higgs Field giving mass to the particles. The Planck Distribution Law of the electromagnetic oscillators explains the electron/proton mass rate by the diffraction patterns. The accelerating charges explain not only the Maxwell Equations and the Special Relativity, but the Heisenberg Uncertainty Relation, the wave particle duality and the electron’s spin also, building the bridge between the Classical and Relativistic Quantum Theories. The self maintained electric potential of the accelerating charges equivalent with the General Relativity space-time curvature, and since it is true on the quantum level also, gives the base of the Quantum Gravity. The diffraction patterns and the locality of the self-maintaining electromagnetic potential explains also the Quantum Entanglement, giving it as a natural part of the relativistic quantum theory.

**Category:** High Energy Particle Physics

[417] **viXra:1701.0537 [pdf]**
*submitted on 2017-01-18 13:20:25*

**Authors:** Gary R. Prok

**Comments:** 2 Pages.

There has been disagreement about the validity of Landauer's Principle, which places a limitation on computational energy efficiency. The Principle is predicated on a finite entropy increase associated with every erasure of a memory register. Existence of a reversible memory register reduces Landauer' Principle to a disproven conjecture.

**Category:** Classical Physics

[416] **viXra:1701.0536 [pdf]**
*submitted on 2017-01-18 13:23:54*

**Authors:** Gary R. Prok

**Comments:** 9 Pages.

Maxwell’s demon challenges our interpretation of thermodynamics and our understanding of the Second Law of thermodynamics. The Szilard engine is a gedanken instantiation of Maxwell’s Demon that is amenable to standard thermodynamic analysis. The paradox of Maxwell’s demon as presented by the Szilard engine is considered to have been solved by Landauer’s principle. A classical analysis of the Szilard engine, presented here, shows that Landauer’s principle is not needed to resolve the paradox of the demon. Classical thermodynamics is all that is needed.

**Category:** Classical Physics

[415] **viXra:1701.0535 [pdf]**
*submitted on 2017-01-18 13:49:36*

**Authors:** Michail Zak

**Comments:** 13 Pages.

This work is inspired by the discovery of a new class of dynamical system described by ODE coupled with their Liouville equation. These systems called self-controlled since the role of actuators is played by the probability produced by the Liouville equation. Following the Madelung equation that belongs to this class, non-Newtonian properties such as randomness, entanglement, and probability interference typical for quantum systems have been described. Special attention was paid to the capability to violate the second law of thermodynamics, which makes these systems neither Newtonian, nor quantum. It has been shown that self-controlled dynamical systems can be linked to mathematical models of livings. The discovery of isolated dynamical systems that can decrease entropy in violation of the second law of thermodynamics, and resemblances of these systems to livings implies that Life can slow down heat death of the Universe, and that can be associated with the Purpose of Life.

**Category:** Astrophysics

[414] **viXra:1701.0534 [pdf]**
*submitted on 2017-01-18 12:41:29*

**Authors:** George Rajna

**Comments:** 17 Pages.

An important step towards a completely new experimental access to quantum physics has been made at University of Konstanz. The team of scientists headed by Professor Alfred Leitenstorfer has now shown how to manipulate the electric vacuum field and thus generate deviations from the ground state of empty space which can only be understood in the context of the quantum theory of light. [10] Physicists at the National Institute of Standards and Technology (NIST) have cooled a mechanical object to a temperature lower than previously thought possible, below the so-called "quantum limit." [9] For the past 100 years, physicists have been studying the weird features of quantum physics, and now they're trying to put these features to good use. One prominent example is that quantum superposition (also known as quantum coherence)—which is the property that allows an object to be in two states at the same time—has been identified as a useful resource for quantum communication technologies. [8] Quantum entanglement—which occurs when two or more particles are correlated in such a way that they can influence each other even across large distances—is not an all-or-nothing phenomenon, but occurs in various degrees. The more a quantum state is entangled with its partner, the better the states will perform in quantum information applications. Unfortunately, quantifying entanglement is a difficult process involving complex optimization problems that give even physicists headaches. [7] A trio of physicists in Europe has come up with an idea that they believe would allow a person to actually witness entanglement. Valentina Caprara Vivoli, with the University of Geneva, Pavel Sekatski, with the University of Innsbruck and Nicolas Sangouard, with the University of Basel, have together written a paper describing a scenario where a human subject would be able to witness an instance of entanglement—they have uploaded it to the arXiv server for review by others. [6] The accelerating electrons explain not only the Maxwell Equations and the Special Relativity, but the Heisenberg Uncertainty Relation, the Wave-Particle Duality and the electron's spin also, building the Bridge between the Classical and Quantum Theories. The Planck Distribution Law of the electromagnetic oscillators explains the electron/proton mass rate and the Weak and Strong Interactions by the diffraction patterns. The Weak Interaction changes the diffraction patterns by moving the electric charge from one side to the other side of the diffraction pattern, which violates the CP and Time reversal symmetry. The diffraction patterns and the locality of the self-maintaining electromagnetic potential explains also the Quantum Entanglement, giving it as a natural part of the relativistic quantum theory.

**Category:** Quantum Physics

[413] **viXra:1701.0533 [pdf]**
*submitted on 2017-01-18 05:02:17*

**Authors:** J. Dunning-Davies, J. P. Dunning-Davies

**Comments:** 7 Pages.

Ever since Oliver Heaviside's suggestion of the possible existence of a set of equations, analogous to Maxwell's equations for the electromagnetic field, to describe the gravitational field, others have considered and built on the original notion. However, if such equations do exist and really are analogous to Maxwell's electromagnetic equations, new problems could arise related to presently accepted notions concerning special relativity. This note, as well as offering a translation of a highly relevant paper by Carstoiu, addresses these concerns in the same manner as similar concerns regarding Maxwell's equations were.

**Category:** Mathematical Physics

[412] **viXra:1701.0532 [pdf]**
*submitted on 2017-01-18 07:32:42*

**Authors:** George Rajna

**Comments:** 17 Pages.

One of the deepest mysteries of physics today is why we seem to live in a world composed only of matter, while the Big Bang should have created equal amounts of matter and antimatter. [13] A precise measurement of absolute beam intensity is a key parameter to monitor any losses in a beam and to calibrate the absolute number of particles delivered to the experiments. [12] In a paper published today in the journal Science, the ASACUSA experiment at CERN reported new precision measurement of the mass of the antiproton relative to that of the electron. [11] When two protons approaching each other pass close enough together, they can " feel " each other, similar to the way that two magnets can be drawn closely together without necessarily sticking together. According to the Standard Model, at this grazing distance, the protons can produce a pair of W bosons. [10] The fact that the neutron is slightly more massive than the proton is the reason why atomic nuclei have exactly those properties that make our world and ultimately our existence possible. Eighty years after the discovery of the neutron, a team of physicists from France, Germany, and Hungary headed by Zoltán Fodor, a researcher from Wuppertal, has finally calculated the tiny neutron-proton mass difference. [9] Taking into account the Planck Distribution Law of the electromagnetic oscillators, we can explain the electron/proton mass rate and the Weak and Strong Interactions. Lattice QCD gives the same results as the diffraction patterns of the electromagnetic oscillators, explaining the color confinement and the asymptotic freedom of the Strong Interactions.

**Category:** High Energy Particle Physics

[411] **viXra:1701.0531 [pdf]**
*submitted on 2017-01-17 17:36:56*

**Authors:** Fu Yuhua

**Comments:** 6 Pages.

As No.3 of comparative physics series papers, this paper mainly discusses the comparative studies between the original law of conservation of energy and the Computer Information Library Clusters; and based on the multiform laws of conservation of energy, the concept of "law clusters of conservation of generalized energy" is presented. In which, any physical quantity can be regarded as "generalized energy", and any physical formula and equation can be transformed into law of conservation, therefore all the physical laws as well as formulas and equations can be classified as "physical law clusters of conservation of generalized energy" (sometimes it can be simplified to "law clusters of conservation of generalized energy"). While in law clusters of conservation of generalized energy, there are some source laws. According to the source law, some related laws as well as formulas and equations can be derived, for example, law of gravity and Newton's second law can be derived with law of conservation of energy; thus "law clusters of conservation of generalized energy" can be simplified to "law clusters of physical source law". As the number of source laws in the law clusters is reduced to some degree, all the laws of physics are able to be written on a T-shirt with the form of "the simplest law clusters of physical source law". In order to deal with the practical problems, "variational principle of the simplest law clusters of physical source law" can be eatablished.

**Category:** Thermodynamics and Energy

[410] **viXra:1701.0530 [pdf]**
*submitted on 2017-01-17 20:03:50*

**Authors:** Michail Zak

**Comments:** 17 Pages.

A new class of dynamical systems with a preset type of interference of probabilities is introduced. It is obtained from the extension of the Madelung equation by replacing the quantum potential with a specially selected feedback from the Liouville equation. It has been proved that these systems are different from both Newtonian and quantum systems, but they can be useful for modeling spontaneous collective novelty phenomena when emerging outputs are qualitatively different from the weighted sum of individual inputs. Formation of language and fast decision-making process as potential applications of the probability interference is discussed.

**Category:** Artificial Intelligence

[409] **viXra:1701.0529 [pdf]**
*submitted on 2017-01-17 20:26:55*

**Authors:** Michail Zak

**Comments:** 22 Pages.

via conservative forces. These forces can be of gravitational origin (celestial mechanics),
inter-molecular origin (molecular dynamics), or representing (structural biology). In
The n-body problem as a classic astronomical and physical problem that naturally follows from the two- body problem first solved by Newton in his Principia in 1687. The efforts of many famous mathematicians have been devoted to this difficult problem, including Euler and Lagrange (1772), Jacobi (1836), Hill (1878), Poincaré (1899), Levi-Civita (1905), and Birkhoff (1915). However, despite centuries of exploration, there is no clear structure of the solution of the general n- or even three-body problem as there are no coordinate transformations that can simplify the problem, and there are more and more evidences that, in general, the solutions of n-body problems are chaotic. Failure to find a general analytical structure of the solution shifted the effort towards numerical methods. Many ODE solvers offer a variety of advance numerical methods for the solution.
2. Chaos in classical dynamics
We start this section with revisiting mathematical formalism of chaos in a non-traditional way that is based upon the concept of orbital instability.
The concept of randomness entered Newtonian dynamics almost a century ago: in 1926, Synge, J. introduced a new type of instability - orbital instability- in classical mechanics, [4], that can be considered as a precursor of chaos formulated a couple of decades later, [5]. The theory of chaos was inspired by the fact that in recent years, in many different domains of science (physics, chemistry, biology, engineering), systems with a similar strange behavior were frequently encountered displaying irregular and unpredictable behavior called chaotic. Currently the theory of chaos that describes such systems is well established. However there are still two unsolved problem remain: prediction of chaos (without numerical runs), and analytical description of chaos in term of the probability density that would formally follow from the original ODE. This paper proposes a contribution to the solution of these problems illustrated by chaos in inertial systems
a. Orbital instability as a precursor of chaos.
Chaos is a special type of instability when the system does not have an alternative stable state and displays an irregular aperiodic motion. Obviously this kind of instability can be associated only with ignorable variables, i.e. with such variables that do not contribute into energy of the system. In order to demonstrate this kind of instability, consider an inertial motion of a particle M of unit mass on a smooth pseudosphere S having a constant negative curvature G0, Fig. 1.
The n-body problem is the problem of predicting the individual motions of a group of objects
1
interacting
with each other
the most common version, the trajectories of the objects are determined by numerically solving the Newton's equations of motion for a system of interacting particles. Non-conservative version of the interaction forces became important in case of the n-body problem that incorporates the effects of
the Coulomb potential
radiation pressure, Poynting-Robertson (P-R) drag, and solar wind drag.
The general method of numerical solution of the corresponding system
of ODE was originally conceived within theoretical physics in the late 1950s,,[1,2], but is applied today
mostly in chemical physics, materials science and the modeling of biomolecules.
The most significant “side effect “of the existing numerical methods for n-body problems becomes chaos when different numerical runs with the same initial conditions result in different trajectories. Although numerical errors can contribute to chaos, nevertheless the primary origin of chaos is physical instability,
[3].
In this work, a general approach to probabilistic description of chaos in n-body problem with conservative

**Category:** Classical Physics

[408] **viXra:1701.0528 [pdf]**
*submitted on 2017-01-17 23:40:30*

**Authors:** Yannan Yang

**Comments:** 2 Pages.

Concerning the paradox of magnetic interaction between two parallel moving charged particle beams, there are a lot of discussions. Here in this paper, an experimental design is proposed, by which we can verify if there is really a magnetic interaction between the two charged particle beams.

**Category:** Relativity and Cosmology

[407] **viXra:1701.0527 [pdf]**
*submitted on 2017-01-17 15:18:42*

**Authors:** George Rajna

**Comments:** 26 Pages.

Technion researchers have demonstrated, for the first time, that laser emissions can be created through the interaction of light and water waves. This "water-wave laser" could someday be used in tiny sensors that combine light waves, sound and water waves, or as a feature on microfluidic "lab-on-a-chip" devices used to study cell biology and to test new drug therapies. [18] Researchers led by EPFL have built ultra-high quality optical cavities for the elusive mid-infrared spectral region, paving the way for new chemical and biological sensors, as well as promising technologies. [17] The research team led by Professor Hele Savin has developed a new light detector that can capture more than 96 percent of the photons covering visible, ultraviolet and infrared wavelengths. [16] A promising route to smaller, powerful cameras built into smartphones and other devices is to design optical elements that manipulate light by diffraction-the bending of light around obstacles or through small gaps-instead of refraction. [15] Converting a single photon from one color, or frequency, to another is an essential tool in quantum communication, which harnesses the subtle correlations between the subatomic properties of photons (particles of light) to securely store and transmit information. Scientists at the National Institute of Standards and Technology (NIST) have now developed a miniaturized version of a frequency converter, using technology similar to that used to make computer chips. [14] Harnessing the power of the sun and creating light-harvesting or light-sensing devices requires a material that both absorbs light efficiently and converts the energy to highly mobile electrical current. Finding the ideal mix of properties in a single material is a challenge, so scientists have been experimenting with ways to combine different materials to create "hybrids" with enhanced features. [13] Condensed-matter physicists often turn to particle-like entities called quasiparticles—such as excitons, plasmons, magnons—to explain complex phenomena. Now Gil Refael from the California Institute of Technology in Pasadena and colleagues report the theoretical concept of the topological polarition, or " topolariton " : a hybrid half-light, half-matter quasiparticle that has special topological properties and might be used in devices to transport light in one direction. [12]

**Category:** Condensed Matter

[406] **viXra:1701.0526 [pdf]**
*submitted on 2017-01-17 12:05:57*

**Authors:** Mark M. Grinshtein

**Comments:** 8 Pages. in Russian

The article reviews the author's concept of the information-wave medicine (IWM). As presented, the IWM is not connected with author's special powers but is a branch of medical science. It is indicated that biolocation is also a science the principles of which are still not understood. \\ В статье рассматривается созданная автором концепция информационно-волновой медицины (ИВМ). Показано, что ИВМ не связана с особыми способностями автора, а является ветвью медицинской науки. Показано также, что биолокация тоже является наукой, механизм которой до настоящего времени ещё не познан.

**Category:** Physics of Biology

[405] **viXra:1701.0525 [pdf]**
*submitted on 2017-01-17 07:37:05*

**Authors:** Sjaak Uitterdijk

**Comments:** 4 Pages.

The article shows that the present worldwide production of sustainable energy is negligible relative to the worldwide need of energy. As a result, increasing the production of sustainable energy, in order to try to reduce CO2 emissions, will not have any significant effect. Only one measure will do. However such a measure will not be received as a popular one.

**Category:** Climate Research

[404] **viXra:1701.0524 [pdf]**
*submitted on 2017-01-17 04:12:19*

**Authors:** Terrence J. McMahon

**Comments:** 32 Pages.

Unification of the physical constants is announced, where gravity, quantum theory, and general relativity are linked via new physics. Unification involves a new, combined ‘gravito-electromagnetic’ constant, linked via Pi and Phi (the golden mean). All constants, most of which are found to run at high energies, are related via this expression. Energy, mass, and the gravitational constant are explained in those terms. The photon constant runs inversely to the gravitational constant, while both are united via a running fine-structure constant. Mass is not conserved, running with energy. Energy however is conserved. Space is a superconductor, where photons have mass. The Hubble constant is redefined, providing an alternative cause for red-shifting of photon wavelengths. A brief discussion of these findings offers an explanation via a new cosmological model that does not require inflation, singularities, dark energy, exotic dark matter, or supersymmetry. Anomalies in the Standard Model are explained. Suitable candidates are described for the cosmological constant, and mass density parameter. The Universe is found to be closed. Planck units run, and the Planck constant is calculated from theory, differing by 0.2%, as is the von Klitzing constant. Magnetic permeability, electric permittivity, and wave impedance are calculated from theory here, differing from accepted values (defined by convention) by just 0.2%. The fine structure, gravitational, and Hubble constants are defined, with accuracy for the latter two improved to 10 significant figures. These data described are all in excellent agreement with the Planck survey (2015) results. New, related constants are discussed. A novel explanation is introduced to explain the mass ratio between an electron and a proton. Predictions are made for future values of the principal running constants. These discoveries have substantial consequences for the Standard Model.

**Category:** Quantum Gravity and String Theory

[403] **viXra:1701.0523 [pdf]**
*submitted on 2017-01-17 04:41:39*

**Authors:** Grushka Ya.I.

**Comments:** 158 Pages. Mathematics Subject Classification: 03E75; 70A05; 83A05; 47B99

This work lays the foundations of the theory of kinematic changeable sets ("abstract kinematics"). Theory of kinematic changeable sets is based on the theory of changeable sets. From an intuitive point of view, changeable sets are sets of objects which, unlike elements of ordinary (static) sets, may be in the process of continuous transformations, and which may change properties depending on the point of view on them (that is depending on the reference frame). From the philosophical and imaginative point of view the changeable sets may look like as "worlds" in which evolution obeys arbitrary laws.
Kinematic changeable sets are the mathematical objects, consisting of changeable sets, equipped by different geometrical or topological structures (namely metric, topological, linear, Banach, Hilbert and other spaces). In author opinion, theories of changeable and kinematic changeable sets (in the process of their development and improvement), may become some tools of solving the sixth Hilbert problem at least for physics of macrocosm. Investigations in this direction may be interesting for astrophysics, because there exists the hypothesis, that in the large scale of Universe, physical laws (in particular, the laws of kinematics) may be different from the laws, acting in the neighborhood of our solar System. Also these investigations may be applied for the construction of mathematical foundations of tachyon kinematics.
We believe, that theories of changeable and kinematic changeable sets may be interesting not only for theoretical physics but also for other fields of science as some, new, mathematical apparatus for description of evolution of complex systems.

**Category:** Mathematical Physics

[402] **viXra:1701.0522 [pdf]**
*submitted on 2017-01-16 16:29:56*

**Authors:** Michail Zak

**Comments:** 10 Pages.

The concept of randomness entered Newtonian dynamics almost a century ago: in 1926, Synge, J. introduced a new type of instability - orbital instability- in classical mechanics, [1], that can be considered as a precursor of chaos formulated a couple of decades later, [2]. The theory of chaos was inspired by the fact that in recent years, in many different domains of science (physics, chemistry, biology, engineering), systems with a similar strange behavior were frequently encountered displaying irregular and unpredictable behavior called chaotic. Currently the theory of chaos that describes such systems is well established. However there are still two unsolved problem remain: prediction of chaos (without numerical runs), and analytical description of chaos in term of the probability density that would formally follow from the original ODE. This paper proposes a contribution to the solution of these problems.

**Category:** Classical Physics

[401] **viXra:1701.0521 [pdf]**
*submitted on 2017-01-16 20:05:38*

**Authors:** J. P. Lestone

**Comments:** 3 Pages. 3 pg, 1 figure.

Virtual photons, with a reduced wavelength of ƛ, are assumed to interact with isolated charged leptons with a cross section of piƛ2. This interaction is assumed to generate stimulated virtual photon emissions that are capable of being exchanged with other particles. This exchange of virtual photons is assumed to define the strength of electromagnetism. With the inclusion of near-field effects, the model choices presented give a calculated fundamental unit of charge of 1.60218x10^-19 C. If these choices are corroborated by detailed calculations then an understanding of the numerical value of the fine structure constant may emerge.

**Category:** Quantum Physics

[400] **viXra:1701.0520 [pdf]**
*replaced on 2017-02-03 08:12:27*

**Authors:** Yibing Qiu

**Comments:** 1 Page.

Abstract: show a viewpoint with regards to the mechanism
between the black holes and the disks of galaxies.

**Category:** Astrophysics

[399] **viXra:1701.0519 [pdf]**
*submitted on 2017-01-17 00:26:31*

**Authors:** W. B. Vasantha Kandasamy, K. Ilanthenral, Florentin Smarandache

**Comments:** 278 Pages.

In this book authors for the first time develop the notion of MOD natural neutrosophic subset special type of topological spaces using MOD natural neutrosophic dual numbers or MOD natural neutrosophic finite complex number or MOD natural neutrosophic-neutrosophic numbers and so on to build their respective MOD semigroups. Later they extend this concept to MOD interval subset semigroups and MOD interval neutrosophic subset semigroups. Using these MOD interval semigroups and MOD interval natural neutrosophic subset semigroups special type of subset topological spaces are built. Further using these MOD subsets we build MOD interval subset matrix semigroups and MOD interval subset matrix special type of matrix topological spaces. Likewise using MOD interval natural neutrosophic subsets matrices semigroups we can build MOD interval natural neutrosophic matrix subset special type of topological spaces. We also do build MOD subset coefficient polynomial special type of topological spaces. The final chapter mainly proposes several open conjectures about the validity of the Kakutani’s fixed point theorem for all MOD special type of subset topological spaces.

**Category:** Topology

[398] **viXra:1701.0518 [pdf]**
*submitted on 2017-01-16 13:27:51*

**Authors:** George Rajna

**Comments:** 33 Pages.

Gold is prized for its preciousness and as a conductor in electronics, but it is also important in scientific experimentation. [23] When the temperature of the material changes, both the electronic and the magnetic properties of the materials change with it. [22] In a proof-of-concept study published in Nature Physics, researchers drew magnetic squares in a nonmagnetic material with an electrified pen and then "read" this magnetic doodle with X-rays. [21] Researchers have brought electrides into the nanoregime by synthesizing the first 2D electride material. Electrides are ionic compounds, which are made of negative and positive ions. But in electrides, the negative "ions" are simply electrons, with no nucleus. [20] Microelectromechanical systems, or MEMS, are tiny machines fabricated using equipment and processes developed for the production of electronic chips and devices. [19] Researchers from the Harvard John A. Paulson School of Engineering and Applied Sciences have made the world's smallest radio receiver-built out of an assembly of atomic-scale defects in pink diamonds. [18] Smart phones have shiny flat AMOLED displays. Behind each single pixel of these displays hide at least two silicon transistors which were mass-manufactured using laser annealing technologies. [17] Bumpy surfaces with graphene between would help dissipate heat in next-generation microelectronic devices, according to Rice University scientists. [16] Scientists at The University of Manchester and Karlsruhe Institute of Technology have demonstrated a method to chemically modify small regions of graphene with high precision, leading to extreme miniaturisation of chemical and biological sensors. [15] A new method for producing conductive cotton fabrics using graphene-based inks opens up new possibilities for flexible and wearable electronics, without the use of expensive and toxic processing steps. [14]

**Category:** Condensed Matter

[397] **viXra:1701.0517 [pdf]**
*submitted on 2017-01-16 13:55:40*

**Authors:** Michail Zak

**Comments:** 22 Pages.

This paper presents a non-traditional approach to theory of turbulence. Its objective is to prove that Newtonian mechanics is fully equipped for description of turbulent motions without help of experimentally obtained closures. Turbulence is one of the most fundamental problems in theoretical physics that is still unsolved. The term “unsolved “ here means that turbulence cannot be properly formulated, i.e. reduced to standard mathematical procedure such as solving differential equations. In other words, it is not just a computational problem: prior to computations, a consistent mathematical model must be found. Although applicability of the Navier-Stokes equations as a model for fluid mechanics is not in question, the instability of their solutions for flows with supercritical Reynolds numbers raises a more general question: is Newtonian mechanics complete?
The problem of turbulence (stressed later by the discovery of chaos) demonstrated that the Newton’s world is far more complex than those represented by classical models. It appears that the Lagrangian or Hamiltonian formulations do not suggest any tools for treating postinstability motions, and this is a major flaw of the classical approach to Newtonian mechanics. The explanation of that limitation is proposed in this paper: the classical formalism based upon the Newton’s laws exploits additional mathematical restrictions (such as space–time differentiability, and the Lipchitz conditions) that are not required by the Newton’s laws. The only purpose for these restrictions is to apply a powerful technique of classical mathematical analysis. However, in many cases such restrictions are incompatible with physical reality, and the most obvious case of such incompatibility is the Euler’s model of inviscid fluid in which absence of shear stresses are not compensated by a release of additional degrees of freedom as required by the principles of mechanics.
It has been recently demonstrated, [3], that according to the principle of release of constraints, absence of shear stresses in the Euler equations must be compensated by additional degrees of freedom, and that led to a Reynolds-type enlarged Euler equations (EE equations) with a doublevalued velocity field that do not require any closures. In the first part of the paper, the theory is applied to turbulent mixing and illustrated by propagation of mixing zone triggered by a tangential jump of velocity. A comparison of the proposed solution with the Prandtl’s solution is performed and discussed. In the second part of the paper, a semi-viscous version of the Navier-Stokes equations is introduced. The model does not require any closures since the number of equations is equal to the number of unknowns.

**Category:** Classical Physics

[396] **viXra:1701.0516 [pdf]**
*submitted on 2017-01-16 14:14:27*

**Authors:** Michail Zak

**Comments:** 19 Pages.

Stochastic approach to maximization of a functional constrained by governing equation of a controlled system is introduced and discussed. The idea of the proposed algorithm is the following: represent the functional to be maximized as a limit of a probability density governed by the appropriately selected Liouville equation. Then the corresponding ODE become stochastic, and that sample of the solution which has the largest value will have the highest probability to appear in ODE simulation. Application to optimal control is discussed. Two limitations of optimal control theory - local maxima and possible instability of the optimal solutions - are removed. Special attention is paid to robot motion planning.

**Category:** Artificial Intelligence

[395] **viXra:1701.0515 [pdf]**
*submitted on 2017-01-16 10:26:57*

**Authors:** George Rajna

**Comments:** 26 Pages.

A widely held understanding of electromagnetic radiation has been challenged in newly published research led at the University of Strathclyde. [19] Technion researchers have demonstrated, for the first time, that laser emissions can be created through the interaction of light and water waves. This "water-wave laser" could someday be used in tiny sensors that combine light waves, sound and water waves, or as a feature on microfluidic "lab-on-a-chip" devices used to study cell biology and to test new drug therapies. [18] Researchers led by EPFL have built ultra-high quality optical cavities for the elusive mid-infrared spectral region, paving the way for new chemical and biological sensors, as well as promising technologies. [17] The research team led by Professor Hele Savin has developed a new light detector that can capture more than 96 percent of the photons covering visible, ultraviolet and infrared wavelengths. [16] A promising route to smaller, powerful cameras built into smartphones and other devices is to design optical elements that manipulate light by diffraction-the bending of light around obstacles or through small gaps-instead of refraction. [15] Converting a single photon from one color, or frequency, to another is an essential tool in quantum communication, which harnesses the subtle correlations between the subatomic properties of photons (particles of light) to securely store and transmit information. Scientists at the National Institute of Standards and Technology (NIST) have now developed a miniaturized version of a frequency converter, using technology similar to that used to make computer chips. [14] Harnessing the power of the sun and creating light-harvesting or light-sensing devices requires a material that both absorbs light efficiently and converts the energy to highly mobile electrical current. Finding the ideal mix of properties in a single material is a challenge, so scientists have been experimenting with ways to combine different materials to create "hybrids" with enhanced features. [13] Condensed-matter physicists often turn to particle-like entities called quasiparticles—such as excitons, plasmons, magnons—to explain complex phenomena.

**Category:** Quantum Physics

[394] **viXra:1701.0514 [pdf]**
*submitted on 2017-01-16 12:46:36*

**Authors:** Terubumi Honjou

**Comments:** 4 Pages.

The inflation space model becomes the leading role of the cosmology now.
The inflation cosmology supposes space to have been the size of the elementary particle level at a moment of the space birth and applies particle physics and is going to understand it.
But for now,
Innumerable inflation cosmology is proposed, and one cannot identify it.
Introduction of the super velocity of light concept more than the velocity of light,
A unit infinite other than the space where we live in other space with existing,
There is it in confusion.

**Category:** Astrophysics

[393] **viXra:1701.0513 [pdf]**
*submitted on 2017-01-16 06:25:21*

**Authors:** Carlos Castro

**Comments:** 14 Pages. Submitted to the IJGMMP.

Starting with the study of the geometry on the cotangent bundle (phase space), it is shown that the maximal proper force condition, in the case of a uniformly accelerated observer of mass $m$ along the $x$ axis, leads to a minimum value of $x$ lying $inside$ the Rindler wedge and given by the black hole horizon radius $ 2Gm$. Whereas in the uniform circular motion case, we find that the maximal proper force condition implies that the radius of the circle cannot exceed the value of the horizon radius $2Gm$. A correspondence is found between the black hole horizon radius and a singularity in the curvature of momentum space. The fact that the geometry (metric) in phase spaces is observer dependent (on the momentum of the massive particle/observer) indicates further that the matter stress energy tensor and vacuum energy in the underlying spacetime may admit an interpretation in terms of the curvature in momentum spaces. Consequently, phase space geometry seems to be the proper arena for a space-time-matter unification.

**Category:** Quantum Gravity and String Theory

[392] **viXra:1701.0512 [pdf]**
*submitted on 2017-01-16 07:18:40*

**Authors:** George Rajna

**Comments:** 32 Pages.

When the temperature of the material changes, both the electronic and the magnetic properties of the materials change with it. [22] In a proof-of-concept study published in Nature Physics, researchers drew magnetic squares in a nonmagnetic material with an electrified pen and then "read" this magnetic doodle with X-rays. [21] Researchers have brought electrides into the nanoregime by synthesizing the first 2D electride material. Electrides are ionic compounds, which are made of negative and positive ions. But in electrides, the negative "ions" are simply electrons, with no nucleus. [20] Microelectromechanical systems, or MEMS, are tiny machines fabricated using equipment and processes developed for the production of electronic chips and devices. [19] Researchers from the Harvard John A. Paulson School of Engineering and Applied Sciences have made the world's smallest radio receiver-built out of an assembly of atomic-scale defects in pink diamonds. [18] Smart phones have shiny flat AMOLED displays. Behind each single pixel of these displays hide at least two silicon transistors which were mass-manufactured using laser annealing technologies. [17] Bumpy surfaces with graphene between would help dissipate heat in next-generation microelectronic devices, according to Rice University scientists. [16] Scientists at The University of Manchester and Karlsruhe Institute of Technology have demonstrated a method to chemically modify small regions of graphene with high precision, leading to extreme miniaturisation of chemical and biological sensors. [15] A new method for producing conductive cotton fabrics using graphene-based inks opens up new possibilities for flexible and wearable electronics, without the use of expensive and toxic processing steps. [14] A device made of bilayer graphene, an atomically thin hexagonal arrangement of carbon atoms, provides experimental proof of the ability to control the momentum of electrons and offers a path to electronics that could require less energy and give off less heat than standard silicon-based transistors. It is one step forward in a new field of physics called valleytronics. [13]

**Category:** Condensed Matter

[391] **viXra:1701.0511 [pdf]**
*submitted on 2017-01-15 16:19:44*

**Authors:** Stephen C. Pearson.

**Comments:** 42 Pages.

This particular submission contains a copy [PART 5/6] of the author's original paper and is therefore a continuation of his previous submission, namely - "An Introduction to Functions of a Quaternion Hypercomplex Variable - PART 4/6", which has been published under the 'VIXRA' Mathematics subheading:- 'Functions and Analysis'.

**Category:** Functions and Analysis

[390] **viXra:1701.0510 [pdf]**
*submitted on 2017-01-15 17:01:34*

**Authors:** Stephen C. Pearson.

**Comments:** 24 Pages.

This particular submission contains a copy [PART 6/6] of the author's original paper and is therefore a continuation of his previous submission, namely - "An Introduction to Functions of a Quaternion Hypercomplex Variable - PART 5/6", which has been published under the 'VIXRA' Mathematics subheading:- 'Functions and Analysis'.

**Category:** Functions and Analysis

[389] **viXra:1701.0509 [pdf]**
*submitted on 2017-01-15 22:03:42*

**Authors:** Georgina Woodward

**Comments:** 25 Pages.

Starting with the premise that; the differences between substantial objects and images are not unimportant, as, though they may bear the same object name they are not equivalent. Consideration of methods of making measurements of distance is then given, followed by examination of the distance measurement methods used in “On the Electrodynamics of moving bodies”. The category error of not differentiating between objects and images is shown and identified within Einstein’s paper. It is postulated that that category error has led to a misunderstanding of the physics of relativity, and is cause of the associated paradoxes. Having clarified the categorical difference between material Object reality and product of information processing, Image reality, the paradoxes of relativity are considered making use of that differentiation. Caution regarding magic related to the “what you see is all there is” bias is given.

**Category:** History and Philosophy of Physics

[388] **viXra:1701.0508 [pdf]**
*submitted on 2017-01-16 01:10:24*

**Authors:** Nikitin V. N., Nikitin I.V.

**Comments:** 2 Pages.

Our Universe – one of galaxies Multivselenna limited to a gravitational cover and the Black hole in the center. The white hole has turned black, having let out galactic "tears", and we see that today, as have to see!

**Category:** Astrophysics

[387] **viXra:1701.0507 [pdf]**
*submitted on 2017-01-16 01:15:27*

**Authors:** Nikitin V. N., Nikitin I.V.

**Comments:** 1 Page.

Once the tail of an unknown comet "covered" Mars with red "cover". Martian "blueberry" is the hail which arose from the "torn-off" tail of an unknown comet.

**Category:** Astrophysics

[386] **viXra:1701.0506 [pdf]**
*submitted on 2017-01-15 13:43:49*

**Authors:** George Rajna

**Comments:** 37 Pages.

Stem cell therapies hold great promise for restoring function in a variety of degenerative conditions, but one of the logistical hurdles is how to ensure the cells survive in the body long enough to work. [21] A surprising new finding about gene expression could increase our understanding of the aging process. Gene expression is the process by which the information contained within a gene becomes a useful product. [20] Scientists at The Scripps Research Institute (TSRI) have discovered a protein that fine-tunes the cellular clock involved in aging. [19] Genes tell cells what to do—for example, when to repair DNA mistakes or when to die—and can be turned on or off like a light switch. Knowing which genes are switched on, or expressed, is important for the treatment and monitoring of disease. Now, for the first time, Caltech scientists have developed a simple way to visualize gene expression in cells deep inside the body using a common imaging technology. [18] Researchers at The University of Manchester have discovered that a potential new drug reduces the number of brain cells destroyed by stroke and then helps to repair the damage. [17] Researchers at the University of Connecticut have uncovered new information about how particles behave in our bloodstream, an important advancement that could help pharmaceutical scientists develop more effective cancer drugs. [16] For the past 15 years, the big data techniques pioneered by NASA's Jet Propulsion Laboratory in Pasadena, California, have been revolutionizing biomedical research. On Sept. 6, 2016, JPL and the National Cancer Institute (NCI), part of the National Institutes of Health, renewed a research partnership through 2021, extending the development of data science that originated in space exploration and is now supporting new cancer discoveries. [15] IBM scientists have developed a new lab-on-a-chip technology that can, for the first time, separate biological particles at the nanoscale and could enable physicians to detect diseases such as cancer before symptoms appear. [14] Scientists work toward storing digital information in DNA. [13]

**Category:** Physics of Biology

[385] **viXra:1701.0505 [pdf]**
*submitted on 2017-01-15 14:34:25*

**Authors:** Stephen C. Pearson.

**Comments:** 42 Pages.

This particular submission contains a copy [PART 3/6] of the author's original paper and is therefore a continuation of his previous submission, namely - "An Introduction to Functions of a Quaternion Hypercomplex Variable - PART 2/6", which has been published under the 'VIXRA' Mathematics subheading:- 'Functions and Analysis'.

**Category:** Functions and Analysis

[384] **viXra:1701.0504 [pdf]**
*submitted on 2017-01-15 15:44:13*

**Authors:** Stephen C. Pearson.

**Comments:** 42 Pages.

This particular submission contains a copy [PART 4/6] of the author's original paper and is therefore a continuation of his previous submission, namely - "An Introduction to Functions of a Quaternion Hypercomplex Variable - PART 3/6", which has been published under the 'VIXRA' Mathematics subheading:- 'Functions and Analysis'.

**Category:** Functions and Analysis

[383] **viXra:1701.0503 [pdf]**
*replaced on 2017-02-06 21:33:19*

**Authors:** Gene H Barbee

**Comments:** 36 Pages. Please contact gene at genebarbee@msn.com

The cosmic web is a filament like structure that connects galaxies. It has been imaged by gravitational lensing and is thought to be composed mainly of dark matter since it is not visible in the electromagnetic spectrum. There are computer simulations of the web showing that galaxies are often nodes for multiple branches. View the simulations at https://www.youtube.com/watch?v=ivymdduulFU. WMAP, PLANCK and other background radiation anisotropy teams have concluded that dark matter is 5 times more prevalent than normal matter. Scientists are trying to identify dark matter and the unexpected web like structure adds to the list of cosmology unknowns.
This document proposes that dark matter consists of neutron waves or neutrons (wave/particle duality) contained by a gravitational field. Dark matter density would be the same as normal matter density but neutron waves might have a radius of only 1.53e-15 meters (the wavelength of a neutron). This means it could be very elongated (e.g. 5e16 meters). It may coil into a small volume unless stretched by gravity. The neutron/waves location in the long filament is probabilistic but it contains 939 MeV/filament (1.675e-27 Kg). A diffuse structure and the absence of electromagnetic features will make it difficult to detect. Originally dark and normal matter is mixed and both fall into massive structures like galaxies over time. The residual dark matter probably forms aligned filaments we see as the cosmic web. It would attract some normal matter and be gravitationally stretched between galaxies. Dark matter has only gravitational interactions. As it moves into galaxies it forms halos and explains anomalous galactic velocity observations.
The author will present a re-analysis of the baryon/photon ratio (critical to residual deuterium abundance data) and will review that WMAP data that lead scientists to conclude that dark matter was 5 times more prevalent than normal matter. A detailed model from matter equality to decoupling will be presented. The features of interest are the waves that cause temperature variations in the background radiation. A model that predicts the temperature of the hot spots will be presented. Based on re-analysis of limiting considerations it will be shown that half of all matter is baryons and the other half is dark matter.
Most

**Category:** Relativity and Cosmology

[382] **viXra:1701.0502 [pdf]**
*submitted on 2017-01-15 11:15:18*

**Authors:** Stephen C. Pearson.

**Comments:** 42 Pages.

This particular submission contains (inter alia) a copy [PART 1/6] of the author's original paper, which was completed on 31st March 1984 and thus comprises a total of 161 handwritten foolscap pages. Subsequently, its purpose is to enunciate various definitions and theorems, which pertain to the following topics, i.e. (a) the algebra of quaternion hypercomplex numbers; (b) functions of a single quaternion hypercomplex variable; (c) the concepts of limit and continuity applied to such functions; (d) the elementary principles of differentiation and integration applied to quaternion hypercomplex functions. Many of the concepts presented therein are analogous to well established notions from real and complex variable analysis with any divergent results being due to the non-commutativity of quaternion products.

**Category:** Functions and Analysis

[381] **viXra:1701.0501 [pdf]**
*submitted on 2017-01-15 12:58:04*

**Authors:** Stephen C. Pearson.

**Comments:** 42 Pages.

This particular submission contains a copy [PART 2/6] of the author's original paper and is therefore a continuation of his previous submission, namely - "An Introduction to Functions of a Quaternion Hypercomplex Variable - PART 1/6", which has been published under the 'VIXRA' Mathematics subheading:- 'Functions and Analysis'.

**Category:** Functions and Analysis

[380] **viXra:1701.0500 [pdf]**
*submitted on 2017-01-15 12:36:59*

**Authors:** George Rajna

**Comments:** 34 Pages.

Scientists at The Scripps Research Institute (TSRI) have discovered a protein that fine-tunes the cellular clock involved in aging. [19] Genes tell cells what to do—for example, when to repair DNA mistakes or when to die—and can be turned on or off like a light switch. Knowing which genes are switched on, or expressed, is important for the treatment and monitoring of disease. Now, for the first time, Caltech scientists have developed a simple way to visualize gene expression in cells deep inside the body using a common imaging technology. [18] Researchers at The University of Manchester have discovered that a potential new drug reduces the number of brain cells destroyed by stroke and then helps to repair the damage. [17] Researchers at the University of Connecticut have uncovered new information about how particles behave in our bloodstream, an important advancement that could help pharmaceutical scientists develop more effective cancer drugs. [16] For the past 15 years, the big data techniques pioneered by NASA's Jet Propulsion Laboratory in Pasadena, California, have been revolutionizing biomedical research. On Sept. 6, 2016, JPL and the National Cancer Institute (NCI), part of the National Institutes of Health, renewed a research partnership through 2021, extending the development of data science that originated in space exploration and is now supporting new cancer discoveries. [15] IBM scientists have developed a new lab-on-a-chip technology that can, for the first time, separate biological particles at the nanoscale and could enable physicians to detect diseases such as cancer before symptoms appear. [14] Scientists work toward storing digital information in DNA. [13] Leiden theoretical physicists have proven that DNA mechanics, in addition to genetic information in DNA, determines who we are. Helmut Schiessel and his group simulated many DNA sequences and found a correlation between mechanical cues and the way DNA is folded. They have published their results in PLoS One. [12] We model the electron clouds of nucleic acids in DNA as a chain of coupled quantum harmonic oscillators with dipole-dipole interaction between nearest neighbours resulting in a van der Waals type bonding. [11]

**Category:** Physics of Biology

[379] **viXra:1701.0499 [pdf]**
*submitted on 2017-01-15 13:09:15*

**Authors:** George Rajna

**Comments:** 36 Pages.

A surprising new finding about gene expression could increase our understanding of the aging process. Gene expression is the process by which the information contained within a gene becomes a useful product. [20] Scientists at The Scripps Research Institute (TSRI) have discovered a protein that fine-tunes the cellular clock involved in aging. [19] Genes tell cells what to do—for example, when to repair DNA mistakes or when to die—and can be turned on or off like a light switch. Knowing which genes are switched on, or expressed, is important for the treatment and monitoring of disease. Now, for the first time, Caltech scientists have developed a simple way to visualize gene expression in cells deep inside the body using a common imaging technology. [18] Researchers at The University of Manchester have discovered that a potential new drug reduces the number of brain cells destroyed by stroke and then helps to repair the damage. [17] Researchers at the University of Connecticut have uncovered new information about how particles behave in our bloodstream, an important advancement that could help pharmaceutical scientists develop more effective cancer drugs. [16] For the past 15 years, the big data techniques pioneered by NASA's Jet Propulsion Laboratory in Pasadena, California, have been revolutionizing biomedical research. On Sept. 6, 2016, JPL and the National Cancer Institute (NCI), part of the National Institutes of Health, renewed a research partnership through 2021, extending the development of data science that originated in space exploration and is now supporting new cancer discoveries. [15] IBM scientists have developed a new lab-on-a-chip technology that can, for the first time, separate biological particles at the nanoscale and could enable physicians to detect diseases such as cancer before symptoms appear. [14] Scientists work toward storing digital information in DNA. [13] Leiden theoretical physicists have proven that DNA mechanics, in addition to genetic information in DNA, determines who we are. Helmut Schiessel and his group simulated many DNA sequences and found a correlation between mechanical cues and the way DNA is folded. They have published their results in PLoS One. [12]

**Category:** Physics of Biology

[378] **viXra:1701.0498 [pdf]**
*replaced on 2017-01-19 10:50:18*

**Authors:** Sylwester Kornowski

**Comments:** 4 Pages.

The Scale-Symmetric Theory (SST) shows that the quantum entanglement fixes the speed of light in “vacuum” c in relation to its source or a last-interaction object (it can be a detector). It causes that the spatial distances to galaxies differ from the time distances (the light travel time) - it is the duality of relativity. The duality of relativity leads to the running time Hubble constant that creates an illusion of acceleration of expansion of the Universe. According to SST, for the nearest Universe, the time Hubble constant is 70.52. SST gives for mean time Hubble constant 64.01 - it should be the mean observed Hubble constant when we apply the General Relativity (GR) to the whole observed Universe. If we neglect some part of distant Universe then the GR/observed time Hubble constant should be defined by following interval <64.01, 70.52>. But emphasize that the real mean spatial Hubble constant calculated within SST is 45.24. It leads to the age of Universe 21.614 +- 0.096 Gyr but time distance to most distant observed Universe cannot be longer than 13.866 +- 0.096 Gyr. SST shows that evolution of galaxies accelerated about 13.1 Gyr ago - it leads to an illusion that cosmic objects are not older than 13.1 Gyr.

**Category:** Quantum Gravity and String Theory

[377] **viXra:1701.0497 [pdf]**
*replaced on 2017-01-21 14:16:05*

**Authors:** Espen Gaarder Haug

**Comments:** 8 Pages.

In this paper we are combining Heisenberg’s uncertainty principle with Haug’s suggested maximum velocity for anything with rest-mass; see [1, 2, 3]. This leads to a suggested exact boundary condition on Heisenberg’s uncertainty principle. The uncertainty in position at the potential maximum momentum for subatomic particles (as derived from the maximum velocity) is half of the Planck length.
Perhaps Einstein was right after all when he stated, “God does not play dice.” Or at least the dice may have a stricter boundary on possible outcomes than we have previously thought.
We also show how this suggested boundary condition seems to make big G consistent with Heisenberg’s uncertainty principle. We obtain a mathematical expression for big G that is fully in line with empirical observations.
Hopefully our analysis can be a small step in better understanding Heisenberg’s uncertainty principle and its interpretations and by extension, the broader implications for the quantum world.

**Category:** Quantum Physics

[376] **viXra:1701.0496 [pdf]**
*replaced on 2017-04-16 22:29:29*

**Authors:** Frank Dodd Tony Smith Jr

**Comments:** 22 Pages.

This paper is intended to be a only rough semi-popular overview of how the 240 Root Vectors of E8 can be used to construct a useful Lagrangian describing Gravity and Dark Energy plus the Standard Model. For details and references, see viXra/1602.0319. The 240 Root Vectors of E8 represent the physical forces, particles, and spacetime that make up the construction of a realistic Lagrangian describing the Octonionic Inflation Era followed by a Quaternionic M4 x CP2 Kaluza-Klein Era in which the HIggs emerges by the Mayer mechanism and 2nd and 3rd Generation Fermions appear. By generalizations of the Nambu-Jona-Lasinio models, the Higgs is seen to be a Truth Quark-AntiQuark Condensate giving 3 Mass States of the Higgs and 3 Mass States of the Truth Quark. My analysis of Fermilab and LHC observation data indicates that Fermilab has observed the 3 Truth Quark Mass States and LHC has observed the 3 Higgs Mass States. The Lagrangian, which is fundamentally classical, is constructed from E8 only and E8 lives in Cl(16) = Cl(8) x Cl(8) which corresponds to two copies of an E8 Lattice. A seperate paper discusses using a third copy of an E8 Lattice in connection with construction of a realistic Algebraic Quantum Field Theory related to the Leech Lattice. Version 3 (v3) includes CMS analysis of 35.9 /fb of data in the H -> ZZ* -> 4l channel of the 2016 Run of the LHC at 13 TeV. Version 4 (v4) corrects author name and adds comparison of Consensus 1-state model with E8 3-state model with respect to Higgs-Tquark Phase Diagram. Versions 5 and 6 (v5 and v6) add details of CMS histogram and references to details of Nambu-Jona-Lasinio type calculations by Hashimoto, Tanabashi, and Yamawaki.

**Category:** High Energy Particle Physics

[375] **viXra:1701.0495 [pdf]**
*replaced on 2017-04-16 22:37:46*

**Authors:** Frank Dodd Tony Smith Jr

**Comments:** 32 Pages.

This paper is intended to be a only rough semi-popular overview of how the 240 Root Vectors of E8 can be used to construct a useful Lagrangian and Algebraic Quantum Field Theory (AQFT) in which the Bohm Quantum Potential emerges from a 26D String Theory with Strings = World-Lines = Path Integral Paths and the Massless Spin 2 State interpreted as the Bohm Quantum Potential. For details and references, see viXra/1602.0319. The 240 Root Vectors of E8 represent the physical forces, particles, and spacetime that make up the construction of a realistic Lagrangian describing the Octonionic Inflation Era. The Octonionic Lagrangian can be embedded into a Cl(1,25) Clifford Algebra which with 8-Periodicity gives an AQFT. The Massless Spin 2 State of 26D String Theory gives the Bohm Quantum Potential. The Quantum Code of the AQFT is the Tensor Product Quantum Reed-Muller code. A Single Cell of the 26D String Theory model has the symmetry of the Monster Group. Quantum Processes produce Schwinger Sources with size about 10^(-24) cm. Microtubule Structure related to E8 and Clifford Algebra enable Penrose-Hameroff Quantum Consciousness. E8 and Cl(8) may have been encoded in the Great Pyramid. A seperate paper discusses using the Quaternionic M4 x CP2 Kaluza-Klein version of the Lagrangian to produce the Higgs and 2nd and 3rd Generation Fermions and a Higgs - Truth Quark System with 3 Mass States for Higgs and Truth Quark.

**Category:** High Energy Particle Physics

[374] **viXra:1701.0494 [pdf]**
*submitted on 2017-01-15 00:36:12*

**Authors:** Andrew Walcott Beckwith, Stepan Moshkaliuk

**Comments:** 16 Pages. Last version of a paper cleared by referees in the Ukranian Journal of Physics. Subsequently accepted for Publication, after 1 year of vetting

We examine conditions for which energy flows in the early universe are modeled as a quantum Hamilton-Jacobi set of equations. Subsequently, we manage to use the Heisenberg Uncertainty principle for metric tensors based upon are Geometrodynamics treatment of our problem

**Category:** Quantum Gravity and String Theory

[373] **viXra:1701.0493 [pdf]**
*submitted on 2017-01-14 15:13:07*

**Authors:** George Rajna

**Comments:** 28 Pages.

Scientists at the University of Sydney have demonstrated the ability to "see" the future of quantum systems, and used that knowledge to preempt their demise, in a major achievement that could help bring the strange and powerful world of quantum technology closer to reality. [16] New method allows for quick, precise measurement of quantum states. [15] The fact that it is possible to retrieve this lost information reveals new insight into the fundamental nature of quantum measurements, mainly by supporting the idea that quantum measurements contain both quantum and classical components. [14] Researchers blur the line between classical and quantum physics by connecting chaos and entanglement. [13] Yale University scientists have reached a milestone in their efforts to extend the durability and dependability of quantum information. [12] Using lasers to make data storage faster than ever. [11] Some three-dimensional materials can exhibit exotic properties that only exist in "lower" dimensions. For example, in one-dimensional chains of atoms that emerge within a bulk sample, electrons can separate into three distinct entities, each carrying information about just one aspect of the electron's identity—spin, charge, or orbit. The spinon, the entity that carries information about electron spin, has been known to control magnetism in certain insulating materials whose electron spins can point in any direction and easily flip direction. Now, a new study just published in Science reveals that spinons are also present in a metallic material in which the orbital movement of electrons around the atomic nucleus is the driving force behind the material's strong magnetism. [10] Currently studying entanglement in condensed matter systems is of great interest. This interest stems from the fact that some behaviors of such systems can only be explained with the aid of entanglement. [9] Researchers from the Norwegian University of Science and Technology (NTNU) and the University of Cambridge in the UK have demonstrated that it is possible to directly generate an electric current in a magnetic material by rotating its magnetization. [8] This paper explains the magnetic effect of the electric current from the observed effects of the accelerating electrons, causing naturally the experienced changes of the electric field potential along the electric wire. The accelerating electrons explain not only the Maxwell Equations and the Special Relativity, but the Heisenberg Uncertainty Relation, the wave particle duality and the electron's spin also, building the bridge between the Classical and Quantum Theories. The changing acceleration of the electrons explains the created negative electric field of the magnetic induction, the changing relativistic mass and the Gravitational Force, giving a Unified Theory of the physical forces. Taking into account the Planck Distribution Law of the electromagnetic oscillators also, we can explain the electron/proton mass rate and the Weak and Strong Interactions.

**Category:** Quantum Physics

[372] **viXra:1701.0492 [pdf]**
*submitted on 2017-01-14 08:20:23*

**Authors:** George Rajna

**Comments:** 31 Pages.

In a proof-of-concept study published in Nature Physics, researchers drew magnetic squares in a nonmagnetic material with an electrified pen and then "read" this magnetic doodle with X-rays. [21] Researchers have brought electrides into the nanoregime by synthesizing the first 2D electride material. Electrides are ionic compounds, which are made of negative and positive ions. But in electrides, the negative "ions" are simply electrons, with no nucleus. [20] Microelectromechanical systems, or MEMS, are tiny machines fabricated using equipment and processes developed for the production of electronic chips and devices. [19] Researchers from the Harvard John A. Paulson School of Engineering and Applied Sciences have made the world's smallest radio receiver-built out of an assembly of atomic-scale defects in pink diamonds. [18] Smart phones have shiny flat AMOLED displays. Behind each single pixel of these displays hide at least two silicon transistors which were mass-manufactured using laser annealing technologies. [17] Bumpy surfaces with graphene between would help dissipate heat in next-generation microelectronic devices, according to Rice University scientists. [16] Scientists at The University of Manchester and Karlsruhe Institute of Technology have demonstrated a method to chemically modify small regions of graphene with high precision, leading to extreme miniaturisation of chemical and biological sensors. [15] A new method for producing conductive cotton fabrics using graphene-based inks opens up new possibilities for flexible and wearable electronics, without the use of expensive and toxic processing steps. [14] A device made of bilayer graphene, an atomically thin hexagonal arrangement of carbon atoms, provides experimental proof of the ability to control the momentum of electrons and offers a path to electronics that could require less energy and give off less heat than standard silicon-based transistors. It is one step forward in a new field of physics called valleytronics. [13] In our computer chips, information is transported in form of electrical charge. Electrons or other charge carriers have to be moved from one place to another. For years scientists have been working on elements that take advantage of the electrons angular momentum (their spin) rather than their electrical charge.

**Category:** Condensed Matter

[371] **viXra:1701.0491 [pdf]**
*replaced on 2017-01-17 04:56:58*

**Authors:** Luke Kenneth Casson Leighton

**Comments:** 7 Pages.

This paper continues prior work based on the insight
that Rishon ultracoloured triplets (electron, up, neutrino in left and
right forms) might simply be elliptically-polarised "mobius light". The
important first step is therefore to identify the twelve (24 including
both left and right handed forms) phases, the correct topology, and then
to peform transformations (mirroring, rotation, time-reversal) to double-check
which "particles" are identical to each other and which are anti-particle
opposites.
Ultimately, a brute-force systematic analysis will allow a formal
mathematical group to be dropped seamlessly on top of the twelve (24)
particles.

**Category:** High Energy Particle Physics

[370] **viXra:1701.0490 [pdf]**
*submitted on 2017-01-14 08:36:16*

**Authors:** Manuel Simões F., Ricardo Gobato

**Comments:** 1 Page. Panel presented at the XV Week of Physics of the State University of Londrina, Paraná, Brazil, September 2010.

The anisotropic viscosity of the liquid crystals (CL) is one of the most challenging properties of these materials, it was discovered in 1935 by Miesowicz, when he showed that CLs are non-Newtonian fluids, exhibiting viscosities that are direction dependent when subjected to an external field . Over this time, a tremendous amount of experimental and theoretical research has been devoted to the subject, but a microscopic theory satisfactory to it has never been found. The kinetic approach of Doi had for some time been the most accepted microscopic theory for nematic viscosity, but even having the great merit of producing a free expression of the adjustable parameters, which captures the essence of the phenomena, providing a semimicroscopic explanation for the origin of Its anisotropy, there are well documented divergences with the experimental data, being unable to describe the essential aspects of the phenomenology observed in these systems, especially when considering the range of the nematic phase.
The objective of this work is to study the contribution of the characteristic geometry of the nematic / molecule micelle to the viscosity of the nematic liquids. Throughout this work, we use the word geometry of the nematic grain, or simple geometry of the grain, to designate the geometry that a nematic micelle / molecule acquires under the thermal vibration. This concept does not appear to be common in the theory of NCLs, but arises naturally from Gennes's theory of parameters for NLCs. In addition, to increase the contribution of grain geometry to nematic viscosity we will use the Hess and Balls conforming approach to formulate the fundamentals of nematic viscosity.

**Category:** Condensed Matter

[369] **viXra:1701.0489 [pdf]**
*submitted on 2017-01-14 08:44:58*

**Authors:** Desire Francine Gobato, Ricardo Gobato, Jonas Liasch

**Comments:** 1 Page. Panel presented at the XVI Week of Physics of the State University of Londrina, Paraná, Brazil, September 2011..

One of the most unusual V / STOL military aircraft programs was the Avro VZ-9 "Avrocar". Designed to be a true flying saucer, Avrocar was one of the few V / STOL to be developed in complete stealth. Despite significant design changes, during flight trials, Avrocar was unable to achieve its objectives, and the program was eventually canceled after spending $ 10 million between 1954 and 1961. Raise data and information related to the Avrocar project carried out during World War II, which was directly linked to the advances of the aircraft that were built after it. Also study the data obtained and correlate them with the turbo fan engines used today.

**Category:** General Science and Philosophy

[368] **viXra:1701.0488 [pdf]**
*submitted on 2017-01-14 09:15:15*

**Authors:** Viktor S.Dolgikh

**Comments:** 16 Pages. In english and in russian

I present this article as a part of my work “HE”: the beginning.
A real picture of creation of primary, composing elements of matter and the results of their interaction are described.
Introductory and advertising part of it is omitted because of the expected "sarcasm", which will disappear at the end of the article.
Many years of practical approach to thinking is the main item in the content of the presented work.
It is given in a very condensed form without "tiring" description of the presented picture.
Its final statement is given on page 12.
The following extended explanation, and main description of basic directions:
- the variety of particles due to the result of their division, disintegration and their unnatural creation;
- the structure of atoms and molecules of matter in the classification of their states;
- the frame structure of the "live" part of this kind of matter with its diversity;
- the evolution of matter development,
which are constantly being in the process of work, will depend on the interest to this article and are presented in the following publications.
To clarify the text I am sending the original.

**Category:** Nuclear and Atomic Physics

[367] **viXra:1701.0487 [pdf]**
*submitted on 2017-01-14 05:07:46*

**Authors:** George Rajna

**Comments:** 19 Pages.

Diffraction-based analytical methods are widely used in laboratories, but they struggle to study samples that are smaller than a micrometer in size. [13] In an electron microscope, electrons are emitted by pointy metal tips, so they can be steered and controlled with high precision. Recently, such metal tips have also been used as high precision electron sources for generating X-rays. [12] In some chemical reactions both electrons and protons move together. When they transfer, they can move concertedly or in separate steps. Light-induced reactions of this sort are particularly relevant to biological systems, such as Photosystem II where plants use photons from the sun to convert water into oxygen. [11] EPFL researchers have found that water molecules are 10,000 times more sensitive to ions than previously thought. [10] Working with colleagues at the Harvard-MIT Center for Ultracold Atoms, a group led by Harvard Professor of Physics Mikhail Lukin and MIT Professor of Physics Vladan Vuletic have managed to coax photons into binding together to form molecules – a state of matter that, until recently, had been purely theoretical. The work is described in a September 25 paper in Nature. New ideas for interactions and particles: This paper examines the possibility to origin the Spontaneously Broken Symmetries from the Planck Distribution Law. This way we get a Unification of the Strong, Electromagnetic, and Weak Interactions from the interference occurrences of oscillators. Understanding that the relativistic mass change is the result of the magnetic induction we arrive to the conclusion that the Gravitational Force is also based on the electromagnetic forces, getting a Unified Relativistic Quantum Theory of all 4 Interactions.

**Category:** Quantum Physics

[366] **viXra:1701.0486 [pdf]**
*submitted on 2017-01-14 06:34:49*

**Authors:** George Rajna

**Comments:** 16 Pages.

In accordance with the rules of quantum mechanics, the atomic nucleus has discrete energy levels. [13] Research conducted at the National Superconducting Cyclotron Laboratory at Michigan State University has shed new light on the structure of the nucleus, that tiny congregation of protons and neutrons found at the core of every atom. [12] The work elucidates the interplay between collective and single-particle excitations in nuclei and proposes a quantitative theoretical explanation. It has as such great potential to advance our understanding of nuclear structure. [11] When two protons approaching each other pass close enough together, they can " feel " each other, similar to the way that two magnets can be drawn closely together without necessarily sticking together. According to the Standard Model, at this grazing distance, the protons can produce a pair of W bosons. [10] The fact that the neutron is slightly more massive than the proton is the reason why atomic nuclei have exactly those properties that make our world and ultimately our existence possible. Eighty years after the discovery of the neutron, a team of physicists from France, Germany, and Hungary headed by Zoltán Fodor, a researcher from Wuppertal, has finally calculated the tiny neutron-proton mass difference. [9] Taking into account the Planck Distribution Law of the electromagnetic oscillators, we can explain the electron/proton mass rate and the Weak and Strong Interactions. Lattice QCD gives the same results as the diffraction patterns of the electromagnetic oscillators, explaining the color confinement and the asymptotic freedom of the Strong Interactions.

**Category:** Nuclear and Atomic Physics

[365] **viXra:1701.0485 [pdf]**
*submitted on 2017-01-13 17:18:21*

**Authors:** Dongchan Lee

**Comments:** 54 Pages. 1st draft in PPT based pdf form

The recent releases of the PISA 2015 and TIMSS 2015 math results showed clearly that most of the developed countries are in math EDU growths in stagnations or collapses. Lee demonstrated the overall math score history of both TIMSS and PISA 1995-2015 for 15-20 years and what the stagnations mean for the future economies of these nations and what are the concrete alternatives to overcome the past 1-2 decades' of the math education stagnations.

**Category:** Economics and Finance

[364] **viXra:1701.0484 [pdf]**
*submitted on 2017-01-14 01:22:09*

**Authors:** Ernesto Lopez Gonzalez

**Comments:** 87 pages. (In spanish)

A new theory of matter and energy is proposed. The main postulate is this, all matter and energy are composed of vibrations of space-time, which is formed by a single 5-brane extended in the three spatial dimensions and compacted in two additional dimensions up to an order of 10 -6 m. It also postulates the existence of a central hole in the plane of compacted dimensions. The substance forming this 5-brane is considered to have properties similar to a liquid crystal. It is also postulated that all interactions are originated from the modification of space-time caused by the vibrations that forming matter and energy. In particular we analyze three mechanisms: the drag, deformation and the modification of the index of refraction of the space-time. With these postulates and by resolving the wave equation we can deduce the D'Broglie wavelength, the uncertainty principle, the charge and mass of the electron only from its mass, the origin of inertia, the centrifugal force, the electric forces, the gravitational forces, hydrogen atom orbitals and the existence of a system of elementary particles formed by the three known neutrinos, the electron and four partons formed by the combination of the previous four with surface waves in the hypothetical central hole in the plane of the compacted dimensions. The masses and the strength of their interactions of these particles are estimated. Then it is possible to propose a system for hadrons that allows to estimate their masses, magnetic moments, internal distribution of charges and the Reid potential for the residual nuclear force. Finally an intuitive explanation of the spin of the particles is provided.

**Category:** Quantum Physics

[363] **viXra:1701.0483 [pdf]**
*submitted on 2017-01-13 13:46:54*

**Authors:** Reuven Tint

**Comments:** 4 Pages. original papper in russian

Annotation. Are given in Section 1 the theorem and its proof, complementing the classical formulation of the ABC conjecture, and in Chapter 2 addressed the issue of communication with the elliptic curve Frey's "Great" Fermat's theorem.

**Category:** Number Theory

[362] **viXra:1701.0482 [pdf]**
*submitted on 2017-01-13 09:00:42*

**Authors:** guilhem CICOLELLA

**Comments:** 4 Pages.

the only consecutives powers being 8 and 9 the probleme consisted in demonstrating that the quantities of primes numbers inferior to one billion depended on one single equation based on two different methods of calculation with congruent results,the ultimate purpose being to prove the existence of an algorithm capable of determining two intricate values more quickly than with computer(rapid mathematical system r.m.S)

**Category:** Number Theory

[361] **viXra:1701.0481 [pdf]**
*submitted on 2017-01-13 09:07:07*

**Authors:** Yannan Yang

**Comments:** 4 Pages.

Mistakes are found in the theoretical derivation process, during which the magnetic force is explained to be the relativistic side effect of Coulomb force. As a result, some serious paradoxes will be inevitable if we accept the notion that Magnetism is a Relativistic side eﬀect of Electrostatics.

**Category:** Relativity and Cosmology

[360] **viXra:1701.0480 [pdf]**
*submitted on 2017-01-13 09:35:22*

**Authors:** Andrew Beckwith

**Comments:** 7 Pages.

Magnetic field for relic graviton production linked to strength, initially, of inflaton

**Category:** Quantum Gravity and String Theory

[359] **viXra:1701.0479 [pdf]**
*submitted on 2017-01-13 06:52:10*

**Authors:** George Rajna

**Comments:** 29 Pages.

Researchers have brought electrides into the nanoregime by synthesizing the first 2D electride material. Electrides are ionic compounds, which are made of negative and positive ions. But in electrides, the negative "ions" are simply electrons, with no nucleus. [20] Microelectromechanical systems, or MEMS, are tiny machines fabricated using equipment and processes developed for the production of electronic chips and devices. [19] Researchers from the Harvard John A. Paulson School of Engineering and Applied Sciences have made the world's smallest radio receiver-built out of an assembly of atomic-scale defects in pink diamonds. [18] Smart phones have shiny flat AMOLED displays. Behind each single pixel of these displays hide at least two silicon transistors which were mass-manufactured using laser annealing technologies. [17] Bumpy surfaces with graphene between would help dissipate heat in next-generation microelectronic devices, according to Rice University scientists. [16] Scientists at The University of Manchester and Karlsruhe Institute of Technology have demonstrated a method to chemically modify small regions of graphene with high precision, leading to extreme miniaturisation of chemical and biological sensors. [15] A new method for producing conductive cotton fabrics using graphene-based inks opens up new possibilities for flexible and wearable electronics, without the use of expensive and toxic processing steps. [14] A device made of bilayer graphene, an atomically thin hexagonal arrangement of carbon atoms, provides experimental proof of the ability to control the momentum of electrons and offers a path to electronics that could require less energy and give off less heat than standard silicon-based transistors. It is one step forward in a new field of physics called valleytronics. [13] In our computer chips, information is transported in form of electrical charge. Electrons or other charge carriers have to be moved from one place to another. For years scientists have been working on elements that take advantage of the electrons angular momentum (their spin) rather than their electrical charge. This new approach, called "spintronics" has major advantages compared to common electronics. It can operate with much less energy. [12]

**Category:** Condensed Matter

[358] **viXra:1701.0478 [pdf]**
*submitted on 2017-01-12 13:25:43*

**Authors:** Tom Masterson

**Comments:** 1 Page. © 1965 by Tom Masterson

A number theory query related to Fermat's last theorem in higher dimensions.

**Category:** Number Theory

[357] **viXra:1701.0477 [pdf]**
*submitted on 2017-01-12 14:18:40*

**Authors:** George Rajna

**Comments:** 22 Pages.

In roughly four billion years, the Milky Way will be no more. Indeed, our home galaxy is on course to collide and unite with the Andromeda Galaxy, at present some two million light years away. [16] A simulation of the powerful jets generated by supermassive black holes at the centers of the largest galaxies explains why some burst forth as bright beacons visible across the universe, while others fall apart and never pierce the halo of the galaxy. [15] Astronomers from Chalmers University of Technology have used the giant telescope Alma to reveal an extremely powerful magnetic field very close to a supermassive black hole in a distant galaxy. The results appear in the 17 April 2015 issue of the journal Science. [14] Quasars, even those that are billions of light years away, are some of the " brightest beacons " in the universe. Yet how can quasars radiate so much energy that they can be seen from Earth? One explanation is that at each quasar's center is a growing supermassive black hole (SMBH). [13] If dark matter comes in both matter and antimatter varieties, it might accumulate inside dense stars to create black holes. [12] For a long time, there were two main theories related to how our universe would end. These were the Big Freeze and the Big Crunch. In short, the Big Crunch claimed that the universe would eventually stop expanding and collapse in on itself. This collapse would result in…well…a big crunch (for lack of a better term). Think " the Big Bang " , except just the opposite. That's essentially what the Big Crunch is. On the other hand, the Big Freeze claimed that the universe would continue expanding forever, until the cosmos becomes a frozen wasteland. This theory asserts that stars will get farther and farther apart, burn out, and (since there are no more stars bring born) the universe will grown entirely cold and eternally black. [11] Newly published research reveals that dark matter is being swallowed up by dark energy, offering novel insight into the nature of dark matter and dark energy and what the future of our Universe might be. [10] The gravitational force attracting the matter, causing concentration of the matter in a small space and leaving much space with low matter concentration: dark matter and energy. There is an asymmetry between the mass of the electric charges, for example proton and electron, can understood by the asymmetrical Planck Distribution Law. This temperature dependent energy distribution is asymmetric around the maximum intensity, where the annihilation of matter and antimatter is a high probability event. The asymmetric sides are creating different frequencies of electromagnetic radiations being in the same intensity level and compensating each other. One of these compensating ratios is the electron – proton mass ratio. The lower energy side has no compensating intensity level, it is the dark energy and the corresponding matter is the dark matter.

**Category:** Astrophysics

[356] **viXra:1701.0476 [pdf]**
*submitted on 2017-01-12 16:01:36*

**Authors:** Declan Traill

**Comments:** 7 Pages.

The original mathematical treatment used in the analysis of the Fizeau experiment of 1851, which measured the relative speed of light in a moving medium, assumes that light travels through the water in a smooth continuous flow, at a speed less than the speed of light in a vacuum (relative to the water). Thus it assumes that the water’s velocity vector can simply be added to that of the light. However, light is transmitted through optical media, such as water, by a continuous process of absorption and re-emission by the water molecules; but travels between them at the full speed of light (in a vacuum). Thus the mathematics describing the process of Fresnel dragging must be formulated differently and can then be explained by classical Physics

**Category:** Classical Physics

[355] **viXra:1701.0475 [pdf]**
*submitted on 2017-01-12 10:27:06*

**Authors:** Nikolay Dementev

**Comments:** 5 Pages.

Based on the observation of randomly chosen primes it has been conjectured that the sum of digits that form any prime number should yield either even number or another prime number. The conjecture was successfully tested for the first 100 primes.

**Category:** Number Theory

[354] **viXra:1701.0474 [pdf]**
*submitted on 2017-01-12 10:57:20*

**Authors:** George Rajna

**Comments:** 26 Pages.

New method allows for quick, precise measurement of quantum states. [15] The fact that it is possible to retrieve this lost information reveals new insight into the fundamental nature of quantum measurements, mainly by supporting the idea that quantum measurements contain both quantum and classical components. [14] Researchers blur the line between classical and quantum physics by connecting chaos and entanglement. [13] Yale University scientists have reached a milestone in their efforts to extend the durability and dependability of quantum information. [12] Using lasers to make data storage faster than ever. [11] Some three-dimensional materials can exhibit exotic properties that only exist in "lower" dimensions. For example, in one-dimensional chains of atoms that emerge within a bulk sample, electrons can separate into three distinct entities, each carrying information about just one aspect of the electron's identity—spin, charge, or orbit. The spinon, the entity that carries information about electron spin, has been known to control magnetism in certain insulating materials whose electron spins can point in any direction and easily flip direction. Now, a new study just published in Science reveals that spinons are also present in a metallic material in which the orbital movement of electrons around the atomic nucleus is the driving force behind the material's strong magnetism. [10] Currently studying entanglement in condensed matter systems is of great interest. This interest stems from the fact that some behaviors of such systems can only be explained with the aid of entanglement. [9] Researchers from the Norwegian University of Science and Technology (NTNU) and the University of Cambridge in the UK have demonstrated that it is possible to directly generate an electric current in a magnetic material by rotating its magnetization. [8] This paper explains the magnetic effect of the electric current from the observed effects of the accelerating electrons, causing naturally the experienced changes of the electric field potential along the electric wire. The accelerating electrons explain not only the Maxwell Equations and the Special Relativity, but the Heisenberg Uncertainty Relation, the wave particle duality and the electron's spin also, building the bridge between the Classical and Quantum Theories. The changing acceleration of the electrons explains the created negative electric field of the magnetic induction, the changing relativistic mass and the Gravitational Force, giving a Unified Theory of the physical forces. Taking into account the Planck Distribution Law of the electromagnetic oscillators also, we can explain the electron/proton mass rate and the Weak and Strong Interactions.

**Category:** Quantum Physics

[353] **viXra:1701.0473 [pdf]**
*submitted on 2017-01-12 12:04:19*

**Authors:** Nikola Perkovic

**Comments:** 4 Pages.

The paper will make new claims regarding the fine structure constant. The specific value of the electromagnetic coupling constant, that is the fine structure constant, will be explained as a consequence of mass energy equivalence. Special Relativity and Quantum Electrodynamics will be used to attain the mass energy equivalence equation and after which a new, quantized equation of mass energy equivalence will be postulated and tested. A new way will be presented to determine the mass of neutrons by using the strong nuclear coupling constant and protons by using the fine structure constant.

**Category:** High Energy Particle Physics

[352] **viXra:1701.0472 [pdf]**
*replaced on 2017-03-27 06:12:42*

**Authors:** Jeffrey S. Keen

**Comments:** 17 pages, 17 Figures,7 Tables

Abstract
This paper addresses two fundamental areas of physics and cosmology that involve a “universal consciousness”. (a) It shows where Einstein was incorrect: it is not only possible to communicate information faster than the speed of light, but this can be instantaneous. (b) The main challenge in physics today is unifying quantum theory with gravity: in this paper it is demonstrated that the extended mind is involved in solving this problem.
I have spent over 30 years researching the mind’s interaction with the laws of physics, subtle fields, and the cosmos. This has been achieved by quantifying sensed data and discovering formulae and universal constants. A technique, I have developed, involving a singularity is explained for noetically studying subtle fields and abstract geometry. This has produced some ground-breaking and fundamental findings, demonstrating that the mind is very sensitive to geometry and both local and astronomical forces.
The most exciting aspects are the quantified results and graphs that have been obtained from a specified subtle energy beam length (L) measured over the last eight years. For example, during the course of a day, a sinusoidal curve is obtained with maxima at sunset and minima at sunrise, even if measurements are made in a darkened room on a cloudy day.
Another example is that the mind can detect a lower gravitational force on Earth, when the sun and moon’s gravity are pulling in opposite directions at full moon, resulting in a peak in L. Likewise, a higher gravitational force, when the sun and moon’s gravity are pulling in the same direction at new moon, results in counter-intuitive shorter lengths of L.
The mind also detects changes in the Newtonian gravitational force, Fg, as the earth orbits the sun. Over the course of a year, a plot of L produces an equation L=6E+105*Fg -δ which has a very high correlation coefficient R2 = 0.9745. The power index is Feigenbaum’s constant within 0.013% error. This is another example of the mind’s ability to interact with gravity and produce a universal constant, suggesting that consciousness is intimately connected to the fabric of the universe and chaos theory.
Any three objects in alignment, be they 3 grains of sand, 3 trees, 3 coins, 3 stones, 3 abstract circles drawn on paper, or even 3 objects in the solar system all form a strong subtle energy beam that experimentally has been perceived to extend endlessly. In particular, this beam has been measured during alignments across the solar system. These have included eclipses of the sun and moon, to a transit of Neptune by the moon. The data was analysed weeks after the events. In all cases L peaked before the predicted time of the occlusion. This time was always identical to the time it takes light to reach an observer on earth from the furthest of the 3 planets in alignment, on the day of the experiment This demonstrates that the mind can communicate not only faster than light, but instantaneously across the solar system, and the structure of the universe is such to enable this to happen. It also suggests that macro entanglement is possible.
The findings in this paper significantly impact cosmology, and in particular show that Inflation Theory just after the big bang, is unnecessary to explain the current structure of the universe.

**Category:** Relativity and Cosmology

[351] **viXra:1701.0471 [pdf]**
*submitted on 2017-01-12 05:04:13*

**Authors:** George Rajna

**Comments:** 15 Pages.

Physicists at the National Institute of Standards and Technology (NIST) have cooled a mechanical object to a temperature lower than previously thought possible, below the so-called "quantum limit." [9] For the past 100 years, physicists have been studying the weird features of quantum physics, and now they're trying to put these features to good use. One prominent example is that quantum superposition (also known as quantum coherence)—which is the property that allows an object to be in two states at the same time—has been identified as a useful resource for quantum communication technologies. [8] Quantum entanglement—which occurs when two or more particles are correlated in such a way that they can influence each other even across large distances—is not an all-or-nothing phenomenon, but occurs in various degrees. The more a quantum state is entangled with its partner, the better the states will perform in quantum information applications. Unfortunately, quantifying entanglement is a difficult process involving complex optimization problems that give even physicists headaches. [7] A trio of physicists in Europe has come up with an idea that they believe would allow a person to actually witness entanglement. Valentina Caprara Vivoli, with the University of Geneva, Pavel Sekatski, with the University of Innsbruck and Nicolas Sangouard, with the University of Basel, have together written a paper describing a scenario where a human subject would be able to witness an instance of entanglement—they have uploaded it to the arXiv server for review by others. [6] The accelerating electrons explain not only the Maxwell Equations and the Special Relativity, but the Heisenberg Uncertainty Relation, the Wave-Particle Duality and the electron's spin also, building the Bridge between the Classical and Quantum Theories. The Planck Distribution Law of the electromagnetic oscillators explains the electron/proton mass rate and the Weak and Strong Interactions by the diffraction patterns. The Weak Interaction changes the diffraction patterns by moving the electric charge from one side to the other side of the diffraction pattern, which violates the CP and Time reversal symmetry. The diffraction patterns and the locality of the self-maintaining electromagnetic potential explains also the Quantum Entanglement, giving it as a natural part of the relativistic quantum theory.

**Category:** Quantum Physics

[350] **viXra:1701.0470 [pdf]**
*submitted on 2017-01-11 17:54:52*

**Authors:** Dongchan Lee

**Comments:** 9 Pages. 1st draft

In this short paper, I demonstrated that why the top 5-6 original oil richest countries need to be excluded from most of the socio-economic vs. cognitive skill regressions because they will remain as outliers far too much out to the otherwise very reliable and stable regression growth coefficients and explanation powers of the models involved. I included some simple linear regression charts where they are far out in North West corners of the regression lines; their GDP per capita had reached the top tier of the world by the 70s already with the minimal cognitive skills and education inputs; I provided their relative economic strength compared to the economic miracle powers from the Eastern Asia: 4 Asian Tigers and China so that you can see their super rapid rises were all due to their oil-based economies; their top 6 shares of the Natural Resource rents as percent of capita over the past 40 years. I believe that these 4 key factors may allow anyone serious about any serious regressions that involve the socio-economic regressions to exclude these 5-6 countries in their analysis. Finally I made a brief comment about the polar opposite to these countries with the poor economy with the rapid gains of the cognitive skills.

**Category:** Economics and Finance

[349] **viXra:1701.0468 [pdf]**
*submitted on 2017-01-11 23:10:52*

**Authors:** Stephen I. Ternyik

**Comments:** 5 Pages.

The teleology of economic knowledge and causality is discussed, in terms of mathematical and temporal aspects.

**Category:** General Science and Philosophy

[348] **viXra:1701.0467 [pdf]**
*submitted on 2017-01-12 02:38:07*

**Authors:** Nikitin V. N., Nikitin I.V.

**Comments:** 1 Page.

Hydrogen and helium – the subsequent product of synthesis of elements of stars.

**Category:** Astrophysics

[347] **viXra:1701.0465 [pdf]**
*replaced on 2017-01-14 01:44:02*

**Authors:** Sylwester Kornowski

**Comments:** 10 Pages.

Here, applying the Scale-Symmetric Theory (SST), we answered following question: What is the origin of the cosmic reionization? Presented here scenario differs radically from that described within the mainstream cosmology. Most important are masses of massive galaxies/quasars and the decays of large cosmic structures due to the stepwise decays of the earliest photons (such decays of photons mimic an acceleration of expansion of the Universe). Highest rate of reionization of hydrogen should be for redshift z(H,start) = 11.18 whereas complete reionization should occur at z(H,end) = 7.10. For reionization of helium we obtain respectively z(He,start) = 3.63 and z(He,end) = 2.70. Theoretical results are consistent with observational data. It leads to conclusion that the General Theory of Relativity (GR) correctly describes the regions of reionization. We showed that number and energy of created photons were sufficient to ionize the intergalactic medium. We answered as well the second very important question: Why there appeared the supermassive black holes so quickly? We showed here also that there was an acceleration of evolution of clusters of galaxies (not an acceleration of expansion of spacetime!) about 13.8 down to 13 Gyr and 6.5 down to 5 Gyr ago.

**Category:** Quantum Gravity and String Theory

[346] **viXra:1701.0463 [pdf]**
*submitted on 2017-01-11 09:46:23*

**Authors:** U. Kayser-Herold

**Comments:** 3 Pages.

By oblique reflection of circularly polarized photons on a rotating cylindrical mirror the frequency of the reflected photons is shifted against the ferquency of incident photons by nearly twice the rotational frequency $n$ of the mirror: $\Delta \nu = 2\hspace{2} n \hspace{2}\sin \alpha$, where $\alpha$ is the axial angle of incidence. $\Delta \nu$ can be substantially enhanced by multiple reflections between counter-rotating coaxial mirrors.

**Category:** Quantum Physics

[345] **viXra:1701.0462 [pdf]**
*submitted on 2017-01-11 05:45:26*

**Authors:** Octavian Cira, Florentin Smarandache

**Comments:** 75 Pages.

We put the problem to determine the sets of integers in base b ≥ 2 that generate primes with using a function.

**Category:** General Mathematics

[344] **viXra:1701.0457 [pdf]**
*submitted on 2017-01-11 05:51:19*

**Authors:** Florentin Smarandache

**Comments:** 12 Pages.

: Soft set theory is a general mathematical tool for dealing with uncertain, fuzzy, not clearly deﬁned objects. In this paper we introduced soft mixed neutrosophic N-algebraic
with the discussion of some of their characteristics. We also introduced soft mixed dual neutrosophic N-algebraic structures, soft weak mixed neutrosophic N-algebraic structures,
soft Lagrange mixed neutrosophic N-algebraic structures, soft weak Lagrange mixed neu
trosophic and soft Lagrange free mixed neutosophic N-algebraic structures. the so called soft strong neutrosophic loop which is of pure neutrosophic character. We also introduced some of new notions and some basic properties of this newly born soft mixed neutrosophic N-structures related to neutrosophic theory.

**Category:** General Mathematics

[343] **viXra:1701.0452 [pdf]**
*submitted on 2017-01-11 05:55:50*

**Authors:** Said Broumi, Irfan Deli, Florentin Smarandache

**Comments:** 15 Pages.

In this paper, we ﬁrst give the cartesian product of two neutrosophic multi sets(NMS). Then, we deﬁne relations on neutrosophic multi sets to extend the intuitionistic fuzzy multi relations to neutrosophic multi relations. The relations allows to compose two neutrosophic sets. Also, various properties like reﬂexivity, symmetry and transitivity are studied.

**Category:** General Mathematics

[342] **viXra:1701.0449 [pdf]**
*submitted on 2017-01-11 06:00:01*

**Authors:** Luige Vlădăreanu, Florentin Smarandache, Mumtaz Ali, Victor Vlădăreanu, Mingcong Deng

**Comments:** 6 Pages.

The paper presents automated estimation techniques for robot parameters through system identification, for both PID control and future implementation of intelligent control laws, with the aim of designing the experimental model in a 3D virtual reality for testing and validating control laws in the joints of NAO humanoid robots.

**Category:** General Mathematics

[341] **viXra:1701.0448 [pdf]**
*submitted on 2017-01-11 06:02:38*

**Authors:** Nguyen Xuan Thao, Bui Cong Cuong, Florentin Smarandache

**Comments:** 20 Pages.

A rough fuzzy set is the result of approximation of a fuzzy set with respect to a crisp approximation space. It is a mathematical tool for the knowledge discovery in the fuzzy information systems. In this paper, we introduce the concepts of rough standard neutrosophic sets, standard neutrosophic information system and give some results of the knowledge discovery on standard neutrosophic information system based on rough standard neutrosophic sets.

**Category:** General Mathematics

[340] **viXra:1701.0447 [pdf]**
*submitted on 2017-01-11 06:03:45*

**Authors:** Said Broumi, Mohamed Talea, Assia Bakali, Florentin Smarandache, Mumtaz Ali

**Comments:** 8 Pages.

This main purpose of this paper is to develop an algorithm to find the shortest path on a network in which the weights of the edges are represented by bipolar neutrosophic numbers. Finally, a numerical example has been provided for illustrating the proposed approach.

**Category:** General Mathematics

[339] **viXra:1701.0446 [pdf]**
*submitted on 2017-01-11 06:04:50*

**Authors:** Said Broumi, Mohamed Talea, Assia Bakali, Florentin Smarandache, Luige VlĂdĂreanu

**Comments:** 6 Pages.

In this paper, we develop a new approach to deal with neutrosphic shortest path problem in a network in which each edge weight (or length) is represented as triangular fuzzy neutrosophic number. The proposed algorithm also gives the shortest path length from source node to destination node using ranking function. Finally, an illustrative example is also included to demonstrate our proposed approach.

**Category:** General Mathematics

[338] **viXra:1701.0445 [pdf]**
*submitted on 2017-01-11 06:05:51*

**Authors:** Said Broumi, Mohamed Talea, Assia Bakali, Florentin Smarandache

**Comments:** 7 Pages.

In this study, we propose an approach to determine the shortest path length between a pair of specified nodes s and t on a network whose edge weights are represented by trapezoidal neutrosophic numbers. Finally, an illustrative example is provided to show the applicability and effectiveness of the proposed approach.

**Category:** General Mathematics

[337] **viXra:1701.0444 [pdf]**
*submitted on 2017-01-11 06:06:52*

**Authors:** Kenta Takaya, Toshinori Asai, Valeri Kroumov, Florentin Smarandache

**Comments:** 6 Pages.

In the process of development a control strategy for mobile robots, simulation is important for testing the software components, robot behavior and control algorithms in different surrounding environments. In this paper we introduce a simulation environment for mobile robots based on ROS and Gazebo. We show that after properly creating the robot models under Gazebo, the code developed for the simulation process can be directly implemented in the real robot without modiﬁcations. In this paper autonomous navigation tasks and 3D-mapping simulation using control programs under ROS are presented. Both the simulation and experimental results agree very well and show the usability of the developed environment.

**Category:** General Mathematics

[336] **viXra:1701.0439 [pdf]**
*submitted on 2017-01-11 06:10:57*

**Authors:** Said Broumi, Florentin Smarandache

**Comments:** 21 Pages.

Multi-attribute decision making (MADM). Play an important role in many applications, due to the efficiency to handle indeterminate and inconsistent information, single valued neutrosophic sets is widely used to model indeterminate information.

**Category:** General Mathematics

[335] **viXra:1701.0438 [pdf]**
*submitted on 2017-01-11 06:11:57*

**Authors:** Florentin Smarandache, Mircea Eugen Şelariu

**Comments:** 13 Pages.

Trilobele sunt funcţii supermatematice circulare excentrice (FSM-CE) de excentricitate unghiulară.

**Category:** General Mathematics

[334] **viXra:1701.0435 [pdf]**
*submitted on 2017-01-11 06:15:54*

**Authors:** Marcel Migdalovici, Luige Vladareanu, Gabriela Vladeanu, Said broumi, Daniela Baran, Florentin Smarandache

**Comments:** 6 Pages.

A survey of some author’s concepts on the dynamic systems stability regions, in the general case of dynamic systems, that depend on parameters, is related in the paper. The property of separation of stable regions in the free parameters domain is assumed in the paper as an important property of the environment that is carry out and in the specified case of walking robot analyzed in the paper.

**Category:** General Mathematics

[333] **viXra:1701.0433 [pdf]**
*submitted on 2017-01-11 06:17:59*

**Authors:** W.B. Vasantha Kandasamy, Ilanthenral Kandasamy, Florentin Smarandache

**Comments:** 10 Pages.

The Collatz conjecture is an open conjecture in mathematics named so after Lothar Collatz who proposed it in 1937. It is also known as 3n + 1 conjecture, the Ulam conjecture (after Stanislaw Ulam), Kakutanis problem (after Shizuo
Kakutani) and so on. Several various generalization of the Collatz conjecture
has been carried.

**Category:** General Mathematics

[332] **viXra:1701.0424 [pdf]**
*submitted on 2017-01-11 06:27:41*

**Authors:** Florentin Smarandache, Mircea Eugen Șelariu

**Comments:** 13 Pages.

The trilobes are ex-centric circular supermathematics functions (EC-SMF) of angular excentricity.

**Category:** General Mathematics

[331] **viXra:1701.0423 [pdf]**
*submitted on 2017-01-11 06:28:41*

**Authors:** Ilanthenral Kandasamy, Florentin Smarandache

**Comments:** 8 Pages.

Personality tests are most commonly objective type where the users rate their behaviour. Instead of providing a single forced choice, they can be provided with more options. A person may not be in general capable to judge his/her behaviour very precisely and categorize it into a single category. Since it is self rating there is a lot of uncertain and indeterminate feelings involved. The results of the test depend a lot on the circumstances under which the test is taken, the amount of time that is spent, the past experience of the person, the emotion the person is feeling and the person’s self image at that time and so on.

**Category:** General Mathematics

[330] **viXra:1701.0422 [pdf]**
*submitted on 2017-01-11 06:29:23*

**Authors:** Ovidiu Ilie Şandru, Florentin Smarandache

**Comments:** 3 Pages.

În aceasta lucrare vom prezenta un procedeu de algoritmizare a operatiilor necesare deplasarii automate a unui obiect predefinit dintr-o imagine video data intr-o regiune tinta a acelei imagini, menit a facilita realizarea de aplicatii software specializate in rezolvarea acestui gen de probleme.

**Category:** General Mathematics

[329] **viXra:1701.0421 [pdf]**
*submitted on 2017-01-11 03:24:38*

**Authors:** Igor Chistiukhin

**Comments:** 18 Pages. Russian language

The article deals with the problems associated with the scientific study of the relationship of the Orthodox Church to the system of the ancient spectacles that existed at the time of the birth of Christianity

**Category:** Social Science

[328] **viXra:1701.0420 [pdf]**
*replaced on 2017-01-23 17:53:56*

**Authors:** Nikhil Shaw

**Comments:** 8 Pages.

In computer science, a selection algorithm is an algorithm for finding the kth smallest number in a list or array; such a number is called the kth order statistic. This includes the cases of finding the minimum, maximum, and median elements. There are O(n) (worst-case linear time) selection algorithms, and sublinear performance is possible for structured data; in the extreme, O(1) for an array of sorted data. Selection is a subproblem of more complex problems like the nearest neighbour and shortest path problems. Many selection algorithms are derived by generalizing a sorting algorithm, and conversely some sorting algorithms can be derived as repeated application of selection.
This new algorithm although has worst case of O(n^2), the average case is of near linear time for an unsorted list.

**Category:** Statistics

[327] **viXra:1701.0419 [pdf]**
*submitted on 2017-01-10 10:38:19*

**Authors:** Azeddine ELHASSOUNY, Florentin SMARANDACHE

**Comments:** 17 Pages.

The purpose of this paper is to present an extension and alternative of the hybrid approach using Saaty’s Analytical Hierarchy Process (AHP) and Technique for Order Preference by Similarity to Ideal Solution (TOPSIS) method (AHP-TOPSIS), that based on the AHP and its use of pairwise comparisons, to a new method called α-D MCDM-TOPSIS(α-Discounting Method for Multi-Criteria Decision Making-TOPSIS). The proposed method works not only for preferences that are pairwise comparisons of criteria as AHP does, but for preferences of any n-wise (with n ≥ 2) comparisons of criteria. Finally the α-D MCDM-TOPSIS methodology is veriﬁed by some examples to demonstrate how it might be applied in diﬀerent types of matrices and is how it allwos for consistency, inconsistent, weak inconsistent, and strong inconsistent problems.

**Category:** General Mathematics

[326] **viXra:1701.0418 [pdf]**
*submitted on 2017-01-10 10:41:59*

**Authors:** Ilanthenral Kandasamy, Florentin Smarandache

**Comments:** 15 Pages.

Double Refined Indeterminacy Neutrosophic Set(DRINS) is an inclusive case of the refined neutrosophic set, defined by Smarandache [1], which provides the additional possibility to represent with sensitivity and accuracy the uncertain,imprecise,incomplete,and inconsistent information which are available in real world.

**Category:** General Mathematics

[325] **viXra:1701.0417 [pdf]**
*submitted on 2017-01-10 10:48:31*

**Authors:** Muhammad Gulistan, Majid Khan, Young Bae Jun, Florentin Smarandache, Naveed Yaqoob

**Comments:** 17 Pages.

We generalize the concept of fuzzy point, intutionistic fuzzy point, cubic point by introducing the concept of neutrosophic cubic point.

**Category:** General Mathematics

[324] **viXra:1701.0416 [pdf]**
*submitted on 2017-01-10 12:01:39*

**Authors:** Jerrold Thacker

**Comments:** 12 Pages.

A review of parallax measurements of stars indicates there are major discrepancies in the distance measurements to red giant stars. There is very strong evidence that the nearest star is Betelgeuse, only a few light-weeks away and actually a dwarf star. It is possible that this dwarf star may be the missing Planet 9

**Category:** Astrophysics

[323] **viXra:1701.0415 [pdf]**
*submitted on 2017-01-10 12:06:01*

**Authors:** Gao Jian, Xinhan Huang, Min Wang, Xinde Li

**Comments:** 7 Pages.

Several neutrosophic combination reles based on the Dempster-Shafer Theory and Dezert-Smarandache Theory are presented in the study.

**Category:** General Mathematics

[322] **viXra:1701.0413 [pdf]**
*submitted on 2017-01-10 12:08:23*

**Authors:** Akbar Rezaei, Arsham Borumand Saeid, Florentin Smarandache

**Comments:** 15 Pages.

In this paper, we introduce the notion of (implicative)neutrosophic ﬁlters in BE-algebras. The relation between implicative neutrosophic ﬁlters and neutrosophic ﬁlters is investigated and we show that in self distributive BEalgebras these notions are equivalent.

**Category:** General Mathematics

[321] **viXra:1701.0412 [pdf]**
*submitted on 2017-01-10 12:09:13*

**Authors:** Florentin Smarandache, Ștefan Vlăduțescu

**Comments:** 6 Pages.

The study highlights persuasive-fictional inductions that are recorded in journalistic discourse. Subsequently it constitutes an application of Neutrosophic on journalistic communication. The theoretical premise is that journalism is impregnated persuasion.

**Category:** General Mathematics

[320] **viXra:1701.0411 [pdf]**
*submitted on 2017-01-10 12:10:09*

**Authors:** Florentin Smarandache, Gheorghe Săvoiu

**Comments:** 36 Pages.

Neutrosophic numbers easily allow modeling uncertainties of prices universe, thus justifying the growing interest for theoretical and practical aspects of arithmetic generated by some special numbers in our work. At the beginning of this paper, we reconsider the importance in applied research of instrumental discernment, viewed as the main support of the final measurement validity.

**Category:** General Mathematics

[319] **viXra:1701.0410 [pdf]**
*submitted on 2017-01-10 12:10:53*

**Authors:** Florentin Smarandache

**Comments:** 11 Pages.

We introduce now for the first time the neutrosophic modal logic. The Neutrosophic Modal Logic includes the neutrosophic operators that express the modalities. It is an extension of neutrosophic predicate logic, and of neutrosophic propositional logic.

**Category:** General Mathematics

[318] **viXra:1701.0407 [pdf]**
*submitted on 2017-01-10 12:14:36*

**Authors:** Mumtaz Ali, Florentin Smarandache, Luige Vladareanu

**Comments:** 28 Pages.

Neutrosophic sets and Logic plays a significant role in approximation theory. It is a generalization of fuzzy sets and intuitionistic fuzzy set. Neutrosophic set is based on the neutrosophic philosophy in which every idea Z, has opposite denoted as anti(Z) and its neutral which is denoted as neut(Z). This is the main feature of neutrosophic sets and logic.

**Category:** General Mathematics

[317] **viXra:1701.0406 [pdf]**
*submitted on 2017-01-10 12:15:32*

**Authors:** Madad Khan, Florentin Smarandache, Sania Afzal

**Comments:** 24 Pages.

In this paper we have dened neutrosophic ideals, neutrosophic interior ideals, netrosophic quasi-ideals and neutrosophic bi-ideals (neutrosophic generalized bi-ideals) and proved some results related to them.

**Category:** General Mathematics

[316] **viXra:1701.0404 [pdf]**
*submitted on 2017-01-10 12:19:28*

**Authors:** Florentin Smarandache

**Comments:** 8 Pages.

This paper is an extention of (t,i,f)-Neutrosophic Structures applicability, where were introduced for the first time a new type of structures.

**Category:** General Mathematics

[315] **viXra:1701.0403 [pdf]**
*submitted on 2017-01-10 12:20:21*

**Authors:** Mumtaz Ali, Florentin Smarandache

**Comments:** 10 Pages.

The theory of soluble groups and nilpotent groups is old and hence a generalized on. In this paper, we introduced neutrosophic soluble groups and neutrosophic nilpotent groups which have some kind of indeterminacy. These notions are generalized to the classic notions of soluble groups and nilpotent groups. We also derive some new type of series which derived some new notions of soluble groups and nilpotent groups. They are mixed neutrosophic soluble groups and mixed neutrosophic nilpotent groups as well as strong neutrosophic soluble groups and strong neutrosophic nilpotent groups.

**Category:** General Mathematics

[314] **viXra:1701.0402 [pdf]**
*submitted on 2017-01-10 12:21:37*

**Authors:** Florentin Smarandache, Mumtaz Ali, Muhammad Shabir

**Comments:** 11 Pages.

In this paper, for the first time the authors introduced the notions of neutrosophic
triplet group which is completely different from the classical group. In neutrosophic triplet group, we apply the fundamental law of neutrosophy that for an idea A, we have neut(A) and
anti(A) and we capture the picture of neutrosophy in algebraic structures.

**Category:** General Mathematics

[313] **viXra:1701.0400 [pdf]**
*submitted on 2017-01-10 12:26:05*

**Authors:** Mumtaz Ali, Florentin Smarandache, W. B. Vasantha Kandasamy

**Comments:** 13 Pages.

Algebraic codes play a signicant role in the minimisation of data corruption which caused by deffects such as inference, noise channel, crosstalk, and packet loss.In this paper, we introduce soft codes (soft linear codes) through the application of soft sets which is an approximated collection of codes.

**Category:** General Mathematics

[312] **viXra:1701.0398 [pdf]**
*replaced on 2017-01-30 10:13:04*

**Authors:** Yibing Qiu

**Comments:** 1 Page.

Abstract: show a viewpoint about black holes, white holes, pulsars and neutron stars.

**Category:** Astrophysics

[311] **viXra:1701.0397 [pdf]**
*submitted on 2017-01-10 07:35:16*

**Authors:** Quang Nguyen Van

**Comments:** 1 Page.

We have found a solution of FLT for n = 3, so that FLT is wrong. In this paper, we give a counterexample ( the solution in integer for equation x^3 + y^3 = z^3 only. It is too large ( 18 digits).

**Category:** Number Theory

[310] **viXra:1701.0394 [pdf]**
*submitted on 2017-01-10 07:43:01*

**Authors:** Victor Christianto, Florentin Smarandache

**Comments:** 11 Pages.

Hyman Minsky pioneered the idea of the financial instability hypothesis to explain how swings between robustness and fragility in financial markets generate business cycles in the economic system. Therefore, in his model business cycles and instability are endogenous. The problem now is how to put his idea of financial instability into a working model which can be tested with empirical data. Such a Minskyan model is quite rare, though some economists have proposed have tried to achieve that. For example, Toichiro Asada suggested generalized Lotka-Volterra nonlinear systems of equations as a model for Minskyan cycles.

**Category:** General Mathematics

[309] **viXra:1701.0393 [pdf]**
*submitted on 2017-01-10 07:44:18*

**Authors:** Qiang Guo, You He, Yong Deng, Tao Jian, Florentin Smarandache

**Comments:** 35 Pages.

To obtain effective fusion results of multi source evidences with different importances, an evidence fusion method with importance discounting factors based on neutrosopic probability analysis in DSmT framework is proposed. First, the reasonable evidence sources are selected out based on the statistical analysis of the pignistic probability functions of single focal elements.

**Category:** General Mathematics

[308] **viXra:1701.0392 [pdf]**
*submitted on 2017-01-10 07:45:51*

**Authors:** W.B. Vasantha Kandasamy, Ilanthenral Kandasamy, Florentin Smarandache

**Comments:** 9 Pages.

The Collatz conjecture is an open conjecture in mathematics named so after Lothar Collatz who proposed it in 1937. It is also known as 3n + 1 conjecture, the Ulam conjecture (after Stanislaw Ulam), Kakutani’s problem (after Shizuo
Kakutani) and so on.

**Category:** General Mathematics

[307] **viXra:1701.0391 [pdf]**
*submitted on 2017-01-10 07:47:02*

**Authors:** Mumtaz Ali, Florentin Smarandache

**Comments:** 12 Pages.

Algebraic codes play a signicant role in the minimisation of data corruption which caused by de⁄ects such as inference, noise channel, crosstalk, and packet loss.In this paper, we introduce soft codes (soft linear codes) through the application of soft sets which is an approximated collection of codes. We also discuss several types of soft codes such as type-1 soft codes, complete soft codes etc. Further, we constrcut the soft generator matrix and soft parity check matrix for the soft linear codes. Moreover, we develop two techinques for the decoding of soft codes.

**Category:** General Mathematics

[306] **viXra:1701.0390 [pdf]**
*submitted on 2017-01-10 07:48:31*

**Authors:** Mehmet Şahin, Necati Olgun, Vakkas Uluçay, Abdullah Kargın, Florentin Smarandache

**Comments:** 28 Pages.

In this paper, we propose transformations based on the centroid points between single valued neutrosophic values. We introduce these transformations according to truth, indeterminacy and falsity value of single valued neutrosophic values. We propose a new similarity measure based on falsity value between single valued neutrosophic sets.

**Category:** General Mathematics

[305] **viXra:1701.0384 [pdf]**
*submitted on 2017-01-10 08:29:52*

**Authors:** Said Broumi, Florentin Smarandache, Mohamed Talea, Assia Bakali

**Comments:** 8 Pages.

In this paper, we first define the concept of bipolar single neutrosophic graphs as the generalization of bipolar fuzzy graphs, N-graphs, intuitionistic fuzzy graph, single valued neutrosophic graphs and bipolar intuitionistic fuzzy graphs.

**Category:** General Mathematics

[304] **viXra:1701.0383 [pdf]**
*submitted on 2017-01-10 08:31:27*

**Authors:** Florentin Smarandache, Ștefan VlĂduȚescu, Ioan Constantin Dima, Dan Valeriu Voinea

**Comments:** 8 Pages.

The paper aims to explain the technology of emergence of information. Our research proves that information as communicational product is the result of processing within some operations, actions, mechanisms and strategies of informational material meanings. Are determined eight computational-communicative operations of building information.

**Category:** General Mathematics

[303] **viXra:1701.0381 [pdf]**
*submitted on 2017-01-10 08:33:59*

**Authors:** Said Broumi, Mohamed Talea, Assia Bakali, Florentin Smarandache

**Comments:** 6 Pages.

In this paper, the authors propose an extended version of Dijkstra’ algorithm for finding the shortest path on a network where the edge weights are characterized by an interval valued neutrosophic numbers. Finally, a numerical example is given to explain the proposed algorithm.

**Category:** General Mathematics

[302] **viXra:1701.0380 [pdf]**
*submitted on 2017-01-10 08:43:21*

**Authors:** Kalyan Mondal, Surapati Pramanik, Florentin Smarandache

**Comments:** 16 Pages.

This paper presents some similarity measures between complex neutrosophic sets. A complex neutrosophic set is a generalization of neutrosophic set whose complex-valued truth membership function, complex-valued indeterminacy membership function, and complex valued falsity membership functions are the combinations of real-valued truth amplitude term in association with phase term, real-valued indeterminate amplitude term with phase term, and real-valued false amplitude term with phase term respectively. In the present study, we have proposed complex cosine, Dice and Jaccard similarity measures and investigated some of their properties. Finally, complex neutrosophic cosine, Dice and Jaccard similarity measures have been applied to a medical diagnosis problem with complex neutrosophic information.

**Category:** General Mathematics

[301] **viXra:1701.0378 [pdf]**
*submitted on 2017-01-10 08:44:47*

**Authors:** George Rajna

**Comments:** 20 Pages.

Controlled direct acceleration of electrons in very strong laser fields can offer a path towards ultra-compact accelerators. [13] In an electron microscope, electrons are emitted by pointy metal tips, so they can be steered and controlled with high precision. Recently, such metal tips have also been used as high precision electron sources for generating X-rays. [12] In some chemical reactions both electrons and protons move together. When they transfer, they can move concertedly or in separate steps. Light-induced reactions of this sort are particularly relevant to biological systems, such as Photosystem II where plants use photons from the sun to convert water into oxygen. [11] EPFL researchers have found that water molecules are 10,000 times more sensitive to ions than previously thought. [10] Working with colleagues at the Harvard-MIT Center for Ultracold Atoms, a group led by Harvard Professor of Physics Mikhail Lukin and MIT Professor of Physics Vladan Vuletic have managed to coax photons into binding together to form molecules – a state of matter that, until recently, had been purely theoretical. The work is described in a September 25 paper in Nature. New ideas for interactions and particles: This paper examines the possibility to origin the Spontaneously Broken Symmetries from the Planck Distribution Law. This way we get a Unification of the Strong, Electromagnetic, and Weak Interactions from the interference occurrences of oscillators. Understanding that the relativistic mass change is the result of the magnetic induction we arrive to the conclusion that the Gravitational Force is also based on the electromagnetic forces, getting a Unified Relativistic Quantum Theory of all 4 Interactions.

**Category:** High Energy Particle Physics

[300] **viXra:1701.0377 [pdf]**
*submitted on 2017-01-10 08:51:57*

**Authors:** Said Broumi, Mohamed Talea, Assia Bakali, Florentin Smarandache

**Comments:** 5 Pages.

The selection of shortest path problem is one the classic problems in graph theory. In literature, many algorithms have been developed to provide a solution for shortest path problem in a network. One of common algorithms in solving shortest path problem is Dijkstra’s algorithm. In this paper, Dijkstra’s algorithm has been redesigned to handle the case in which most of parameters of a network are uncertain and given in terms of neutrosophic numbers. Finally, a numerical example is given to explain the proposed algorithm

**Category:** General Mathematics

[299] **viXra:1701.0376 [pdf]**
*submitted on 2017-01-10 08:54:44*

**Authors:** GUO Qiang, HE You, LI Xian, Florentin Smarandache, XU Shi-you

**Comments:** 12 Pages.

Aiming to solving the problem that the evidence information based on Dezert-Smarandache (DSm) model cannot be effectively conditionally reasoned in multi-source heterogeneous network which leads to the low rate of situation assessment, a situation assessment method in Conditional Evidential Network based on DSm-Proportional Conflict Redistribution No.5 (PCR5) is proposed. First, the conditional reasoning formula in Conditional Evidential Network based on DSm model is given. Then, the Disjunctive Rule of Combination(DRC) based on DSm-PCR5 is proposed and the Generalized Bayesian Theorem (GBT) for multiple intersection sets of focal elements can be obtained in the premise that the conditional mass assignments functions of focal elements in refinement of hyper-power set is known. Finally, through the simulation experiments results of situation assessment, the effectiveness of the proposed method is verified.

**Category:** General Mathematics

[298] **viXra:1701.0374 [pdf]**
*submitted on 2017-01-10 09:16:25*

**Authors:** Florentin Smarandache

**Comments:** 9 Pages.

In this paper we make distinctions between Classical Logic (where the propositions are 100% true, or 100 false) and the Neutrosophic Logic (where one deals with partially true, partially indeterminate and partially false propositions) in order to respond to K. Georgiev [1]’s criticism. We recall that if an axiom is true in a classical logic system, it is not necessarily that the axiom be valid in a modern (fuzzy, intuitionistic fuzzy, neutrosophic etc.) logic system.

**Category:** General Mathematics

[297] **viXra:1701.0373 [pdf]**
*submitted on 2017-01-10 09:19:44*

**Authors:** Ilanthenral Kandasamy, Florentin Smarandache

**Comments:** 10 Pages.

Triple Reﬁned Indeterminate Neutrosophic Set (TRINS) which is a case of the reﬁned neutrosophic set was introduced. It provides the additional possibility to represent with sensitivity and accuracy the uncertain, imprecise, incomplete, and inconsistent information which are available in real world.

**Category:** General Mathematics

[296] **viXra:1701.0371 [pdf]**
*submitted on 2017-01-10 09:22:27*

**Authors:** Ilanthenral Kandasamy, Florentin Smarandache

**Comments:** 16 Pages.

Triple Reﬁned Indeterminate Neutrosophic Set (TRINS) a case of the reﬁned neutrosophic set was introduced in [8]. The uncertain and inconsistent information which are available in real world is represented with sensitivity and accuracy by TRINS.

**Category:** General Mathematics

[295] **viXra:1701.0368 [pdf]**
*submitted on 2017-01-10 09:28:40*

**Authors:** Kalyan Mondal, Mumtaz Ali, Surapati Pramanik, Florentin Smarandache

**Comments:** 27 Pages.

This paper presents some similarity measures between complex neutrosophic sets. A complex neutrosophic set is a generalization of neutrosophic set whose complex-valued truth membership function, complex-valued indeterminacy membership function, and complex valued falsity membership functions are the combinations of realvalued truth amplitude term in association with phase term, real-valued indeterminate amplitude term with phase term, and real-valued false amplitude term with phase term respectively.

**Category:** General Mathematics

[294] **viXra:1701.0366 [pdf]**
*submitted on 2017-01-10 09:32:05*

**Authors:** Said Broumi, Mohamed Talea, Assia Bakali, Florentin Smarandache

**Comments:** 6 Pages.

In this work, a neutrosophic network method is proposed for finding the shortest path length with single valued trapezoidal neutrosophic number. The proposed algorithm gives the shortest path length using score function from source node to destination node.

**Category:** General Mathematics

[293] **viXra:1701.0364 [pdf]**
*submitted on 2017-01-10 09:37:38*

**Authors:** Luige Vladareanu, Mihaiela Iliescu, Hongbo Wang, Feng Yongfei, Victor Vladareanu, Hongnian Yu, Florentin Smarandache

**Comments:** 6 Pages.

This paper presents relevant aspects of the idea of using the digital medicine in cancer, so that to shape a viable strategy for creating and implementing an interactive digital platform, NEO-VIP, that should be the basic support to design the strategy for integration of basic, clinical and environmental research on neoplasia progression to cancer.

**Category:** General Mathematics

[292] **viXra:1701.0363 [pdf]**
*submitted on 2017-01-10 09:39:32*

**Authors:** Florentin Smarandache, Mircea Eugen Şelariu

**Comments:** 6 Pages.

Funcţiile beta excentrice de variabilă excentrică bexθ şi de variabilă centrică Bexα stau la baza edificiul funcţiilor supermatematice circulare excentrice (FSM−CE).

**Category:** General Mathematics

[291] **viXra:1701.0362 [pdf]**
*submitted on 2017-01-10 09:42:09*

**Authors:** Said Broumi, Mohamed Talea, Assia Bakali, Florentin Smarandache

**Comments:** 7 Pages.

In this article, we extend the concept of neutrosophic graph-based multicriteria decision making method (NGMCDM) to the case of interval valued neutrosophic graph theory. The new concept is called interval valued neutrosophic graph-based multicriteria decision making method (IVNGMCDM for short).

**Category:** General Mathematics

[290] **viXra:1701.0356 [pdf]**
*submitted on 2017-01-10 09:54:12*

**Authors:** Irfan Deli, Yusuf Şubaş, Florentin Smarandache, Mumtaz Ali

**Comments:** 8 Pages.

Interval valued bipolar neutrosophic set(IVBN-set) is a new generalization of fuzzy set, bipolar fuzzy set, neutrosophic set and bipolar neutrosophic set so that it can handle uncertain information more flexibly in the process of decision making.

**Category:** General Mathematics

[289] **viXra:1701.0354 [pdf]**
*submitted on 2017-01-10 09:56:56*

**Authors:** Said Broumi, Florentin Smarandache

**Comments:** 13 Pages.

We first defined interval-valued neutrosophic soft rough sets (IVN-soft rough sets for short) which combine interval-valued neutrosophic soft set and rough sets and studied some of its basic properties. This concept is an extension of interval-valued intuitionistic fuzzy soft rough sets(IVIF-soft rough sets).

**Category:** General Mathematics

[288] **viXra:1701.0351 [pdf]**
*submitted on 2017-01-10 10:04:37*

**Authors:** Octavian Cira, F, Smarandache

**Comments:** 26 Pages.

In this article we try to answer questions regarding the set of L.

**Category:** General Mathematics

[287] **viXra:1701.0346 [pdf]**
*submitted on 2017-01-10 05:36:59*

**Authors:** H.S. Dhaliwal

**Comments:** 19 Pages.

I have compared Wmap to Planck and there appears to be explosion like phenomena occurring in the Cmb. A hot spot, after some time, disappears or explodes in the before and after comparisons. There are also observations of hot spots appearing where once there was a cold region. One may say this is noise in the data, to carry on the big bang theory, but there is a significant chance it is not as these changes are too large. If they are explosions, these hot spots may leave black holes behind and eject matter outwards, similar to a super novae event but macro. In some observations you will notice a round hollowish dark dot is left after the explosion like event. This observation, if correct, means we cannot rely on the redshift-distance theory interpretation as we know it when it comes to the distance between us and the Cmb. If in fact these were explosions, they should be significantly different in there frequency. There is also evidence from simulations on the Cmb of numerous concentric circles existing in the map. These circles may be the after affects of these Cmb hot spots exploding. The midpoint of these concentric rings do not have any hot spots in them, because the hot spot at the midpoint may have exploded in the past, which turned into cold spots, and the after affect was the concentric rings. A superimposition of the concentric circles on the Cmb is also included in the paper. Note some concentric rings have unique dark spots in the mid point, a black circle is placed over these interesting observations (Obs. 17).

**Category:** Astrophysics

[286] **viXra:1701.0345 [pdf]**
*replaced on 2017-01-10 22:26:14*

**Authors:** Zhang ChengGang

**Comments:** 2 Pages.

Time-independent Shrodinger equation is derived in mathematics and physics.

**Category:** Quantum Physics

[285] **viXra:1701.0344 [pdf]**
*submitted on 2017-01-09 16:56:48*

**Authors:** Gerges Francis Tawdrous

**Comments:** 73 Pages. the study interest for the tabernacle geometry only

This Study interest for the Tabernacle Geometrical details
where the geometrical data isn't clearly complete in the Holy Bible (Exodus Book)
So I had to conclude many geometrical details logically
for that the study is interesting because it depends basically on the geometrical logic.

**Category:** General Science and Philosophy

[284] **viXra:1701.0343 [pdf]**
*submitted on 2017-01-09 18:11:54*

**Authors:** Jerrold Thacker

**Comments:** 1 Page.

There is evidence that the attractive force of gravity is logarithmic instead of linear. If this is true, then the motion of stars in spiral galaxies is easily explained and there is no need for dark matter.

**Category:** Astrophysics

[283] **viXra:1701.0342 [pdf]**
*submitted on 2017-01-10 00:56:43*

**Authors:** Mahendra Kumar Trivedi, Alice Branton, Dahryn Trivedi, Gopal Nayak, William Dean Plikerd, Peter L. Surguy, Robert John Kock, Rolando Baptista Piedad, Russell Phillip Callas, Sakina A. Ansari, Sandra Lee Barrett, Sara Friedman, Steven Lee Christie

**Comments:** 10 Pages.

With the increasing popularity of herbomineral preparations in healthcare, a new proprietary herbomineral formulation was formulated with ashwagandha root extract and three minerals viz. zinc, magnesium, and selenium. The aim of the study was to evaluate the immunomodulatory potential of Biofield Energy Healing (The Trivedi Effect®) on the herbomineral formulation using murine splenocyte cells. The test formulation was divided into two parts. One was the control without the Biofield Energy Treatment. The other part was labelled the Biofield Energy Treated sample, which received the Biofield Energy Healing Treatment remotely by twenty renowned Biofield Energy Healers. Through MTT assay, all the test formulation concentrations from 0.00001053 to 10.53 µg/mL were found to be safe with cell viability ranging from 102.61% to 194.57% using splenocyte cells. The Biofield Treated test formulation showed a significant (p≤0.01) inhibition of TNF-α expression by 15.87%, 20.64%, 18.65%, and 20.34% at 0.00001053, 0.0001053, 0.01053, and 0.1053, µg/mL, respectively as compared to the vehicle control (VC) group. The level of TNF-α was reduced by 8.73%, 19.54%, and 14.19% at 0.001053, 0.01053, and 0.1053 µg/mL, respectively in the Biofield Treated test formulation compared to the untreated test formulation. The expression of IL-1β reduced by 22.08%, 23.69%, 23.00%, 16.33%, 25.76%, 16.10%, and 23.69% at 0.00001053, 0.0001053, 0.001053, 0.01053, 0.1053, 1.053 and 10.53 µg/mL, respectively compared to the VC. Additionally, the expression of MIP-1α significantly (p≤0.001) reduced by 13.35%, 22.96%, 25.11%, 22.71%, and 21.83% at 0.00001053, 0.0001053, 0.01053, 1.053, and 10.53 µg/mL, respectively in the Biofield Treated test formulation compared to the VC. The Biofield Treated test formulation significantly down-regulated the MIP-1α expression by 10.75%, 9.53%, 9.57%, and 10.87% at 0.00001053, 0.01053, 0.1053 and 1.053 µg/mL, respectively compared to the untreated test formulation. The results showed the IFN-γ expression was also significantly (p≤0.001) reduced by 39.16%, 40.34%, 27.57%, 26.06%, 42.53%, and 48.91% at 0.0001053, 0.001053, 0.01053, 0.1053, 1.053, and 10.53 µg/mL, respectively in the Biofield Treated test formulation compared to the VC. The Biofield Treated test formulation showed better suppression of IFN-γ expression by 15.46%, 13.78%, 17.14%, and 13.11% at concentrations 0.001053, 0.01053, 0.1053, and 10.53 µg/mL, respectively compared to the untreated test formulation. Overall, the results demonstrated that The Trivedi Effect®- Biofield Energy Healing (TEBEH) has the capacity to potentiate the immunomodulatory and anti-inflammatory activity of the test formulation. Biofield Energy may also be useful in organ transplants, anti-aging, and stress management by improving overall health and quality of life.

**Category:** Biochemistry

[282] **viXra:1701.0341 [pdf]**
*replaced on 2017-02-02 01:39:37*

**Authors:** Alexey V. Komkov

**Comments:** 2 Pages.

This work contains certificates numbers Van der Waerden, was found using SAT Solver. These certificates establish the best currently known lower bounds of the numbers Van der Waerden W( 7, 3 ), W( 8, 3 ), W( 9, 3 ), W( 10, 3 ), W( 11, 3 ).

**Category:** Combinatorics and Graph Theory

[281] **viXra:1701.0340 [pdf]**
*submitted on 2017-01-10 03:55:52*

**Authors:** Ahsan Amin

**Comments:** 32 Pages.

Our goal is to give a very simple, effective and intuitive algorithm for the solution of initial value problem of ODEs of 1st and arbitrary higher order with general i.e. constant, variable or nonlinear coefficients and the systems of these ordinary differential equations. We find an expansion of the differential equation/function to get an infinite series containing iterated integrals evaluated solely at initial values of the dependent variables in the ordinary differential equation. Our series represents the true series expansion of the closed form solution of the ordinary differential equation. The method can also be used easily for general 2nd and higher order ordinary differential equations. We explain with examples the steps to solution of initial value problems of 1st order ordinary differential equations and later follow with more examples for linear and non-linear 2nd order and third order ODEs. We have given mathematica code for the solution of nth order ordinary differential equations and show with examples how to use the general code on 1st, 2nd and third order ordinary differential equations. We also give mathematica code for the solution of systems of a large number of general ordinary differential equations each of arbitrary order. We give an example showing the ease with which our method calculates the solution of system of three non-linear second order ordinary differential equations.

**Category:** General Mathematics

[280] **viXra:1701.0339 [pdf]**
*submitted on 2017-01-09 15:20:14*

**Authors:** Desire Francine Gobato

**Comments:** 95 Pages. Portuguese.

The flight safety is one of the main concerns related to the current aviation and through the prevention it is gotten to avoid countless incidents and accidents. The work has as objective demonstrates through norms, patterns and documents, that a safe way exists of accomplishing acrobatic maneuvers. The focus from work will have as reference the operational safety in the acrobatic maneuvers involving the current acrobatic aircrafts inside of the Brazilian air space. It gave way a bibliographical consultation it was elaborated through virtual libraries, where were books that they are correlated with the flight safety and the acrobatic flight, besides the norms and patterns of ANAC. It was verified like this that several ways exist of accomplishing a flight with acrobatics in a safe way, and for that they were developed several norms that owe her they be followed in the aerial demonstrations, shows or in any event that executes maneuvers or acrobatics in a risky way, where the aircraft is exposed to your own limits.

**Category:** Classical Physics

[279] **viXra:1701.0337 [pdf]**
*submitted on 2017-01-09 12:02:34*

**Authors:** Desire Francine Gobato, Ricardo Gobato

**Comments:** 1 Page. Panel presented in the THIRD INTERNATIONAL SATELLITE CONFERENCE ON MATHEMATICAL METHODS IN PHYSICS, Londrina - PR (Brazil), October 21 - 26, 2013.

One of the most unusual V/STOL aircraft programs was the Avro VZ-9 “Avrocar”. Designed to be a true flying saucer, the Avrocar was one of the
few V/STOL aircraft to be developed in complete secrecy. Despite significant design changes during flight test, the Avrocar was unable to achieve
its objectives and the program was ultimately canceled after the expenditure of over $10 million (1954-61).
The concept of a lift fan driven by a turbojet engine did not die either, and lives on today as a
key component of the Lockheed X-35 Joint Strike Fighter contender. While the Avrocar was
under development, Peter Kappus of General Electric independently developed a lift fan
propulsion system which evolved into the GE/Ryan VZ-11 (later XV-5) “Vertifan”.

**Category:** General Mathematics

[278] **viXra:1701.0336 [pdf]**
*submitted on 2017-01-09 12:13:24*

**Authors:** Desire Francine Gobato Fedrigo, Ricardo Gobato

**Comments:** 1 Page. Portuguese. Panel presented in the XVII Physics Week of the State University of Londrina, October 22 to 26, 2012.

To begin a study related to the High Speed Theory, it is necessary to know that the aircraft are classified according to the being: subsonic, transonic, hypersonic or supersonic. Directly related to there are still the so-called pressure waves, which are concentric waves printed in the air by any object that produces sound or which travels in the Earth's atmosphere, these propagate at a speed of 340 m/s or 1224 km/h at the mean sea level. The Aircraft in flight produces these waves, which are formed around them and move 360º around it. In this study, the
nose of the aircraft and the area of greater curvature in the extrados of the wing as expansion waveformers, in order to study the wave of Shock, Mach number and critical Mach number.

**Category:** Classical Physics

[277] **viXra:1701.0335 [pdf]**
*replaced on 2017-01-26 17:39:30*

**Authors:** Jérémy Kerneis

**Comments:** 21 Pages.

We use three postulates P1, P2a/b and P3 :
Combining P1 and P2a with "Sommerfeld's quantum rules"; correspond to the original quan-
tum theory of Hydrogen, which produces the correct relativistic energy levels for atoms (Sommerfeld's and Dirac's theories of matter produces the same energy levels, and Schrodinger's theory produces the approximation of those energy levels). P3 can be found in Schrodinger's famous paper introducing his equation, P3 being his first assumption (a second assumption, suppressed here, is required to deduce his equation). P3 implies that the wavefunction solution of both Schrodinger's and Klein-Gordon's equations in the non interacting case while, in the interacting case, it immediatly implies "Sommerfeld's quantum rules" : P1, P2a,
and P3 then produce the correct relativistic energy levels of atoms, and we check that the required degeneracy is justied by pure deduction, without any other assumption (Schrodinger's theory only justies one half of the degeneracy).
We observe that the introduction of an interaction in P1 is equivalent to a
modication of the metric inside the wavefunction in P3, such that the equation of motion of a system can be
deduced with two dierent methods, with or without the metric. Replacing the electromagnetic potential P2a by the suggested gravitationnal potential P2b, the equation of motion (deduced in two ways) is equivalent to the equation of motion of General Relativity in the low field approximation (with accuracy 10-6 at the surface of the Sun). We have no coordinate singularity inside the metric. Other motions can be obtained by modifying P2b, the theory is adaptable.
First of all, we discuss classical Kepler problems (Newtonian motion of the Earth around the Sun), explain the link between Kelpler law of periods (1619) and Plank's law (1900) and observe the links between all historical models of atoms (Bohr, Sommerfeld, Pauli, Schrodinger, Dirac, Fock). This being done, we introduce P1, P2a/b, and P3 to then describe electromagnetism
and gravitation in the same formalism.

**Category:** Quantum Gravity and String Theory

[276] **viXra:1701.0334 [pdf]**
*submitted on 2017-01-08 18:10:20*

**Authors:** Ricardo Gobato

**Comments:** 14 Pages. Parana Journal of Science and Education, v.2, n.3, March 10, 2016 PJSE, ISSN 2447-6153, c 2015-2016

The work is the result of a philosophical study of several passages of the Holy Bible, with regard to faith. We analyzed verses that include mustard seed parables. The study discusses the various concepts of faith as belief and faith as a form of energy. In this concept of faith as energy, we made a connection and this matter. We approach the gravitational ﬁeld using the Law of Universal Gravitation and the equation of equivalence between energy and matter not to relativistic effects. Of Scriptures, we focus on Matthew 17:20, and according to the concept of faith as a form of energy, we calculate the energy needed to raise a mountain, for the conversion of matter to energy in a mustard seed and we compare a massive iron mountain, Mount Everest and Mount Sinai. We conclude with these concepts and considerations that energy ”faith” can move a mountain.

**Category:** General Science and Philosophy

[275] **viXra:1701.0333 [pdf]**
*submitted on 2017-01-08 22:27:43*

**Authors:** Andrew Beckwith

**Comments:** 6 Pages.

We are looking at what if the initial cosmological constant is due to if we furthermore use as the variation of the time component of the metric tensor in Pre-Planckian Space-time up to the Planckian space-time initial values. This assumes as an initial inflaton value, as well as employing NonLinear Electrodynamics to the scale factor in ,the upshot is an expression for as an initial inflaton value / squared which supports Corda’s assumptions in the ‘Gravity’s breath Electronic Journal of theoretical physics article. We close with an idea to be worked in further detail as to density matrices and how it may relate to gravitons traversing from a Pre Planckian to Planckian space-time regime. An idea we will write up in far greater detail in a future publication

**Category:** Quantum Gravity and String Theory

[274] **viXra:1701.0332 [pdf]**
*submitted on 2017-01-08 22:39:04*

**Authors:** Hadi Oqaibi, Anas Fattouh

**Comments:** 10 Pages. International Journal of Innovative Research in Computer and Communication Engineering

Steady-state visual evoked potential (SSVEP) is a well-established paradigm of brain-computer interface (BCI) where the interaction between the user and a controlled device is achieved via brainwave activities and visual stimuli. Although SSVEP-based BCIs are known to have high information transfer rate (ITR), wrong feedback reduces the performance of these applications. In this paper, we investigate the possibility of enhancing SSVEP -based BCI applications by incorporating the user’s emotions. To this end, an SSVEP-based BCI application is designed and implemented where the user has to steer a simulated car moving through a maze to reach a target position. Using standard flickering checkerboards, the user has to select one of two commands, turn right or turn left. After each selection, a visual virtual feedback is shown and the emotional state of the user is estimated from recorded electroencephalogram (EEG) brain activities. This estimated emotion could be used to automatically confirm or cancel the selected command and therefore improve the quality of executed commands.

**Category:** Digital Signal Processing

[273] **viXra:1701.0331 [pdf]**
*submitted on 2017-01-08 14:31:48*

**Authors:** Gerges Francis Tawdrous

**Comments:** 279 Pages.

This study devoted to the tabernacle & Great pyramids geometries in addition to analyze the church icons
Also the study offers an interpretation for the tabernacle with some deep philosophy for the dualism..
The study in Arabic Language
(Part Two)

**Category:** General Science and Philosophy

[272] **viXra:1701.0330 [pdf]**
*submitted on 2017-01-08 16:08:20*

**Authors:** Ramzi Suleiman

**Comments:** 70 Pages.

We propose a simple, axiom-free modification of Galileo-Newton's dynamics of moving bodies, termed Information Relativity theory. We claim that the theory is capable of unifying physics. The claimed unification is supported by the fact that the same derived set of simple and beautiful transformations, apply successfully to predicting and explaining many phenomena and findings in cosmology, quantum mechanics, and more. Our modification of classical physics is done simply by accounting for the time travel of information about a physical measurement, from the reference frame at which the measurement was taken, to an observer in another reference frame, which is in motion relative to the first frame. This minor modification of classical physics turns out to be sufficient for unifying all the dynamics of moving bodies, regardless of their size and mass. Since the theory's transformations and predictions are expressed only in terms of observable physical entities, its testing should be simple and straightforward.
For quantum mechanics the special version of the theory for translational inertial motion predicts and explains matter-wave duality, quantum phase transition, quantum criticality, entanglement, the diffraction of single particles in the double slit experiment, the quantum nature of the hydrogen atom. For cosmology, the theory constructs a relativistic quantum cosmology, which provides plausible and testable explanations of dark matter and dark energy, as well as predictions of the mass of the Higgs boson, the GZK cutoff phenomena, the Schwarzschild radius of black holes (without interior singularity), and the timeline of ionization of chemical elements along the history of the universe.
The general version of the theory for gravitational and electrostatic fields, also detailed in the paper, is shown to be successful in predicting and explaining the strong force, quantum confinement, and asymptotic freedom.

**Category:** Relativity and Cosmology

[271] **viXra:1701.0329 [pdf]**
*submitted on 2017-01-08 11:02:17*

**Authors:** Marius Coman

**Comments:** 4 Pages.

In this paper I make the following conjecture: For any pair of consecutive primes [p1, p2], p2 > p1 > 43, p1 and p2 having the same number of digits, there exist a prime q, 5 < q < p1, such that the number n obtained concatenating (from the left to the right) q with p2, then with p1, then again with q is prime. Example: for [p1, p2] = [961748941, 961748947] there exist q = 19 such that n = 1996174894796174894119 is prime. Note that the least values of q that satisfy this conjecture for twenty consecutive pairs of consecutive primes with 9 digits are 19, 17, 107, 23, 131, 47, 83, 79, 61, 277, 163, 7, 41, 13, 181, 19, 7, 37, 29 and 23 (all twenty primes lower than 300!), the corresponding primes n obtained having 20 to 24 digits! This method appears to be a good way to obtain big primes with a high degree of ease and certainty.

**Category:** Number Theory

[270] **viXra:1701.0328 [pdf]**
*submitted on 2017-01-07 23:13:35*

**Authors:** Roger Granet

**Comments:** 3 Pages.

The Russell Paradox (1) considers the set, R, of all sets that are not members of themselves. On its surface, it seems like R belongs to itself only if it doesn't belong to itself. This is where the paradox come from. Here, a solution is proposed that is similar to Russell's method based on his theory of types (1,2) but is instead based on the definition of why things exist as described in previous work (3). In that work, it was proposed that a thing exists if it is a grouping defining what is contained within. A corollary is that a thing, such as a set, does not exist until what is contained within is defined. A second corollary is that after a grouping defining what is contained within is present, and the thing exists, if one then alters the definition of what is contained within, the first existent entity is destroyed and a different existent entity is created. Based on this, set R of the Russell Paradox does not even exist until after the list of the elements it contains (e.g. the list of all sets that aren't members of themselves) is defined. Once this list of elements is completely defined, R then springs into existence. Therefore, because it doesn't exist until after its list of elements is defined, R obviously can't be in this list of elements and, thus, cannot be a member of itself; so, the paradox is resolved. Additionally, one can't then put R back into its list of elements after the fact because if this were done, it would be a different list of elements, and it would no longer be the original set R, but some new set. This same type of reasoning is then applied to the Godel Incompleteness Theorem, which roughly states that there will always be some statements within a formal system of arithmetic (system P) that are true but that can't be proven to be true. Briefly, this reasoning suggests that arguments such as the Godel sentence and diagonalization arguments confuse references to future, not yet existent statements with a current and existent statement saying that the future statements are unprovable. Current and existent statements are different existent entities than future, not yet existent statements and should not be conflated. In conclusion, a new resolution of the Russell Paradox and some issues with the Godel Incompleteness Theorem are described.

**Category:** Set Theory and Logic

[269] **viXra:1701.0327 [pdf]**
*submitted on 2017-01-08 01:29:36*

**Authors:** Sergey G. Fedosin

**Comments:** 69 pages. Journal of Fundamental and Applied Sciences, Vol. 9, No. 1, pp. 411-467 (2017). http://dx.doi.org/10.4314/jfas.v9i1.25

It is shown that the angular frequency of the photon is nothing else than the averaged angular frequency of revolution of the electron cloud’s center during emission and quantum transition between two energy levels in an atom. On assumption that the photon consists of charged particles of the vacuum field (of praons), the substantial model of a photon is constructed. Praons move inside the photon in the same way as they must move in the electromagnetic field of the emitting electron, while internal periodic wave structure is formed inside the photon. The properties of praons, including their mass, charge and speed, are derived in the framework of the theory of infinite nesting of matter. At the same time, praons are part of nucleons and leptons just as nucleons are the basis of neutron stars and the matter of ordinary stars and planets. With the help of the Lorentz transformations, which correlate the laboratory reference frame and the reference frame, co-moving with the praons inside the photon, transformation of the electromagnetic field components is performed. This allows us to calculate the longitudinal magnetic field and magnetic dipole moment of the photon, and to understand the relation between the transverse components of the electric and magnetic fields, connected by a coefficient in the form of the speed of light. The total rest mass of the particles making up the photon is found, it turns out to be inversely proportional to the nuclear
charge number of the hydrogen-like atom, which emits the photon. In the presented
picture the photon composed of praons moves at a speed less than the speed of light,
and it loses the right to be called an elementary particle due to its complex structure.

**Category:** Classical Physics

[268] **viXra:1701.0326 [pdf]**
*submitted on 2017-01-08 03:54:31*

**Authors:** Radhakrishnamurty Padyala

**Comments:** 6 Pages.

The concept of ‘thermal heating efficiency’, G, considered as a duel of Carnot efficiency, offers a suitable method to test the validity of second law of thermodynamics. This concept claims to offer us many practical (therefore, experimentally testable) advantages, specifically, economy in heating houses, cooking, besides others. For example, if one unit of fuel when burnt inside the house gives Q joules of heat, the thermodynamic method based on this concept offers as much as 10 Q joules for the same one unit of fuel, giving a 10 fold economy in heating houses. We show in this article that the economy claimed is a myth and we can get no more heat into the house using this method than that we get by burning the fuel inside the house. We propose, the concept of thermal heating efficiency as a suitable method to test the validity of the second law of thermodynamics.

**Category:** Thermodynamics and Energy

[267] **viXra:1701.0325 [pdf]**
*submitted on 2017-01-08 03:55:27*

**Authors:** Ilija Barukčić

**Comments:** 29 Pages. Copyright © 2017 by Ilija Barukčić, Jever, Germany. Published by:

Epstein-Barr Virus (EBV) has been widely proposed as a possible candidate virus for the viral etiology of human breast cancer, still the most common malignancy affecting females worldwide. Due to possible problems with PCR analyses (contamination), the lack of uniformity in the study design and insufficient mathematical/statistical methods used by the different authors, findings of several EBV (polymerase chain reaction (PCR)) studies contradict each other making it difficult to determine the EBV etiology for breast cancer. In this present study, we performed a re-investigation of some of the known studies. To place our results in context, this study support the hypothesis that EBV is a cause of human breast cancer.

**Category:** Statistics

[266] **viXra:1701.0324 [pdf]**
*submitted on 2017-01-08 04:19:38*

**Authors:** Adam Chmaj

**Comments:** 6 Pages. Original 2014 version of the result is posted here. Some minor corrections are left to the reader.

The existence of traveling waves for the fractional Burgers equation is established, using an operator splitting trick. This solves a 1998 open problem.

**Category:** Functions and Analysis

[265] **viXra:1701.0322 [pdf]**
*submitted on 2017-01-07 10:21:42*

**Authors:** Adrian Ferent

**Comments:** 51 Pages. © 2016 Adrian Ferent

“Dark Energy is Gravitational Waves”
Adrian Ferent
“The momentum of the graviton is negative p = - m × v ”
Adrian Ferent
“Because the momentum is negative, the relativistic mass -m of the graviton is negative!”
Adrian Ferent
The causes of Planck universe to expand faster:
-The Ferent universe with supermassive black holes is speeding up the expansion of Planck universe.
-The negative pressure created by gravitons inside the Planck universe, is speeding up the expansion of Planck universe.
-Not the Dark energy is speeding up the expansion of Planck universe, because Dark energy does not exist.
“Dark Energy is Gravitons”
Adrian Ferent
The title is ‘Dark Energy is Gravitational Waves, Dark Energy is Gravitons’ because my Gravitation theory is completely different than Einstein’s Gravitation theory where
Gravitational waves are ripples in the curvature of spacetime that propagate as waves with the speed of light.

**Category:** Quantum Gravity and String Theory

[264] **viXra:1701.0321 [pdf]**
*submitted on 2017-01-07 12:02:26*

**Authors:** Ricardo Gobato, Alekssander Gobato, Desire Francine G. Fedrigo

**Comments:** 1 Page. Portuguese. Panel presented at the XVIII Physics Week of the State University of Londrina, from September 9 to 13, 2013.

Argemone Mexicana L. popularly known as: Mexican poppy, thorny Mexican poppy, thistle or cardo santo is a species of poppy found in Mexico and widespread in many parts of the world. It is an extremely resistant plant, tolerant to drought and poor soils, being often the only vegetation cover present in the soil. It has bright yellow latex, and although toxic to grazing animals, it is rarely ingested. From the Papaverácea family, informally known as poppies, it is an important ethno-pharmacological family of 44 genera and about 760 species of flowering plants. The plant is the source of several types of chemical compounds, such as flavonoids, although alkaloids are the most commonly found. In addition to pharmaceutical efficacy, certain parts of the plant also show toxic effects. It is used in different parts of the world for the treatment of various diseases including tumors, warts, skin diseases, rheumatism, inflammation, jaundice, leprosy, microbial infections, malaria and as a larvicide against Aedes aegypti, dengue vector.

**Category:** Physics of Biology

[263] **viXra:1701.0320 [pdf]**
*submitted on 2017-01-07 12:05:30*

**Authors:** Marius Coman

**Comments:** 3 Pages.

In this paper I make the following conjecture: For any pair of twin primes [p, p + 2], p > 5, there exist a prime q, 5 < q < p, such that the number n obtained concatenating (from the left to the right) q with p + 2, then with p, then again with q is prime. Example: for [p, p + 2] = [18408287, 18408289] there exist q = 37 such that n = 37184082891840828737 is prime. Note that the least values of q that satisfy this conjecture for twenty consecutive pairs of twins with 8 digits are 19, 7, 19, 11, 23, 23, 47, 7, 47, 17, 13, 17, 17, 37, 83, 19, 13, 13, 59 and 97 (all twenty primes lower than 100!), the corresponding primes n obtained having 20 digits! This method appears to be a good way to obtain big primes with a high degree of ease and certainty.

**Category:** Number Theory

[262] **viXra:1701.0319 [pdf]**
*submitted on 2017-01-07 04:17:29*

**Authors:** Espen Gaarder Haug

**Comments:** 6 Pages.

In this paper we look at the ultimate limits of a photon propulsion rocket. The maximum velocity for a photon propulsion rocket is just below the speed of light and is a function of the reduced Compton wavelength of the heaviest subatomic particles in the rocket. We are basically combining the relativistic rocket equation with Haug’s new insight in the maximum velocity for anything with rest mass; see [1, 2, 3].
An interesting new finding is that in order to accelerate any sub-atomic “fundamental” particle to its maximum velocity, the particle rocket basically needs two Planck masses of initial load. This might sound illogical until one understands that subatomic particles with different masses have different maximum velocities. This can be generalized to large rockets and gives us the maximum theoretical velocity of a fully-efficient and ideal rocket. Further, no additional fuel is needed to accelerate a Planck mass particle to its maximum velocity; this also might sound absurd, but it has a very simple and logical solution that is explained in this paper.
This paper is Classified!

**Category:** Relativity and Cosmology

[261] **viXra:1701.0316 [pdf]**
*replaced on 2017-01-08 06:47:33*

**Authors:** Luke Kenneth Casson Leighton

**Comments:** 7 Pages.

In a prior paper ultracolour was added back
in to the Extended Rishon Model, and the I-Frame structure explored
using the proton as an example. Bearing in mind that because
Maxwell's equations have to be obeyed, the Rishons have to have
actual phase, position, momentum and velocity. The only pattern
of motion that fitted the stringent requirements was if the Rishons
circulated on mobius strips. Fascinatingly and very
excitingly, exactly such a
previously-theoretical elliptically-transverse mobius
topology of light
has been experimentally confirmed last year.
The next logical task of writing out Rishon triplets in a circle as actual
starting phases of the elliptically polarized mobius-walking light
has proven to be a huge breakthrough, providing startling
insight with massive implications such as implying
the existence of two previously
undiscovered quarks very similar to up and down (provisionally nicknamed
over and under), logically and naturally confirming that "decay" is
just a "phase transform", and generally being really rather disruptive
to both the Standard Model and the Extended Rishon Model.
A huge task is therefore ahead, to revisit the available data on particle
decays and masses (bear in mind that the Standard Model's statistical
inference confirmation techniques assume the up and over, and down and
under, to be the same particles), so this paper endeavours to
lay some groundwork and ask pertinent questions.

**Category:** High Energy Particle Physics

[260] **viXra:1701.0315 [pdf]**
*submitted on 2017-01-06 21:56:05*

**Authors:** Georgina Woodward

**Comments:** Pages.

This paper is focused in the underlying foundational Object reality. It shows that when that underlying reality is considered physics is not so strange as mathematics, used within classical and quantum physics models, would seem to imply. Some philosophical implications regarding; true relations and the appearance of relations, qualities and attributes rather than properties, and the law of non-contradiction are considered. The double slit experiment problem, and the problem of provocation are discussed. The Harry Beck’s Tube map analogy shows how something highly accurate in certain respects and able to produce reliable predictions can also not be an accurate representation of reality.

**Category:** History and Philosophy of Physics

[259] **viXra:1701.0314 [pdf]**
*submitted on 2017-01-07 04:07:08*

**Authors:** Abdelmajid Ben Hadj Salem

**Comments:** 10 Pages. In French.

In an article, E. Grafarend and B. Schaffrin studied the geometry of non-linear adjustment of the planar trisection problem using the Gauss Markov model and the method of the least squares. This paper develops the same method working on an example of the determination of a point by trilateration in the three-dimensional geodetic option for determining the coordinates (x, y, z) of an unknown point from measurements known distances to n points.

**Category:** Geophysics

[258] **viXra:1701.0313 [pdf]**
*submitted on 2017-01-06 14:50:00*

**Authors:** A. Gobato, D. F. G. Fedrigo, R. Gobato

**Comments:** 1 Page. Panel presented at the XIX Physics Week, at the State University of Londrina, from September 15 to 19, 2014.

Many sources of energy have been researched for their proper capture, where some stand out for their ease of obtaining, others for low cost and others for being renewable. Here we study a source of energy - the sea wave, whose capture is still under development. This energy comes from the sea waves and is 100% renewable, and the treatment system here is the Pelamis System. Over the years the energy has become vital for the human being, allowing us comfort, leisure, mobility among other factors. The search for cheap and renewable energy sources has grown significantly in recent years, mainly to a diminishing effect that has degraded nature, enabling scientists and engineers to search for new technologies. Some countries where there are not many forms of energy capture, such as sources of alternative sources such as wind, solar, thermoelectric, marine energy and many others. Among the methods of energy capture mentioned above, a project created by Chinese in 1988, with the intention of being an infinite and totally clean energy source that caught our attention. It is a system that is not generated by the movement of the waves, this project is called Pelamis.

**Category:** Thermodynamics and Energy

[257] **viXra:1701.0312 [pdf]**
*submitted on 2017-01-06 14:58:00*

**Authors:** R. Gobato, D. F. G. Fedrigo, A. Gobato

**Comments:** 1 Page. Portuguese

Argemone Mexicana L. popularly known as: Mexican poppy, thorny Mexican poppy, thistle or cardo santo is a species of poppy found in Mexico and widespread in many parts of the world. It is an extremely resistant plant, tolerant to drought and poor soils, being often the only vegetation cover present in the soil. It has bright yellow latex, and although toxic to grazing animals, it is rarely ingested. From the family Papaveraceae, informally known as poppies, it is an important ethnopharmacological family of 44 genera and about 760 species of flowering plants. The plant is the source of several types of chemical compounds, such as flavonoids, although alkaloids are the most commonly found. In addition to pharmaceutical efficacy, certain parts of the plant also show toxic effects. It is used in different parts of the world for the treatment of various diseases including tumors, warts, skin diseases, rheumatism, inflammation, jaundice, leprosy, microbial infections, malaria, agrobacteria, among others and as a larvicide against Aedes aegypti, vector Of dengue.

**Category:** Condensed Matter

[256] **viXra:1701.0311 [pdf]**
*submitted on 2017-01-06 15:12:15*

**Authors:** R. Gobato, D. F. G. Fedrigo, A. Gobato

**Comments:** 1 Page. Portuguese

The use of inorganic crystals in the technology comes from ample date. From quartz crystals to receiver radios common to computer chips with new semiconductor materials. Elements such as Se, Li, Be, and Si, are of great use in technology. The use of new inorganic crystals in technology has been widely studied. The development of new compounds coming from this arrangement can bring technological advances in the most diverse areas of knowledge. The probable difficulty of finding such crystals in nature or synthesized suggest an advanced study of the theme. A preliminary literature search did not indicate any compounds of said arrangement of these chemical elements. From this fact our study may lead to obtaining new crystals to be used in the materials industry. For this, a computational study using software with Molecular Mechanics, ab initio, DFT, and empirical methods with microscopic and conoscopic analysis can lead to the obtaining of such crystals.

**Category:** Condensed Matter

[255] **viXra:1701.0310 [pdf]**
*submitted on 2017-01-06 10:31:33*

**Authors:** Jose P. Koshy

**Comments:** 5 Pages. The paper will soon be submitted to a relevent journal

The Fine Structure Constant, regarded as a magic number, was introduced in 1916. Now, after 100 years, the mystery of it is revealed: it is a constant related to spherical-packing. The approximate value of it can be given as a ≈ (42/1838)/(3.14), where 1838 represents the number of entities packed and 42, the number of entities in the diameter, and 3.14, the mathematical constant pi.

**Category:** Quantum Physics

[254] **viXra:1701.0309 [pdf]**
*replaced on 2017-02-04 13:13:24*

**Authors:** Hans Detlef Hüttenbach

**Comments:** 5 Pages.

This paper is on the mathematical structure of space, time, and gravity. It is shown that electrodynamics is neither charge inversion invariant, nor is it time inversion invariant.

**Category:** Mathematical Physics

[253] **viXra:1701.0308 [pdf]**
*submitted on 2017-01-06 11:40:42*

**Authors:** Vito R. D'Angelo

**Comments:** 4 Pages.

The construction of the Planckian hierarchal schematic, comprised of four very well-known Planck constants, i.e., h,(h-bar), lp, and tp. The reintroduction of a forgotten constant, 1/2 of (h-bar). The postulation of a new Planck constant - the Planck circumference, symbol (P), where the Planck length is its diameter. The natural outcome of pi as the ratio of the Planck length and Planck circumference. Also, the initialization of the ratio of 1/2(h-bar) and the Planck circumference, with a value of 1.038499006, referred to, as the ratio of attribute. The crux of the paper is to show that the dimensionless ratios of the Planckian schematic (i.e., 2, 1.038499006, pi and 299792458) can be utilized to enumerate the Planck momentum, Planck mass and Planck energy constants.

**Category:** High Energy Particle Physics

[252] **viXra:1701.0307 [pdf]**
*submitted on 2017-01-06 11:34:02*

**Authors:** George Rajna

**Comments:** 28 Pages.

Giant atoms could help unveil 'dark matter' and other cosmic secrets. [23] Astronomers in the US are setting up an experiment which, if it fails – as others have – could mark the end of a 30-year-old theory. [22] Russian scientists have discovered that the proportion of unstable particles in the composition of dark matter in the days immediately following the Big Bang was no more than 2 percent to 5 percent. Their study has been published in Physical Review D. [21] Researchers from the University of Amsterdam's (UvA) GRAPPA Center of Excellence have just published the most precise analysis of the fluctuations in the gamma-ray background to date. [20] The Dark Energy Spectroscopic Instrument, called DESI, has an ambitious goal: to scan more than 35 million galaxies in the night sky to track the expansion of our universe and the growth of its large-scale structure over the last 10 billion years. [19] If the axion exist and it is the main component of Dark Matter, the very relic axions that would be bombarding us continuously could be detected using microwave resonant (to the axion mass) cavities, immersed in powerful magnetic fields. [18] In yet another attempt to nail down the elusive nature of dark matter, a European team of researchers has used a supercomputer to develop a profile of the yet-to-be-detected entity that appears to pervade the universe. [17] MIT physicists are proposing a new experiment to detect a dark matter particle called the axion. If successful, the effort could crack one of the most perplexing unsolved mysteries in particle physics, as well as finally yield a glimpse of dark matter. [16] Researches at Stockholm University are getting closer to light dark-matter particle models. Observations rule out some axion-like particles in the quest for the content of dark matter. The article is now published in the Physical Review Letters. [15] Scientists have detected a mysterious X-ray signal that could be caused by dark matter streaming out of our Sun's core. Hidden photons are predicted in some extensions of the Standard Model of particle physics, and unlike WIMPs they would interact electromagnetically with normal matter. In particle physics and astrophysics, weakly interacting massive particles, or WIMPs, are among the leading hypothetical particle physics candidates for dark matter. The gravitational force attracting the matter, causing concentration of the matter in a small space and leaving much space with low matter concentration: dark matter and energy. There is an asymmetry between the mass of the electric charges, for example proton and electron, can understood by the asymmetrical Planck Distribution Law. This temperature dependent energy distribution is asymmetric around the maximum intensity, where the annihilation of matter and antimatter is a high probability event. The asymmetric sides are creating different frequencies of electromagnetic radiations being in the same intensity level and compensating each other. One of these compensating ratios is the electron – proton mass ratio. The lower energy side has no compensating intensity level, it is the dark energy and the corresponding matter is the dark matter.

**Category:** Astrophysics

[251] **viXra:1701.0306 [pdf]**
*submitted on 2017-01-06 11:36:01*

**Authors:** M. Simões F., R. Gobato

**Comments:** 1 Page. Portuguese

Spectroscopy is a technique for collecting physicochemical data through the transmission, absorption or reflection of incident radiant energy in a sample. It is much employed to be used in their spectra, which is difficult to access equipment because of its high cost, found in research surveys. Our work is used in common low cost and easy access devices that have a CCD reader, which is replaced by these spectrometers. We determine mathematical parameters that characterize by mapping the images obtained by common cameras such as: cell phones, smartphone, tablet, iphone, ipad, webcam, etc. As filming obtained by optical CCD reader theses hardware, form decoded and separated into their quantified RGB E color channels. Our technique consists of the analysis of the pixels of the images of primary light sources, such as: the sun, incandescent lamps, fire, candle flames, matchestick flame, wood combustion, etc.

**Category:** Condensed Matter

[250] **viXra:1701.0305 [pdf]**
*submitted on 2017-01-06 11:49:30*

**Authors:** George Rajna

**Comments:** 37 Pages.

Physicists have performed a variation of the famous 200-year-old double-slit experiment that, for the first time, involves "exotic looped trajectories" of photons. These photons travel forward through one slit, then loop around and travel back through another slit, and then sometimes loop around again and travel forward through a third slit. [24] Now in a new paper published in Physical Review Letters, physicists Gael Sentís et al. have taken the change point problem to the quantum domain. [23] When a quantum system changes its state, this is called a quantum jump. Usually, these quantum jumps are considered to be instantaneous. Now, new methods for high-precision measurements allow us to study the time evolution of these quantum jumps. On a time scale of attoseconds, there time structure becomes visible. It is he most accurate time measurement of quantum jumps to date. [22] New research conducted at the University of Chicago has confirmed a decades-old theory describing the dynamics of continuous phase transitions. [21] No matter whether it is acoustic waves, quantum matter waves or optical waves of a laser—all kinds of waves can be in different states of oscillation, corresponding to different frequencies. Calculating these frequencies is part of the tools of the trade in theoretical physics. Recently, however, a special class of systems has caught the attention of the scientific community, forcing physicists to abandon well-established rules. [20] Until quite recently, creating a hologram of a single photon was believed to be impossible due to fundamental laws of physics. However, scientists at the Faculty of Physics, University of Warsaw, have successfully applied concepts of classical holography to the world of quantum phenomena. A new measurement technique has enabled them to register the first-ever hologram of a single light particle, thereby shedding new light on the foundations of quantum mechanics. [19] A combined team of researchers from Columbia University in the U.S. and the University of Warsaw in Poland has found that there appear to be flaws in traditional theory that describe how photodissociation works. [18] Ultra-peripheral collisions of lead nuclei at the LHC accelerator can lead to elastic collisions of photons with photons. [17]

**Category:** Quantum Physics

[249] **viXra:1701.0304 [pdf]**
*submitted on 2017-01-06 12:53:44*

**Authors:** George Rajna

**Comments:** 29 Pages.

It goes by the unwieldy acronym STT-MRAM, which stands for spin-transfer torque magnetic random access memory. [21] Memory chips are among the most basic components in computers. The random access memory is where processors temporarily store their data, which is a crucial function. Researchers from Dresden and Basel have now managed to lay the foundation for a new memory chip concept. [20] Researchers have built a record energy-efficient switch, which uses the interplay of electricity and a liquid form of light, in semiconductor microchips. The device could form the foundation of future signal processing and information technologies, making electronics even more efficient. [19] The magnetic structure of a skyrmion is symmetrical around its core; arrows indicate the direction of spin. [18] According to current estimates, dozens of zettabytes of information will be stored electronically by 2020, which will rely on physical principles that facilitate the use of single atoms or molecules as basic memory cells. [17] EPFL scientists have developed a new perovskite material with unique properties that can be used to build next-generation hard drives. [16] Scientists have fabricated a superlattice of single-atom magnets on graphene with a density of 115 terabits per square inch, suggesting that the configuration could lead to next-generation storage media. [15] Now a researcher and his team at Tyndall National Institute in Cork have made a 'quantum leap' by developing a technical step that could enable the use of quantum computers sooner than expected. [14] A method to produce significant amounts of semiconducting nanoparticles for light-emitting displays, sensors, solar panels and biomedical applications has gained momentum with a demonstration by researchers at the Department of Energy's Oak Ridge National Laboratory. [13] A source of single photons that meets three important criteria for use in quantum-information systems has been unveiled in China by an international team of physicists. Based on a quantum dot, the device is an efficient source of photons that emerge as solo particles that are indistinguishable from each other. The researchers are now trying to use the source to create a quantum computer based on "boson sampling". [11]

**Category:** Digital Signal Processing

[248] **viXra:1701.0303 [pdf]**
*submitted on 2017-01-06 08:12:02*

**Authors:** August Lau, Chuan Yin

**Comments:** 8 pages, 11 figures

The goal of seismic processing is to convert input data collected in the field into a meaningful image based on signal processing and
wave equation processing and other algorithms. It is normally a global approach like tomography or FWI (full waveform inversion).
Seismic imaging or inversion methods fail partly due to the thin-lens effect or rough surfaces. These interfaces are non-invertible. To
mitigate the problem, we propose a more stable method for seismic imaging in 4 steps as layer stripping approach of INVERSION + DATUMING + STATIC + ENHANCEMENT.

**Category:** Geophysics

[247] **viXra:1701.0302 [pdf]**
*submitted on 2017-01-06 08:47:32*

**Authors:** George Rajna

**Comments:** 30 Pages.

Scientists at the U.S. Naval Research Laboratory (NRL) have reported the first direct comparison of the spin polarization generated in the topologically protected Dirac states of a topological insulator (TI) bismuth selenide (Bi2Se3) and the trivial 2-dimensional electron gas (2DEG) states at the surface of indium arsenide (InAs). [22] Topological insulators, an exciting, relatively new class of materials, are capable of carrying electricity along the edge of the surface, while the bulk of the material acts as an electrical insulator. Practical applications for these materials are still mostly a matter of theory, as scientists probe their microscopic properties to better understand the fundamental physics that govern their peculiar behavior. [21] A Florida State University research team has discovered a new crystal structure of organic-inorganic hybrid materials that could open the door to new applications for optoelectronic devices like light-emitting diodes and lasers. [20] Researchers have built a record energy-efficient switch, which uses the interplay of electricity and a liquid form of light, in semiconductor microchips. The device could form the foundation of future signal processing and information technologies, making electronics even more efficient. [19] The magnetic structure of a skyrmion is symmetrical around its core; arrows indicate the direction of spin. [18] According to current estimates, dozens of zettabytes of information will be stored electronically by 2020, which will rely on physical principles that facilitate the use of single atoms or molecules as basic memory cells. [17] EPFL scientists have developed a new perovskite material with unique properties that can be used to build next-generation hard drives. [16] Scientists have fabricated a superlattice of single-atom magnets on graphene with a density of 115 terabits per square inch, suggesting that the configuration could lead to next-generation storage media. [15] Now a researcher and his team at Tyndall National Institute in Cork have made a 'quantum leap' by developing a technical step that could enable the use of quantum computers sooner than expected. [14]

**Category:** Condensed Matter

[246] **viXra:1701.0301 [pdf]**
*submitted on 2017-01-06 05:35:04*

**Authors:** Sai Venkatesh Balasubramanian

**Comments:** 15 Pages.

In this article, the masterpiece of Dikshitar and by extension all of Carnatic Music, namely the ninth Avarana Krithi of Sri Yantra worship in Aahiri Ragam is discussed in detail, diving deep into its philosophical and spiritual significance. The song is understood to the philosophies of all the nine Avaranas into a single composition, merging seamlessly the Vibhaktis, Yoginis, Avarana concepts and Devatas. Thus, it is concluded that this song is a pinnacle of Muthuswami Dikshitar, of Carnatic Music, of Navavarana Concept and of Sri Vidya itself, extolling Kamalambal by viewing the Mother Lalitha as the very Sat-Chit-Aanandam Parabrahman.

**Category:** Religion and Spiritualism

[245] **viXra:1701.0300 [pdf]**
*submitted on 2017-01-06 07:09:48*

**Authors:** Miloje M. Rakocevic

**Comments:** 6 Pages. From Proceedings of the 2nd International Conference “Theoretical Approaches to BioInformation Systems” (TABIS.2013), September 17 – 22, 2013, Belgrade, Serbia.

In previous two works [1], [2] we have shown the determination of genetic code by golden and harmonic mean within standard Genetic Code Table (GCT), i.e. nucleotide triplet table, whereas in this paper we show the same determination through a specific connection between two tables – of nucleotide doublets Table (DT) and triplets Table (TT), over polarity of amino acids, measured by Cloister energy. (Miloje M. Rakočević) (Belgrade, 6.01.2017)
(www.rakocevcode.rs) (mirkovmiloje@gmail.com)

**Category:** Quantitative Biology

[244] **viXra:1701.0299 [pdf]**
*replaced on 2017-01-15 11:21:14*

**Authors:** William O. Straub

**Comments:** 13 Pages. Finalized, with typos fixed in Equations (6.2.2) and (6.3.2)

A very elementary overview of the spinor concept, intended as a guide for undergraduates.

**Category:** Mathematical Physics

[243] **viXra:1701.0298 [pdf]**
*replaced on 2017-01-09 04:09:17*

**Authors:** Erman ZENG

**Comments:** 46 Pages.

The quantitative Marxism function system is developed on the basis of the labor theory of value as the micro foundation including Marx labour value function, Marx surplus value function, Marx production function. The heterogeneous aggregation problem is overcome by using matrix analysis of the macro input-output data resulting price eigenvalues and product value thus the details about an economic system such as the rate of profit, the surplus rate of value, the elasticity of capital output. The falling tendency of the rate of profit may not be true if the economy undergoes an general equilibrium.

**Category:** Economics and Finance

[242] **viXra:1701.0297 [pdf]**
*submitted on 2017-01-06 04:09:40*

**Authors:** Domenico Oricchio

**Comments:** 1 Page.

An attempt to universal definition of phase transition

**Category:** Classical Physics

[241] **viXra:1701.0296 [pdf]**
*submitted on 2017-01-05 13:34:46*

**Authors:** Ilija Barukčić

**Comments:** 7 Pages. Copyright © 2017 by Ilija Barukčić, Jever, Germany. Published by:

Epstein-Barr virus (EBV), a herpes virus which persists in memory B cells in the peripheral blood for the lifetime of a person, is associated with some malignancies. Many studies suggested that the Epstein-Barr virus contributes to the development of Hodgkin's lymphoma (HL) in some cases too. Despite intensive study, the role of Epstein-Barr virus in Hodgkin's lymphoma remains enigmatic. It is the purpose of this publication to make the proof the Epstein-Barr virus is a main cause of Hodgkin’s lymphoma (k=+0,739814235, p Value = 0,000000000000138).

**Category:** Statistics

[240] **viXra:1701.0295 [pdf]**
*submitted on 2017-01-05 11:13:15*

**Authors:** George Rajna

**Comments:** 36 Pages.

Researchers have developed a microscope that can chemically identify individual micron-sized particles. [20] Scientists around the Nobel laureate Stefan Hell at the Max Planck Institute for Biophysical Chemistry in Göttingen have now achieved what was for a long time considered impossible – they have developed a new fluorescence microscope, called MINFLUX, allowing, for the first time, to optically separate molecules, which are only nanometers (one millionth of a millimeter) apart from each other. [19] Dipole orientation provides new dimension in super-resolution microscopy [18] Fluorescence is an incredibly useful tool for experimental biology and it just got easier to tap into, thanks to the work of a group of University of Chicago researchers. [17] Molecules that change colour can be used to follow in real-time how bacteria form a protective biofilm around themselves. This new method, which has been developed in collaboration between researchers at Linköping University and Karolinska Institutet in Sweden, may in the future become significant both in medical care and the food industry, where bacterial biofilms are a problem. [16] Researchers led by Carnegie Mellon University physicist Markus Deserno and University of Konstanz (Germany) chemist Christine Peter have developed a computer simulation that crushes viral capsids. By allowing researchers to see how the tough shells break apart, the simulation provides a computational window for looking at how viruses and proteins assemble. [15] IBM scientists have developed a new lab-on-a-chip technology that can, for the first time, separate biological particles at the nanoscale and could enable physicians to detect diseases such as cancer before symptoms appear. [14] Scientists work toward storing digital information in DNA. [13] Leiden theoretical physicists have proven that DNA mechanics, in addition to genetic information in DNA, determines who we are. Helmut Schiessel and his group simulated many DNA sequences and found a correlation between mechanical cues and the way DNA is folded. They have published their results in PLoS One. [12]`

**Category:** Quantum Physics

[239] **viXra:1701.0294 [pdf]**
*submitted on 2017-01-05 11:15:03*

**Authors:** Miloje M. Rakocevic

**Comments:** 255 Pages.

This book contains my works published in the period 2005-2013 on my website (also in arXiv). The concept of "harmony" in the title refers to the determination of the genetic code by golden mean, generalized golden mean and harmonic mean. Some parts of the contents, in the meantime are published in some of the official journals, but most are not, and this was the reason for my decision to publish all papers here in their entirety.
Miloje M. Rakočević) (Belgrade, 31.12.2016)
(www.rakocevcode.rs) (mirkovmiloje@gmail.com)

**Category:** Quantitative Biology

[238] **viXra:1701.0293 [pdf]**
*submitted on 2017-01-05 08:58:59*

**Authors:** Espen Gaarder Haug

**Comments:** 12 Pages.

This paper discusses the similarities between Einstein’s length contraction and the FitzGerald, Lorentz, and Larmor length contraction. The FitzGerald, Lorentz, and Larmor length contraction was originally derived for only the case of a frame moving relative to the ether frame, and not for two moving frames. When extending the FitzGerald, Lorentz, and Larmor length transformation to any two frames, we will clearly see that it is different than the Einstein length contraction. Under the FitzGerald, Lorentz, and Larmor length transformation we get both length contraction and length expansion, and non-reciprocality, while under Einstein’s special relativity theory we have only length contraction and reciprocality. However, we show that there is a mathematical and logical link between the two methods of measuring length.
This paper shows that the Einstein length contraction can be derived from assuming an anisotropic one-way speed of light. Further, we show that that the reciprocality for length contraction under special relativity is an apparent reciprocality due to Einstein-Poincar ́e synchronization. The Einstein length contraction is real in the sense that the predictions are correct when measured with Einstein-Poincar ́e synchronized clocks. Still we will claim that there likely is a deeper and more fundamental reality that is better described with the extended FitzGerald, Lorentz, and Larmor framework, which, in the special case of using Einstein-Poincar ́e synchonized clocks gives Einstein’s length contraction. The extended FitzGerald, Lorentz, and Larmor length contraction is also about length expansion, and it is not recipro- cal between frames. Still, when using Einstein synchronized clocks the length contraction is apparently reciprocal. An enduring, open question concerns whether or not it is possible to measure the one-way speed of light without relying on Einstein-Poincar ́e synchronization or slow clock transportation synchronization, and if the one-way speed of light then is anisotropic or isotropic. Several experiments performed and published claim to have found an anisotropic one-way speed of light. These experiments have been ignored or ridiculed, but in our view they should be repeated and investigated further.

**Category:** Relativity and Cosmology

[237] **viXra:1701.0292 [pdf]**
*submitted on 2017-01-05 10:03:01*

**Authors:** George Rajna

**Comments:** 17 Pages.

Physicists have performed a test designed to investigate the effects of the expansion of the universe—hoping to answer questions such as "does the expansion of the universe affect laboratory experiments?", "might this expansion change the lengths of solid objects and the time measured by atomic clocks differently, in violation of Einstein's equivalence principle?", and "does spacetime have a foam-like structure that slightly changes the speed of photons over time?", an idea that could shed light on the connection between general relativity and quantum gravity. [11] Einstein's equivalence principle states that an object in gravitational free fall is physically equivalent to an object that is accelerating with the same amount of force in the absence of gravity. This principle lies at the heart of general relativity and has been experimentally tested many times. Now in a new paper, scientists have experimentally demonstrated a conceptually new way to test the equivalence principle that could detect the effects of a relatively new concept called spin-gravity coupling. [10] A recent peer-reviewed paper by physicist James Franson from the University of Maryland in the US has initiated a stir among physics community. Issued in the New Journal of Physics, the paper points to evidence proposing that the speed of light as defined by the theory of general relativity, is slower than originally thought. [9] Gravitational time dilation causes decoherence of composite quantum systems. Even if gravitons are there, it's probable that we would never be able to perceive them. Perhaps, assuming they continue inside a robust model of quantum gravity, there may be secondary ways of proving their actuality. [7] The magnetic induction creates a negative electric field, causing an electromagnetic inertia responsible for the relativistic mass change; it is the mysterious Higgs Field giving mass to the particles. The accelerating electrons explain not only the Maxwell Equations and the Special Relativity, but the Heisenberg Uncertainty Relation, the Wave-Particle Duality and the electron's spin also, building the Bridge between the Classical and Quantum Theories. The Planck Distribution Law of the electromagnetic oscillators explains the electron/proton mass rate and the Weak and Strong Interactions by the diffraction patterns. The Weak Interaction changes the diffraction patterns by moving the electric charge from one side to the other side of the diffraction pattern, which violates the CP and Time reversal symmetry. The self maintained electric potential of the accelerating charges equivalent with the General Relativity space-time curvature, and since it is true on the quantum level also, gives the base of the Quantum Gravity. The diffraction patterns and the locality of the self-maintaining electromagnetic potential explains also the Quantum Entanglement, giving it as a natural part of the relativistic quantum theory.

**Category:** Quantum Gravity and String Theory

[236] **viXra:1701.0291 [pdf]**
*submitted on 2017-01-04 20:44:19*

**Authors:** Ramzi suleiman

**Comments:** 53 Pages. relevant also to quantum mechanics

We propose a simple, axiom-free modification of Galileo-Newton's dynamics of moving bodies, termed Information Relativity theory. We claim that the theory is capable of unifying physics. The claimed unification is supported by the fact that the same derived set of simple and beautiful transformations, apply successfully to predicting and explaining many phenomena and findings in cosmology, quantum mechanics, and more. Our modification of classical physics is done simply by accounting for the time travel of information about a physical measurement, from the reference frame at which the measurement was taken, to an observer in another reference frame, which is in motion relative to the first frame. This minor modification of classical physics turns out to be sufficient for unifying all the dynamics of moving bodies, regardless of their size and mass. Since the theory's transformations and predictions are expressed only in terms of observable physical entities, its testing should be simple and straightforward. For quantum mechanics the theory predicts and explains matter-wave duality, quantum phase transition, quantum criticality, entanglement, the diffraction of single particles in the double slit experiment, the quantum nature of the hydrogen atom, the strong force, quantum confinement, and asymptotic freedom. For cosmology, the theory constructs a relativistic quantum cosmology, which provides plausible and testable explanations of dark matter and dark energy, as well as predictions of the mass of the Higgs boson, the GZK cutoff phenomena, the Schwarzschild radius of black holes (without interior singularity), and the timeline of ionization of chemical elements along the history of the universe. Extensions of the theory to accounting for the gravitational and electrostatic fields are briefly discussed.

**Category:** Relativity and Cosmology

[235] **viXra:1701.0290 [pdf]**
*submitted on 2017-01-04 22:51:17*

**Authors:** Luke Kenneth Casson Leighton

**Comments:** 7 Pages.

Colour (R,G,B) seems to be fashionable in particle physics theories,
where it may be interpreted to be phase. In the context of the
Extended Rishon Model, where we interpret particles to comprise photons
in phase-harmonic braid-ordered inter-dependence, Colour takes on a
very specific relevance and meaning, not least because Maxwell's
equations have to be obeyed literally and undeniably, and phase is
an absolutely critical part of Maxwell's equations.
A number of potential candidate layouts are explored, including taking
Sundance O Bilson-Thompson's topological braid-order
literally. Ultimately though, the only thing that
worked out that still respected the rules of the Extended Rishon
Model
was to place the Rishons on a mobius strip, mirroring
Williamson's toroidal pattern, which, with
its back-to-back two-cycle rotation, reminds us of Qiu-Hong Hu's
Hubius Helix. The layout of the 2nd level
I-Frame is therefore explored, using the proton as a candidate.

**Category:** High Energy Particle Physics

[234] **viXra:1701.0289 [pdf]**
*replaced on 2017-01-14 03:42:17*

**Authors:** Tamas Lajtner

**Comments:** 3 Pages.

Space is a three-dimensional extent; matter also has three spatial dimensions. Time is the
result of the action-reaction of space and matter.
Space is what matter uses as space. Space is not dependent on its texture; it can be made out
of matter or non-matter. Time is one characteristic of the given space used by a given matter.
Using this new approach called space-matter theory, we can find that there are different
spaces (cp. tunneling), where the same matter has different velocities. These velocities can be
greater than c; their value depends on the amount of information that the given space contains.

**Category:** Relativity and Cosmology

[233] **viXra:1701.0288 [pdf]**
*submitted on 2017-01-05 02:23:17*

**Authors:** Binyamin Tsadik

**Comments:** 1 Page.

This paper represents an intuitive approach to understanding Entropy.
We can thus associate entropy for all forms of energy based on a known form.
Boltzmann's Equation will allow us to determine chemical entropy for chemical energy and based on this we can determine the entropy for all forms of energy.

**Category:** Thermodynamics and Energy

[232] **viXra:1701.0287 [pdf]**
*submitted on 2017-01-05 04:09:40*

**Authors:** W.Berckmans

**Comments:** 20 Pages.

The Physical Reality (PhR) model recently presented in viXra.org/abs/16040230 is the outcome of an alternative approach to studying cosmic behavior. It implicitly extends the principle of energy conservation to the whole cosmos since its creation out of nihil. This article concludes that under such scenario an attempt to produce energy "for free" (e.g. by cold fusion) would not be doomed to fail.

**Category:** Thermodynamics and Energy

[231] **viXra:1701.0286 [pdf]**
*submitted on 2017-01-05 04:25:49*

**Authors:** Daniele Sasso

**Comments:** 8 Pages.

Planck's relation isn't quantum in intrinsic manner because of frequency that changes with continuity into a largest range of frequencies that includes classic electromagnetic waves, microwaves and nanowaves. These last are placed inside spectrum after microwaves and they are associated with electromagnetic nanofield. The distinction between micro and nanowaves enables to define Planck's frequency that is the threshold frequency between the two types of wave and it is located where microwaves end and the infrared band begins (300GHz). Planck's frequency enables to complete the effective quantization of Planck's relation because for energy quanta frequency cannot be smaller than Planck's frequency.

**Category:** Quantum Physics

[230] **viXra:1701.0285 [pdf]**
*submitted on 2017-01-05 06:40:05*

**Authors:** George Rajna

**Comments:** 16 Pages.

Thanks to a new development in nuclear physics theory, scientists exploring expanding fireballs that mimic the early universe have new signs to look for as they map out the transition from primordial plasma to matter as we know it. [11] Now, powerful supercomputer simulations of colliding atomic nuclei, conducted by an international team of researchers including a Berkeley Lab physicist, provide new insights about the twisting, whirlpool-like structure of this soup and what's at work inside of it, and also lights a path to how experiments could confirm these characteristics. [10] The drop of plasma was created in the Large Hadron Collider (LHC). It is made up of two types of subatomic particles: quarks and gluons. Quarks are the building blocks of particles like protons and neutrons, while gluons are in charge of the strong interaction force between quarks. The new quark-gluon plasma is the hottest liquid that has ever been created in a laboratory at 4 trillion C (7 trillion F). Fitting for a plasma like the one at the birth of the universe. [9] Taking into account the Planck Distribution Law of the electromagnetic oscillators, we can explain the electron/proton mass rate and the Weak and Strong Interactions. Lattice QCD gives the same results as the diffraction patterns of the electromagnetic oscillators, explaining the color confinement and the asymptotic freedom of the Strong Interactions.

**Category:** High Energy Particle Physics

[229] **viXra:1701.0284 [pdf]**
*submitted on 2017-01-04 11:30:13*

**Authors:** George Rajna

**Comments:** 29 Pages.

Topological insulators, an exciting, relatively new class of materials, are capable of carrying electricity along the edge of the surface, while the bulk of the material acts as an electrical insulator. Practical applications for these materials are still mostly a matter of theory, as scientists probe their microscopic properties to better understand the fundamental physics that govern their peculiar behavior. [21] A Florida State University research team has discovered a new crystal structure of organic-inorganic hybrid materials that could open the door to new applications for optoelectronic devices like light-emitting diodes and lasers. [20] Researchers have built a record energy-efficient switch, which uses the interplay of electricity and a liquid form of light, in semiconductor microchips. The device could form the foundation of future signal processing and information technologies, making electronics even more efficient. [19] The magnetic structure of a skyrmion is symmetrical around its core; arrows indicate the direction of spin. [18] According to current estimates, dozens of zettabytes of information will be stored electronically by 2020, which will rely on physical principles that facilitate the use of single atoms or molecules as basic memory cells. [17] EPFL scientists have developed a new perovskite material with unique properties that can be used to build next-generation hard drives. [16] Scientists have fabricated a superlattice of single-atom magnets on graphene with a density of 115 terabits per square inch, suggesting that the configuration could lead to next-generation storage media. [15] Now a researcher and his team at Tyndall National Institute in Cork have made a 'quantum leap' by developing a technical step that could enable the use of quantum computers sooner than expected. [14] A method to produce significant amounts of semiconducting nanoparticles for light-emitting displays, sensors, solar panels and biomedical applications has gained momentum with a demonstration by researchers at the Department of Energy's Oak Ridge National Laboratory. [13]

**Category:** Condensed Matter

[228] **viXra:1701.0283 [pdf]**
*submitted on 2017-01-04 08:37:24*

**Authors:** Elkin Igor Vladimirovich

**Comments:** 7 Pages. ielkin@yandex.ru

From Einstein's theory of relativistic we obtain a formula relating the acceleration and the time derivative of the momentum. The formula includes the speed of light. The metric, according to general relativity, and changes in the gravitational field and with time. Since changing the metric, changing the way the distance measurement. Since changing a method of measuring distances in various small local areas, then in every such local area of the speed of light is negligible differs from the speed of light in a different local area, if this speed is measured in units of certain third local area (eg, the area with the observer). Therefore, the interaction will be different on the attraction and repulsion. From this is derived the inertia of a body.

**Category:** Relativity and Cosmology

[227] **viXra:1701.0282 [pdf]**
*submitted on 2017-01-04 10:17:30*

**Authors:** George Rajna

**Comments:** 27 Pages.

In the proposed model, the universe contains multiple sectors, each of which is governed by its own version of the Standard Model with its own Higgs vacuum expectation value. The sector with the smallest non-zero vacuum expectation value contains our copy of the Standard Model. [18] Physicists have come up with a new model that they say solves five of the biggest unanswered questions in modern physics, explaining the weirdness of dark matter, neutrino oscillations, baryogenesis, cosmic inflation, and the strong CP problem all at once. [17] The universe is unbalanced. Gravity is tremendously weak. But the weak force, which allows particles to interact and transform, is enormously strong. The mass of the Higgs boson is suspiciously petite. And the catalog of the makeup of the cosmos? Ninety-six percent incomplete. [16] One of the biggest challenges in physics is to understand why everything we see in our universe seems to be formed only of matter, whereas the Big Bang should have created equal amounts of matter and antimatter. CERN's LHCb experiment is one of the best hopes for physicists looking to solve this longstanding mystery. [15] Imperial physicists have discovered how to create matter from light-a feat thought impossible when the idea was first theorized 80 years ago. [14] How can the LHC experiments prove that they have produced dark matter? They can't… not alone, anyway. [13] The race for the discovery of dark matter is on. Several experiments worldwide are searching for the mysterious substance and pushing the limits on the properties it may have. [12] Dark energy is a mysterious force that pervades all space, acting as a "push" to accelerate the universe's expansion. Despite being 70 percent of the universe, dark energy was only discovered in 1998 by two teams observing Type Ia supernovae. A Type 1a supernova is a cataclysmic explosion of a white dwarf star. The best way of measuring dark energy just got better, thanks to a new study of Type Ia supernovae. [11] Newly published research reveals that dark matter is being swallowed up by dark energy, offering novel insight into the nature of dark matter and dark energy and what the future of our Universe might be. [10] The gravitational force attracting the matter, causing concentration of the matter in a small space and leaving much space with low matter concentration: dark matter and energy. There is an asymmetry between the mass of the electric charges, for example proton and electron, can understood by the asymmetrical Planck Distribution Law. This temperature dependent energy distribution is asymmetric around the maximum intensity, where the annihilation of matter and antimatter is a high probability event. The asymmetric sides are creating different frequencies of electromagnetic radiations being in the same intensity level and compensating each other. One of these compensating ratios is the electron – proton mass ratio. The lower energy side has no compensating intensity level, it is the dark energy and the corresponding matter is the dark matter.

**Category:** High Energy Particle Physics

[226] **viXra:1701.0281 [pdf]**
*submitted on 2017-01-04 06:46:28*

**Authors:** Ryujin Choe

**Comments:** 1 Page.

Every even integer greater than 2 can be expressed as the sum of two primes

**Category:** Number Theory

[225] **viXra:1701.0280 [pdf]**
*submitted on 2017-01-04 06:57:26*

**Authors:** Edgar Valdebenito

**Comments:** 6 Pages.

In this note we present:Real solutions of the polynomial equations:x^5+x^4+x-1=0 , x^5+x^3+x^2-1=0 , via fixed point method , Relations for pi number, Fractal connection.

**Category:** General Mathematics

[224] **viXra:1701.0279 [pdf]**
*submitted on 2017-01-03 13:45:09*

**Authors:** Erik Blasch, Youssif Al-Nashif, Salim Hariri

**Comments:** 15 Pages.

Information fusion includes signals, features, and decision-level analysis over various types of data including imagery, text, and cyber security detection. With the maturity of data processing, the explosion of big data, and the need for user acceptance; the Dynamic Data-Driven Application System (DDDAS) philosophy fosters insights into the usability of information systems solutions.

**Category:** General Mathematics

[223] **viXra:1701.0278 [pdf]**
*submitted on 2017-01-03 13:46:09*

**Authors:** Madad Khan, Misbah

**Comments:** 25 Pages.

In this paper, we have introduced the notion of neutrosophic (2;2)regular, neutrosophic strongly regular neutrosophic AG-groupoids and investigated these structures. We have shown that neutrosophic regular, neutrosophic intraregular and neutrosophic strongly regular AG-groupoid are the only generalized classes of an AG-groupoid.

**Category:** General Mathematics

[222] **viXra:1701.0277 [pdf]**
*submitted on 2017-01-03 13:47:23*

**Authors:** Jayshree K.

**Comments:** 219 Pages.

As the name suggests, a semigroup is a generalization of a group; because a semigroup need not in general have an element which has an inverse. The algebraic structure enjoyed by a semigroup is a non-empty set together with an associative closed binary operation.

**Category:** General Mathematics

[221] **viXra:1701.0276 [pdf]**
*submitted on 2017-01-03 13:48:38*

**Authors:** Mumtaz Ali, Mohsin Khan

**Comments:** 8 Pages.

This article is a study on the development of neutrosophic triplet ring and neutrosophic triplet field.

**Category:** General Mathematics

[220] **viXra:1701.0275 [pdf]**
*submitted on 2017-01-03 13:49:23*

**Authors:** Anjan Mukherjee, Sadhan Sarkar

**Comments:** 8 Pages.

F. Smarandache introduced the concept of neutrosophic set in 1995 and P. K. Maji introduced the notion of neutrosophic soft set in 2013, which is a hybridization of neutrosophic set and soft set.

**Category:** General Mathematics

[219] **viXra:1701.0274 [pdf]**
*submitted on 2017-01-03 13:50:16*

**Authors:** Gülnur Saffak Atalay, Emin Kasap

**Comments:** 11 Pages.

In this paper, we analyzed the problem of consructing a family of surfaces from a given some special Smarandache curves in Euclidean 3-space.

**Category:** General Mathematics

[218] **viXra:1701.0273 [pdf]**
*submitted on 2017-01-03 13:51:26*

**Authors:** S.mohana Prakash, P.betty, K.sivanarulselvan

**Comments:** 10 Pages.

Biometrics is used to uniquely identify a person’s individual based on physical and behavioural characteristics. Unimodal biometric system contains various problems such as degree of freedom, spoof attacks, non-universality, noisy data and error rates. so the need of multimodal biometrics system occurred, a multimodal biometric system combines the different biometric traits and provides better performance as compared to a single biometric trait.

**Category:** General Mathematics

[217] **viXra:1701.0272 [pdf]**
*submitted on 2017-01-03 13:52:20*

**Authors:** Edmundas Kazimieras Zavadskas, Romualdas Baušys, Marius Lazauskas

**Comments:** 14 Pages.

The principles of sustainability have become particularly important in the construction, real estate maintenance sector, and all areas of life in recent years. The one of the major problem of urban territories that domestic and construction waste of generated products cannot be removed automatically.

**Category:** General Mathematics

[216] **viXra:1701.0271 [pdf]**
*submitted on 2017-01-03 13:53:15*

**Authors:** Yaman Akbulut, Abdulkadir Sengur, Yanhui Guo

**Comments:** 4 Pages.

Image segmentation is the ﬁrst step of image processing and image analysis. Texture segmentation is a challenging task in image segmentation applications.

**Category:** General Mathematics

[215] **viXra:1701.0269 [pdf]**
*submitted on 2017-01-03 13:54:47*

**Authors:** Yun Ye

**Comments:** 9 Pages.

A simpliﬁed neutrosophic set (SNS) is a subclass of neutrosophic set and contains a single-valued neutrosophic set (SVNS) and an interval neutrosophic set (INS).

**Category:** General Mathematics

[214] **viXra:1701.0268 [pdf]**
*submitted on 2017-01-03 13:57:03*

**Authors:** Alexey Stakhov

**Comments:** 20 Pages.

We consider real important Generalized Golden proportion.

**Category:** General Mathematics

[213] **viXra:1701.0267 [pdf]**
*submitted on 2017-01-03 13:57:58*

**Authors:** J. Dezert, A. Tchamova, P. Konstantinova

**Comments:** 11 Pages.

The main purpose of this paper is to apply and to test the performance of a new method, based on belief functions, proposed by Dezert et al. in order to evaluate the quality of the individual association pairings provided in the optimal data association solution for improving the performances of multisensormultitarget tracking systems.

**Category:** General Mathematics

[212] **viXra:1701.0266 [pdf]**
*submitted on 2017-01-03 13:58:10*

**Authors:** Rahul Singha Chowdhury, Sourangsu Banerji

**Comments:** 17 Pages.

Current wireless broadband technologies have now been providing omnipresent broadband access to wireless subscribers which were only confined to the wire line users till sometime ago. Traditionally, the wireless technologies have been categorized based on their area of coverage. The IEEE 802.11 and ETSI HiperLAN standards are the de facto standards of wireless access in local areas. Whereas IEEE 802.16 and 802.22, ETSI HiperACCESS and HiperMAN, WiBro, and HAP technologies has been predominantly used for providing service in metropolitan and cosmopolitan areas.

There have been many papers written in the recent past covering various types of wireless broadband access technologies, ranging from WLANs to satellite communications. But the study carried out previously never captured the entire spectrum of all such technologies, and even if one particular family of the wireless broadband access technology was scrutinized, the authors somehow concentrated only on certain aspects. In this paper, we aim to bridge that gap by summarizing one of the emerging wireless broadband access technologies i.e. IEEE 802. 16: WiMAX family in detail. Though some of the earliest versions of this family date back to the early decade and some are even out of practice, still we have included them in our study for the sake of completeness and a sense of insight.

[211] **viXra:1701.0265 [pdf]**
*submitted on 2017-01-03 13:58:52*

**Authors:** Lucian Capitanu, Luige Vladareanu, Virgil Florescu

**Comments:** 10 Pages.

The failure mechanism was postulated as a combination of the high level of loading during normal activities and a non-conforming contact mechanism between the femoral condyles and the tibial insert.

**Category:** General Mathematics

[210] **viXra:1701.0263 [pdf]**
*submitted on 2017-01-03 14:01:39*

**Authors:** Temitope Gbolahan Jaiyeola

**Comments:** 14 Pages.

The concept of Smarandache Bryant Schneider Group of a Smarandache loop is introduced. Relationship(s) between the Bryant Schneider Group and the Smarandache Bryant Schneider Group of an S-loop are discovered and the later is found to be useful in ﬁnding Smarandache isotopy-isomorphy condition(s) in S-loops just like the formal is useful in ﬁnding isotopy-isomorphy condition(s) inloops.

**Category:** General Mathematics

[209] **viXra:1701.0260 [pdf]**
*submitted on 2017-01-03 20:01:55*

**Authors:** Yichen Huang

**Comments:** 4 Pages.

No, in a rigorous sense specified below.

**Category:** Condensed Matter

[208] **viXra:1701.0258 [pdf]**
*replaced on 2017-03-04 22:30:59*

**Authors:** Kenneth Dalton

**Comments:** 3 Pages.

Galactic black holes were created during the Big Bang. As such, they were
available for clustering in the early Universe. This paper describes the
role these clusters could play in explaining dark matter, and it answers the
following question: What is the energy source for the extremely hot gas
found in galactic clusters?

**Category:** Relativity and Cosmology

[207] **viXra:1701.0255 [pdf]**
*submitted on 2017-01-04 03:26:39*

**Authors:** Chunfang Liu, YueSheng Luo

**Comments:** 12 Pages.

Neutrosophic set (NS) is a generalization of fuzzy set (FS) that is designed for some practical situations in which each element has diﬀerent truth membership function, indeterminacy membership function and falsity membership function.

**Category:** General Mathematics

[206] **viXra:1701.0254 [pdf]**
*submitted on 2017-01-04 03:28:13*

**Authors:** Prem Kumar Singh

**Comments:** 11 Pages.

Recently, three-way concept lattice is studied to handle the uncertainty and incompleteness in the given attribute set based on acceptation, rejection, and uncertain regions.

**Category:** General Mathematics

[205] **viXra:1701.0253 [pdf]**
*submitted on 2017-01-04 03:29:17*

**Authors:** Surapati Pramanik, Durga Banerjee, B. C. Giri

**Comments:** 18 Pages.

This paper presents TOPSIS approach to solve chance constrained multi – objective multi – level quadratic programming problem. The proposed approach actually combines TOPSIS and fuzzy goal programming.

**Category:** General Mathematics

[204] **viXra:1701.0251 [pdf]**
*submitted on 2017-01-04 03:33:20*

**Authors:** Hua Ma, Zhigang Hu, Keqin Li, Hongyu Zhang

**Comments:** 20 Pages.

Toward trustworthy cloud service selection: A time-aware approach using interval neutrosophic set.

**Category:** General Mathematics

[203] **viXra:1701.0250 [pdf]**
*submitted on 2017-01-04 03:34:42*

**Authors:** Feng Liu

**Comments:** 14 Pages.

Based on extensive exploration in Chinese classics, this paper ¯nds a fresh new perspective for KM philosophy especially in pyramid issue: The primary role in knowledge ontology should be consciousness rather than IT aspects, in which the former determines the perceived results such as data, information, knowledge or wisdom.

**Category:** General Mathematics

[202] **viXra:1701.0245 [pdf]**
*submitted on 2017-01-04 03:42:47*

**Authors:** CHEN Jinguang, ZHANG Fen, MA Lili.

**Comments:** 6 Pages.

Conflict of evidence is one of the most important factor which leads to the fusion result of evidence theory unsatisfactory. Thus the evidence conflict has been the key issue to be solved in evidence theory. By using a linear combination of ambiguity measure, discord measure and nonspecificity measure, a new uncertainty measurement method of evidence is presented.

**Category:** General Mathematics

[201] **viXra:1701.0244 [pdf]**
*submitted on 2017-01-04 03:44:01*

**Authors:** Jun Ye, Tahir Mahmood, Qaisar Khan

**Comments:** 24 Pages.

In this article we present three similarity measures between simplied neutrosophic hesitant fuzzy sets, which contain the concept of single valued neutrosophic hesitant fuzzy sets and interval valued neutrosophic hesitant fuzzy sets, based on the extension of Jaccard similarity measure, Dice similarity measure and Cosine similarity in the vector space.

**Category:** General Mathematics

[200] **viXra:1701.0243 [pdf]**
*submitted on 2017-01-04 03:45:30*

**Authors:** KOVALENKO Igore Ivanovich, ShVED Alena Vladimirovna, PUGAChENKO Ekaterina Sergeevna

**Comments:** 7 Pages.

При анализе групповых экспертных оценок эффективные результаты могут быть получены при правильном учете различных НЕ–факторов (неполнота, неопределенность, нечеткость, недостоверность, неоднозначность и др.), что в свою очередь создает основу для выбора соответствующих подходов и методов обработки экспертной информации.

**Category:** General Mathematics

[199] **viXra:1701.0242 [pdf]**
*submitted on 2017-01-04 03:47:03*

**Authors:** Igore KOVALENKO, Ekaterina ANTIPOVA, Sergey BORDUN

**Comments:** 5 Pages.

В процессе проведения экспертизы между суждениями экспертов могут возникать конфликтные ситуации, когда оценки двух и более независимых экспертных групп не пересекаются.

**Category:** General Mathematics

[198] **viXra:1701.0240 [pdf]**
*submitted on 2017-01-04 05:00:02*

**Authors:** George Rajna

**Comments:** 27 Pages.

A Florida State University research team has discovered a new crystal structure of organic-inorganic hybrid materials that could open the door to new applications for optoelectronic devices like light-emitting diodes and lasers. [20] Researchers have built a record energy-efficient switch, which uses the interplay of electricity and a liquid form of light, in semiconductor microchips. The device could form the foundation of future signal processing and information technologies, making electronics even more efficient. [19] The magnetic structure of a skyrmion is symmetrical around its core; arrows indicate the direction of spin. [18] According to current estimates, dozens of zettabytes of information will be stored electronically by 2020, which will rely on physical principles that facilitate the use of single atoms or molecules as basic memory cells. [17] EPFL scientists have developed a new perovskite material with unique properties that can be used to build next-generation hard drives. [16] Scientists have fabricated a superlattice of single-atom magnets on graphene with a density of 115 terabits per square inch, suggesting that the configuration could lead to next-generation storage media. [15] Now a researcher and his team at Tyndall National Institute in Cork have made a 'quantum leap' by developing a technical step that could enable the use of quantum computers sooner than expected. [14] A method to produce significant amounts of semiconducting nanoparticles for light-emitting displays, sensors, solar panels and biomedical applications has gained momentum with a demonstration by researchers at the Department of Energy's Oak Ridge National Laboratory. [13] A source of single photons that meets three important criteria for use in quantum-information systems has been unveiled in China by an international team of physicists. Based on a quantum dot, the device is an efficient source of photons that emerge as solo particles that are indistinguishable from each other. The researchers are now trying to use the source to create a quantum computer based on "boson sampling". [11] With the help of a semiconductor quantum dot, physicists at the University of Basel have developed a new type of light source that emits single photons.

**Category:** Condensed Matter

[197] **viXra:1701.0239 [pdf]**
*submitted on 2017-01-03 11:09:03*

**Authors:** Rajeswara Reddy P., Naga Raju I., Diwakar Reddy V., Krishnaiah G.

**Comments:** 6 Pages.

In any manufacturing industry 60% - 70% of total cost of product pertains raw materials cost. Therefore selec material supplier is very significant factor to improve quality of product as well as reducing total cost. Supplier selection considers various factors and numerous alternatives.

**Category:** General Mathematics

[196] **viXra:1701.0238 [pdf]**
*submitted on 2017-01-03 11:10:21*

**Authors:** Andrew Koster, Ana L. C. Bazzan, Marcelo de Souza

**Comments:** 18 Pages.

This paper presents a non-prioritized belief change operator, designed speciﬁcally for incorporating new information from many heterogeneous sources in an uncertain environment. We take into account that sources may be untrustworthy and provide a principled method for dealing with the reception of contradictory information.

**Category:** General Mathematics

[195] **viXra:1701.0235 [pdf]**
*submitted on 2017-01-03 11:15:39*

**Authors:** Hong-yu Zhang, Pu Ji, Jian-qiang Wang

**Comments:** 36 Pages.

The selection process of medical treatment options is a multi-criteria decision-making (MCDM)
one, and single valued trapezoidal neutrosophic numbers (SVTNNs) are useful in depicting information and fuzziness in selection processes.

**Category:** General Mathematics

[194] **viXra:1701.0234 [pdf]**
*submitted on 2017-01-03 11:16:49*

**Authors:** M. Reehana Parveen, P. Sekar

**Comments:** 16 Pages.

In this paper for the first time authors define the new notion of merging of semilattices. The properties of merged semilattices is studied and several interesting results are proved in this direction.

**Category:** General Mathematics

[193] **viXra:1701.0233 [pdf]**
*submitted on 2017-01-03 11:18:32*

**Authors:** Simon CARLADOUS, Jean-Marc TACNET, Jean DEZERT, Corinne CURT, Mireille BATTON-HUBERT

**Comments:** 7 Pages.

Assessment of structural, functional and economic effectiveness of civil protective works against natural hazard, such as torrential check dams, is a key issue for infrastructures assets owners and managers such as French government who owns many of them. This assessment relies on imperfect information (imprecise, incomplete, uncertain), provided by several sources, which relates as far to structures themselves than to natural phenomena scenarios.

**Category:** General Mathematics

[192] **viXra:1701.0232 [pdf]**
*submitted on 2017-01-03 11:22:24*

**Authors:** Gursangeet Kaur, Jyoti Rani

**Comments:** 5 Pages.

Medical image processing and its segmentation is an active and interesting area for researchers. It has reached at the tremendous place in diagnosing tumors after the discovery of CT and MRI. MRI is an useful tool to detect the brain tumor and segmentation is performed to carry out the useful portion from an image. The purpose of this paper is to provide an overview of different image segmentation methods like watershed algorithm, morphological operations, neutrosophic sets, thresholding, K-means clustering, fuzzy C-means etc using MR images.

**Category:** General Mathematics

[191] **viXra:1701.0231 [pdf]**
*submitted on 2017-01-03 11:24:55*

**Authors:** Rafael Munoz-Salinas, R. Medina-Carnicer, F.J. Madrid-Cuevas, A. Carmona-Poyato

**Comments:** 18 Pages.

Multi-camera people tracking using evidential filters.

**Category:** General Mathematics

[190] **viXra:1701.0230 [pdf]**
*submitted on 2017-01-03 11:42:01*

**Authors:** Faruk Karaaslan

**Comments:** 15 Pages.

In this paper, we propose three similarity measure methods for single valued neutrosophic reﬁned sets and interval neutrosophic reﬁned sets based on Jaccard, Dice and Cosine similarity measures of single valued neutrosophic sets and interval neutrosophic sets.

**Category:** General Mathematics

[189] **viXra:1701.0229 [pdf]**
*submitted on 2017-01-03 11:42:59*

**Authors:** Surapati Pramanik, Durga Banerjee, B. C. Giri

**Comments:** 7 Pages.

In the paper, multi criteria group decision making model has been presented based on tangent similarity measure of neutrosophic refined set. Simplified form of tangent similarity measure in neutrosophic refined set has been presented. New ranking method has been proposed based on refined tangent similarity measure. The proposed approach has been illustrated by solving a teacher selection problem in neutrosophic refined set environment.

**Category:** General Mathematics

[188] **viXra:1701.0227 [pdf]**
*submitted on 2017-01-03 11:44:43*

**Authors:** Romualdas Bausys, Edmundas-Kazimieras Zavadskas

**Comments:** 16 Pages.

The paper presents the extension of VIKOR method for the solution of the multicriteria decision making problems, namely VIKOR-IVNS. The original VIKOR method was proposed for the solution of the decision problems with the conflicting and non-common measurable criteria. In this paper, a new extension of the crisp VIKOR method has been proposed. This extension is developed in the context of interval-valued neutrosophic sets.

**Category:** General Mathematics

[187] **viXra:1701.0226 [pdf]**
*submitted on 2017-01-03 11:46:18*

**Authors:** Mridula Sarkar, Samir Dey, Tapan Kumar Roy

**Comments:** 7 Pages.

In this paper, a multi-objective non-linear neutrosophic optimization (NSO) approach for optimizing the design of plane truss structure with multiple objectives subject to a specified set of constraints has been developed.

**Category:** General Mathematics

[186] **viXra:1701.0225 [pdf]**
*submitted on 2017-01-03 11:47:18*

**Authors:** Yun Ye

**Comments:** 20 Pages.

Motivated by the idea of single valued neutrosophic uncertain linguistic sets (SVNULSs) and hesitant fuzzy sets (HFSs), in this article we combine SVNULSs with HFSs to present the idea of hesitant single valued neutrosophic uncertain linguistic sets (HSVNULSs), hesitant single valued neutrosophic uncertain linguistic elements (HSVNULEs) and defined some basic operational laws of HSVNULEs.

**Category:** General Mathematics

[185] **viXra:1701.0224 [pdf]**
*replaced on 2017-01-30 11:42:37*

**Authors:** Don Brown

**Comments:** 14 Pages.

This paper creates a model that unifies Gravity, Inertia, and Centripetal Force and shows how they are all created by the same mechanical properties. Some additions to the opening statements and some Practical Applications for this theory have been added to the end of the paper.

**Category:** Classical Physics

[184] **viXra:1701.0223 [pdf]**
*submitted on 2017-01-03 11:48:29*

**Authors:** Peide Liu, Fei Teng

**Comments:** 27 Pages.

In this paper, similar to the extension from intuitionistic fuzzy numbers (IFNs) to neutrosophic numbers (NNs), we propose the normal neutrosophic numbers (NNNs) based on the normal intuitionistic fuzzy numbers (NIFNs) to handle the incompleteness, indeterminacy and inconsistency of the evaluation information.

**Category:** General Mathematics

[183] **viXra:1701.0222 [pdf]**
*submitted on 2017-01-03 11:49:40*

**Authors:** Yingwu Fang, Yi Wang, Wei Jin

**Comments:** 5 Pages.

Traditional particle filter (PF) will have bad effect when the target’s color is similar to the background or the target is blocked, and an improved tracking algorithm is presented in the framework of PF and DSmT.

**Category:** General Mathematics

[182] **viXra:1701.0221 [pdf]**
*submitted on 2017-01-03 11:51:02*

**Authors:** Azeddine Elhassouny, Soufiane Idbraim, Aissam Bekkarri, Driss Mammass, Danielle Ducrot

**Comments:** 9 Pages.

In this paper we introduce a new procedure for classification and change detection by the integration in a fusion process using hybrid DSmT model, both, the contextual information obtained from a supervised ICM classification with constraints and the temporal information with the use of two images taken at two different dates.

**Category:** General Mathematics

[181] **viXra:1701.0220 [pdf]**
*submitted on 2017-01-03 11:53:32*

**Authors:** Jean Dezert, Albena Tchamova, Pavlina Konstantinova

**Comments:** 8 Pages.

The main objective of this paper is to present, to apply, and to test the effectiveness of the new method, based on belief functions, proposed by Dezert et al. in order to evaluate the quality of the individual association pairings provided in the classical optimal data association solution for improving the performances of multitarget tracking systems in clutter, when some of the association decisions given in the optimal assignment solution are unreliable and doubtful and lead to potentially critical mistake.

**Category:** General Mathematics

[180] **viXra:1701.0219 [pdf]**
*submitted on 2017-01-03 11:55:00*

**Authors:** Sylwester Kornowski

**Comments:** 3 Pages.

Here, applying the Scale-Symmetric Theory (SST), we show that there were the 133.27 e-folds of the smooth SST inflation. On the other hand, SST leads to conclusion that the inflation of the Cosmos, which is about 4 orders of magnitude bigger than the present-day Universe, was separated in time from the expansion of the Universe. The CMB was produced when there dominated the electromagnetic and weak interactions - it leads to the SST CMB spectral index of 0.96666 which is consistent with the index that results from the Planck-satellite 2015 data (0.968 +- 0.006). But the SST CMB spectral index is 3 orders of magnitude more accurate so future more accurate observational data should show whether the SST cosmology is correct.

**Category:** Quantum Gravity and String Theory

[179] **viXra:1701.0217 [pdf]**
*submitted on 2017-01-03 11:56:40*

**Authors:** R. Narmada Devi

**Comments:** 14 Pages.

In this paper, the concept of N-open set in neutrosophic complex topological space is introduced. Some of the interesting properties of neutrosophic complex N-open sets are studied. The idea of neutrosophic complex N-continuous function and its characterization are discussed. Also the interrelation among the sets and continuity are established.

**Category:** General Mathematics

[178] **viXra:1701.0216 [pdf]**
*submitted on 2017-01-03 11:57:44*

**Authors:** Majid Khan, Muhammad Gulistan

**Comments:** 21 Pages.

Operational properties of neutrosophic cubic sets are investigated.The notion of neutrosophic cubic subsemigroups and neutrosophic cubic left (resp.right) ideals are introduced, and several properties are investigated.

**Category:** General Mathematics

[177] **viXra:1701.0215 [pdf]**
*submitted on 2017-01-03 12:00:04*

**Authors:** Jun Ye, Rui Yong, Qi-Feng Liang, Man Huang, Shi-Gui Du

**Comments:** 10 Pages.

Many studies have been carried out to investigate the scale effect on the shear behavior of rock joints.

**Category:** General Mathematics

[176] **viXra:1701.0214 [pdf]**
*submitted on 2017-01-03 12:01:01*

**Authors:** Muhammad Akram, Sundas Shahzadi, A. Borumand Saeid

**Comments:** 16 Pages.

In this paper, we introduce certain concepts, including neutrosophic hypergraph, line graph of neutrosophic hypergraph, dual neutrosophic hypergraph, tempered neutrosophic hypergraph and transversal neutrosophic hypergraph. We illustrate these concepts by several examples and investigate some of interesting properties.

**Category:** General Mathematics

[175] **viXra:1701.0213 [pdf]**
*submitted on 2017-01-03 12:01:53*

**Authors:** Debabrata Mandal

**Comments:** 12 Pages.

Hyperstructures, in particular hypergroups, were introduced in 1934 by Marty [12] at the eighth congress of Scandinavian Mathematicians. The notion of algebraic hyperstructure has been developed in the following decades and nowadays by many authors, especially Corsini [2, 3], Davvaz [5, 6, 7, 8, 9], Mittas [13], Spartalis [16], Stratigopoulos [17] and Vougiouklis [20]. Basic deﬁnitions and notions concerning hyperstructure theory can be found in [2].

**Category:** General Mathematics

[174] **viXra:1701.0210 [pdf]**
*submitted on 2017-01-03 12:08:30*

**Authors:** Surapati Pramanik

**Comments:** 11 Pages.

This paper proposes the framework of neutrosophic linear programming approach for solving multi objective optimization problems involving uncertainty and indeterminacy.

**Category:** General Mathematics

[173] **viXra:1701.0209 [pdf]**
*submitted on 2017-01-03 12:09:10*

**Authors:** Soumitra De

**Comments:** 7 Pages.

In this paper the author propose a new method of search called Neutrosophic search to find the most suitable match for the predicates to answer any imprecise query made by the database users. Neutrosophic search is capable of manipulating incomplete as well as inconsistent information. Fuzzy relation or vague relation can only handle incomplete information. Neutrosophic logic is an extension of classical logic.

**Category:** General Mathematics

[172] **viXra:1701.0207 [pdf]**
*submitted on 2017-01-03 12:10:59*

**Authors:** Surapati Pramanik

**Comments:** 11 Pages.

For modeling imprecise and indeterminate data for multi-objective decision making, two different methods: neutrosophic multi-objective linear/non-linear programming, neutrosophic goal programming, which have been very recently proposed in the literatuire.

**Category:** General Mathematics

[171] **viXra:1701.0206 [pdf]**
*submitted on 2017-01-03 12:12:26*

**Authors:** Kalyan Mondal, Surapati Pramanik

**Comments:** 10 Pages.

In the paper, tangent similarity measure of neutrosophic refined set is proposed and its properties are studied. The concept of this tangent similarity measure of single valued neutrosophic refined sets is an extension of tangent similarity measure of single valued neutrosophic sets.

**Category:** General Mathematics

[170] **viXra:1701.0201 [pdf]**
*submitted on 2017-01-03 12:16:38*

**Authors:** A. A. Salama

**Comments:** 13 Pages.

The fundamental concepts of neutrosophic set, introduced by Smarandache in [ 38, 39, 40], and Salama et al. in [20-37], provides a natural foundation for treating mathematically the neutrosophic phenomena which exist pervasively in our real world and for building new branches of neutrosophic mathematics.

**Category:** General Mathematics

[169] **viXra:1701.0200 [pdf]**
*submitted on 2017-01-03 12:17:25*

**Authors:** Ashraf Al-Quran, Nasruddin Hassan

**Comments:** 12 Pages.

In this paper, we ﬁrst introduce the concept of neutrosophic vague soft expert sets (NVSESs for short) which combines neutrosophic vague sets and soft expert sets to be more effective and useful. We also deﬁne its basic operations, namely complement, union, intersection, AND and OR along with illustrative examples, and study some related properties with supporting proofs. Lastly, this concept is applied to a decision making problem and its effectiveness is demonstrated using a hypothetical example.

**Category:** General Mathematics

[168] **viXra:1701.0199 [pdf]**
*submitted on 2017-01-03 12:18:06*

**Authors:** R. Narmada Devi

**Comments:** 14 Pages.

In this paper, the concept of N-open set in neutrosophic complex topological space is introduced. Some of the interesting properties of neutrosophic complex N-open sets are studied. The idea of neutrosophic complex N-continuous function and its characterization are discussed. Also the interrelation among the sets and continuity are established.

**Category:** General Mathematics

[167] **viXra:1701.0198 [pdf]**
*submitted on 2017-01-03 12:19:08*

**Authors:** Temitope Gbolahan Jaiyeola, S.P. David

**Comments:** 23 Pages.

In this paper, some new algebraic properties of a middle Bol loop are established.

**Category:** General Mathematics

[166] **viXra:1701.0197 [pdf]**
*submitted on 2017-01-03 12:19:56*

**Authors:** Han-Liang Huang

**Comments:** 12 Pages.

A single-valued neutrosophic set (SVNS) is an instance of a neutrosophic set, which can be used to handle uncertainty, imprecise, indeterminate, and inconsistent information in real life. In this paper, a new distance measure between two SVNSs is deﬁned by the full consideration of truthmembership function, indeterminacy-membership function, and falsity-membership function for the forward and backward differences.

**Category:** General Mathematics

[165] **viXra:1701.0196 [pdf]**
*submitted on 2017-01-03 12:20:43*

**Authors:** Talat Körpınar

**Comments:** 5 Pages.

Analisa-se novos tipos de superfícies conforme as curvas B- Smarandache TM1 das hélices com inclinação B bi-harmonica. Caracterizam-se as curvas B- Smarandache TM1 conforme as curvaturas de Bishop, acresentando outras relações interessantes.

**Category:** General Mathematics

[164] **viXra:1701.0195 [pdf]**
*submitted on 2017-01-03 12:21:37*

**Authors:** Somayeh Motamed, Mahsa Sadeghi Kosarkhizi

**Comments:** 12 Pages.

In this paper we introduce the notions of n-fold BL-Smarandache positive implicateve ﬁlter and n-fold BL-Smarandache implicateve ﬁlter in Smarandache residuated lattices and study the relations among them.

**Category:** General Mathematics

[163] **viXra:1701.0194 [pdf]**
*submitted on 2017-01-03 12:22:24*

**Authors:** Somayeh Motamed, Mahsa Sadeghi Kosarkhizi

**Comments:** 12 Pages.

In this paper we introduce the notions of n-fold BL-Smarandache n-fold BL-Smarandache fantastic ﬁlter and n-fold BL-Smarandache easy ﬁlter in Smarandache residuated lattices and study the relations among them.

**Category:** General Mathematics

[162] **viXra:1701.0193 [pdf]**
*submitted on 2017-01-03 12:23:19*

**Authors:** Elena Barcucci, Antonio Bernini, Stefano Bilotta, Renzo Pinzani

**Comments:** 16 Pages.

Two matrices are said non-overlapping if one of them can not be put on the other one in a way such that the corresponding entries coincide. We provide a set of non-overlapping binary matrices and a formula to enumerate it which involves the k-generalized Fibonacci numbers. Moreover, the generating function for the enumerating sequence is easily seen to be rational.

**Category:** General Mathematics

[161] **viXra:1701.0192 [pdf]**
*submitted on 2017-01-03 12:24:17*

**Authors:** Mehmet Serhat Can, Ömer Faruk Özgüven

**Comments:** 7 Pages.

PID denetleyici ile bulanık mantık denetleyici ve bunlardan oluşturulmuş karma tasarımlar, kontrol uygulamalarında sıkça kullanılan denetleyici türlerindendir.

**Category:** General Mathematics

[160] **viXra:1701.0191 [pdf]**
*submitted on 2017-01-03 12:25:01*

**Authors:** Mehmet Serhat Can, Ömer Faruk Özgüven

**Comments:** 6 Pages.

Bu bildiride, hatanın ve hata değişiminin nötrosofik üyelik fonksiyonlarıyla değerlendirildiği bir bulanık kontrol yaklaşımı sunulmuştur.

**Category:** General Mathematics

[159] **viXra:1701.0190 [pdf]**
*replaced on 2017-04-16 07:15:21*

**Authors:** Muhammad Akram, Musavarah Sarwar

**Comments:** 21 Pages.

In this research article, we present certain notions of bipolar neutrosophic
graphs. We study the dominating and independent sets of bipolar neutrosophic graphs.
We describe novel multiple criteria decision making methods based on bipolar neutro-
sophic sets and bipolar neutrosophic graphs. We develop an algorithm for computing
domination in bipolar neutrosophic graphs. We also show that there are some flaws in
Broumi et al. [11]’s definition.

**Category:** General Mathematics

[158] **viXra:1701.0189 [pdf]**
*submitted on 2017-01-03 12:26:45*

**Authors:** Shawkat Alkhazaleh

**Comments:** 6 Pages.

In this paper as a generalization of neutrosophic soft set we introduce the concept of n-valued reﬁned neutrosophic soft set and study some of its properties. We also, deﬁne its basic operations, complement, union intersection, "AND" and "OR" and study their properties.

**Category:** General Mathematics

[157] **viXra:1701.0188 [pdf]**
*submitted on 2017-01-03 12:29:06*

**Authors:** Shawkat Alkhazaleh

**Comments:** 12 Pages.

In this work we use the concept of a n-valued reﬁned neutrosophic soft sets and its properties to solve decision making problems, Also a similarity measure between two n-valued reﬁned neutrosophic soft sets are proposed.

**Category:** General Mathematics

[156] **viXra:1701.0187 [pdf]**
*submitted on 2017-01-03 12:30:51*

**Authors:** Husein Hadi Abbass, Qasim Mohsin Luhaib

**Comments:** 11 Pages.

In this paper, the notions of Q-Smarandache fuzzy commutative ideal and Q-Smarandache fuzzy sub-commutative ideal of a Q-Smarandache BH-Algebra are introduced, examples and related properties are investigated. Also, the relationships among these notions and other types of Q-Smarandache fuzzy ideal of a Q-Smarandache BH-Algebra are studied.

**Category:** General Mathematics

[155] **viXra:1701.0186 [pdf]**
*submitted on 2017-01-03 12:31:47*

**Authors:** Husein Hadi Abbass, Hayder Kareem Gatea

**Comments:** 9 Pages.

In this paper, we define the concept of a Q-Smarandache implicative ideal with respect to an element of a Q-Smarandache BH-algebra. We state and prove some theorems which determine the relationships among this notion and other types of ideals of a Q-Smarandache BH-algebra.

**Category:** General Mathematics

[154] **viXra:1701.0184 [pdf]**
*submitted on 2017-01-03 12:38:43*

**Authors:** Tuhin Bera, Nirmal Kumar Mahapatra

**Comments:** 20 Pages.

The concepts of neutrosophic normal soft group,neutrosophic soft cosets, neutrosophic soft homomorphism are introduced and illustrated by suitable examples in this paper. Several related properties and structural characteristics are investigated. Some of their basic theorems are also established.

**Category:** General Mathematics

[153] **viXra:1701.0183 [pdf]**
*submitted on 2017-01-03 12:39:55*

**Authors:** Tuhin Bera, Nirmal Kumar Mahapatra

**Comments:** 19 Pages.

In this paper, the cartesian product and the relations on neutrosophic soft sets have been deﬁned in a new approach. Some properties of this concept have been discussed and veriﬁed with suitable real life examples.

**Category:** General Mathematics

[152] **viXra:1701.0182 [pdf]**
*submitted on 2017-01-03 12:40:54*

**Authors:** Vildan Cetkin, Banu Pazar Varol, Halis Aygun

**Comments:** 9 Pages.

The target of this study is to observe some of the algebraic structures of a single valued neutrosophic set. So, we introduce the concept of a neutrosophic submodule of a given classical module and investigate some of the crucial properties and characterizations of the proposed concept.

**Category:** General Mathematics

[151] **viXra:1701.0181 [pdf]**
*submitted on 2017-01-03 12:41:41*

**Authors:** Muhammed T. Sariaydin, Vedat Asil

**Comments:** 7 Pages.

In this paper, we study the parallel curve of a space curve according to parallel transport frame. Then, we obtain new results according to some cases of this curve by using parallel transport frame in Euclidean 3-space. Additionally, we give new examples for this characterizations and we illustrate this examples in gures.

**Category:** General Mathematics

[150] **viXra:1701.0180 [pdf]**
*submitted on 2017-01-03 12:44:32*

**Authors:** Esra Betul Koc Ozturk, Ufuk Ozturk, Kazim Ilarslan, Emilija Nešovic

**Comments:** 15 Pages.

In this paper we define nonnull and null pseudospherical Smarandache curves according to the Sabban frame of a spacelike curve lying on pseudosphere in Minkowski 3-space.

**Category:** General Mathematics

[149] **viXra:1701.0179 [pdf]**
*submitted on 2017-01-03 12:46:08*

**Authors:** Ahmad Sabihi

**Comments:** 19 Pages.

We solve some famous conjectures on the distribution of primes. These conjectures are to be listed as Legendre’s, Andrica’s, Oppermann’s, Brocard’s,Cramer’s, Shanks’, and ﬁve Smarandache’s conjectures. We make use of both Firoozbakht’s conjecture (which recently proved by the author) and Kourbatov’s theorem on the distribution of and gaps between consecutive primes.

**Category:** General Mathematics

[148] **viXra:1701.0178 [pdf]**
*submitted on 2017-01-03 12:47:38*

**Authors:** Süleyman SENYURT, Yasin ALTUN, Ceyda CEVAHIR

**Comments:** 8 Pages.

In this paper, we investigated special Smarandache curves in terms of Sabban frame drawn on the surface of the sphere by the unit Darboux vector of involute curve. We created Sabban frame belonging to this curve. It was explained Smarandache curves position vector is composed by Sabban vectors belonging to this curve. Then, we calculated geodesic curvatures of this Smarandache curves. Found results were expressed depending on the base curve. We also gave example belonging to the results found.

**Category:** General Mathematics

[147] **viXra:1701.0176 [pdf]**
*submitted on 2017-01-03 12:49:38*

**Authors:** Gheorghe Săvoiu, Vasile Dinu

**Comments:** 23 Pages.

Structura articolului reuneşte trei secţiuni majore, urmând demersul general al impactului paradoxurilor în cadrul teoriei economice. O primă secţiune descrie o necesară investigaţie în universul sintetizat al paradoxurilor, pentru a valorifica taxonomia paradoxurilor lui Quine şi a releva importanţa paradoxului cu adevărat paraconsistent, delimitând relativ şi inovativ paradoxismul economic in sensul excesului valorificării creative a paradoxurilor din arealul ştiinţific, aşa cum l-a iniţiat matematicianul şi logicianul Florentin Smarandache.

**Category:** General Mathematics

[146] **viXra:1701.0174 [pdf]**
*submitted on 2017-01-03 12:51:46*

**Authors:** Jean Dezert, Albena Tchamova, L. Bojilov, Pavlina Konstantinova

**Comments:** 5 Pages.

The main objective of this paper is to investigate the impact of the quality of attribute data source on the performance of a target tracking algorithm. An array of dense scenarios arranged according to the distance between closely spaced targets is studied by different confusion matrices.

**Category:** General Mathematics

[145] **viXra:1701.0173 [pdf]**
*submitted on 2017-01-03 12:53:14*

**Authors:** Mehmet Serhat Can, Omerul Faruk Ozguven

**Comments:** 15 Pages.

In this paper, a method for adjusting the proportional-integral-derivative (PID) coefﬁcients based on the neutrosophic similarity measure is proposed. First, rough PID coefﬁcients were determined by the Ziegler– Nichols method, and the upper and lower limit values for the search range of the PID coefﬁcients were determined.

**Category:** General Mathematics

[144] **viXra:1701.0170 [pdf]**
*submitted on 2017-01-03 13:00:35*

**Authors:** Tahir Mahmood, Qaisar Khan, Mohsin Ali Khan

**Comments:** 16 Pages.

In this paper, we have introduced the concept of Q-single value neutrosophic soft set, multi Qsingle valued neutrosophic set and defined some basic results and related properties. We have also defined the idea of Q-single valued neutrosophic soft set, which is the genralizations of Q-fuzzy set, Q-intuitionistic fuzzy set, multi Q-fuzzy set , Multi Q-intuitionistic fuzzy set, Q-fuzzy soft set, Q-intuitionistic fuzzy soft set. We have also defined and discussed some properties and operations of Q-single valued neutrosophic soft set.

**Category:** General Mathematics

[143] **viXra:1701.0169 [pdf]**
*submitted on 2017-01-03 13:01:40*

**Authors:** Ting Peng, Aiping Qu, Xiaoling Wang

**Comments:** 7 Pages.

Segregation is a popular phenomenon. It has considerable eﬀects on material performance. To the author’s knowledge, there is still no automated objective Quantitative indicator for segregation. In order to full ﬁll this task, segregation of particles is analyzed. Edges of the particles are extracted from the digital picture. Then, the whole picture of particles is splintered to small rectangles with the same shape.

**Category:** General Mathematics

[142] **viXra:1701.0168 [pdf]**
*submitted on 2017-01-03 13:02:22*

**Authors:** Yang-Fan, Shen Lai-xin

**Comments:** 12 Pages.

Rectangle algorithm is designed to extract ancient dwellings from village satellite images according to their pixel features and shape features. For these unrecognized objects, we need to distinguish them by further extracting texture features of them. In order to get standardized sample, three pre-process operations including rotating operation, scaling operation, and clipping operation are designed to unify their sizes and directions.

**Category:** General Mathematics

[141] **viXra:1701.0166 [pdf]**
*submitted on 2017-01-03 13:20:03*

**Authors:** J. Akande, D. K. K. Adjaï, L. H. Koudahoun, Y. J. F. Kpomahou, M. D. Monsia

**Comments:** 6 pages

In quantum mechanics, the wave function and energy are required for the complete characterization of fundamental properties of a physical system subject to a potential energy. It is proved in this work, the existence of a Schrödinger equation with position-dependent mass having the prolate spheroidal wave function as exact solution, resulting from a classical quadratic Liénard-type oscillator equation. This fact may allow the extension of the current one-dimensional model to three dimensions and increase the understanding of analytical features of quantum systems.

**Category:** Mathematical Physics

[140] **viXra:1701.0165 [pdf]**
*submitted on 2017-01-03 13:06:49*

**Authors:** Yi Wang, Yingwu Fang, Xinyu Da, Shuxin Chen, Hui Wang

**Comments:** 6 Pages.

Compared with DST, DSmT can be good evidence of contradictions to resolve the issue of evidence portfolios, In view of the occluding in tracking dynamic targets in complex background, a new anti-occluding target tracking algorithm based on DSmT and particle filter is proposed by color cues. The simulation results show that the proposed algorithm is effective and practicable in tracking occluded target and intersected target. Compared with the existing combination rules, the newly proposed rule is applied to both cases of conflicting and coincidence.

**Category:** General Mathematics

[139] **viXra:1701.0164 [pdf]**
*submitted on 2017-01-03 13:08:18*

**Authors:** Wei-dong Zhu, Fang Liu, Yu-wang Chen, Jian-bo Yang, Dong-ling Xu, Dong-peng Wang

**Comments:** 22 Pages.

Research project evaluation and selection is mainly concerned with evaluating a number of research projects and then choosing some of them for implementation. It involves a complex multiple-experts multiple-criteria decision making process.

**Category:** General Mathematics

[138] **viXra:1701.0163 [pdf]**
*submitted on 2017-01-03 13:09:14*

**Authors:** Surapati Pramanik, Kalyan Mondal

**Comments:** 11 Pages.

Bipolar neutrosophic set theory and rough neutrosophic set theory are emerging as powerful tool for dealing with uncertainty, and indeterminate, incomlete, and inprecise information. In the present study we develop a hybrid structure called “rough bipoar neutrsophic set”. In the study, we define rough bipoar neutrsophic set and define union, complement, intersection and containment of rough bipolar neutrosophic sets.

**Category:** General Mathematics

[137] **viXra:1701.0162 [pdf]**
*submitted on 2017-01-03 13:10:06*

**Authors:** I. Arockiarani, C. Antony Crispin Sweety

**Comments:** 14 Pages.

In this paper, we define the rough neutrosophic relation of two universe sets and study the algebraic properties of two rough neutrosophic relations that are interesting in the theory of rough sets. Finally, we present the similarity rough neutrosophic relation with an example.

**Category:** General Mathematics

[136] **viXra:1701.0161 [pdf]**
*submitted on 2017-01-03 13:12:18*

**Authors:** Shelda Sajeev, Mariusz, Gobert Lee

**Comments:** 9 Pages.

Segmentation of Breast Masses in Local Dense Background using Adaptive Clip Limit-CLAHE.

**Category:** General Mathematics

[135] **viXra:1701.0160 [pdf]**
*submitted on 2017-01-03 13:13:15*

**Authors:** Edmundas Kazimieras Zavadskas, Romualdas Baušys, Dragisa Stanujkic, Marija Magdalinovic-Kalinovic

**Comments:** 8 Pages.

In this paper, selection of adequate circuit design of lead-zinc froth flotation, which has a significant impact on the processing costs and useful minerals utilization, is considered.

**Category:** General Mathematics

[134] **viXra:1701.0157 [pdf]**
*submitted on 2017-01-03 13:16:50*

**Authors:** Vakkas Ulucay, Irfan Deli, Mehmet Sahin

**Comments:** 10 Pages.

In this paper, we introduced some similarity measures for bipolar neutrosophic sets such as; Dice similarity measure, weighted Dice similarity measure, Hybrid vector similarity measure and weighted Hybrid vector similarity measure.

**Category:** General Mathematics

[133] **viXra:1701.0156 [pdf]**
*submitted on 2017-01-03 13:19:06*

**Authors:** Zhang-peng Tian, Jing Wang, Jian-qiang Wang, Hong-yu Zhang

**Comments:** 31 Pages.

For many companies, green product development has become a key strategic consideration due to regulatory requirements and market trends. In this paper, the life cycle assessment technique is used to develop an innovative multi-criteria group decision-making approach that incorporates power aggregation operators and a TOPSIS-based QUALIFLEX method in order to solve green product design selection problems using neutrosophic linguistic information.

**Category:** General Mathematics

[132] **viXra:1701.0155 [pdf]**
*submitted on 2017-01-03 13:19:54*

**Authors:** Muhammad Akram, Muzzamal Sitara

**Comments:** 9 Pages.

A graph structure is a generalization of undirected graph which is quite useful in studying some structures, including graphs and signed graphs. In this research paper, we apply the idea of single-valued neutrosophic sets to graph structure, and explore some interesting properties of single-valued neutrosophic graph structure. We also discuss the concept of φ-complement of single-valued neutrosophic graph structure.

**Category:** General Mathematics

[131] **viXra:1701.0154 [pdf]**
*submitted on 2017-01-03 13:21:51*

**Authors:** A. Borumand Saeid, A. Ahadpanah, L.Torkzadeh

**Comments:** 9 Pages.

In this article we define theSmarandache BE-Algebra.

**Category:** General Mathematics

[130] **viXra:1701.0152 [pdf]**
*submitted on 2017-01-03 13:27:11*

**Authors:** M. Khoshnevian

**Comments:** 7 Pages.

In this paper we present the Smarandache's Concurent Lines Theorem in the geometry of polygon.

**Category:** General Mathematics

[129] **viXra:1701.0151 [pdf]**
*submitted on 2017-01-03 13:29:01*

**Authors:** Süleyman Şenyurt, Yasin Altun, Ceyda Cevahir

**Comments:** 5 Pages.

In this paper, we investigate the Smarandache curves according to Sabban frame of ﬁxed pole curve which drawn by the unit Darboux vector of the Bertrand partner curve. Some results have been obtained. These results were expressed as the depends Bertrand curve.

**Category:** General Mathematics

[128] **viXra:1701.0150 [pdf]**
*submitted on 2017-01-03 13:32:14*

**Authors:** M. Elzawy, S. Mosa

**Comments:** 4 Pages.

In this paper,we study Smarandache curves in
the 4-dimensional Galilean space G4. We obtain
Frenet Serret invariants for the Smarandache
curve in G4.

**Category:** General Mathematics

[127] **viXra:1701.0146 [pdf]**
*submitted on 2017-01-03 13:37:19*

**Authors:** Ion Patrascu

**Comments:** 6 Pages.

An example of a Smarandache geometry.

**Category:** General Mathematics

[126] **viXra:1701.0144 [pdf]**
*submitted on 2017-01-03 13:40:12*

**Authors:** Bui Cong Cuong, Roan Thi Ngan, Le Chi Ngoc

**Comments:** 20 Pages.

In 2013 we introduced a new notion of picture fuzzy sets (PFS), which are direct extensions of the fuzzy sets and the intuitonistic fuzzy sets.

**Category:** General Mathematics

[125] **viXra:1701.0143 [pdf]**
*submitted on 2017-01-03 13:41:00*

**Authors:** P.A.Hummadi, H.M.Yassin

**Comments:** 7 Pages.

In this paper, we study Smarandache special definite groups. We give necessary and sufficient conditions for a group to be Smarandache special definite group(S-special definite group). Moreover we study Smarandache special definite subgroups, Smarandache special definite maximal ideals, Smarandache special definite minmal ideals.

**Category:** General Mathematics

[124] **viXra:1701.0141 [pdf]**
*submitted on 2017-01-03 13:44:11*

**Authors:** Nurten Gurses, Ozcan Bektas‚ Salim Yuce

**Comments:** 18 Pages.

In differential geometry, there are many important consequences and properties of curves studied by some authors [1, 2, 3]. Researchers always introduce some new curves by using the existing studies.

**Category:** General Mathematics

[123] **viXra:1701.0140 [pdf]**
*submitted on 2017-01-03 07:25:38*

**Authors:** Sun Wei-cha, Xu Ai-qiang, J. Wen-hui

**Comments:** 9 Pages.

Approaches for Combination of Interval-Valued Belief Structures.

**Category:** General Mathematics

[122] **viXra:1701.0139 [pdf]**
*submitted on 2017-01-03 07:24:44*

**Authors:** George Rajna

**Comments:** 25 Pages.

Replacing a hydrogen atom by an iodine atom in insulin, the hormone retains its efficacy but is available more rapidly to the organism. Researchers at the University of Basel were able to predict this effect based on computer simulations and then confirm it with experiments. [13] In a landmark experiment at SLAC National Accelerator Laboratory, scientists used an X-ray laser to capture the first snapshots of a chemical interaction between two biomolecules in real time and at an atomic level. [12] An international team of scientists has learned how to determine the spatial structure of a protein obtained with an X-ray laser using the sulfur atoms it contains. This development is the next stage in the project of a group led by Vadim Cherezov to create an effective method of studying receptor proteins. [11] When cryoEM images are obtained from protein nanocrystals the images themselves can appear to be devoid of any contrast. A group of scientists from the Netherlands have now demonstrated that lattice information can be revealed and enhanced by a specialized filter. [10] There is also connection between statistical physics and evolutionary biology, since the arrow of time is working in the biological evolution also. From the standpoint of physics, there is one essential difference between living things and inanimate clumps of carbon atoms: The former tend to be much better at capturing energy from their environment and dissipating that energy as heat. [8] This paper contains the review of quantum entanglement investigations in living systems, and in the quantum mechanically modeled photoactive prebiotic kernel systems. [7] The human body is a constant flux of thousands of chemical/biological interactions and processes connecting molecules, cells, organs, and fluids, throughout the brain, body, and nervous system. Up until recently it was thought that all these interactions operated in a linear sequence, passing on information much like a runner passing the baton to the next runner. However, the latest findings in quantum biology and biophysics have discovered that there is in fact a tremendous degree of coherence within all living systems. The accelerating electrons explain not only the Maxwell Equations and the Special Relativity, but the Heisenberg Uncertainty Relation, the Wave-Particle Duality and the electron's spin also, building the Bridge between the Classical and Quantum Theories. The Planck Distribution Law of the electromagnetic oscillators explains the electron/proton mass rate and the Weak and Strong Interactions by the diffraction patterns. The Weak Interaction changes the diffraction patterns by moving the electric charge from one side to the other side of the diffraction pattern, which violates the CP and Time reversal symmetry. The diffraction patterns and the locality of the self-maintaining electromagnetic potential explains also the Quantum Entanglement, giving it as a natural part of the Relativistic Quantum Theory and making possible to understand the Quantum Biology.

**Category:** Physics of Biology

[121] **viXra:1701.0138 [pdf]**
*submitted on 2017-01-03 07:27:57*

**Authors:** Nouran M. Radwan, M. Badr Senousy, Alaa El Din M. Riad

**Comments:** 10 Pages.

The notion of uncertainty in expert systems is dealing with vague data, incomplete information, and imprecise knowledge. Different uncertainty types which are imprecision, vagueness, ambiguity, and inconsistence need different handling models.

**Category:** General Mathematics

[120] **viXra:1701.0137 [pdf]**
*submitted on 2017-01-03 07:29:31*

**Authors:** Pu Ji, Hong-yu Zhang, Jian-qiang Wang

**Comments:** 14 Pages.

The personnel selection is a vital activity for companies, and multi-valued neutrosophic sets (MVNSs) can denote the fuzziness and hesitancy in the processes of the personnel selection.

**Category:** General Mathematics

[119] **viXra:1701.0136 [pdf]**
*submitted on 2017-01-03 07:33:47*

**Authors:** I. Deli, Y. Şubaş

**Comments:** 14 Pages.

The concept of a single valued neutrosophic number (SVN-number) is of importance for quantifying an ill-known quantity and the ranking of SVN-numbers is a very difﬁcult problem in multi-attribute decision making problems.

**Category:** General Mathematics

[118] **viXra:1701.0134 [pdf]**
*submitted on 2017-01-03 07:35:36*

**Authors:** M.Rajeswari, J.Parveen Banu

**Comments:** 9 Pages.

In this paper, we are study about some basic definitions related to Graphs and Neutrosophic graphs. Some properties for the neutrosophic graphs associated with the Neutrosophic bigraphs. By applying some neutrosophic cognitive map and techniques in Neutrosophic models.

**Category:** General Mathematics

[117] **viXra:1701.0133 [pdf]**
*submitted on 2017-01-03 07:36:53*

**Authors:** Jin-Hee Cho, Kevin Chan, Sibel Adali

**Comments:** 40 Pages.

The concept of trust and/or trust management has received considerable attention in engineering research communities as trust is perceived as the basis for decision making in many contexts and the motivation for maintaining long-term relationships based on cooperation and collaboration.

**Category:** General Mathematics

[116] **viXra:1701.0132 [pdf]**
*submitted on 2017-01-03 07:38:28*

**Authors:** Shoba Dyre, C.P. Sumathi

**Comments:** 17 Pages.

Automatic Fingerprint authentication for personal identification and verification has received considerable attention over the past decades among various biometric techniques because of the distinctiveness and persistence properties of fingerprints.

**Category:** General Mathematics

[115] **viXra:1701.0131 [pdf]**
*submitted on 2017-01-03 07:39:42*

**Authors:** Srilatha V., Veeramuthu Venkatesh

**Comments:** 7 Pages.

Trustworthy contextual data of human action recognition of remotely monitored person who requires medical care should be generated to avoid hazardous situation and also to provide ubiquitous services in home-based care. It is difficult for numerous reasons. At first level, the data obtained from heterogeneous source have different level of uncertainty.

**Category:** General Mathematics

[114] **viXra:1701.0130 [pdf]**
*submitted on 2017-01-03 07:41:02*

**Authors:** Mehrdad Moghbel, Syamsiah Mashohor, Rozi Mahmud, M. Iqbal Bin Saripan

**Comments:** 18 Pages.

Segmentation of liver tumors from Computed Tomography (CT) and tumor burden analysis play an important role in the choice of therapeutic strategies for liver diseases and treatment monitoring. In this paper, a new segmentation method for liver tumors from contrast-enhanced CT imaging is proposed.

**Category:** General Mathematics

[113] **viXra:1701.0129 [pdf]**
*submitted on 2017-01-03 07:43:57*

**Authors:** Temitope Gbolahan Jaiyeola

**Comments:** 10 Pages.

The basic properties of S2ndBLs are studied. These properties are all Smarandache in nature. The results in this work generalize the basic properties of Bol loops, found in the Ph.D. thesis of D. A. Robinson. Some questions for further studies are raised.

**Category:** General Mathematics

[112] **viXra:1701.0128 [pdf]**
*submitted on 2017-01-03 07:47:02*

**Authors:** Linfan Mao

**Comments:** 28 Pages.

However, even if it is non-solvable, it charac
terizes biological systems also if it can be classiﬁed into solvable subsystems. The
main purpose of this paper is to characterize the biological behavior of such systems with global stability by a combinatorial approach, i.e., establish the relationship between solvable subsystems of a biological n-system with Eulerian subgraphs of la beling bi-digraph of→ G, characterize n-system with linear growth rate and the global stability on subgraphs.

**Category:** General Mathematics

[111] **viXra:1701.0127 [pdf]**
*submitted on 2017-01-03 08:49:03*

**Authors:** I. Deli, Y. Şubaş

**Comments:** 12 Pages.

This paper proposes concept of bipolar neutrosophic refined set and its some operations. Firstly, a score certainty and accuracy function to compare the bipolar neutrosophic refined information is defined. Secondly, to aggregate the bipolar neutrosophic refined information, a bipolar neutrosophic refined weighted average operator and a bipolar neutrosophic refined weighted geometric operator is developed.

**Category:** General Mathematics

[110] **viXra:1701.0126 [pdf]**
*submitted on 2017-01-03 08:50:11*

**Authors:** Mehmet Şahin, İrfan Deli, Vakkas Uluçay

**Comments:** 10 Pages.

In this paper, we introduce concept of bipolar neutrosophic soft expert set and its some operations. Also, we propose score, certainty and accuracy functions to compare the bipolar neutrosophic soft expert sets. We give examples for these concepts.

**Category:** General Mathematics

[109] **viXra:1701.0125 [pdf]**
*submitted on 2017-01-03 08:51:45*

**Authors:** Shaima Elnazer, Mohamed Morsy, Mohy Eldin A.Abo-Elsoud

**Comments:** 7 Pages.

An improved segmentation approach based on Neutrosophic sets (NS) and Modified Non local Fuzzy c-mean clustering (NLFCM) is proposed. The brain tumor MRI image is transformed into NS domain, which is described using three subsets namely; the percentage of truth in a subset T%, the percentage of indeterminacy in a subset I% , and the percentage of falsity in a subset F%.

**Category:** General Mathematics

[108] **viXra:1701.0124 [pdf]**
*submitted on 2017-01-03 08:52:53*

**Authors:** Mircea Eugen Şelariu

**Comments:** 11 Pages.

The evolutive supermathematical functions (FSM ─ Ev) are discussed in the papers [1], [2], [3], [4], [5], [6], [7], [8].They are combinations of the four types of FSM (centric functions (FC), excentric functions (FE), elevated functions (FEl) and exotic functions (FEx), taken by two, called centricoexcentric functions, elevatoexotic functions, centricoelevated functions, and so forth).

**Category:** General Mathematics

[107] **viXra:1701.0123 [pdf]**
*submitted on 2017-01-03 08:55:39*

**Authors:** Paul K. Davis, David Manheim, Walter L. Perry, John S. Hollywood

**Comments:** 24 Pages.

We describe research fusing heterogeneous information in an effort eventually to detect terrorists, reduce false alarms, and exonerate those falsely identified. The specific research is more humble, using synthetic data and first versions of fusion methods. Both the information and the fusion methods are subject to deep uncertainty.

**Category:** General Mathematics

[106] **viXra:1701.0122 [pdf]**
*submitted on 2017-01-03 08:56:31*

**Authors:** Muhammad Akram, Anam Luqman

**Comments:** 20 Pages.

A directed hypergraph is powerful tool to solve the problems that arises in diﬀerent ﬁelds, including computer networks, social networks and collaboration networks. In this research paper, we apply the concept of single-valued neutrosophic sets to directed hypergraphs.

**Category:** General Mathematics

[105] **viXra:1701.0121 [pdf]**
*submitted on 2017-01-03 08:58:17*

**Authors:** Azeddine Elhassouny, Souﬁane Idbraim, Driss Mammass, Danielle Ducrot

**Comments:** 6 Pages.

The objective of this work is, in the ﬁrst place, the integration in a fusion process using hybrid DSmT model, both, the contextual information obtained from a supervised ICM classiﬁcation with constraints and the temporal information with the use of two images taken at two different dates.

**Category:** General Mathematics

[104] **viXra:1701.0120 [pdf]**
*submitted on 2017-01-03 08:59:21*

**Authors:** N. Kannappa, B. Fairosekani

**Comments:** 6 Pages.

In this paper, we introduce the Smarandache-2-algebraic structure of soft neutrosophic near-ring, namely Smarandache-soft neutrosophic near-ring.

**Category:** General Mathematics

[103] **viXra:1701.0119 [pdf]**
*submitted on 2017-01-03 09:03:57*

**Authors:** S. Maharasi, V. Mahalakshmi, S. Jayalakshmi

**Comments:** 10 Pages.

In this paper, with a new idea, we define weak bi-ideal and investigate some of its properties. We characterize weak bi-ideal by biideals of bi-near ring.

**Category:** General Mathematics

[102] **viXra:1701.0118 [pdf]**
*submitted on 2017-01-03 09:05:05*

**Authors:** Alexandre Delahaye

**Comments:** 159 Pages.

Face à l’accroissement de la résolution spatiale des capteurs optiques satellitaires, de nouvelles stratégies doivent être développées pour classifier les images de télédétection.

**Category:** General Mathematics

[101] **viXra:1701.0117 [pdf]**
*submitted on 2017-01-03 09:06:23*

**Authors:** Sebastian Breker, Bernhard Sick

**Comments:** 10 Pages.

The use of expert knowledge is always more or less afﬂicted with uncertainties for many reasons: Expert knowledge may be imprecise, imperfect, or erroneous, for instance.

**Category:** General Mathematics

[100] **viXra:1701.0116 [pdf]**
*submitted on 2017-01-03 09:08:25*

**Authors:** Henry Ibstedt

**Comments:** 21 Pages.

This study has been inspired by questions asked by Ch. Ashbacher in the Journal of Recreational Maathematics.

**Category:** General Mathematics

[99] **viXra:1701.0112 [pdf]**
*submitted on 2017-01-03 09:29:04*

**Authors:** Steven A. Israel, Erik Blasch

**Comments:** 26 Pages.

Decision support systems enable users to quickly assess data, but they require signiﬁcant resources to develop and are often relevant to limited domains. This chapter identiﬁes the implicit assumptions that require contextual analysis for decision support systems to be effective for providing a relevant threat assessment.

**Category:** General Mathematics

[98] **viXra:1701.0111 [pdf]**
*submitted on 2017-01-03 09:32:34*

**Authors:** Sarah Calderwood, Kevin McAreavey, Weiru Liu, Jun Hong

**Comments:** 26 Pages.

There has been much interest in the Belief-Desire-Intention (BDI) agentbased model for developing scalable intelligent systems, e.g. using the AgentSpeak framework. However, reasoning from sensor information in these large-scale systems remains a signiﬁcant challenge.

**Category:** General Mathematics

[97] **viXra:1701.0110 [pdf]**
*submitted on 2017-01-03 09:34:10*

**Authors:** Chunfang Liua, YueSheng Luo

**Comments:** 7 Pages.

The simpliﬁed neutrosophic set (SNS) is a generalization of the fuzzy set that is designed for some incomplete, uncertain and inconsistent situations in which each element has different truth membership, indeterminacy membership and falsity membership functions.

**Category:** General Mathematics

[96] **viXra:1701.0109 [pdf]**
*submitted on 2017-01-03 09:36:18*

**Authors:** Lilian Shi

**Comments:** 17 Pages.

In order to process the vagueness in vibration fault diagnosis of rolling bearing, a new correlation coefficient of simplified neutrosophic sets (SNSs) is proposed. Vibration signals of rolling bearings are acquired by an acceleration sensor, and a morphological filter is used to reduce the noise effect.

**Category:** General Mathematics

[95] **viXra:1701.0108 [pdf]**
*submitted on 2017-01-03 09:37:36*

**Authors:** Rıdvan Sahin, Peide Liu

**Comments:** 9 Pages.

As a combination of the hesitant fuzzy set (HFS) and the single-valued neutrosophic set (SVNS), the single-valued neutrosophic hesitant fuzzy set (SVNHFS) is an important concept to handle uncertain and vague information existing in real life, which consists of three membership functions including hesitancy, as the truthhesitancy membership function, the indeterminacy-hesitancy membership function and the falsity-hesitancy membership function, and encompasses the fuzzy set, intuitionistic fuzzy set (IFS), HFS, dual hesitant fuzzy set (DHFS) and SVNS. Correlation and correlation coefﬁcient have been applied widely in many research domains and practical ﬁelds.

**Category:** General Mathematics

[94] **viXra:1701.0107 [pdf]**
*submitted on 2017-01-03 09:38:36*

**Authors:** Jun YE

**Comments:** 24 Pages.

This paper proposes the concept of an interval neutrosophic hesitant fuzzy set (INHFS) and the operational relations of INHFSs. Then, we develop correlation coeﬃcients of INHFSs and investigate the relation between the similarity measures and the correlation coeﬃcients.

**Category:** General Mathematics

[93] **viXra:1701.0103 [pdf]**
*submitted on 2017-01-03 09:43:16*

**Authors:** I. R. Sumathi, I.Arockiarani

**Comments:** 10 Pages.

In this paper we have introduced the concept of cosine similarity measures for neutrosophic soft set and interval valued neutrosophic soft set.An application is given to show its practicality and eﬀectiveness.

**Category:** General Mathematics

[92] **viXra:1701.0102 [pdf]**
*submitted on 2017-01-03 09:45:00*

**Authors:** Xiao-hui Wu, Jian-qiang Wang, Juan-juan Peng, Xiao-hong Chen

**Comments:** 13 Pages.

Simpliﬁed neutrosophic sets (SNSs) can effectively solve the uncertainty problems, especially those involving the indeterminate and inconsistent information. Considering the advantages of SNSs, a new approach for multi-criteria decision-making (MCDM) problems is developed under the simpliﬁed neutrosophic environment.

**Category:** General Mathematics

[91] **viXra:1701.0101 [pdf]**
*submitted on 2017-01-03 09:46:24*

**Authors:** Jean Dezert, Deqiang Han, Jean-Marc Tacnet, Simon Carladous, Yi Yang

**Comments:** 9 Pages.

In this paper we propose a new general method for decisionmaking under uncertainty based on the belief interval distance. We show through several simple illustrative examples how this method works and its ability to provide reasonable results.

**Category:** General Mathematics

[90] **viXra:1701.0100 [pdf]**
*submitted on 2017-01-03 09:48:11*

**Authors:** A. Sobchak, E. Shostak, T. Tseplyaeva, O. Popova, A. Firsova

**Comments:** 8 Pages.

Under conditions of post-industrial society, a steady trend is the formation and support of the functioning of virtual instrument-making enterprises (VIE) in the field of high technologies.

**Category:** General Mathematics

[89] **viXra:1701.0098 [pdf]**
*submitted on 2017-01-03 09:51:13*

**Authors:** Mehmet Serhat CAN, Ömerül Faruk ÖZGÜVEN

**Comments:** 13 Pages.

In this paper, we propose a method based on the fuzzy logic controller (FLC) method, created by using neutrosophic membership values. This method is named as proportional integral derivative-neutrosophic valued fuzzy logic controller (PID-NFLC).

**Category:** General Mathematics

[88] **viXra:1701.0097 [pdf]**
*submitted on 2017-01-03 09:52:40*

**Authors:** Preeti Meena, Malti Bnasal

**Comments:** 6 Pages.

This research paper proposes an intellectual method for the classification of different types of Electromyography (EMG) signals like normal, myopathy and neuropathy signals. Inside the human body, contraction of muscles and nerves occur at every second.

**Category:** General Mathematics

[87] **viXra:1701.0096 [pdf]**
*submitted on 2017-01-03 09:53:45*

**Authors:** Ilanthenral Kandasamy

**Comments:** 20 Pages.

Neutrosophy (neutrosophic logic) is used to represent uncertain, indeterminate, and inconsistent information available in the real world.

**Category:** General Mathematics

[86] **viXra:1701.0095 [pdf]**
*submitted on 2017-01-03 09:56:04*

**Authors:** Oussama Derbel, Rene Jr. Landry

**Comments:** 6 Pages.

This paper proposes a driving risk model based on the information given from the driver-vehicle-enviroment entities.

**Category:** General Mathematics

[85] **viXra:1701.0094 [pdf]**
*submitted on 2017-01-03 09:57:19*

**Authors:** Guo Qiang, He You

**Comments:** 7 Pages.

To improve the correct radar emitter recognition rate in cases that radar emitter characteristic parameters are overlapped with each other and existence of multiple modes, a DSm (Dezert-Smarandache) evidence modeling and radar emitter fusion recognition method based on cloud model is proposed.

**Category:** General Mathematics

[84] **viXra:1701.0093 [pdf]**
*submitted on 2017-01-03 09:59:09*

**Authors:** Suha Yılmaz, Umit Ziya Savcı

**Comments:** 4 Pages.

In this work, curves of constant breadth are deﬁned and some characterizations of closed dual curves of constant breadth according to Bishop frame are presented in dual Euclidean space. Also, it has been obtained that a third order vectorial differential equation in dual Euclidean 3-space.

**Category:** General Mathematics

[83] **viXra:1701.0092 [pdf]**
*submitted on 2017-01-03 10:01:00*

**Authors:** Tanju Kahraman, Hasan Hüseyin Ugurlu

**Comments:** 13 Pages.

In this paper, we give Darboux approximation for dual Smarandache curves of time like curve on unit dual Lorentzian sphere.

**Category:** General Mathematics

[82] **viXra:1701.0091 [pdf]**
*submitted on 2017-01-03 10:02:17*

**Authors:** Gheorghe Săvoiu, Vasile Dinu, Emilia Gogu, Hosney Zurub

**Comments:** 16 Pages.

This paper describes, in its introduction, its main objective and some of its investigative premises,emphasizing the need to address micro- and macroeconomic models using the major principles of statistical thinking.

**Category:** General Mathematics

[81] **viXra:1701.0089 [pdf]**
*submitted on 2017-01-03 10:03:01*

**Authors:** George Rajna

**Comments:** 28 Pages.

Memory chips are among the most basic components in computers. The random access memory is where processors temporarily store their data, which is a crucial function. Researchers from Dresden and Basel have now managed to lay the foundation for a new memory chip concept. [20] Researchers have built a record energy-efficient switch, which uses the interplay of electricity and a liquid form of light, in semiconductor microchips. The device could form the foundation of future signal processing and information technologies, making electronics even more efficient. [19] The magnetic structure of a skyrmion is symmetrical around its core; arrows indicate the direction of spin. [18] According to current estimates, dozens of zettabytes of information will be stored electronically by 2020, which will rely on physical principles that facilitate the use of single atoms or molecules as basic memory cells. [17] EPFL scientists have developed a new perovskite material with unique properties that can be used to build next-generation hard drives. [16] Scientists have fabricated a superlattice of single-atom magnets on graphene with a density of 115 terabits per square inch, suggesting that the configuration could lead to next-generation storage media. [15] Now a researcher and his team at Tyndall National Institute in Cork have made a 'quantum leap' by developing a technical step that could enable the use of quantum computers sooner than expected. [14] A method to produce significant amounts of semiconducting nanoparticles for light-emitting displays, sensors, solar panels and biomedical applications has gained momentum with a demonstration by researchers at the Department of Energy's Oak Ridge National Laboratory. [13] A source of single photons that meets three important criteria for use in quantum-information systems has been unveiled in China by an international team of physicists. Based on a quantum dot, the device is an efficient source of photons that emerge as solo particles that are indistinguishable from each other. The researchers are now trying to use the source to create a quantum computer based on "boson sampling". [11] With the help of a semiconductor quantum dot, physicists at the University of Basel have developed a new type of light source that emits single photons.

**Category:** Data Structures and Algorithms

[80] **viXra:1701.0088 [pdf]**
*submitted on 2017-01-03 10:04:18*

**Authors:** Hossein Jafari, Xiangfang Li, Lijun Qian

**Comments:** 8 Pages.

Dezert-Smarandache Theory (DSmT) of plausible and paradoxical reasoning has excellent performance when the data contain uncertainty or conﬂicting.

**Category:** General Mathematics

[79] **viXra:1701.0087 [pdf]**
*submitted on 2017-01-03 10:06:01*

**Authors:** A.S. Shankar, A. Asokan, D. Sivakumar

**Comments:** 9 Pages.

Brain tumor is most vital disease which commonly penetrates in the human beings. Studies based on brain tumor confirm that people affected by brain tumors die due to their erroneous detection. In this paper, an enhancedFuzzy C- Means segmentation (FCM) technique is proposed for detecting brain tumor.

**Category:** General Mathematics

[78] **viXra:1701.0086 [pdf]**
*submitted on 2017-01-03 10:09:09*

**Authors:** Simon Carladous, Jean-Marc Tacnet, Jean Dezert, Deqiang Han, Mireille Batton-Hubert

**Comments:** 8 Pages.

Real life Decision-Making Problems (DMPs) depend on several criteria for which precise or imperfect evaluation can be provided. Economic aspects are also of main importance to compare and choose strategies or measures. For instance, in mountainous areas, risk managers must rank several protective actions against torrential ﬂoods to choose the best one.

**Category:** General Mathematics

[77] **viXra:1701.0085 [pdf]**
*submitted on 2017-01-03 10:10:02*

**Authors:** Yun Ye

**Comments:** 18 Pages.

An interval neutrosophic set (INS) is a subclass of a neutrosophic set and a generalization of an interval-valued intuitionistic fuzzy set, and then the characteristics of INS are independently described by the interval numbers of its truth-membership, indeterminacy-membership, and falsity-membership degrees.

**Category:** General Mathematics

[76] **viXra:1701.0084 [pdf]**
*submitted on 2017-01-03 10:14:23*

**Authors:** Refaul Ferdous, Faisal Khan, Rehan Sadiq, Paul Amyotte, Brian Veitch

**Comments:** 22 Pages.

Process systems in chemical engineering are infamous for fugitive emissions, toxic releases, ﬁre and explosions, and operation disruptions. These incidents have considerable potential to cause an accident and incur environmental and property damage, economic loss, sickness, injury, or death of workers in the vicinity.

**Category:** General Mathematics

[75] **viXra:1701.0083 [pdf]**
*submitted on 2017-01-03 10:15:43*

**Authors:** Yun Ye

**Comments:** 14 Pages.

This paper proposes a dimension root distance and its similarity measure of single-valued neutrosophic sets (SVNSs), and then develops the fault diagnosis method of hydraulic turbine by using the dimension root similarity measure of SVNSs.

**Category:** General Mathematics

[74] **viXra:1701.0082 [pdf]**
*submitted on 2017-01-03 10:18:45*

**Authors:** Zhigang Yao, Jean-Marie Le Bars, Christophe Charrier, Christophe Rosenberger

**Comments:** 18 Pages.

This article chieﬂy focuses on Fingerprint Quality Assessment(FQA)applied to the Automatic Fingerprint Identiﬁcation System (AFIS). In our research work, different FQA solutions proposed so far are compared by using several quality metrics selected from the existing studies.

**Category:** General Mathematics

[73] **viXra:1701.0081 [pdf]**
*submitted on 2017-01-03 10:20:43*

**Authors:** E.M.El-Nakeeb, Hewayda El Ghawalby, A.A.Salama, S.A.El-Hafeez

**Comments:** 23 Pages.

The aim in this paper is to introduce a new approach to mathematical morphology
based on neutrosophic set theory. Basic definitions for neutrosophic morphological
operations are extracted and a study of its algebraic properties is presented. In our
work we demonstrate that neutrosophic morphological operations inherit properties
and restrictions of fuzzy mathematical morphology.

**Category:** General Mathematics

[72] **viXra:1701.0079 [pdf]**
*submitted on 2017-01-03 10:23:09*

**Authors:** S.Mohana Prakash, P.Betty, K.Sivanarulselvan

**Comments:** 5 Pages.

Abstract-Biometrics is used to uniquely identify a person‘s individual based on physical and behavioural characteristics. Unimodal biometric system contains various problems such as degree of freedom, spoof attacks, non-universality, noisy data and error rates.

**Category:** General Mathematics

[71] **viXra:1701.0078 [pdf]**
*submitted on 2017-01-03 10:24:55*

**Authors:** J. Martina Jency, I. Arockiarani

**Comments:** 9 Pages.

This paper introduces the concept of fuzzy neutrosophic equivalence relations and discuss some of their properties.Also we define fuzzy neutrosophic transitive closure and investigate their properties.

**Category:** General Mathematics

[70] **viXra:1701.0076 [pdf]**
*submitted on 2017-01-03 10:27:24*

**Authors:** I. Arockiarani, J. Martina Jency

**Comments:** 14 Pages.

The focus of this paper is to present the concept of fuzzy neutrosophic relations. Further we study the composition of fuzzy neutrosophic relations with the choice of t-norms and tconorms and characterize their properties.

**Category:** General Mathematics

[69] **viXra:1701.0075 [pdf]**
*submitted on 2017-01-03 10:28:27*

**Authors:** I.R.Sumath, I.Arockiarani

**Comments:** 7 Pages.

In this paper we propose the three types of similarity measures between fuzzy neutrosophic soft sets based on value matrix of fuzzy neutrosophic soft sets. Furthermore, we demonstrate the efficiency of the proposed similarity measures through the application in decision making.

**Category:** General Mathematics

[68] **viXra:1701.0074 [pdf]**
*submitted on 2017-01-03 10:29:19*

**Authors:** J. Martina Jency, I.arockiarani

**Comments:** 12 Pages.

In this paper we introduce the notion of fuzzy neutrosophic subgroups. Also we obtain the fuzzy neutrosophic subgroups generated by fuzzy neutrosophic set and investigate some of their properties.

**Category:** General Mathematics

[67] **viXra:1701.0073 [pdf]**
*submitted on 2017-01-03 10:30:31*

**Authors:** Sorin Nadaban, Simona Dzitac, Ioan Dzitac

**Comments:** 9 Pages.

The aim of this survey paper is to oﬀer a general view of the developments of fuzzy TOPSIS methods. We begin with a literature review an we explore diﬀerent fuzzy models that have been applied to the decision making ﬁeld. Finally, we present some applications of fuzzy TOPSIS.

**Category:** General Mathematics

[66] **viXra:1701.0072 [pdf]**
*submitted on 2017-01-03 10:31:36*

**Authors:** Adrian Rubio-Solis, George Panoutsos

**Comments:** 8 Pages.

In this paper we introduce a fuzzy uncertainty assessment methodology based on Neutrosophic Sets (NS). This is achieved via the implementation of a Radial Basis Function Neural-Network (RBF-NN) for multiclass classiﬁcation that is functionally equivalent to a class of Fuzzy Logic Systems (FLS).

**Category:** General Mathematics

[65] **viXra:1701.0071 [pdf]**
*submitted on 2017-01-03 10:39:35*

**Authors:** Hee Sik Kim Neggers, Keum Sook So

**Comments:** 10 Pages.

In this paper, we introduce the notion of generalized Fibonacci sequences over a groupoid and discuss it in particular for the case where the groupoid contains idempotents and pre-idempotents.Using the notion of Smarandache-type P-algebra, we obtain several relations on groupoids which are derived from generalized Fibonacci sequences.

**Category:** General Mathematics

[64] **viXra:1701.0070 [pdf]**
*submitted on 2017-01-03 10:41:07*

**Authors:** Bingyi Kang, Hongming Mo, Rehan Sadiq, Yong Deng

**Comments:** 11 Pages.

A fuzzy cognitive maps (FCM) is a cognitive map within the relations between the elements. FCM has been widely used in many applicati

**Category:** General Mathematics

[63] **viXra:1701.0068 [pdf]**
*submitted on 2017-01-03 10:44:15*

**Authors:** Yilin Dong, Xinde Li, Jean Dezert, Pei Li, Xianghui Li

**Comments:** 41 Pages.

Recent studies of alternative Probabilistic Transformation (PT) in Dempster Shafer (DS) theory have mainly focused on investigating various schemes for assigning the mass of compound focal elements to each singleton in order to obtain Bayesian belief function for decision making problems.

**Category:** General Mathematics

[62] **viXra:1701.0067 [pdf]**
*submitted on 2017-01-03 05:02:46*

**Authors:** Simon Maskell

**Comments:** 19 Pages.

The Dezert–Smarandache theory (DSmT) and transferable belief model (TBM) both address concerns with the Bayesian methodology as spplied to applications involving the fusion of uncertain, imprecise and conﬂicting information. In this paper, we revisit these concerns regarding the Bayesian methodology in the light of recent developments in the context of the DSmT and TBM. We show that, by exploiting recent advances in the Bayesian research arena, one can devise and analyse Bayesian models that have the same emergent properties as DSmT and TBM. Speciﬁcally, we deﬁne Bayesian models that articulate uncertainty over the value of probabilities (including multimodal distributions that result from conﬂicting information) and we use a minimum expected cost criterion to facilitate making decisions that involve hypotheses that are not mutually exclusive. We outline our motivation for using the Bayesian methodology and also show that the DSmT and TBM models are computationally expedient approaches to achieving the same endpoint. Our aim is to provide a conduit between these two communities such that an objective view can be shared by advocates of all the techniques.

**Category:** General Mathematics

[61] **viXra:1701.0066 [pdf]**
*submitted on 2017-01-03 05:06:21*

**Authors:** Xinde Li, Fengyu Wang

**Comments:** 24 Pages.

Aiming at the counterintuitive phenomena of the Dempster–Shafer method in combining the highly conﬂictive evidences, a combination method of evidences based on the clustering analysis is proposed in this paper. At ﬁrst, the cause of conﬂicts is disclosed from the point of view of the internal and external contradiction. Andthen,a new similarity measure based on it is proposed by comprehensively considering the Pignistic distance and the sequence according to the size of the basic belief assignments over focal elements.

**Category:** General Mathematics

[60] **viXra:1701.0065 [pdf]**
*submitted on 2017-01-03 05:12:15*

**Authors:** Temitope Gbolahan Jaiyeola

**Comments:** 7 Pages.

The present study further strengthens the use of the Keedwell CIPQ against attack on a system by the use of the Smarandache Keedwell CIPQ for cryptography in a similar spirit in which the cross inverse property has been used by Keedwell. This is done as follows. By constructing two S-isotopic S-quasigroups(loops) U and V such that their Smarandache automorphism groups are not trivial, it is shown that U is a SCIPQ(SCIPL) if and only if V is a SCIPQ(SCIPL). Explanations and procedures are given on how these SCIPQs can be used to double encrypt information.

**Category:** General Mathematics

[59] **viXra:1701.0064 [pdf]**
*submitted on 2017-01-03 05:13:40*

**Authors:** Yun Ye

**Comments:** 11 Pages.

Based on the concept of neutrosophic linguistic numbers (NLNs) in symbolic neutrosophic theory presented by Smarandache in 2015, the paper firstly proposes basic operational laws of NLNs and the expected value of a NLN to rank NLNs. Then, we propose the NLN weighted arithmetic average (NLNWAA) and NLN weighted geometric average (NLNWGA) operators and discuss their properties. Further, we establish a multiple attribute group decision-making (MAGDM) method by using the NLNWAA and NLNWGA operators under NLN environment. Finally, an illustrative example on a decision-making problem of manufacturing alternatives in the flexible manufacturing system is given to show the application of the proposed MAGDM method.

**Category:** General Mathematics

[58] **viXra:1701.0063 [pdf]**
*submitted on 2017-01-03 05:15:40*

**Authors:** Mutasem K. Alsmadi

**Comments:** 10 Pages.

It is really important to diagnose jaw tumor in its early stages to improve its prognosis. A differential diagnosis could be performed using X-ray images; therefore, accurate and fully automatic jaw lesions image segmentation is a challenging and essential task. The aim of this work was to develop a novel, fully automatic and effective method for jaw lesions in panoramic X-ray image segmentation. The hybrid Fuzzy C-Means and Neutrosophic approach is used for segmenting jaw image and detecting the jaw lesion region in panoramic X-ray images which may help in diagnosing jaw lesions.

**Category:** General Mathematics

[57] **viXra:1701.0062 [pdf]**
*submitted on 2017-01-03 05:18:40*

**Authors:** Xindong Peng, Chong Liu

**Comments:** 11 Pages.

This paper presents two novel single-valued neutrosophic soft set (SVNSS) methods. First,we initiate a new axiomatic deﬁnition of single-valued neutrosophic simlarity measure, which is expressed by single-valued neutrosophic number (SVNN) that will reduce the information loss and remain more original information. Then, the objective weights of various parameters are determined via grey system theory. Combining objective weights with subjective weights, we present the combined weights, which can reﬂect both the subjective considerations of the decision maker and the objective information. Later, we present two algorithms to solve decision making problem based on Evaluation based on Distance from Average Solution (EDAS) and similarity measure. Finally, the effectiveness and feasibility of approaches are demonstrated by a numerical example.

**Category:** General Mathematics

[56] **viXra:1701.0061 [pdf]**
*submitted on 2017-01-03 05:22:36*

**Authors:** Henry Ibstedt

**Comments:** 13 Pages.

This study is an extentionof work done by Ch. Ashbacher.Iteration results have been refined in terms of invariants and loops.

**Category:** General Mathematics

[55] **viXra:1701.0060 [pdf]**
*submitted on 2017-01-03 05:23:52*

**Authors:** Ishmeet Kaur, Lalit Mann Singh

**Comments:** 7 Pages.

Diabetic Retinopathy is a disease which causes a menace to the eyesight. The detection of this at an early stage can aid the person from vision loss. The examination of retinal blood vessel structure can help to detect the disease, so segmentation of retinal blood vessel vasculature is important and is appreciated by the ophthalmologists. In this paper, we present the approach of blood vessel segmentation using computer intelligence by deploying fuzzy c-means and neutrosophic set.

**Category:** General Mathematics

[54] **viXra:1701.0059 [pdf]**
*submitted on 2017-01-03 05:25:42*

**Authors:** Yildiray Celik

**Comments:** 9 Pages.

The concept of neutrosophic soft set is a new mathematical tool for dealing with uncertainties that is free from the diﬃculties aﬀecting existing methods. The theory has rich potential for applications in several directions. In this paper, a new approach is proposed to construct the decision method for medical diagnosis by using fuzzy neutrosophic soft sets. Also, we develop a technique to diagnose which patient is suﬀering from what disease. Our data with respect to the case study has been provided by the a medical center in Ordu, Turkey.

**Category:** General Mathematics

[53] **viXra:1701.0058 [pdf]**
*submitted on 2017-01-03 05:26:46*

**Authors:** Lihua Yang, Baolin Li

**Comments:** 10 Pages.

As a generalization of intuitionistic fuzzy sets, neutrosophic sets (NSs) can be better handle the incomplete, indeterminate and inconsistent information, which have attracted the widespread concerns for researchers.

**Category:** General Mathematics

[52] **viXra:1701.0057 [pdf]**
*submitted on 2017-01-03 05:28:31*

**Authors:** Rodolfo Alvarado-Cervantes, Edgardo M. Felipe-Riveron, Vladislav Khartchenko, Oleksiy Pogrebnyak

**Comments:** 17 Pages.

In this article, we present an adaptive color similarity function deﬁned in a modiﬁed hue-saturationintensity color space, which can be used directly as a metric to obtain pixel-wise segmentation of color images among other applications. The color information of every pixel is integrated as a unit by an adaptive similarity function thus avoiding color information scattering.

**Category:** General Mathematics

[51] **viXra:1701.0056 [pdf]**
*submitted on 2017-01-03 05:32:36*

**Authors:** Stefan Korolija, Eva Tuba, Milan Tuba

**Comments:** 8 Pages.

Digital images and digital image processing were widely researched in the past decades and special place in this ﬁeld have medical images. Magnetic resonance images are a very important class of medical images and their enhancement is very signiﬁcant for diagnostic process.

**Category:** General Mathematics

[50] **viXra:1701.0055 [pdf]**
*submitted on 2017-01-03 05:34:01*

**Authors:** .Ashit Kumar Dutta

**Comments:** 3 Pages.

Cancer is considered as a most dangerous disease in the world. The research shows that there is no medicine to cure the disease. Chemotherapy is one of the treatments to prolong the patient life time but the side effects cause more problems for the patient. Cognitive maps discovered many solutions for the complex problems.

**Category:** General Mathematics

[49] **viXra:1701.0054 [pdf]**
*submitted on 2017-01-03 05:35:29*

**Authors:** Anchal Sharma, Shaveta Rani

**Comments:** 5 Pages.

—Conceptual Segmentation is a critical technique in medical imaging. The Processes of identification and division of optic circle and veins are the fundamental strides for the analysis of a few infections that causes visual deficiency like diabetic retinopathy, hypertension, glaucoma and different visual deficiency ailment.

**Category:** General Mathematics

[48] **viXra:1701.0053 [pdf]**
*submitted on 2017-01-03 05:38:14*

**Authors:** Guo Qiang, He You, Guan Xin, Gai Ming-jiu.

**Comments:** 7 Pages.

A DSmT Approximate Reasoning Method on the Condition of Non-zero Mutiple Focal Elements.

**Category:** General Mathematics

[47] **viXra:1701.0052 [pdf]**
*submitted on 2017-01-03 05:50:29*

**Authors:** Mohammadreza Karimipoor, Vahid Abolghasemi, Saideh Ferdowsi

**Comments:** 7 Pages.

In this paper, a denoising method based on dictionary learning has been proposed. With the increasing use of digital images, the methods that can remove noise based on image content and not restrictedly based on statistical properties has been widely extended. The major weakness of dictionary learning methods is that all of these methods require a long training process and a very large storage memory for storing features extracted from the training images.

**Category:** General Mathematics

[46] **viXra:1701.0051 [pdf]**
*submitted on 2017-01-03 05:53:10*

**Authors:** Edi Sutoyo, Mungad Mungad, Suraya Hamid, Tutut Herawan

**Comments:** 31 Pages.

Conflict analysis has been used as an important tool in economic,business,governmental and political dispute, games, management negotiations, military operations and etc.

**Category:** General Mathematics

[45] **viXra:1701.0050 [pdf]**
*submitted on 2017-01-03 05:54:33*

**Authors:** D. Shona, M. Senthilkumar

**Comments:** 6 Pages.

The Hasty intensification of Internet communication and accessibility of systems to infringe the network, network security has become requisite. This paper focuses on development of efficient IDS in MANET.

**Category:** General Mathematics

[44] **viXra:1701.0049 [pdf]**
*submitted on 2017-01-03 05:59:26*

**Authors:** B. Kavitha, D. Karthikeyan, P. Sheeba Maybell

**Comments:** 9 Pages.

In the real word it is a result that one most deal with uncertainty when security is concerned.

**Category:** General Mathematics

[43] **viXra:1701.0048 [pdf]**
*submitted on 2017-01-03 06:00:59*

**Authors:** Mumtaz Ali, Nguyen Van Minh, Le Hoang Son

**Comments:** 76 Pages.

Neutrosophic set has the ability to handle uncertain, incomplete, inconsistent, indeterminate information in a more accurate way. In this paper, we proposed a neutrosophic recommender system to predict the diseases based on neutrosophic set which includes single-criterion neutrosophic recommender system (SCNRS) and multi-criterion neutrosophic recommender system (MC-NRS).

**Category:** General Mathematics

[42] **viXra:1701.0047 [pdf]**
*submitted on 2017-01-03 06:05:13*

**Authors:** Qiang Guo, You He, Xin Guan, Li Deng, Lina Pan, Tao Jian

**Comments:** 11 Pages.

The computetional complexity of Dezert-Smarandache Theory increases exponentially with the linear increment of element number in the discernment frame, and it limits the wide applications and development of DSmT.

**Category:** General Mathematics

[41] **viXra:1701.0046 [pdf]**
*submitted on 2017-01-03 06:09:12*

**Authors:** Qiang Guo, You He, Tao Jian, Haipeng Wang, Shutao Xia

**Comments:** 14 Pages.

Due to the huge computation complexity of Dezert-Smarandache Thery, its applications especiall for multi-source (more than two sources) complex fusion problems have been limited.

**Category:** General Mathematics

[40] **viXra:1701.0045 [pdf]**
*submitted on 2017-01-03 06:11:37*

**Authors:** A. Thamaraiselvi, R. Santhi

**Comments:** 10 Pages.

Neutrosophic sets have been introduced as a generalization of crisp sets, fuzzy sets, and intuitionistic fuzzy sets to represent uncertain,inconsistent,and incomplete information about areal world problem. For the first time, this paper attempts to introduce the mathematical representation of a transportation problem in neutrosophic environment.

**Category:** General Mathematics

[39] **viXra:1701.0044 [pdf]**
*submitted on 2017-01-03 06:16:44*

**Authors:** Radim Jirousek, Prakash P. Shenoy

**Comments:** 29 Pages.

We propose a new deﬁnition of entropy for basic probability assignments (BPA) in the Dempster-Shafer (D-S) theory of belief functions, which is interpreted as a measure of total uncertainty in the BPA. Our deﬁnition is diﬀerent from the deﬁnitions proposed by H¨ohle, Smets, Yager, Nguyen, Dubois-Prade, Lamata-Moral, Klir-Ramer, Klir-Parviz, Pal et al., MaedaIchihashi, Harmanec-Klir, Jousselme et al., and Pouly et al. We state a list of ﬁve desired properties of entropy for D-S belief functions theory that are motivated by Shannon’s deﬁnition of entropy for probability functions together with the requirement that any deﬁnition should be consistent with the semantics of D-S belief functions theory.

**Category:** General Mathematics

[38] **viXra:1701.0043 [pdf]**
*submitted on 2017-01-03 06:20:41*

**Authors:** Nouran M. Radwan, M.Badr Senousy, Alaa El M. Riad

**Comments:** 11 Pages.

There has been a sudden increase in the usage of Learning Managment Systems applicatios to support learner's learning process in higher education.

**Category:** General Mathematics

[37] **viXra:1701.0042 [pdf]**
*submitted on 2017-01-03 06:26:26*

**Authors:** Angel Garcés Doz

**Comments:** 117 Pages.

1) Using the partition function for a system in thermodynamic equilibrium; and replacing the energy-Beta factor (Beta = 1/[Boltzmann constant x temperature]) by the imaginary parts of the nontrivial zeros Riemann's zeta function; It is obtained, a function that equals the value of elementary electric charge and the square root of the product of the Planck mass, the electron mass, and the constant of universal gravitation.
2) Using this same partition function (thermodynamic equilibrium) the Planck constant (Planck mass squared, multiplied by the constant of universal gravitation) is calculated with complete accuracy; As a direct function of the square of the quantized constant elementary electric charge.
These two fundamental equations imply the existence of a repulsive acceleration of the quantum vacuum. As a direct consequence of this repulsive acceleration of the quantum vacuum; The repulsive energy of the quantum vacuum is directly derived from the general relativity equation for critical density.
As a consequence of this repulsive acceleration, we establish provisional equations that allow us to calculate with enough approximation the speed of rotation within the galaxies; As well as the diameter of galaxies and clusters of galaxies.
To obtain this results, completely accurate; several initial hypotheses are established. These hypotheses could say, that become physical-mathematical theorems, when they are demonstrated by empirical data. Among others, it is calculated accurately baryon density as well as the mass density.
Hypothesis-axioms are demonstrated by their practical application for the empirical calculation of baryon density, antimatter-matter asymmetry factor, Higgs vacuum value, Higgs boson mass (mh1), and mass prediction of Quark stop, about 745-750 GeV. This boson would not have been discovered because its decay would be hidden by the almost equal masses of the particles involved in the decay. We think that this type of hidden decay is a general feature of supersymmetry.
The physico-mathematical concept of quantum entropy (entropy of information) acquires a fundamental relevance in the axiomatization of the theories of unification.
Another fundamental consequence is that time would be an emergent dimension in the part of the universe called real (finite limit velocities). In the part of the virtual universe and not observable; The time would be canceled, would acquire the value t = 0.This property would explain the instantaneity of the change of correlated observables of interlaced particles; And the instantaneous collapse of the wave function, once it is disturbed (measured, observed with energy transmission to the observed or measured system).
To get a zero time, special relativity must be extended to hyperbolic geometries (virtual quantum wormholes). This natural generalization implies the existence of infinite speeds, on the strict condition of zero energy and zero time (canceled). This has a direct relationship with soft photons and soft gravitons with zero energy, from the radiation of a black hole; And that they would solve the problem of the loss of information of the black holes.
The main equation of unification of electromagnetism and gravitation; It seems necessarily imply the existence of wormholes, as geometrical manifestation of hyperboloid of one sheet, and two sheets. In the concluding chapter we discuss this point; and others highly relevants.
The relativistic invariance of elementary quantized electric charge is automatically derived.

**Category:** Quantum Gravity and String Theory

[36] **viXra:1701.0041 [pdf]**
*submitted on 2017-01-03 06:22:34*

**Authors:** Rui Paúl, Eugenio Aguirre, Miguel García-Silvente, Rafael Muñoz-Salinas

**Comments:** 16 Pages.

This paper describes a system capable of detecting and tracking various people using a new approach based on color, stereo vision and fuzzy logic. Initially, in the people detection phase, two fuzzy systems are used to ﬁlter out false positives of a face detector.

**Category:** General Mathematics

[35] **viXra:1701.0040 [pdf]**
*submitted on 2017-01-03 06:28:31*

**Authors:** Yang Yi, Han Deqiang, Jean Dezert

**Comments:** 10 Pages.

In the theory based on belief functions, the measure of uncertainty is an important concept, which is used for representing some types of uncertainty incorporated in bodies of evidence such as the discord and the non-specificity.

**Category:** General Mathematics

[34] **viXra:1701.0039 [pdf]**
*submitted on 2017-01-03 06:30:33*

**Authors:** HU Lifanga, HE You, GUAN Xin, DENG Yong, HAN Deqiang

**Comments:** 12 Pages.

The mapping from the belief to the probability domain is a controversial issue, whose original purpose is to make (hard) decision, but for contrariwise to erroneous widespread idea/claim, this is not the only interest for using such mappings nowadays.

**Category:** General Mathematics

[33] **viXra:1701.0038 [pdf]**
*submitted on 2017-01-03 06:32:52*

**Authors:** Şerif Özlü, İrfan Deli

**Comments:** 6 Pages.

In this paper, we give a new similarity measure on npn-soft set theory which is the extension of correlation measure of neutrosophic refined sets. By using the similarity measure we propose a new method for decision making problem. Finally, we give an example for diagnosis of diseases could be improved by incorporating clinical results and other competing diagnosis in npn-soft environment.

**Category:** General Mathematics

[32] **viXra:1701.0036 [pdf]**
*submitted on 2017-01-03 06:36:38*

**Authors:** Partha Pratim Dey, Surapati Pramanik, Bibhas Chandra Giri

**Comments:** 12 Pages.

The paper proposes a multi-attribute decision making method based on extended grey relation analysis under interval neutrosophic environment. The interval neutrosophic set is an important decision making apparatus that can handle imprecise, indeterminate, inconsistence information.

**Category:** General Mathematics

[31] **viXra:1701.0035 [pdf]**
*submitted on 2017-01-03 06:38:10*

**Authors:** Peide Liu, Lili Zhang

**Comments:** 15 Pages.

Neutrosophic hesitant fuzzy set is the generalization of neutrosophic set and the hesitant fuzzy set, which can easily express the uncertain, incomplete and inconsistent information in cognitive activity, and the VIKOR (from Serbian:VIseKriterijumska Optimizacija I Kompromisno Resenje) method is an effective decision making tool which can select the optimal alternative by the maximum ‘‘group utility’’ and minimum of an ‘‘individual regret’’ with cognitive computation.

**Category:** General Mathematics

[30] **viXra:1701.0032 [pdf]**
*submitted on 2017-01-03 06:43:09*

**Authors:** JIN Hong-bin, LAN Jiang-qiao

**Comments:** 5 Pages.

The computational complexity of evidence theory is a hot issue in current research. Dezert-Smarandache theory (DSmT) introduces conflicting focal element, which makes the calculation complexity increases sharply. This paper starts with the focal element control rule used by the approximate calculation method mostly. The examples show that the improved rule is effective and feasible in both Shafer model and DSm model.

**Category:** General Mathematics

[29] **viXra:1701.0031 [pdf]**
*submitted on 2017-01-03 06:46:34*

**Authors:** Zhang-peng Tian, Jing Wang, Jian-qiang Wang, Hong-yu Zhang

**Comments:** 13 Pages.

Multi-objective optimization by ratio analysis plus the full multiplicative form (MULTIMOORA) is a useful method to apply in multi-criteria decision-making due to the ﬂexibility and robustness it introduces into the decision process.

**Category:** General Mathematics

[28] **viXra:1701.0030 [pdf]**
*submitted on 2017-01-03 06:50:50*

**Authors:** P. Kechichian, B. Champagne

**Comments:** 14 Pages.

Recently, a coupled echo canceller was proposed that uses two short adaptive filters for sparse echo cancellation.

**Category:** General Mathematics

[27] **viXra:1701.0029 [pdf]**
*submitted on 2017-01-03 06:52:16*

**Authors:** Hong-yu Zhang, Pu Ji, Jian-qiang Wang, Xiao-hong Chen

**Comments:** 18 Pages.

This paper presents a new correlation coefficient measure, which satisfies the requirement of this measure equaling one if and only if two interval neutrosophic sets (INSs) are the same. And an objective weight of INSs is presented to unearth and utilize deeper information that is uncertain.

**Category:** General Mathematics

[26] **viXra:1701.0028 [pdf]**
*submitted on 2017-01-03 06:53:41*

**Authors:** P. Rajeswara Reddy, I. Naga Raju, V. Diwakar Reddy, G. Krishnaiah

**Comments:** 7 Pages.

In today’s practical work environment group decision making is essential to choose best alternative from set of alternatives which are characterized by multiple criteria. In manufacturing environment frequent group decision making is common practice which involves conflicting and multiple criteria problems.

**Category:** General Mathematics

[25] **viXra:1701.0027 [pdf]**
*submitted on 2017-01-03 06:54:53*

**Authors:** Yin-xiang Ma, Jian-qiang Wang, Jing Wang, Xiao-hui Wu

**Comments:** 21 Pages.

Selecting medical treatments is a critical activity in medical decision-making. Usually, medical treatments are selected by doctors, patients, and their families based on various criteria.

**Category:** General Mathematics

[24] **viXra:1701.0026 [pdf]**
*submitted on 2017-01-03 06:59:41*

**Authors:** K.M. Amin, A.I.Shahin, Yanhui Guo

**Comments:** 11 Pages.

A lot of studies confurmed the seriousness of brest cancer as the most tumors lethal to women worldwide.

**Category:** General Mathematics

[23] **viXra:1701.0025 [pdf]**
*submitted on 2017-01-03 07:04:28*

**Authors:** Hong-yu Zhang, Pu Ji, Jian-qiang Wang, Xiao-hong Chen

**Comments:** 17 Pages.

Decision support model for satisfactory restaurants have attracted numerous researcher's attention.

**Category:** General Mathematics

[22] **viXra:1701.0024 [pdf]**
*submitted on 2017-01-03 07:06:00*

**Authors:** Keli Hu, Jun Ye, En Fan, Shigen Shen, Longjun Huang, Jiatian Pi

**Comments:** 13 Pages.

Although appearance based trackers have been greatly improved in the last decade, they are still struggling with some challenges like occlusion, blur, fast motion, deformation, etc. As known, occlusion is still one of the soundness challenges for visual tracking.

**Category:** General Mathematics

[21] **viXra:1701.0023 [pdf]**
*submitted on 2017-01-03 07:07:13*

**Authors:** Jenice Aroma R., Kumudha Raimond

**Comments:** 5 Pages.

The satellite image based applications are highly utilized nowadays from simple purposes like vehicle navigation to complex surveillance and virtual environment modeling projects. On increased population rate, the depletion of natural resources is highly unavoidable and it leads to increased threats on natural hazards. In order to protect and overcome the physical losses on devastation of properties, the risk mapping models such as weather forecasts, drought modeling and other hazard assessment models are in need.

**Category:** General Mathematics

[20] **viXra:1701.0021 [pdf]**
*submitted on 2017-01-03 07:13:01*

**Authors:** A. Elhassouny, S. Idbraim, A. Bekkari, D. Mammass, D. Ducrot

**Comments:** 11 Pages.

In this paper, we introduce a new procedure called DSmT-ICM with adaptive decision rule, which is an alternative and extension of Multisource Classiﬁcation Using ICM (Iterated conditional mode) and DempsterShafer theory (DST).

**Category:** General Mathematics

[19] **viXra:1701.0019 [pdf]**
*submitted on 2017-01-03 07:16:19*

**Authors:** .J. Martina Jency, I. Arockiarani

**Comments:** 4 Pages.

In this paper a new approach is proposed to meet the challenges in medical diagnosis using fuzzy neutrosophic c ompositionrelation.

**Category:** General Mathematics

[18] **viXra:1701.0018 [pdf]**
*submitted on 2017-01-03 07:19:33*

**Authors:** Romualdas Bausys, Edmundas Kazimieras Zavadskas, Artūras Kaklauskas

**Comments:** 15 Pages.

The paper presents multicriteria decision making method with single value neutrosophic sets (SVNS), namely COPRAS-SVNS. The complex proportional assessment method (COPRAS) has shown accurate results for the solution of various multicriteria decision making problems in the engineering field.

**Category:** General Mathematics

[17] **viXra:1701.0017 [pdf]**
*submitted on 2017-01-03 07:22:23*

**Authors:** Simon Carladous, Jean-Marc Tacnet, Jean Dezert

**Comments:** 10 Pages.

Experts take into account several criteria to assess the eﬀectiveness of torrential ﬂood protection systems. In practice, scoring each criterion is imperfect.

**Category:** General Mathematics

[16] **viXra:1701.0016 [pdf]**
*replaced on 2017-01-03 08:04:26*

**Authors:** Luke Kenneth Casson Leighton

**Comments:** 19 Pages.

This document is in eect a journal of the past thirty years of exploring particle
physics, with a special focus on the electron. With the exception of this abstract, a
rst person dialog has been unusually chosen after discovering that it can be more
eective in communicating certain logical reasoning chains of thought. The story
begins in 1986 with the rediscovery of the Rishon Model, later expanded in 2012,
followed by an exploration of possible meaning as to why the four Rishons would
exist at all, and why they would exist as triplets: what possible physical underlying
mechanism would give us "Rishons"? The following hypothesis is therefore put
forward:
All evidence explored so far supports the hypothesis that all particles are made
of phased-array photons in a tight and innitely-cyclic recurring loop, in a self-
contained non-radiating E.M eld that obeys nothing more than Maxwell's Equa-
tions (applied from rst principles), with the addition that particles that are not
nonradiating are going to be unstable to some degree (i.e. will undergo "decay").
Rishons themselves are not actual particles per se but simply represent the phase
and braiding order of the constituent photons.
A number of researchers have explored parts of this eld, but have not pulled
all of the pieces together.

**Category:** High Energy Particle Physics

[15] **viXra:1701.0015 [pdf]**
*submitted on 2017-01-03 01:26:13*

**Authors:** Mahendra Kumar Trivedi, Alice Branton, Dahryn Trivedi, Gopal Nayak, Barry Dean Wellborn, Deborah Lea Smith, Dezi Ann Koster, Elizabeth Patric, Jagdish Singh, Kathleen Starr Vagt, Krista Joanne Callas, Olga Mirgalijeva

**Comments:** 10 Pages.

A proprietary herbomineral formulation was formulated with four ingredients; a mixture of the minerals (zinc, magnesium, and selenium) and the herbal root extract ashwagandha. The aim of the study was to evaluate the immunomodulatory potential of Biofield Energy Healing (The Trivedi Effect®) on the herbomineral formulation in splenocyte cells, which were isolated from Biofield Treated mice. The test formulation was divided into two parts. One part was denoted as the control without any Biofield Energy Treatment, while the other part was defined as the Biofield Energy Treated sample, which received the Biofield Energy Healing Treatment remotely from seven renowned Biofield Energy Healers. The splenocyte cells were treated with the test formulation at concentrations ranges from 0.00001053 to 10.53 µg/mL and analyzed after 48 hours of treatment by MTT assay. The cell viability data showed safe concentrations up to 1.053 µg/mL with viability ranges from 69.22% to 123.88% in the test formulation groups. The expression of TNF-α was decreased by 4.82% at 1.053 µg/mL in the Biofield Energy Treated test formulation compared with the vehicle control. The level of TNF-α was significantly decreased by 2.02%, 4.92%, and 18.78% at 0.00001053, 0.001053, and 1.053 µg/mL, respectively in the Biofield Energy Treated test formulation group as compared to the untreated test formulation. The expression of IL-1β was significantly reduced by 83.65%, 92.15%, 27.30%, and 41.88% at 0.00001053, 0.0001053, 0.001053, and 1.053 µg/mL, respectively in the Biofield Energy Treated test formulation compared with the vehicle control. The Biofield Treated test formulation showed significant reduction of IL-1β by 17.26%, 92.61% (p≤0.001), 34.62% (p≤0.05), and 16.13% at 0.00001053, 0.0001053, 0.001053, and 1.053 µg/mL, respectively compared with the untreated test formulation. Additionally, the expression of chemokine MIP-1α was significantly reduced by 17.03%, 10.99%, 22.33%, 24.21%, 21.61%, and 30.67% at 0.00001053, 0.0001053, 0.001053, 0.01053, 0.1053, and 1.053 µg/mL, respectively in the Biofield Treated test formulation compared with the vehicle control. The MIP-1α expression was significantly reduced by 19.32% and 12.56% at 0.01053 and 0.1053 µg/mL, respectively in the Biofield Treated test formulation compared with the untreated test formulation. The overall results demonstrated that the Biofield Energy Treated test formulation significantly down-regulated the expression of TNF-α, IL-1β, and MIP-1α in the Biofield Treated mice splenocyte cells compared to the untreated test formulation. These data suggest that the Biofield Treated test formulation can be used for autoimmune and inflammatory diseases, stress management and anti-aging by improving overall health.

**Category:** Biochemistry

[14] **viXra:1701.0014 [pdf]**
*replaced on 2017-02-06 00:15:30*

**Authors:** Barry Foster

**Comments:** 2 Pages.

This is a two page attempt using simple concepts

**Category:** Number Theory

[13] **viXra:1701.0013 [pdf]**
*submitted on 2017-01-03 01:45:00*

**Authors:** Koji KOBAYASHI

**Comments:** 4 Pages.

This paper describes about P vs NP by using topological approach. We modify computation history as “Problem forest”, and define special problem family “Wildcard problem” and “Maximal complement Wildcard problem” to simplify relations between every input.
“Problem forest” is directed graph with transition functions edges and computational configuration nodes with effective range of tape. Problem forest of DTM is two tree graph which root are accepting & rejecting configuration, which leaves are inputs, trunks are computational configuration with effective range of tape. This tree shows TM's interpretation of symmetry and asymmetry of each input. From the view of problem forest, some NTM inputs are marged partly, and all DTM inputs are separated totally. Therefore NTM can compute implicitly some type of partial (symmetry) overrap, and DTM have to compute explicitly.
“WILDCARD (Wildcard problem family)” and “MAXCARD (Maximal complement Wildcard problem family)” is special problem families that push NTM branches variations into inputs. If “CONCRETE (Concrete Problem)” that generate MAXCARD is in P-Complete, then MAXCARD is in PH, and these inputs have many overrap. DTM cannot compute these overrap conditions implicitly, and these conditions are necesarry to compute MAXCARD input, so DTM have to compute these conditions explicitly. These conditions are over polynomial size and DTM take over polynomial steps to compute these conditions explicitly. That is, PH is not P, and NP is not P.

**Category:** General Science and Philosophy

[12] **viXra:1701.0012 [pdf]**
*submitted on 2017-01-02 10:39:11*

**Authors:** Clive Jones

**Comments:** 2 Pages.

An exploration of prime-number summing grids

**Category:** Number Theory

[11] **viXra:1701.0011 [pdf]**
*submitted on 2017-01-02 11:22:36*

**Authors:** George Rajna

**Comments:** 25 Pages.

Researchers have built a record energy-efficient switch, which uses the interplay of electricity and a liquid form of light, in semiconductor microchips. The device could form the foundation of future signal processing and information technologies, making electronics even more efficient. [19] The magnetic structure of a skyrmion is symmetrical around its core; arrows indicate the direction of spin. [18] According to current estimates, dozens of zettabytes of information will be stored electronically by 2020, which will rely on physical principles that facilitate the use of single atoms or molecules as basic memory cells. [17] EPFL scientists have developed a new perovskite material with unique properties that can be used to build next-generation hard drives. [16] Scientists have fabricated a superlattice of single-atom magnets on graphene with a density of 115 terabits per square inch, suggesting that the configuration could lead to next-generation storage media. [15] Now a researcher and his team at Tyndall National Institute in Cork have made a 'quantum leap' by developing a technical step that could enable the use of quantum computers sooner than expected. [14] A method to produce significant amounts of semiconducting nanoparticles for light-emitting displays, sensors, solar panels and biomedical applications has gained momentum with a demonstration by researchers at the Department of Energy's Oak Ridge National Laboratory. [13] A source of single photons that meets three important criteria for use in quantum-information systems has been unveiled in China by an international team of physicists. Based on a quantum dot, the device is an efficient source of photons that emerge as solo particles that are indistinguishable from each other. The researchers are now trying to use the source to create a quantum computer based on "boson sampling". [11] With the help of a semiconductor quantum dot, physicists at the University of Basel have developed a new type of light source that emits single photons. For the first time, the researchers have managed to create a stream of identical photons. [10] Optical photons would be ideal carriers to transfer quantum information over large distances. Researchers envisage a network where information is processed in certain nodes and transferred between them via photons. [9] While physicists are continually looking for ways to unify the theory of relativity, which describes large-scale phenomena, with quantum theory, which describes small-scale phenomena, computer scientists are searching for technologies to build the quantum computer using Quantum Information. In August 2013, the achievement of "fully deterministic" quantum teleportation, using a hybrid technique, was reported. On 29 May 2014, scientists announced a reliable way of transferring data by quantum teleportation. Quantum teleportation of data had been done before but with highly unreliable methods. The accelerating electrons explain not only the Maxwell Equations and the Special Relativity, but the Heisenberg Uncertainty Relation, the Wave-Particle Duality and the electron's spin also, building the Bridge between the Classical and Quantum Theories. The Planck Distribution Law of the electromagnetic oscillators explains the electron/proton mass rate and the Weak and Strong Interactions by the diffraction patterns. The Weak Interaction changes the diffraction patterns by moving the electric charge from one side to the other side of the diffraction pattern, which violates the CP and Time reversal symmetry. The diffraction patterns and the locality of the self-maintaining electromagnetic potential explains also the Quantum Entanglement, giving it as a natural part of the Relativistic Quantum Theory and making possible to build the Quantum Computer with the help of Quantum Information.

**Category:** Quantum Physics

[10] **viXra:1701.0010 [pdf]**
*submitted on 2017-01-02 09:46:24*

**Authors:** Johan Noldus

**Comments:** 2 Pages.

I present a short note on my expectations regarding this subject based
upon my thoughts in my previous two books on relativistic quantum theory and a note on the emergence of consensus in science.

**Category:** History and Philosophy of Physics

[9] **viXra:1701.0009 [pdf]**
*submitted on 2017-01-02 10:00:30*

**Authors:** Ilija Barukčić

**Comments:** 9 Pages. Copyright © 2016 by Ilija Barukčić, Jever, Germany. Published by Journal of Biosciences and Medicines, Vol.5 No. 2, p. 1-9. https://doi.org/10.4236/jbm.2017.52001

Background: Many studies documented an association between a Helicobacter pylori infection and the development of human gastric cancer. None of these studies were able to identify Helicobacter pylori as a cause or as the cause of human gastric cancer. The basic relation between gastric cancer and Helicobacter pylori still remains uncer-tain.
Objectives: This systematic review and re-analysis of Naomi Uemura et al. available long-term, prospective study of 1526 Japanese patients is performed so that some new and meaningful inference can be drawn.
Materials and Methods: Data obtained by Naomi Uemura et al. who conducted a long-term, prospective study of 1526 Japanese patients with a mean follow up about 7.8 years and endoscopy at enrolment and in the following between one and three years after enrolment were re-analysed.
Statistical analysis used:
The method of the conditio sine qua non relationship was used to proof the hypothesis without a helicobacter pylori infection no development of human gastric cancer. The mathematical formula of the causal relationship was used to proof the hypothesis, whether there is a cause effect relationship between a helicobacter pylori infection and human gastric cancer. Significance was indicated by a P value of less than 0.05.
Results:
Based on the data published by Uemura et al. we were able to make evidence that without a helicobacter pylori infection no development of human gastric cancer. In other words, a Helicobacter pylori infection is a conditio sine qua non of human gastric cancer. In the same respect, the data of Uemura et al. provide a significant evidence that a helicobacter pylori infection is the cause of human gastric cancer.
Conclusions:
Without a Helicobacter pylori infection no development of human gastric cancer. Hel-icobacter pylori is the cause (k=+0,07368483, p Value = 0.00399664) of human gastric cancer.

**Category:** Statistics

[8] **viXra:1701.0008 [pdf]**
*submitted on 2017-01-02 04:55:37*

**Authors:** Ryujin Choe

**Comments:** 2 Pages.

Twin primes are infinitely many

**Category:** Number Theory

[7] **viXra:1701.0007 [pdf]**
*submitted on 2017-01-01 21:36:01*

**Authors:** John R. Springer

**Comments:** 17 Pages.

A new model of particle structure is presented for the lowest stable hadrons and leptons which shows first: the complete internal quark/gluon structure of the proton, neutron, eta, neutral kaon, and neutral pion mesons and (surprisingly) the muon. It can be extended to include without gluons: the electron, neutrinos, and even the photon.
Second, it shows the origin of mass. While mass cannot be assigned to individual quarks which do not exist alone, it can be assigned in totality to a small number of gluons of positive and negative associated mass (±14me). This makes the basic unit of mass the electron mass.
Third, it shows that mixing of internal quark states (like the neutral kaon) is common in all particles. In fact, it shows the source of mixing, entanglement, and oscillations.
The key to this discovery is the finding that quarks do not exist as single isolated quark-antiquark pairs but only as triads and antitriads. Quark-antiquark pairing does occur but only within a quark triad-antitriad pair. With these claims, a thorough analysis of particle properties, especially mass, yields the precise structure and mass of internal structures; essentially a small number (possibly a string or helix) of quark triad-antitriad pairs. The proton and neutron, in addition, each contain one unpaired triad- uud and ddu respectively. Proof for the model is presented which involves a quark structure for the muon and its subsequent decay.

**Category:** High Energy Particle Physics

[6] **viXra:1701.0006 [pdf]**
*replaced on 2017-01-17 11:32:58*

**Authors:** Luke Kenneth Casson Leighton

**Comments:** 4 Pages.

The de Vries formula, discovered in 2004, is undeniably accurate to current experimental and theoretical measurements (3.1e-10 to within CODATA 2014's value, currently 2.3e-10 relative uncertainty). Its Kolmogorov Complexity is extremely low, and it is as elegant as Euler's Identity formula. Having been discovered by a Silicon Design Engineer, no explanation was offered except for the hint that it is based on the well-recognised first approximation for g/2: 1 + alpha / 2pi. Purely taking the occurence of the fine structure constant in the electron: in light of G Poelz and Dr Mills' work, as well as the Ring Model of the early 1900s, this paper offers a tentative explanation for alpha as being a careful dynamic balanced inter-relationship between each radiated loop as emitted from whatever constitutes the "source" of the energy at the heart of the electron. Mills and the original Ring Model use the word "nonradiating" which is is believed to be absolutely critical.

**Category:** High Energy Particle Physics

[5] **viXra:1701.0005 [pdf]**
*submitted on 2017-01-02 04:09:13*

**Authors:** Mahendra Kumar Trivedi, Alice Branton, Dahryn Trivedi, Gopal Nayak, Barry Dean Wellborn, Deborah Lea Smith, Dezi Ann Koster, Elizabeth Patric, Jagdish Singh, Kathleen Starr Vagt, Krista Joanne Callas, Olga Mirgalijeva

**Comments:** 10 Pages.

The use of herbomineral formulations in the healthcare sector has increased due to their high safety and better therapeutic action. A new proprietary herbomineral formulation was formulated with a combination of the herbal root extract ashwagandha and minerals viz. zinc, magnesium, and selenium. The aim of the study was to evaluate the immunomodulatory potential of Biofield Energy Healing (The Trivedi Effect®) on the test formulation in splenocytes. The test formulation was divided into two parts; one was the control without the Biofield Energy Treatment, while the other part was defined as the Biofield Energy Treated sample, which received the Biofield Energy Healing Treatment remotely by seven renowned Biofield Energy Healers. The MTT assay showed that the test formulation exhibited safe concentrations up to 1.053 µg/mL with cell viability ranging from 88.19% to 110.30% in the Biofield Energy Treated sample. The Biofield Energy Healing significantly enhanced the cell viability as compared with the untreated test formulation. The expression of TNF-α was significantly inhibited by 7.23% at 1.053 µg/mL in the Biofield Energy Treated test formulation compared with the vehicle control. The level of TNF-α was significantly decreased by 3.90%, 11.74%, 3.12%, and 9.17% at 0.001053, 0.01053, 0.1053 and 1.053 µg/mL, respectively in the Biofield Energy Treated test formulation compared with the untreated test formulation. Additionally, the expression of IL-1β was significantly reduced by 28.98%, 51.23%, 53.06%, 48.98%, 55.71%, and 59.10% at 0.00001053, 0.0001053, 0.001053, 0.01053, 0.1053, and 1.053 µg/mL, respectively in the Biofield Energy Treated test formulation compared with the vehicle control. Further, the Biofield Treated test formulation showed significant reduction of IL-1β by 35.07% (p≤0.05), 47.46% (p≤0.05), and 57.51% (p≤0.01) at 0.001053, 0.1053, and 1.053 µg/mL, respectively compared with the untreated test formulation. Similarly, the MIP-1α expression was inhibited by the Biofield Energy Treated formulation and showed immunosuppressive activity at 0.00001053, 0.0001053, 0.001053, and 0.01053 µg/mL by 19.38%, 24.97%, 31.23%, and 25.41%, respectively compared with the vehicle control group. The Biofield Treated test formulation significantly (p<0.001) reduced the MIP-1α expression by 19.33%, 22.57%, and 30.50% at 0.0001053, 0.001053, and 0.01053 µg/mL, respectively compared with the untreated test formulation. Overall, The Trivedi Effect®- Biofield Energy Healing (TEBEH) significantly down-regulated the pro-inflammatory cytokines and potentiated the immunosuppressive effect of the Biofield Energy Treated test formulation, which can be better utilized for organ transplants, autoimmune diseases, inflammatory disorders, anti-aging, stress management, overall health and quality of life, etc.

**Category:** Biochemistry

[4] **viXra:1701.0004 [pdf]**
*submitted on 2017-01-02 04:10:25*

**Authors:** Mahendra Kumar Trivedi, Alice Branton, Dahryn Trivedi, Gopal Nayak, Michael Peter Ellis, James Jeffery Peoples, James Joseph Meuer, Johanne Dodon, John Lawrence Griffin, John Suzuki, Joseph Michael Foty, Judy Weber, Julia Grace McCammon

**Comments:** 10 Pages.

With the increasing popularity of herbomineral preparations in healthcare, a new proprietary herbomineral formulation was formulated with ashwagandha root extract and minerals viz. zinc, magnesium, and selenium. The aim of the study was to evaluate the immunomodulatory potential of Biofield Energy Healing (The Trivedi Effect®) on the herbomineral test formulation using mice splenocytes. The test formulation was divided into two parts. One part was the control without the Biofield Treatment. The other part was labelled the Biofield Treated sample, which received the Biofield Energy Healing Treatment remotely from twenty renowned Biofield Energy Healers. The splenocyte cells were exposed with the test formulation at ranges of 0.00001053 to 10.53 µg/mL for cell viability by MTT assay, with cell viability ranging from 77.50% to 176.52%. TNF-α was significantly inhibited by 15.88%, 15.28%, 12.30%, 12.60%, and 22.72% at 0.00001053, 0.001053, 0.1053, 1.053, and 10.53 µg/mL, respectively in the Biofield Treated test formulation compared to the vehicle control (VC). TNF-α was significantly reduced by 2.33% and 8.35% at 1.053 and 10.53 µg/mL, respectively compared to the untreated test formulation. IL-1β was significantly reduced by 30.81%, 27.36%, 23.92%, 18.40%, 11.27%, and 21.16% at 0.00001053, 0.0001053, 0.001053, 0.01053, 0.1053, and 1.053 µg/mL, respectively in the Biofield Treated test formulation compared to the VC. IL-1β was significantly reduced by 48.63% (p≤0.001) and 15.28% at 0.00001053 and 0.0001053 µg/mL, respectively in the Biofield Treated test formulation compared to the untreated test formulation. MIP-1α expression was inhibited by the Biofield Treated test formulation and showed immunosuppressive activity at 0.01053, 0.1053, 1.053, and 10.53 µg/mL by 22.33%, 16.25%, 15.58%, and 21.83%, respectively compared to the VC. The Biofield Treated test formulation significantly reduced the MIP-1α expression by 13.27% and 15.67% (p<0.05) at 0.01053 and 10.53 µg/mL, respectively compared to the untreated test formulation. The results showed the expression of IFN-γ was significantly reduced by 33.45%, 25.38%, 37.15%, 27.74%, 32.44%, 23.03%, and 44.21% at 0.00001053, 0.0001053, 0.001053, 0.01053, 0.1053, 1.053, and 10.53 µg/mL, respectively in the Biofield Treated test formulation compared to the VC. Further, the IFN-γ level was significantly decreased by 19.02% at 10.53 µg/mL in the Biofield Treated test formulation compared to the untreated test formulation. Overall, the results demonstrate that The Trivedi Effect® Biofield Energy Healing (TEBEH) significantly enhanced the anti-inflammatory and immunomodulatory properties of the treated formulation, and may also be useful in organ transplants, anti-aging, and stress management by improving overall health and quality of life.

**Category:** Biochemistry

[3] **viXra:1701.0003 [pdf]**
*replaced on 2017-01-06 16:04:29*

**Authors:** Tian Hao，Yuanze Xu，Ting Hao

**Comments:** Pages.

On the basis of the very successful free volume theory applicable to wide length scales from electrons to small molecules, macromolecules, colloidal particles and even granules, hypothetical particles dubbed “freevons” are proposed to fill up the free volume available in any system from microscopic atomic world to the macroscopic universe. The Eyring’s rate process theory that has a wide applicability to many chemical and physical phenomena is assumed to govern the behaviors of freevons. It turns out that for keeping the universe to expand in an accelerated manner, the freevons must form the paired structures, like electrons paired in superconductivity state and helium-3 atoms paired in superfluidity state; at about 5 K, there is a temperature induced phase transition happening and the Hubble’s constant is predicted to dramatically increase, implying that the universe will inflate even further rapidly. The universe is therefore viewed as particles, the stars and galaxies, dispersed in the superfluidity freevon sea and the gravity is considered to be induced from the force density, usually defined as a negative gradient of pressure in fluid mechanics. The Newton’s gravity equation is thus easily obtained under those assumptions, and similarly the Coulomb’s Law can also be obtained using the same approach. In this superfluiditiy framework, the volumes rather than the masses of particles are found to be important in determining the gravitational forces. The expansion driving force comes from the activity or concentration gradient of freevons formed at the beginning of the big bang, and there is no need to postulate a dark energy or dark matter to be responsible for such an accelerated inflation; Or freevons can be considered “dark matter”. Our approach may bridge many different theories in astrophysics field and provide a possibility to use quantum mechanics to study the universe through exploring the quantum mechanics behaviors of freevons.

**Category:** Astrophysics

[2] **viXra:1701.0002 [pdf]**
*submitted on 2017-01-01 14:43:43*

**Authors:** Kenneth D. Oglesby

**Comments:** Pages.

MC Physics- Fundamental Force Unification using Mono-Charges
Quantization of basic elemental electrostatic charge occurred in the earliest
Universe which created an uneven distribution of mono-charges (MCs) by charge
type and charge strength. Mono-charges form all matter and cause all force in the
Universe. Charge strength provides the inertial mass of all MCs and, therefore, of
all matter.
All force (electro-static, magnetic, strong nuclear, weak nuclear and gravity) is
electro-static in nature and is instantly and continuously applied only between
mono-charges.
All force (as well as MC/ matter / mass and Space) is modified by movement in
Space (dS/dT), especially at high relativistic velocities.
Magnetic poles and force are caused by moving mono-charges, indicating a link
between magnetism, inertia and relativity- all resistances to spatial or velocity
change. Magnetic force acts only on magnetic poles, i.e. moving MCs.
Following the F-SCoTt matter formation processes, modified Coulomb’s Law and
modified Newton’s Laws, cooling of the ultra-high kinetic energy early Universe
allowed progressively weaker force bondings of quantized MCs to stably form
particles- starting with quarks> protons> nuclei> electrons> neutrinos> and lastly
forming photons of light.
From their joining processes, all particles have internal movements (rotation,
vibrations, wobble) that cause variations (oscillations, vibrations) in their
projected forces, causing temporary charge imbalances and excess weaker MC
joinings.
Such movements cause stronger MCs to have a more focused force projection
source vectors than weaker MCs which have a collective fuzzy collective force
projection source from (mostly) neutralized matter. This is the most likely cause
of attraction force to be slightly stronger than repel forces for masses, i.e. causing
gravitational force.

**Category:** Nuclear and Atomic Physics

[1] **viXra:1701.0001 [pdf]**
*submitted on 2017-01-01 07:53:05*

**Authors:** George Rajna

**Comments:** 26 Pages.

Astronomers in the US are setting up an experiment which, if it fails – as others have – could mark the end of a 30-year-old theory. [22] Russian scientists have discovered that the proportion of unstable particles in the composition of dark matter in the days immediately following the Big Bang was no more than 2 percent to 5 percent. Their study has been published in Physical Review D. [21] Researchers from the University of Amsterdam's (UvA) GRAPPA Center of Excellence have just published the most precise analysis of the fluctuations in the gamma-ray background to date. [20] The Dark Energy Spectroscopic Instrument, called DESI, has an ambitious goal: to scan more than 35 million galaxies in the night sky to track the expansion of our universe and the growth of its large-scale structure over the last 10 billion years. [19] If the axion exist and it is the main component of Dark Matter, the very relic axions that would be bombarding us continuously could be detected using microwave resonant (to the axion mass) cavities, immersed in powerful magnetic fields. [18] In yet another attempt to nail down the elusive nature of dark matter, a European team of researchers has used a supercomputer to develop a profile of the yet-to-be-detected entity that appears to pervade the universe. [17] MIT physicists are proposing a new experiment to detect a dark matter particle called the axion. If successful, the effort could crack one of the most perplexing unsolved mysteries in particle physics, as well as finally yield a glimpse of dark matter. [16] Researches at Stockholm University are getting closer to light dark-matter particle models. Observations rule out some axion-like particles in the quest for the content of dark matter. The article is now published in the Physical Review Letters. [15] Scientists have detected a mysterious X-ray signal that could be caused by dark matter streaming out of our Sun's core. Hidden photons are predicted in some extensions of the Standard Model of particle physics, and unlike WIMPs they would interact electromagnetically with normal matter. In particle physics and astrophysics, weakly interacting massive particles, or WIMPs, are among the leading hypothetical particle physics candidates for dark matter. The gravitational force attracting the matter, causing concentration of the matter in a small space and leaving much space with low matter concentration: dark matter and energy. There is an asymmetry between the mass of the electric charges, for example proton and electron, can understood by the asymmetrical Planck Distribution Law. This temperature dependent energy distribution is asymmetric around the maximum intensity, where the annihilation of matter and antimatter is a high probability event. The asymmetric sides are creating different frequencies of electromagnetic radiations being in the same intensity level and compensating each other. One of these compensating ratios is the electron – proton mass ratio. The lower energy side has no compensating intensity level, it is the dark energy and the corresponding matter is the dark matter.

**Category:** Astrophysics